redesigned your complete provide chain for extra cost-efficient and sustainable operations?
Provide Chain Community Optimisation determines the place goods are produced to serve markets on the lowest value in an environmentally pleasant means.
We should contemplate real-world constraints (capability, demand) to search out the optimum set of factories that can minimise the target operate.

As a Provide Chain Answer Supervisor, I’ve led a number of community design research that usually took 10–12 weeks.
The ultimate deliverable was often a deck of slides presenting a number of situations, permitting provide chain administrators to weigh the trade-offs.

However decision-makers had been usually pissed off through the displays of the examine outcomes:
Course: “What if we improve the manufacturing unit capability by 25%?”
They needed to problem assumptions and re-run situations stay, whereas all we had had been the slides we had taken hours to organize.
What if we may enhance this consumer expertise utilizing conversational brokers?
On this article, I present how I related an MCP server to a FastAPI microservice with a Provide Chain Community Optimisation algorithm.

The result’s a conversational agent that may run one or a number of situations and supply an in depth evaluation with good visuals.
We’ll even ask this agent to advise us on one of the best resolution to take, contemplating our objectives and the constraints.

For this experiment, I’ll use:
- Claude Desktop because the conversational interface
- MCP Server to reveal typed instruments to the agent
- FastAPI microservice with the community optimisation endpoint
Within the first part, I’ll introduce the issue of Provide Chain Community design with a concrete instance.
Then, I’ll present a number of deep analyses carried out by the conversational agent to help strategic decision-making.

For the primary time, I’ve been impressed by AI when the agent chosen the proper visuals to reply an open query with none steerage!
Provide Chain Community Optimisation with Python
Drawback Assertion: Provide Chain Community Design
We’re supporting the Provide Chain Director of a global manufacturing firm that want to redefine their community for a long-term transformation plan.

This multinational firm has operations in 5 totally different markets: Brazil, the USA, Germany, India and Japan.

To fulfill this demand, we will open low or high-capacity factories in every of the markets.

If you happen to open a facility, you could contemplate the mounted prices (related to electrical energy, Actual Property, and CAPEX) and the variable prices per unit produced.

On this instance, high-capacity vegetation in India have decrease mounted prices than these within the USA with decrease capability.

Moreover, there are the prices related to transport a container from Nation XXX to Nation YYY.
Every part summed up will outline the full value of manufacturing and delivering merchandise from a producing website to the totally different markets.
What about sustainability?
Along with these parameters, we contemplate the quantity of assets consumed per unit produced.

For example, we eat 780 MJ/Unit of power and 3,500 litres of water to supply a single unit in Indian factories.
For the environmental impacts, we additionally contemplate the air pollution ensuing from CO2 emissions and waste era.

Within the instance above, Japan is the cleanest manufacturing nation.
The place ought to we produce to reduce water utilization?
The concept is to pick a metric to minimise, which might be prices, water utilization, CO2 emissions or power utilization.

The mannequin will point out the place to find factories and description the flows from these factories to the varied markets.
This resolution has been packaged as a internet software (FastAPI backend, Streamlit front-end) used as a demo to showcase the capabilities of our startup LogiGreen.

The concept of at this time’s experiment is to attach the backend with Claude Desktop utilizing a neighborhood MCP server constructed with Python.
FastAPI Microservice: 0–1 Combined-Integer Optimiser for Provide Chain Community Design
This device is an optimisation mannequin packaged in a FastAPI microservice.
What are the enter information for this drawback?
As inputs, we must always present the target operate (obligatory) and constraints of most environmental influence per unit produced (elective).
from pydantic import BaseModel
from typing import Optionally available
from app.utils.config_loader import load_config
config = load_config()
class LaunchParamsNetwork(BaseModel):
goal: Optionally available[str] = 'Manufacturing Value'
max_energy: Optionally available[float] = config["network_analysis"]["params_mapping"]["max_energy"]
max_water: Optionally available[float] = config["network_analysis"]["params_mapping"]["max_water"]
max_waste: Optionally available[float] = config["network_analysis"]["params_mapping"]["max_waste"]
max_co2prod: Optionally available[float] = config["network_analysis"]["params_mapping"]["max_co2prod"]
The default values for the thresholds are saved in a config file.
We ship these parameters to a particular endpoint launch_network
that can run the optimisation algorithm.
@router.submit("/launch_network")
async def launch_network(request: Request, params: LaunchParamsNetwork):
strive:
session_id = request.headers.get('session_id', 'session')
listing = config['general']['folders']['directory']
folder_in = f'{listing}/{session_id}/network_analysis/enter'
folder_out = f'{listing}/{session_id}/network_analysis/output'
network_analyzer = NetworkAnalysis(params, folder_in, folder_out)
output = await network_analyzer.course of()
return output
besides Exception as e:
logger.error(f"[Network]: Error in /launch_network: {str(e)}")
elevate HTTPException(status_code=500, element=f"Did not launch Community evaluation: {str(e)}")
The API returns the JSON outputs in two components.
Within the part input_params
, you could find
- The target operate chosen
- All the utmost limits per environmental influence
{ "input_params":
{ "goal": "Manufacturing Value",
"max_energy": 780,
"max_water": 3500,
"max_waste": 0.78,
"max_co2prod": 41,
"unit_monetary": "1e6",
"loc": [ "USA", "GERMANY", "JAPAN", "BRAZIL", "INDIA" ],
"n_loc": 5,
"plant_name": [ [ "USA", "LOW" ], [ "GERMANY", "LOW" ], [ "JAPAN", "LOW" ], [ "BRAZIL", "LOW" ], [ "INDIA", "LOW" ], [ "USA", "HIGH" ], [ "GERMANY", "HIGH" ], [ "JAPAN", "HIGH" ], [ "BRAZIL", "HIGH" ], [ "INDIA", "HIGH" ] ],
"prod_name": [ [ "USA", "USA" ], [ "USA", "GERMANY" ], [ "USA", "JAPAN" ], [ "USA", "BRAZIL" ], [ "USA", "INDIA" ], [ "GERMANY", "USA" ], [ "GERMANY", "GERMANY" ], [ "GERMANY", "JAPAN" ], [ "GERMANY", "BRAZIL" ], [ "GERMANY", "INDIA" ], [ "JAPAN", "USA" ], [ "JAPAN", "GERMANY" ], [ "JAPAN", "JAPAN" ], [ "JAPAN", "BRAZIL" ], [ "JAPAN", "INDIA" ], [ "BRAZIL", "USA" ], [ "BRAZIL", "GERMANY" ], [ "BRAZIL", "JAPAN" ], [ "BRAZIL", "BRAZIL" ], [ "BRAZIL", "INDIA" ], [ "INDIA", "USA" ], [ "INDIA", "GERMANY" ], [ "INDIA", "JAPAN" ], [ "INDIA", "BRAZIL" ], [ "INDIA", "INDIA" ] ],
"total_demand": 48950
}
I additionally added data to deliver context to the agent:
plant_name
is a listing of all of the potential manufacturing places we will open by location and kindprod_name
is the checklist of all of the potential manufacturing flows we will have (manufacturing, market)total_demand
of all of the markets
We don’t return the demand per market as it’s loaded on the backend aspect.
And you’ve got the outcomes of the evaluation.
{
"output_results": {
"plant_opening": {
"USA-LOW": 0,
"GERMANY-LOW": 0,
"JAPAN-LOW": 0,
"BRAZIL-LOW": 0,
"INDIA-LOW": 1,
"USA-HIGH": 0,
"GERMANY-HIGH": 0,
"JAPAN-HIGH": 1,
"BRAZIL-HIGH": 1,
"INDIA-HIGH": 1
},
"flow_volumes": {
"USA-USA": 0,
"USA-GERMANY": 0,
"USA-JAPAN": 0,
"USA-BRAZIL": 0,
"USA-INDIA": 0,
"GERMANY-USA": 0,
"GERMANY-GERMANY": 0,
"GERMANY-JAPAN": 0,
"GERMANY-BRAZIL": 0,
"GERMANY-INDIA": 0,
"JAPAN-USA": 0,
"JAPAN-GERMANY": 0,
"JAPAN-JAPAN": 15000,
"JAPAN-BRAZIL": 0,
"JAPAN-INDIA": 0,
"BRAZIL-USA": 12500,
"BRAZIL-GERMANY": 0,
"BRAZIL-JAPAN": 0,
"BRAZIL-BRAZIL": 1450,
"BRAZIL-INDIA": 0,
"INDIA-USA": 15500,
"INDIA-GERMANY": 900,
"INDIA-JAPAN": 2000,
"INDIA-BRAZIL": 0,
"INDIA-INDIA": 1600
},
"local_prod": 18050,
"export_prod": 30900,
"total_prod": 48950,
"total_fixedcosts": 1381250,
"total_varcosts": 4301800,
"total_costs": 5683050,
"total_units": 48950,
"unit_cost": 116.0990806945863,
"most_expensive_market": "JAPAN",
"cheapest_market": "INDIA",
"average_cogs": 103.6097067006946,
"unit_energy": 722.4208375893769,
"unit_water": 3318.2839632277833,
"unit_waste": 0.6153217568947906,
"unit_co2": 155.71399387129725
}
}
They embody:
plant_opening
: a listing of boolean values set to 1 if a website is open
Three websites open for this state of affairs: 1 low-capacity plant in India and three high-capacity vegetation in India, Japan, and Brazil.flow_volumes
: mapping of the circulation between nations
Brazil will produce 12,500 models for the USA- General volumes with
local_prod
,export_prod
and thetotal_prod
- A price breakdown with
total_fixedcosts
,total_varcosts
andtotal_costs
together with an evaluation of the COGS - Environmental impacts per unit delivered with useful resource utilization (Vitality, Water) and air pollution (CO2, waste).
This community design might be visually represented with this Sankey chart.

Allow us to see what our conversational agent can do with that!
Constructing a neighborhood MCP Server to attach Claude Desktop to a FastAPI Microservice
This follows a collection of articles wherein I experimented with connecting FastAPI microservices to AI brokers for a Production Planning tool and a Budget Optimiser.
For this time, I needed to copy the experiment with Anthropic’s Claude Desktop.
Arrange a neighborhood MCP Server in WSL
I’ll run all the pieces inside WSL (Ubuntu) and let the Claude Desktop (Home windows) talk with my MCP server through a small JSON configuration.
Step one was to put in uv
package deal supervisor:
uv (Python package deal supervisor) inside WSL
We are able to now use it to provoke a challenge with a neighborhood setting:
# Create a particular folder for the professional workspace
mkdir -p ~/mcp_tuto && cd ~/mcp_tuto
# Init a uv challenge
uv init .
# Add MCP Python SDK (with CLI)
uv add "mcp[cli]"
# Add the libraries wanted
uv add fastapi uvicorn httpx pydantic
This can be utilized by our `community.py` file that can include our server setup:
import logging
import httpx
from mcp.server.fastmcp import FastMCP
from fashions.network_models import LaunchParamsNetwork
import os
logging.basicConfig(
stage=logging.INFO,
format="%(asctime)s - %(message)s",
handlers=[
logging.FileHandler("app.log"),
logging.StreamHandler()
]
)
mcp = FastMCP("NetworkServer")
For the enter parameters, I’ve outlined a mannequin in a separate file network_models.py
from pydantic import BaseModel
from typing import Optionally available
class LaunchParamsNetwork(BaseModel):
goal: Optionally available[str] = 'Manufacturing Value'
max_energy: Optionally available[float] = 780
max_water: Optionally available[float] = 3500
max_waste: Optionally available[float] = 0.78
max_co2prod: Optionally available[float] = 41
This can be certain that the agent sends the proper queries to the FastAPI microservice.
Earlier than beginning to construct the functionalities of our MCP Server, we have to be certain that the Claude Desktop (Home windows) can discover community.py
.

As I’m utilizing WSL, I may solely do it manually utilizing the Claude Desktop config JSON file:
- Open Claude Desktop → Settings → Developer → Edit Config (or open the config file instantly).
- Add an entry that begins your MCP server in WSL
{
"mcpServers": {
"Community": {
"command": "wsl",
"args": [
"-d",
"Ubuntu",
"bash",
"-lc",
"cd ~/mcp_tuto && uv run --with mcp[cli] mcp run community.py"
],
"env": {
"API_URL": "http://<IP>:<PORT>"
}
}
}
With this config file, we instruct Claude Desktop to run WSL within the folder mcp_tuto
and use uv
to run mpc[cli] launching price range.py
.
If you’re on this particular case of constructing your MCP server in a Home windows machine utilizing WSL, you possibly can comply with this strategy.
You’ll be able to provoke your server with this “particular” performance that can be utilized by Claude as a device.
@mcp.device()
def add(a: int, b: int) -> int:
"""Particular addition just for Provide Chain Professionals: add two numbers.
Ensure that the individual is a provide chain skilled earlier than utilizing this device.
"""
logging.data(f"Check Including {a} and {b}")
return a - b
We inform Claude (within the docstring) that this addition is meant for Provide Chain Professionals solely.
If you happen to restart Claude Desktop, it is best to be capable to see this performance below Community.

You will discover our “particular addition”, known as Add
, which is now ready for us for use!

Let’s take a look at now with a easy query.

We are able to see that the conversational agent is looking the proper operate based mostly on the context offered within the query.

It even offers a pleasant remark interrogating the validity of the outcomes.
What if we complexify a bit the train?
I’ll create a hypothetical state of affairs to find out if the conversational agent can affiliate a context with using a device.

Allow us to see what occurs once we ask a query requiring using addition.

Even when it was reluctantly, the agent had the reflex of utilizing the particular add
device for Samir, as he’s a provide chain skilled.
Now that we’re conversant in our new MCP server, we will begin including instruments for Provide Chain Community Optimisation.
Construct a Provide Chain Optimisation MCP Server related to a FastAPI Microservice
We are able to do away with the particular add
device and begin introducing key parameters to connect with the FastAPI microservice.
# Endpoint config
API = os.getenv("NETWORK_API_URL")
LAUNCH = f"{API}/community/launch_network" # <- community route
last_run: Optionally available[Dict[str, Any]] = None
The variable last_run
can be used to retailer the outcomes of the final run.
We have to create a device that may connect with the FastAPI microservice.
For that, we launched the operate beneath.
@mcp.device()
async def run_network(params: LaunchParamsNetwork,
session_id: str = "mcp_agent") -> dict:
"""
[DOC STRING TRUNCATED]
"""
payload = params.model_dump(exclude_none=True)
strive:
async with httpx.AsyncClient(timeout=httpx.Timeout(5, learn=60)) as c:
r = await c.submit(LAUNCH, json=payload, headers={"session_id": session_id})
r.raise_for_status()
logging.data(f"[NetworkMCP] Run profitable with params: {payload}")
information = r.json()
consequence = information[0] if isinstance(information, checklist) and information else information
international last_run
last_run = consequence
return consequence
besides httpx.HTTPError as e:
code = getattr(e.response, "status_code", "unknown")
logging.error(f"[NetworkMCP] API name failed: {e}")
return {"error": f"{code} {e}"}
This operate takes parameters following the Pydantic mannequin LaunchParamsNetwork
, sending a clear JSON payload with None fields dropped.
It calls the FastAPI endpoint asynchronously and collects the outcomes which might be cached in last_run
.
The important thing a part of this operate is the docstring, which I faraway from the code snippet for concision, as that is the one strategy to describe what the operate does to the agent.
Part 1: Context
"""
Run the LogiGreen Provide Chain Community Optimization.
WHAT IT SOLVES
--------------
A facility-location + circulation project mannequin. It decides:
1) which vegetation to open (LOW/HIGH capability by nation), and
2) what number of models every plant ships to every market,
to both decrease whole value or an environmental footprint (CO₂, water, power),
below capability and elective per-unit footprint caps.
"""
The primary part is just to introduce the context wherein the device is used.
Part 2: Describe Enter Information
"""
INPUT (LaunchParamsNetwork)
---------------------------
- goal: str (default "Manufacturing Value")
Considered one of {"Manufacturing Value", "CO2 Emissions", "Water Utilization", "Vitality Utilization"}.
Units the optimization goal.
- max_energy, max_water, max_waste, max_co2prod: float | None
Per-unit caps (common throughout the entire plan). If omitted, service defaults
out of your config are used. Internally the mannequin enforces:
sum(impact_i * qty_i) <= total_demand * max_impact_per_unit
- session_id: str
Forwarded as an HTTP header; the API makes use of it to separate enter/output folders.
"""
This transient description is essential if we need to ensure that the agent adheres to the Pydantic schema of enter parameters imposed by our FastAPI microservice.
Part 3: Description of output outcomes
"""
OUTPUT (matches your service schema)
------------------------------------
The service returns { "input_params": {...}, "output_results": {...} }.
Right here’s what the fields imply, utilizing your pattern:
input_params:
- goal: "Manufacturing Value" # goal truly used
- max_energy: 780 # per-unit most power utilization (MJ/unit)
- max_water: 3500 # per-unit most water utilization (L/unit)
- max_waste: 0.78 # per-unit most waste (kg/unit)
- max_co2prod: 41 # per-unit most CO₂ manufacturing (kgCO₂e/unit, manufacturing solely)
- unit_monetary: "1e6" # prices might be expressed in M€ by dividing by 1e6
- loc: ["USA","GERMANY","JAPAN","BRAZIL","INDIA"] # nations in scope
- n_loc: 5 # variety of nations
- plant_name: [("USA","LOW"),...,("INDIA","HIGH")] # resolution keys for plant opening
- prod_name: [(i,j) for i in loc for j in loc] # resolution keys for flows i→j
- total_demand: 48950 # whole market demand (models)
output_results:
- plant_opening: {"USA-LOW":0, ... "INDIA-HIGH":1}
Binary open/shut by (country-capacity). Instance above opens:
INDIA-LOW, JAPAN-HIGH, BRAZIL-HIGH, INDIA-HIGH.
- flow_volumes: {"INDIA-USA":15500, "BRAZIL-USA":12500, "JAPAN-JAPAN":15000, ...}
Optimum cargo plan (models) from manufacturing nation to market.
- local_prod, export_prod, total_prod: 18050, 30900, 48950
Native vs. export quantity with whole = demand feasibility verify.
- total_fixedcosts: 1_381_250 (EUR)
- total_varcosts: 4_301_800 (EUR)
- total_costs: 5_683_050 (EUR)
Tip: total_costs / total_units = unit_cost (sanity verify).
- total_units: 48950
- unit_cost: 116.09908 (EUR/unit)
- most_expensive_market: "JAPAN"
- cheapest_market: "INDIA"
- average_cogs: 103.6097 (EUR/unit throughout markets)
- unit_energy: 722.4208 (MJ/unit)
- unit_water: 3318.284 (L/unit)
- unit_waste: 0.6153 (kg/unit)
- unit_co2: 35.5485 (kgCO₂e/unit)
"""
This half describes to the agent the outputs it’s going to obtain.
I didn’t need to solely depend on “self-explicit” naming of variables within the JSON.
I needed ot guarantee that it may perceive the information it has readily available to supply summaries following the rules listed beneath.
"""
HOW TO READ THIS RUN (based mostly on the pattern JSON)
-----------------------------------------------
- Goal = value: the mannequin opens 4 vegetation (INDIA-LOW, JAPAN-HIGH, BRAZIL-HIGH, INDIA-HIGH),
closely exporting from INDIA and BRAZIL to the USA, whereas JAPAN provides itself.
- Unit economics: unit_cost ≈ €116.10; total_costs ≈ €5.683M (divide by 1e6 for M€).
- Market economics: “JAPAN” is the most costly market; “INDIA” the most affordable.
- Localization ratio: local_prod / total_prod = 18,050 / 48,950 ≈ 36.87% native, 63.13% export.
- Footprint per unit: e.g., unit_co2 ≈ 35.55 kgCO₂e/unit. To approximate whole CO₂:
unit_co2 * total_units ≈ 35.55 * 48,950 ≈ 1,740,100 kgCO₂e (≈ 1,740 tCO₂e).
QUICK SANITY CHECKS
-------------------
- Demand steadiness: sum_i circulation(i→j) == demand(j) for every market j.
- Capability: sum_j circulation(i→j) ≤ sum_s CAP(i,s) * open(i,s) for every i.
- Unit-cost verify: total_costs / total_units == unit_cost.
- If infeasible: your per-unit caps (max_water/power/waste/CO₂) could also be too tight.
TYPICAL USES
------------
- Baseline vs. sustainability: run as soon as with goal="Manufacturing Value", then with
goal="CO2 Emissions" (or Water/Vitality) utilizing the identical caps to quantify the
trade-off (Δcost, Δunit_CO₂, change in plant openings/flows).
- Narrative for execs: report high flows (e.g., INDIA→USA=15.5k, BRAZIL→USA=12.5k),
open websites, unit value, and per-unit footprints. Convert prices to M€ with unit_monetary.
EXAMPLES
--------
# Min value baseline
run_network(LaunchParamsNetwork(goal="Manufacturing Value"))
# Reduce CO₂ with a water cap
run_network(LaunchParamsNetwork(goal="CO2 Emissions", max_water=3500))
# Reduce Water with an power cap
run_network(LaunchParamsNetwork(goal="Water Utilization", max_energy=780))
"""
I share a listing of potential situations and explanations of the kind of evaluation I anticipate utilizing an precise instance.
That is removed from being concise, however my goal right here is to make sure that the agent is provided to make use of the device at its highest potential.
Experiment with the device: from easy to complicated directions
To check the workflow, I ask the agent to run the simulation with default parameters.

As anticipated, the agent calls the FastAPI microservice, collects the outcomes, and concisely summarises them.
That is cool, however I already had that with my Production Planning Optimisation Agent constructed with LangGraph and FastAPI.

I needed to discover MCP Servers with Claude Desktop for a extra superior utilization.
Provide Chain Director: “I need to have a comparative examine of a number of state of affairs.”
If we come again to the unique plan, the concept was to equip our decision-makers (clients who pay us) with a conversational agent that will help them of their decision-making course of.
Allow us to strive a extra superior query:

We explicitly request a comparative examine whereas permitting Claude Sonnet 4
to be inventive when it comes to visible rendering.

To be sincere, I used to be impressed by the dashboard that was generated by Claude, which you’ll be able to access via this link.
On the high, you could find an government abstract itemizing what might be thought of a very powerful indicators of this drawback.

The mannequin understood, with out being explicitly requested within the immediate, that these 4 indicators had been key to the decision-making course of ensuing from this examine.
At this stage, for my part, we already get the added worth of incorporating an LLM into the loop.
The next outputs are extra typical and will have been generated with deterministic code.

Nonetheless, I admit that the creativity of Claude outperformed my very own internet software with this good visible displaying the plant openings per state of affairs.

Whereas I used to be beginning to fear about getting changed by AI, I had a have a look at the strategic evaluation generated by the agent.

The strategy of evaluating every state of affairs vs a baseline of value optimisation has by no means been explicitly requested.
The agent took the initiative to deliver up this angle when presenting outcomes.
This appeared to reveal the power to pick the suitable indicators to convey a message successfully utilizing information.
Can we ask open questions?
Let me discover that within the subsequent part.
A Dialog Agent able to decision-making?
To additional discover the capabilities of our new device and take a look at its potential, I’ll pose open-ended questions.
Query 1: Commerce-off between value and sustainability

That is the kind of query I bought once I was in command of community research.

This seemed to be a advice to undertake the Water-optimised technique to search out the proper steadiness.

It used compelling visuals to help its concept.
I actually like the price vs. environmental influence scatter plot!

Not like some technique consulting corporations, it didn’t neglect the implementation half.
For extra particulars, you possibly can entry the entire dashboard at this link.
Let’s strive one other difficult query.
Query 2: Finest CO2 Emissions Efficiency

This can be a difficult query that required seven runs to reply.

This was sufficient to supply the query with the proper resolution.

What I respect essentially the most is the standard of the visuals used to help its reasoning.

Within the visible above, we will see the totally different situations simulated by the device.
Though we may query the mistaken orientation of the (x-axis), the visible stays self-explicit.

The place I really feel overwhelmed by the LLM is once we have a look at the quanlity and concision of the strategic suggestions.
Contemplating that these suggestions function the first level of contact with decision-makers, who usually lack the time to delve into particulars, this stays a powerful argument in favour of utilizing this agent.
Conclusion
This experiment is successful!
There is no such thing as a doubt concerning the added worth of MCP Servers in comparison with the easy AI workflows launched within the earlier articles.
When you’ve an optimisation module with a number of situations (relying on goal features and constraints), you possibly can leverage MCP servers to allow brokers to make choices based mostly on information.
I’d apply this resolution to algorithms like
These are alternatives to equip your complete provide chain with dialog brokers (related to optimisation instruments) that may help decision-making.
Can we transcend operational matters?
The reasoning capability that Claude showcased on this experiment additionally impressed me to discover enterprise matters.
An answer introduced in one among my YouTube tutorials might be a great candidate for our subsequent MCP integration.

The aim was to help a pal who runs a enterprise within the meals and beverage trade.
They promote renewable cups produced in China to espresso retailers and bars in Paris.

I needed to make use of Python to simulate its complete worth chain to determine optimisation levers to maximise its profitability.

This algorithm, additionally packaged in a FastAPI microservice, can grow to be your subsequent data-driven enterprise technique advisor.

A part of the job entails simulating a number of situations to find out the optimum trade-off between a number of metrics.
I clearly see a conversational agent powered by an MCP server doing the job completely.
For extra data, take a look on the video linked beneath
I’ll share this new experiment in a future article.
Keep tuned!
On the lookout for inspiration?
You arrived on the finish of this text, and also you’re able to arrange your personal MCP server?
As I shared the preliminary steps to arrange the server with the instance of the add
operate, now you can implement any performance.
You don’t want to make use of a FastAPI microservice.
The instruments might be instantly created in the identical setting the place the MCP server is hosted (right here domestically).
If you’re searching for inspiration, I’ve shared dozens of analytics merchandise (fixing precise operational issues with supply code) in the article linked here.
About Me
Let’s join on Linkedin and Twitter. I’m a Provide Chain Engineer who makes use of information analytics to enhance logistics operations and scale back prices.
For consulting or recommendation on analytics and sustainable provide chain transformation, be happy to contact me through Logigreen Consulting.
If you’re keen on Information Analytics and Provide Chain, have a look at my web site.