round LLMs is now evolving into the hype of Agentic AI. Whereas I hope this text doesn’t fall into an “over‑hyped” class, I personally imagine this matter is necessary to be taught. Coming from a knowledge and analytics background, I discover that getting aware of it may be very useful in day‑to‑day work and to arrange for the way it may probably reshape present processes.
My very own journey with Agentic AI remains to be fairly new (in spite of everything, it’s a comparatively new matter), and I’m nonetheless studying alongside the way in which. On this sequence of articles, I’d wish to share a newbie‑pleasant, step‑by‑step information to creating Agentic AI primarily based on my private expertise—specializing in the OpenAI Brokers SDK framework. Some subjects I plan to cowl on this sequence embrace: instrument‑use brokers, multi‑agent collaboration, structured output, producing knowledge visualizations, chat options, and extra. So keep tuned!
On this article, we’ll begin by constructing a fundamental agent after which improve it right into a instrument‑utilizing agent able to retrieving knowledge from an API. Lastly, we’ll wrap every little thing in a easy Streamlit UI so customers can work together with the agent we construct.
All through this information, we’ll keep on with a single use case: making a climate assistant app. I selected this instance as a result of it’s relatable for everybody and covers many of the subjects I plan to share. Because the use case is straightforward and generic, you’ll be able to simply adapt this information to your personal initiatives.
The hyperlink to the GitHub repository and the deployed Streamlit app is supplied on the finish of this text.
A Transient Intro to OpenAI Brokers SDK
The OpenAI Brokers SDK is a Python-based framework that enables us to create an agentic AI system in a easy and easy-to-use manner [1]. As a newbie myself, I discovered this assertion to be fairly true, which makes the educational journey really feel much less intimidating.
On the core of this framework are “Brokers”—Massive Language Fashions (LLMs) that we are able to configure with particular directions and instruments they’ll use.
As we already know, an LLM is educated on an enormous quantity of knowledge, giving it robust capabilities in understanding human language and producing textual content or photos. When mixed with clear directions and the power to work together with instruments, it turns into greater than only a generator—it could actually act and turns into an agent [2].
One sensible use of instruments is enabling an agent to retrieve factual knowledge from exterior sources. This implies the LLM not depends solely on its (typically outdated) coaching knowledge, permitting it to provide extra correct and up‑to‑date outcomes.
On this article, we are going to concentrate on this benefit by constructing an agent that may retrieve “actual‑time” knowledge from an API. Let’s get began!
Set Up the Atmosphere
Create a necessities.txt
file containing the next two necessary packages. I desire utilizing necessities.txt
for 2 causes: reusability and getting ready the undertaking for Streamlit deployment.
openai-agents
streamlit
Subsequent, arrange a digital surroundings named venv
and set up the packages listed above. Run the next instructions in your terminal:
python −m venv venv
supply venv/bin/activate # On Home windows: venvScriptsactivate
pip set up -r necessities.txt
Lastly, since we are going to use the OpenAI API to name the LLM, you want to have an API key (get your API key here). Retailer this key in a .env
file as follows. Necessary: be sure to add .env
to your .gitignore
file in case you are utilizing Git for this undertaking.
OPENAI_API_KEY=your_openai_key_here
As soon as every little thing is about up, you’re good to go!
A Easy Agent
Let’s start with a easy agent by making a Python file referred to as 01-single-agent.py
.
Import Libraries
The very first thing we have to do within the script is import the mandatory libraries:
from brokers import Agent, Runner
import asyncio
from dotenv import load_dotenv
load_dotenv()
From the Brokers SDK package deal, we use Agent
to outline the agent and Runner
to run it. We additionally import asyncio
to allow our program to carry out a number of duties with out ready for one to complete earlier than beginning one other.
Lastly, load_dotenv
from the dotenv
package deal masses the surroundings variables we outlined earlier within the .env
file. In our case, this consists of OPENAI_API_KEY
, which will probably be utilized by default once we outline and name an agent.
Outline a Easy Agent
Generated utilizing GraphViz.
Subsequent, we are going to outline a easy agent referred to as Climate Assistant.
agent = Agent(
identify="Climate Assistant",
directions="You present correct and concise climate updates primarily based on person queries in plain language."
)
An agent may be outlined with a number of properties. On this easy instance, we solely configure the identify
and the directions
for the agent. If wanted, we are able to additionally specify which LLM mannequin to make use of. For example, if we wish to use a smaller mannequin comparable to gpt-4o-mini
(at present, the default mannequin is gpt-4o
), we are able to add the configuration as proven beneath.
agent = Agent(
identify="Climate Assistant",
directions="You present correct and concise climate updates primarily based on person queries in plain language.",
mannequin="gpt-4o-mini"
)
There are a number of different parameters that we are going to cowl later on this article and within the subsequent one. For now, we are going to hold the mannequin configuration easy as proven above.
After defining the agent, the following step is to create an asynchronous perform that may run the agent.
async def run_agent():
outcome = await Runner.run(agent, "What is the climate like in the present day in Jakarta?")
print(outcome.final_output)
The Runner.run(agent, ...)
technique calls the agent
with the question “What’s the climate like in the present day in Jakarta?”. The await
key phrase pauses the perform till the duty is full, permitting different asynchronous duties (if any) to run within the meantime. The results of this job is saved within the outcome
variable. To view the output, we print outcome.final_output
to the terminal.
The final half we have to add is this system’s entry level to execute the perform when the script runs. We use asyncio.run
to execute the run_agent
perform.
if __name__ == "__main__":
asyncio.run(run_agent())
Run the Easy Agent
Now, let’s run the script within the terminal by executing:
python 01-single-agent.py
The outcome will probably be that the agent says it can not present the knowledge. That is anticipated as a result of the LLM was educated on previous knowledge and doesn’t have entry to real-time climate situations.
I can’t present real-time info, however you’ll be able to test a dependable climate web site or app for the newest updates on Jakarta’s climate in the present day.
Within the worst case, the agent may hallucinate by returning a random temperature and giving solutions primarily based on that worth. To deal with this example, we are going to later implement the agent to name an API to retrieve the precise climate situations.
Utilizing Hint
One of many helpful options of the Brokers SDK is Hint, which lets you visualize, debug, and monitor the workflow of the agent you’ve constructed and executed. You may entry the tracing dashboard right here: https://platform.openai.com/traces.
For our easy agent, the hint will appear like this:

On this dashboard, you will discover helpful details about how the workflow is executed, together with the enter and output of every step. Since it is a easy agent, we solely have one agent run. Nevertheless, because the workflow turns into extra complicated, this hint characteristic will probably be extraordinarily useful for monitoring and troubleshooting the method.
Person Interface with Streamlit
Beforehand, we constructed a easy script to outline and name an agent. Now, let’s make it extra interactive by including a person interface with Streamlit [3].
Let’s create a script named 02-single-agent-app.py
as proven beneath:
from brokers import Agent, Runner
import asyncio
import streamlit as st
from dotenv import load_dotenv
load_dotenv()
agent = Agent(
identify="Climate Assistant",
directions="You present correct and concise climate updates primarily based on person queries in plain language."
)
async def run_agent(user_input: str):
outcome = await Runner.run(agent, user_input)
return outcome.final_output
def foremost():
st.title("Climate Assistant")
user_input = st.text_input("Ask concerning the climate:")
if st.button("Get Climate Replace"):
with st.spinner("Considering..."):
if user_input:
agent_response = asyncio.run(run_agent(user_input))
st.write(agent_response)
else:
st.write("Please enter a query concerning the climate.")
if __name__ == "__main__":
foremost()
In comparison with the earlier script, we now import the Streamlit library to construct an interactive app. The agent definition stays the identical, however we modify the run_agent
perform to just accept person enter and cross it to the Runner.run
perform. As an alternative of printing the outcome on to the console, the perform now returns the outcome.
Within the foremost
perform, we use Streamlit parts to construct the interface: setting the title, including a textual content field for person enter, and making a button that triggers the run_agent
perform.
The agent’s response is saved in agent_response
and displayed utilizing the st.write
element. To run this Streamlit app in your browser, use the next command:
streamlit run 02-single-agent-app.py

To cease the app, press Ctrl + C
in your terminal.
To maintain the article targeted on the Brokers SDK framework, I saved the Streamlit app so simple as doable. Nevertheless, that doesn’t imply you want to cease right here. Streamlit provides all kinds of parts that let you get inventive and make your app extra intuitive and fascinating. For a whole record of parts, test the Streamlit documentation within the reference part.
From this level onward, we are going to proceed utilizing this fundamental Streamlit construction.
A Device-Use Agent
As we noticed within the earlier part, the agent struggles when requested concerning the present climate situation. It could return no info or, worse, produce a hallucinated reply. To make sure our agent makes use of actual knowledge, we are able to enable it to name an exterior API so it could actually retrieve precise info.
This course of is a sensible instance of utilizing Instruments within the Brokers SDK. Typically, instruments allow an agent to take actions—comparable to fetching knowledge, working code, calling an API (as we are going to do shortly), and even interacting with a pc [1]. Utilizing instruments and taking actions is likely one of the key capabilities that distinguishes an agent from a typical LLM.
Let’s dive into the code. First, create one other file named 03-tooluse-agent-app.py
.
Import Libraries
We are going to want the next libraries:
from brokers import Agent, Runner, function_tool
import asyncio
import streamlit as st
from dotenv import load_dotenv
import requests
load_dotenv()
Discover that from the Brokers SDK, we now import an extra module: function_tool
. Since we are going to name an exterior API, we additionally import the requests
library.
Outline the Perform Device
The API we are going to use is Open‑Meteo [4], which provides free entry for non‑business use. It offers many options, together with climate forecasts, historic knowledge, air high quality, and extra. On this article, we are going to begin with the only characteristic: retrieving present climate knowledge.
As an extra notice, Open‑Meteo offers its personal library, openmeteo‑requests
. Nevertheless, on this information I exploit a extra generic method with the requests
module, with the intention of creating the code reusable for different functions and APIs.
Right here is how we are able to outline a perform to retrieve the present climate for a selected location utilizing Open-Meteo:
@function_tool
def get_current_weather(latitude: float, longitude: float) -> dict:
"""
Fetches present climate knowledge for a given location utilizing the Open-Meteo API.
Args:
latitude (float): The latitude of the placement.
longitude (float): The longitude of the placement.
Returns:
dict: A dictionary containing the climate knowledge, or an error message if the request fails.
"""
attempt:
url = "https://api.open-meteo.com/v1/forecast"
params = {
"latitude": latitude,
"longitude": longitude,
"present": "temperature_2m,relative_humidity_2m,dew_point_2m,apparent_temperature,precipitation,weathercode,windspeed_10m,winddirection_10m",
"timezone": "auto"
}
response = requests.get(url, params=params)
response.raise_for_status() # Elevate an error for HTTP points
return response.json()
besides requests.RequestException as e:
return {"error": f"Did not fetch climate knowledge: {e}"}
The perform takes latitude
and longitude
as inputs to establish the placement and assemble an API request. The parameters embrace metrics comparable to temperature, humidity, and wind pace. If the API request succeeds, it returns the JSON response as a Python dictionary. If an error happens, it returns an error message as a substitute.
To make the perform accessible to the agent, we enhance it with @function_tool
, permitting the agent to name it when the person’s question is expounded to present climate knowledge.
Moreover, we embrace a docstring within the perform, offering each an outline of its function and particulars of its arguments. Together with a docstring is extraordinarily useful for the agent to know the best way to use the perform.
Outline a Device-Use Agent

Generated utilizing GraphViz.
After defining the perform, let’s transfer on to defining the agent.
weather_specialist_agent = Agent(
identify="Climate Specialist Agent",
directions="You present correct and concise climate updates primarily based on person queries in plain language.",
instruments=[get_current_weather],
tool_use_behavior="run_llm_again"
)
async def run_agent(user_input: str):
outcome = await Runner.run(weather_specialist_agent, user_input)
return outcome.final_output
For essentially the most half, the construction is similar as within the earlier part. Nevertheless, since we at the moment are utilizing instruments, we have to add some extra parameters.
The primary is instruments
, which is an inventory of instruments the agent can use. On this instance, we solely present the get_current_weather
instrument. The subsequent is tool_use_behavior
, which configures how instrument utilization is dealt with. For this agent, we set it to "run_llm_again"
, which signifies that after receiving the response from the API, the LLM will course of it additional and current it in a transparent, easy-to-read format. Alternatively, you need to use "stop_on_first_tool"
, the place the LLM won’t course of the instrument’s output additional. We are going to experiment with this selection later.
The remainder of the script follows the identical construction we used earlier to construct the primary Streamlit perform.
def foremost():
st.title("Climate Assistant")
user_input = st.text_input("Ask concerning the climate:")
if st.button("Get Climate Replace"):
with st.spinner("Considering..."):
if user_input:
agent_response = asyncio.run(run_agent(user_input))
st.write(agent_response)
else:
st.write("Please enter a query concerning the climate.")
if __name__ == "__main__":
foremost()
Be certain to save lots of the script, then run it within the terminal:
streamlit run 03-tooluse-agent-app.py
Now you can ask a query concerning the climate in your metropolis. For instance, after I requested concerning the present climate in Jakarta—on the time of scripting this (round 4 o’clock within the morning)—the response was as proven beneath:

Now, as a substitute of hallucinating, the agent can present a human‑readable present climate situation for Jakarta. You may recall that the get_current_weather
perform requires latitude and longitude as arguments. On this case, we depend on the LLM to provide them, as it’s possible educated with fundamental location info. A future enchancment could be so as to add a instrument that retrieves a extra correct geographical location primarily based on a metropolis identify.
(Non-obligatory) Use “stop_on_first_tool”
Out of curiosity, let’s attempt altering the tool_use_behavior
parameter to "stop_on_first_tool"
and see what it returns.

As anticipated, with out the LLM’s assist to parse and rework the JSON response, the output is tougher to learn. Nevertheless, this conduct may be helpful in eventualities the place you want a uncooked, structured outcome with none extra processing by the LLM.
Improved Instruction
Now, let’s change the tool_use_behavior
parameter again to "run_llm_again"
.
As we’ve seen, utilizing an LLM may be very useful for parsing the outcome. We will take this a step additional by giving the agent extra detailed directions—particularly, asking for a structured output and sensible solutions. To do that, replace the directions
parameter as follows:
directions = """
You're a climate assistant agent.
Given present climate knowledge (together with temperature, humidity, wind pace/route, precipitation, and climate codes), present:
1. A transparent and concise clarification of the present climate situations.
2. Sensible solutions or precautions for outside actions, journey, well being, or clothes primarily based on the information.
3. If any extreme climate is detected (e.g., heavy rain, thunderstorms, excessive warmth), spotlight essential security measures.
Format your response in two sections:
Climate Abstract:
- Briefly describe the climate in plain language.
Strategies:
- Supply actionable recommendation related to the climate situations.
"""
After saving the modifications, rerun the app. Utilizing the identical query, you need to now obtain a clearer, nicely‑structured response together with sensible solutions.

The ultimate script of 03-tooluse-agent-app.py
may be seen right here.
from brokers import Agent, Runner, function_tool
import asyncio
import streamlit as st
from dotenv import load_dotenv
import requests
load_dotenv()
@function_tool
def get_current_weather(latitude: float, longitude: float) -> dict:
"""
Fetches present climate knowledge for a given location utilizing the Open-Meteo API.
Args:
latitude (float): The latitude of the placement.
longitude (float): The longitude of the placement.
Returns:
dict: A dictionary containing the climate knowledge or an error message if the request fails.
"""
attempt:
url = "https://api.open-meteo.com/v1/forecast"
params = {
"latitude": latitude,
"longitude": longitude,
"present": "temperature_2m,relative_humidity_2m,dew_point_2m,apparent_temperature,precipitation,weathercode,windspeed_10m,winddirection_10m",
"timezone": "auto"
}
response = requests.get(url, params=params)
response.raise_for_status() # Elevate an error for HTTP points
return response.json()
besides requests.RequestException as e:
return {"error": f"Did not fetch climate knowledge: {e}"}
weather_specialist_agent = Agent(
identify="Climate Specialist Agent",
directions="""
You're a climate assistant agent.
Given present climate knowledge (together with temperature, humidity, wind pace/route, precipitation, and climate codes), present:
1. A transparent and concise clarification of the present climate situations.
2. Sensible solutions or precautions for outside actions, journey, well being, or clothes primarily based on the information.
3. If any extreme climate is detected (e.g., heavy rain, thunderstorms, excessive warmth), spotlight essential security measures.
Format your response in two sections:
Climate Abstract:
- Briefly describe the climate in plain language.
Strategies:
- Supply actionable recommendation related to the climate situations.
""",
instruments=[get_current_weather],
tool_use_behavior="run_llm_again" # or "stop_on_first_tool"
)
async def run_agent(user_input: str):
outcome = await Runner.run(weather_specialist_agent, user_input)
return outcome.final_output
def foremost():
st.title("Climate Assistant")
user_input = st.text_input("Ask concerning the climate:")
if st.button("Get Climate Replace"):
with st.spinner("Considering..."):
if user_input:
agent_response = asyncio.run(run_agent(user_input))
st.write(agent_response)
else:
st.write("Please enter a query concerning the climate.")
if __name__ == "__main__":
foremost()
Conclusion
At this level, we have now explored the best way to create a easy agent and why we want a instrument‑utilizing agent—one highly effective sufficient to reply particular questions on actual‑time climate situations {that a} easy agent can not deal with. We now have additionally constructed a easy Streamlit UI to work together with this agent.
This primary article focuses solely on the core idea of how agentic AI can work together with a instrument, relatively than relying solely on its coaching knowledge to generate output.
Within the subsequent article, we are going to shift our focus to a different necessary idea of agentic AI: agent collaboration. We are going to cowl why a multi‑agent system may be simpler than a single “tremendous” agent, and discover alternative ways brokers can work together with one another.
I hope this text has supplied useful insights to start out your journey into these subjects.
References
[1] OpenAI. (2025). OpenAI Brokers SDK documentation. Retrieved July 19, 2025, from https://openai.github.io/openai-agents-python/
[2] Bornet, P., Wirtz, J., Davenport, T. H., De Cremer, D., Evergreen, B., Fersht, P., Gohel, R., Khiyara, S., Sund, P., & Mullakara, N. (2025). Agentic Synthetic Intelligence: Harnessing AI Brokers to Reinvent Enterprise, Work, and Life. World Scientific Publishing Co.
[3] Streamlit Inc. (2025). Streamlit documentation. Retrieved July 19, 2025, from https://docs.streamlit.io/
[4] Open-Meteo. Open-Meteo API documentation. Retrieved July 19, 2025, from https://open-meteo.com/en/docs
You will discover the entire supply code used on this article within the following repository: agentic-ai-weather | GitHub Repository. Be happy to discover, clone, or fork the undertaking to comply with alongside or construct your personal model.
In the event you’d wish to see the app in motion, I’ve additionally deployed it right here: Weather Assistant Streamlit
Lastly, let’s join on LinkedIn!