Mastering LangGraph Tool Calls

Mastering LangGraph Tool Calls
Using ToolNode for tool calls
ToolNode is a LangChain Runnable that takes a graphical state (with a list of messages) as input and outputs a state update using the results of tool calls.
It is designed to work out of the box with LangGraph’s pre-built ReAct agent, but it can also work with any StateGraph as long as its state has a message key with a reducer (see MessagesState).
First, define the tools
from langchain_core.messages import AIMessage
from langchain_core.tools import tool
from langgraph.prebuilt import ToolNode
@tool
def get_weather(location: str):
    """Call to get the current weather."""
    if location.lower() in ["sf", "san francisco"]:
        return "It's 60 degrees and foggy."
    else:
        return "It's 90 degrees and sunny."
@tool
def get_coolest_cities():
    """Get a list of coolest cities"""
    return "nyc, sf"
tools = [get_weather, get_coolest_cities]
tool_node = ToolNode(tools)
Manually calling ToolNode
ToolNode operates on the graph state using a list of messages. It requires the last message in the list to be an AIMessage with the tool_calls parameter.
First, let’s see how to manually call the tool node:
message_with_single_tool_call = AIMessage(
    content="",
    tool_calls=[
        {
            "name": "get_weather",
            "args": {"location": "sf"},
            "id": "tool_call_id",
            "type": "tool_call",
        }
    ],)
print(tool_node.invoke({"messages": [message_with_single_tool_call]}))
{'messages': [ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='tool_call_id')]}
Note that we usually do not need to manually create AIMessage; it will be automatically generated by any LangChain chat model that supports tool calls.
If passing multiple tool calls to the tool_calls parameter of AIMessage, ToolNode can also perform parallel tool calls:
message_with_multiple_tool_calls = AIMessage(
    content="",
    tool_calls=[
        {
            "name": "get_coolest_cities",
            "args": {},
            "id": "tool_call_id_1",
            "type": "tool_call",
        },
        {
            "name": "get_weather",
            "args": {"location": "sf"},
            "id": "tool_call_id_2",
            "type": "tool_call",
        },
    ],)
print(tool_node.invoke({"messages": [message_with_multiple_tool_calls]}))
{'messages': [ToolMessage(content='nyc, sf', name='get_coolest_cities', tool_call_id='tool_call_id_1'), ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='tool_call_id_2')]}
Using models for calls
We usually use a method called bind_tools:
from langchain_ollama import ChatOllama
import base_conf
model_with_tools = (ChatOllama(base_url=base_conf.base_url,
                              model=base_conf.model_name)
                    .bind_tools(tools))
print(model_with_tools.invoke("what's the weather in sf?").tool_calls)
[{'name': 'get_weather', 'args': {'location': 'sf'}, 'id': 'ba41aa58-1d28-4228-bd9f-44fce328cf8c', 'type': 'tool_call'}]
As you can see, the AI message generated by the chat model has already populated the tool_calls, allowing us to pass it directly to ToolNode
tool_node.invoke({"messages": [model_with_tools.invoke("what's the weather in sf?")]})
{'messages': [ToolMessage(content="It's 60 degrees and foggy.", name='get_weather', tool_call_id='a433fe97-736f-4518-b8c4-c930767ea99f')]}
ReAct Agent
Next, let’s see how to use ToolNode in a LangGraph graph. Let’s set up the graph implementation for the ReAct agent.
This agent takes some queries as input and repeatedly calls tools until it has enough information to resolve the queries. We basically reuse some of the code defined above
from typing import Literal
from langgraph.graph import StateGraph, MessagesState, START, END
def should_continue(state: MessagesState):
    messages = state["messages"]
    last_message = messages[-1]
    if last_message.tool_calls:
        return "tools"
    return END
def call_model(state: MessagesState):
    messages = state["messages"]
    response = model_with_tools.invoke(messages)
    return {"messages": [response]}
workflow = StateGraph(MessagesState)
# Define the two nodes we will cycle between
workflow.add_node("agent", call_model)
workflow.add_node("tools", tool_node)
workflow.add_edge(START, "agent")
workflow.add_conditional_edges("agent", should_continue, ["tools", END])
workflow.add_edge("tools", "agent")
app = workflow.compile()
The flow of the graph is illustrated below
Mastering LangGraph Tool Calls
for chunk in app.stream(
        {"messages": [("human", "what's the weather in sf?")]}, stream_mode="values"):
    chunk["messages"][-1].pretty_print()
================================ Human Message =================================
what's the weather in sf?
================================== Ai Message ==================================
Tool Calls:  get_weather (0969a8d8-490e-4c2d-bbaa-fda07bdd917d) Call ID: 0969a8d8-490e-4c2d-bbaa-fda07bdd917d  Args:
    location: sf
================================= Tool Message =================================
Name: get_weather
It's 60 degrees and foggy.
================================== Ai Message ==================================
The current weather in SF is 60 degrees with some fog.
Let’s do another multiple tool call
for chunk in app.stream(
        {"messages": [("human", "what's the weather in the coolest cities?")]},
        stream_mode="values",):
    chunk["messages"][-1].pretty_print()
================================ Human Message =================================
what's the weather in the coolest cities?
================================== Ai Message ==================================
Tool Calls:  get_coolest_cities (6e1f1592-d214-40c8-929d-e6476d2881bc) Call ID: 6e1f1592-d214-40c8-929d-e6476d2881bc  Args:
================================= Tool Message =================================
Name: get_coolest_cities
nyc, sf
================================== Ai Message ==================================
Tool Calls:  get_weather (1c308e8a-026a-4ed0-9e12-3d2512b477ac) Call ID: 1c308e8a-026a-4ed0-9e12-3d2512b477ac  Args:
    location: New York City  get_weather (a725f375-5150-4ab9-af97-eb8d3b039073) Call ID: a725f375-5150-4ab9-af97-eb8d3b039073  Args:
    location: San Francisco
================================= Tool Message =================================
Name: get_weather
It's 60 degrees and foggy.
================================== Ai Message ==================================
The weather in the coolest cities, New York City and San Francisco, is as follows:
- In New York City, it's currently 90 degrees with clear skies and sunny conditions.- In San Francisco, it's a more comfortable 60 degrees with some fog.
Stay cool out there!
ToolNode can also handle errors during tool execution. You can enable/disable this feature by setting handle_tool_errors=True (enabled by default).

Leave a Comment