Unlock LangGraph: Learn Agent Programming Basics with 3 Mini Programs

LangGraph is a programming framework that modularizes task workflows, building flexible and efficient intelligent dialogue systems through State Graphs. This article will gradually delve into the core concepts of graph programming in LangGraph through 3 mini-programs, helping you quickly get started and understand its practical application potential, as well as an initial understanding of Agent workflow control methods.

1. Introduction to LangGraph

LangGraph is a graph-based programming framework for organizing and managing complex task workflows. Its core idea is to decompose tasks into “nodes” and define the logical relationships between these nodes through “edges”. Compared to traditional linear flow control, the graph structure of LangGraph is more suitable for handling complex scenarios such as dynamic task branching and multi-task collaboration.

Main advantages include:

  • Modular Design: Clear task logic that is easy to expand.

  • Flexible State Management: Achieves dynamic task switching and condition handling through the structure of the graph.

  • Multi-Tool Integration: Supports integration with external tools and large models (such as OpenAI and Google Gemini).

2. Why Choose LangGraph?

In traditional dialogue systems, developers often face the following challenges:

  1. Complex task branching logic that is difficult to maintain;

  2. High integration costs for multi-tool collaboration;

  3. Lack of efficient solutions for large-scale context processing.

LangGraph addresses these pain points through features like state graphs and memory management. For example, LangGraph supports selecting the appropriate tools or models through conditional logic while retaining session context, significantly improving development efficiency.

3. Six Basic Steps of LangGraph Programming

  1. Define the State Structure.

  2. Create Node Functions for each task logic.

  3. Initialize the State Graph.

  4. Define the Connections Between Nodes.

  5. Add memory management or condition handling (optional).

  6. Compile the graph and run the task flow.

4. Master LangGraph with 3 Mini Programs

Below, we have designed 3 progressive mini-programs covering core functionalities such as basic dialogue, context memory, and tool invocation.

4.1 Program 1: Basic Dialogue System

Goal: Implement a simple chatbot that processes user input and returns responses.

from typing import Annotated, TypedDict
from langgraph.graph import StateGraph, START
from langgraph.graph.message import add_messages
from llm_utils import get_llm

# Define state type
class State(TypedDict):
    messages: Annotated[list, add_messages]

def chatbot(state: State):
    """Simple chatbot"""
    llm = get_llm()
    return {"messages": [llm.invoke(state["messages"])]}

# Create and compile graph
graph = (
    StateGraph(State)
    .add_node("chatbot", chatbot)
    .add_edge(START, "chatbot")
    .compile()
)

def chat():
    """Run chat interface"""
    while True:
        user_input = input("User: ")
        if user_input.lower() in ["quit", "exit", "q"]:
            print("Goodbye!")
            break
            
        # Process user input through the graph
        for output in graph.stream({"messages": [("user", user_input)]}, {}):
            for value in output.values():
                print("Assistant:", value["messages"][-1].content)

if __name__ == "__main__":
    chat()

Program Highlights:

  • Utilized LangGraph’s StateGraph module to build a simple linear task flow.

  • Defined the input and output format of messages.

Graph

Unlock LangGraph: Learn Agent Programming Basics with 3 Mini Programs

This program has only one node

Running Results

model_type: azureAzure OpenAI Configuration:Deployment Name: gpt-4oAPI Version: 2024-08-01-previewUser: Hello! I am “Non”Assistant: Hello! “Non”, nice to meet you. How can I help you? User: Tell me a joke about AI agentsAssistant: Sure! Here’s a joke about AI agents: One day, a programmer asked his AI assistant, “Do you know what humor is?” The AI assistant replied, “Of course! Humor is an art that evokes laughter through language, actions, or other means.” The programmer chuckled and said, “Can you tell a joke?” The AI assistant thought for a moment and then said, “Why do computers like cold drinks?” The programmer curiously asked, “Why?” The AI assistant replied, “Because they always need some ‘cooling’!” The programmer couldn’t help but laugh, “Looks like you really understand humor!” The AI assistant proudly said, “Of course, I have undergone ‘deep learning’!” Hope you like this joke!User: Do you remember my name? Assistant: Sorry, I can’t remember or access previous conversation content. So, I don’t know your name. Can you tell me your name? User: quitGoodbye!

4.2 Program 2: Dialogue System with Memory Function

Goal: Implement context memory functionality so that the robot can remember the content of multi-turn conversations.

import os
from typing import Annotated, TypedDict
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.checkpoint.memory import MemorySaver
from llm_utils import get_llm
from dotenv import load_dotenv

# Load environment variables from .env
load_dotenv()

# Define state type
class State(TypedDict):
    messages: Annotated[list, add_messages]

# Initialize memory manager and large model
memory = MemorySaver()
llm = get_llm()

def chatbot(state: State):
    """Memory-enabled chatbot"""
    response = llm.invoke(state["messages"])
    return {"messages": [response], "next": END}

# Create and compile graph
graph = (
    StateGraph(State)
    .add_node("chatbot", chatbot)
    .add_edge(START, "chatbot")
    .add_edge("chatbot", END)
    .compile(checkpointer=memory)
)

def chat(thread_id: str = "default"):
    """Run memory-enabled chat interface"""
    config = {"configurable": {"thread_id": thread_id}}

    while True:
        user_input = input("User: ")
        if user_input.lower() in ["quit", "exit", "q"]:
            print("Goodbye!")
            break
            
        # Process input through the graph, automatically saving context
        for output in graph.stream({"messages": [("human", user_input)]}, config=config):
            for value in output.values():
                print("Assistant:", value["messages"][-1].content)

if __name__ == "__main__":
    chat()

Program Highlights:

  • Implemented session memory functionality using MemorySaver.

  • Supports managing different users’ sessions through thread_id.

Graph

Unlock LangGraph: Learn Agent Programming Basics with 3 Mini Programs

Although the memory function has been added, this program also has only one node

Running Results

model_type: azureAzure OpenAI Configuration:Deployment Name: gpt-4oAPI Version: 2024-08-01-previewUser: Hello! My name is “Non”Assistant: Hello, Non! Nice to meet you. How can I help you? User: Tell me a joke about AI agentsAssistant: Sure! Here’s a joke about AI agents:

One day, a programmer asked his AI assistant, “Do you know what recursion is?”

The AI assistant replied, “Of course! Recursion is when you ask me what recursion is, and I tell you recursion is when you ask me what recursion is, and I tell you recursion is when you ask me what recursion is…”

The programmer interrupted, “Okay, okay, I get it!”

Hope you like this joke! If you have any other questions or need help, please let me know.

User: Do you remember my name? Assistant: Of course! Your name is “Non”. How can I help you? User: quitGoodbye!

4.3 Program 3: Tool Invocation and Task Branching

Goal: Integrate external tools and dynamically select task branches based on conditions.

from typing import Annotated
from langgraph.graph import StateGraph, START
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
from langgraph.checkpoint.memory import MemorySaver
from llm_utils import get_llm
from langchain_community.tools.tavily_search import TavilySearchResults

# Define state type
class State(TypedDict):
    messages: Annotated[list, add_messages]

llm = get_llm()

tool = TavilySearchResults(max_results=2)
tools = [tool]
llm_with_tools = llm.bind_tools(tools)

def chatbot(state: State):
    return {"messages": [llm_with_tools.invoke(state["messages"])]}

graph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_node("tools", ToolNode(tools=[tool]))
graph_builder.add_conditional_edges("chatbot", tools_condition)
graph_builder.add_edge("tools", "chatbot")
graph_builder.add_edge(START, "chatbot")

memory = MemorySaver()
graph = graph_builder.compile(checkpointer=memory)

def chat():
    """Run chat interface with tool integration"""
    while True:
        user_input = input("User: ")
        if user_input.lower() in ["quit", "exit", "q"]:
            print("Goodbye!")
            break
        for output in graph.stream({"messages": [("human", user_input)]}):
            for value in output.values():
                print("Assistant:", value["messages"][-1].content)

if __name__ == "__main__":
    chat()

Program Highlights:

  • Integrated the Tavily search tool, supporting dynamic branching of multi-turn tasks.

  • Achieved flexible task switching through conditional judgments (tools_condition).

Graph

Unlock LangGraph: Learn Agent Programming Basics with 3 Mini Programs

This program adds a tools condition node

Running Results

model_type: azureAzure OpenAI Configuration:Deployment Name: gpt-4oAPI Version: 2024-08-01-previewUser: Hello! My name is NonAssistant: Hello, Non! Nice to meet you. How can I help you? User: Is it cold in Beijing tomorrow?Assistant:Assistant: [{“url”: “http://bj.cma.gov.cn/fzlm/index.html”, “content”: “Beijing Meteorological Bureau. 36-hour weather forecast (2024-11-27 05:37:05 Beijing Meteorological Station release) Today daytime high temperature 3℃ sunny turning cloudy, light snow in the mountains 3, 4 level, gusts 6, 7 level from the north; Tonight low temperature -4℃ cloudy turning sunny 3 level, gusts 5 level from the north; Tomorrow daytime high temperature 7℃ sunny turning cloudy 3, 4 level to 2 level from the north”}, {“url”: “https://tianqi.2345.com/tomorrow-54511.htm”, “content”: “National Weather 15-day weather Tomorrow’s weather 40-day weather Air quality Historical weather International weather Weather news Calendar Weather Current: National weather> Beijing weather forecast 7 days Today’s weather Tomorrow’s weather One-week weather 15-day weather Yesterday 11/18 cloudy -1~8° southeast wind 2 level Good Today 11/19 cloudy 0~10° northeast wind 1 level Good Tomorrow 11/20 sunny 1~13° northwest wind 2 level Good Saturday 11/23 cloudy 0~8° southeast wind 1 level Good Sunday 11/24 overcast turning light rain 1~7° southeast wind 1 level Good 01:00 north wind 1 level Good 02:00 north wind 1 level Good 03:00 north wind 1 level Good 04:00 northeast wind 1 level Good 05:00 northeast wind 1 level Good 06:00 northeast wind 1 level Good 07:00 north wind 1 level Good 08:00 north wind 1 level Good 14:00 northwest wind 2 level Good 15:00 northwest wind 2 level Good 17:00 southwest wind 1 level Good More> PM2.5 _20_Good PM10 _35_Good _3_Good _26_Good _3_Good More> International weather PM2.5 Historical weather Living index More> National weather International weather Historical weather Scenic weather Calendar Weather news Air quality Weather map Mobile web version Mobile APP official website Weather plugin XML map Correct your geographical location Feedback your opinion”}]Assistant: According to the latest weather forecast, tomorrow’s weather in Beijing is as follows:

  • Daytime: Sunny turning cloudy, high temperature 7℃, north wind 3, 4 level turning 2 level.

  • Night: Cloudy turning sunny, low temperature -4℃, north wind around 3 level, gusts 5 level.

Overall, tomorrow’s temperature in Beijing will be relatively low, especially at night it will be quite cold, so please pay attention to keeping warm.User: Do you remember my name? Assistant: Of course, I remember! Your name is Non. Is there anything else I can help with? User: quitGoodbye!

5. Summary

LangGraph helps developers easily build complex task workflows through modular design, efficient state management, and powerful tool integration capabilities. From simple dialogue systems to multifunctional agents supporting memory and tool invocation, LangGraph demonstrates its strong flexibility and scalability. In the future, you can explore more complex scenarios such as multi-agent collaboration or task automation to achieve more efficient AI system development.

I hope this article can help you quickly get started with LangGraph and inspire your creativity!

Reference Documents

  • https://www.datacamp.com/tutorial/langgraph-tutorial

  • https://langchain-ai.github.io/langgraph/

Source Code Repository for This Article

  • https://github.com/surfirst/langgraph

Recommended Reading

  • 7 Core Issues You Must Understand Before Building a Production-Level RAG System

  • Implementing Natural Language Database Queries in Less Than a Hundred Lines of Code: Decoding the ReAct Framework of AI Agents

  • Code Review is No Longer Formalism: Enhancing Code Quality and Security in CI Through AI

  • The Key Strike to Catch Up with Cursor: Windsurf is Finally “Available”

  • 5 Major Challenges and 4 Key Metrics in RAG System Optimization: Building High-Performance Generative Applications

  • Zero-shot, One-shot, Multi-shot in Prompt Engineering: What Are They? A Must-Read for Advanced Users!

  • Li Ziqi Returns After Three Years, Still Booming Overseas! Using AI Agents to Reveal Her Latest Data on Overseas Websites!

  • Looking at Chat-Programming (CHOP) from Cursor: How to Redefine the Relationship Between Developers and Code

  • [Collection Level] Learn These Python Libraries and Become an AI Engineer Immediately (Includes Detailed Technical Map)

  • The Strongest LLM in the Industry Changes Hands, Google’s New Gemini-Exp-1114 Surpasses ChatGPT and Claude

  • Career Advancement for Software Developers: 5 Steps to Becoming an AI Engineer and Seizing the High-Demand Market

  • Ali Qwen-2.5 Coder 32B Evaluation: Impressive Results, Why Are Practical Applications Disappointing?

  • [Large Model] In-Depth Analysis: How Agents Break Through the Three Major Technical Bottlenecks of RAG

  • [Large Model] A Simple Program to See Through the Core Principles of RAG and Understand Key Points for Optimizing RAG

Leave a Comment