Building Local Network Search Agents with Phidata and Ollama

Building Local Network Search Agents with Phidata and Ollama

Background: Attempting to build search Agents based on a local Agent framework.

Reference Website: https://docs.phidata.com/tools/website

Basic Environment: Command line tools (Linux/Mac), python3 (set up an independent conda environment).

Basic LLM: Download and install from the Ollama official website (if you have a ChatGPT membership, you can also use ChatGPT).

AI Agent Framework: This time we use Phidata and utilize its DuckDuckGo (search engine) Website (directly access specified URLs).

Environment Installation

1. Ollama’s Large Model (as the Agent brain)

  • You can also use ChatGPT, refer to the Phidata official website.

Install and run Ollama

  • https://ollama.com/download

Building Local Network Search Agents with Phidata and Ollama

  • The previously used llama3.1 was not supported for Phidata’s Tools, so we directly use llama3.2 (2G size is more suitable for personal computers).

$ ollama run llama3.2

Building Local Network Search Agents with Phidata and Ollama

2. Phidata Framework (as the Agent body)

Install Phidata

$ pip install phidata

Install DuckDuckGo Tool

  • Used to support the search Agent.

$ pip install duckduckgo-search

Building Local Network Search Agents with Phidata and Ollama

Install Website Tool

  • Used to support access to URLs by the Agent.

$ pip install psycopg2

Building Local Network Search Agents with Phidata and Ollama

Start Agents

1. Customized Search Agent

  • Similar to the results produced by mainstream search engines, but can include your own customization needs, such as translating to Chinese without showing ads within 50 words, etc.

  • Later, this small plugin can be deployed as a service on a personal website or added as an Agent to a Multi-Agent system (experiments will follow).

from phi.agent import Agent
from phi.model.ollama import Ollama
from phi.tools.duckduckgo import DuckDuckGo

web_agent = Agent(
    name="Web Agent",
    model=Ollama(id="llama3.2"),
    tools=[DuckDuckGo()],
    instructions=["Always include sources"],
    show_tool_calls=True,
    markdown=True,
    debug_mode=False)

web_agent.print_response("Tell me about OpenAI", stream=True)
  • Execution Results

Building Local Network Search Agents with Phidata and Ollama

2. Agent Capable of Accessing Specified URLs

  • Similar to a search engine, but this Agent requires you to provide specific website addresses, which can then be used as an internal search Agent on a personal website.

  • You can require it to only retrieve information from the websites you provide, without using previously learned information, and you can also require it to do more processing (written in the instructions).

from phi.agent import Agent
from phi.model.ollama import Ollama
from phi.tools.website import WebsiteTools

web_agent = Agent(
    name="Web Agent",
    model=Ollama(id="llama3.2"),
    tools=[WebsiteTools()],
    instructions=["Always include sources"],
    show_tool_calls=True,
    markdown=True,
    debug_mode=True)

web_agent.print_response(
    "Summarize this website: https://docs.phidata.com/introduction.",
    stream=True,
)
  • Execution Results

Building Local Network Search Agents with Phidata and Ollama

Conclusion

  • With these tools, the supported LLMs become individual Agents; in the future, we can create Multi-Agents or deploy them separately where they are most needed.

  • Next, I want to study how to quickly deploy simple Agents to personal website services.

Thank you for reading!

Leave a Comment