CrewAI

🤖 A cutting-edge framework for autonomous AI agents orchestrating role-playing. By facilitating collaborative intelligence, CrewAI enables agents to work seamlessly together and tackle complex tasks.
•Why Choose CrewAI[2]•Getting Started[3]•Key Features[4]•Examples[5]•Local Open Source Models[6]•CrewAI x AutoGen x ChatDev[7]•Contributions[8]•💬 CrewAI Discord Community[9]•Hiring Consulting[10]•License[11]
[12]Why Choose CrewAI?
There are many powers of AI collaboration to offer. CrewAI is designed to enable AI agents to take on roles, share goals, and operate like a finely-tuned team – just like a well-trained crew. Whether you’re building an intelligent assistant platform, an automated customer service team, or a multi-agent research team, CrewAI provides support for complex multi-agent interactions.
•🤖 Converse with Documentation[13]•📄 Documentation Wiki[14]
[15]Getting Started
To get started with CrewAI, follow these simple steps:
1.Install:
pip install crewai
The following example also uses duckduckgo, so install it too:
pip install duckduckgo-search
2.Set Up Your Team:
import os
from crewai import Agent, Task, Crew, Process
os.environ["OPENAI_API_KEY"] = "Your Key"
# You can choose to use a local model, for example via Ollama.
## from langchain.llms import Ollama
# ollama_llm = Ollama(model="openhermes")
# Install duckduckgo-search for this example:
# !pip install -U duckduckgo-search
from langchain.tools import DuckDuckGoSearchRun
search_tool = DuckDuckGoSearchRun()
# Define agents with roles and goals
researcher = Agent(
role='Senior Research Analyst',
goal='Reveal cutting-edge developments in AI and data science',
backstory="""You work at a leading tech think tank. You excel at identifying emerging trends. You are skilled at analyzing complex data and presenting actionable insights.""",
verbose=True,
allow_delegation=False,
tools=[search_tool]
# You can pass an optional llm attribute to specify the mode you want to use.
# It can be a local model via Ollama / LM Studio or remote models like OpenAI, Mistral, Antrophic, etc. (https://python.langchain.com/docs/integrations/llms/)
# Example:
# llm=ollama_llm
# or
# llm=ChatOpenAI(model_name="gpt-3.5", temperature=0.7)
)
writer = Agent(
role='Technical Content Strategist',
goal='Write engaging content about technological advancements',
backstory="""You are a renowned content strategist known for your insightful and engaging articles. You transform complex concepts into compelling narratives.""",
verbose=True,
allow_delegation=True,
# (optional) llm=ollama_llm
)
# Create tasks for your agents
task1 = Task(
description="""Conduct a comprehensive analysis of the latest developments in AI for 2024. Identify key trends, breakthrough technologies, and potential industry impacts. Your final answer must be a complete analytical report""",
agent=researcher
)
task2 = Task(
description="""Using the provided insights, develop an engaging blog post highlighting the most significant advancements in AI. Your article should be both informative and easy to understand, suitable for a tech-savvy audience. Make it sound cool, avoiding complex vocabulary so it doesn't sound like AI. Your final answer must be a complete blog post of at least 4 paragraphs.""",
agent=writer
)
# Instantiate your team and adopt sequential processing
crew = Crew(
agents=[researcher, writer],
tasks=[task1, task2],
verbose=2, # You can set this to 1 or 2 for different logging levels
)
# Let your team start working!
result = crew.kickoff()
print("######################")
print(result)
The only supported processing flow currently is Process.sequential
, where one task executes after another, passing the result of the previous task as additional content to the next task.
Key Features
·Role-Based Agent Design: Customize specific roles, goals, and tools for agents.
·Autonomous Agent Delegation: Agents can autonomously delegate tasks and ask each other questions, improving problem-solving efficiency.
·Flexible Task Management: Define tasks using customizable tools and dynamically assign them to agents.
·Process-Driven: Currently only supports sequential
task execution, but more complex processes like consensus and hierarchy are in development.
CrewAI Mind Map
Examples
You can test real-life scenarios of different AI teams in the Examples Repository
.
Code
·Travel Planner
·Stock Analysis
·Login Page Generator
·Incorporate Human Input in Execution
Videos
Quick Tutorial
Travel Planner
Stock Analysis
[16]Local Open Source Models
CrewAI supports integration with local models via tools such as Ollama[17], enhancing flexibility and customization capabilities. This allows you to use your own models, which is particularly useful for specialized tasks or data privacy concerns.
[18]Setting Up Ollama
•Install Ollama: Ensure Ollama is correctly installed in your environment. Follow the installation guide provided by Ollama for detailed steps.•Configure Ollama: Set up Ollama to collaborate with your local models. You may need to use a Modelfile to adjust the model[19]. I recommend adding Observation
as a stopword and adjusting top_p
and temperature
.
[20]Integrating Ollama with CrewAI
•Instantiate the Ollama model: Create an instance of the Ollama model. When instantiating, you can specify the model and base URL. For example:
from langchain.llms import Ollama
ollama_openhermes = Ollama(model="openhermes")
# Pass the Ollama model to the agent: When creating an agent within the CrewAI framework, you can pass the Ollama model as a parameter to the Agent constructor. For example:
local_expert = Agent(
role='Local Expert of the City',
goal='Provide the best insights about the selected city',
backstory="""An informative local guide who knows the city, its attractions, and customs""",
tools=[
SearchTools.search_internet,
BrowserTools.scrape_and_summarize_website,
],
llm=ollama_openhermes, # Pass the Ollama model here
verbose=True
)
Comparing CrewAI
·Autogen: Autogen excels in creating conversational agents that can work collaboratively, but it lacks an intrinsic process concept. Orchestrating interactions between agents in Autogen requires additional programming, which can become complex and cumbersome as task scales increase.
·ChatDev: ChatDev introduces the process concept into the realm of AI agents, but its implementation is quite rigid. ChatDev has many customization limitations, making it unsuitable for production environments, which may hinder scalability and flexibility in practical applications.
Advantages of CrewAI:
CrewAI is built with production considerations in mind. It offers the flexibility of conversational agents from Autogen while incorporating a structured process approach from ChatDev, without the rigidity. The processes in CrewAI are designed to be dynamic and adaptable, seamlessly integrating into development and production workflows.
Contributions
CrewAI is open source, and we welcome contributions. If you wish to contribute, please:
·Fork the repository.
·Create a new branch for your feature.
·Add your feature or improvement.
·Send a pull request.
·We appreciate your participation!
Installing Dependencies
Copy code
poetry lock
poetry install
Virtual Environment
Copy code
poetry shell
Pre-commit Hooks
Copy code
pre-commit install
Running Tests
Copy code
poetry run pytest
Packaging
Copy code
poetry build
Local Installation
Copy code
pip install dist/*.tar.gz
References
For more information, refer to: https://github.com/joaomdmoura/crewAI
References
[1]Two people rowing:
https://github.com/joaomdmoura/crewAI/blob/main/crewai_logo.png
[2]
Why Choose CrewAI: https://github.com/joaomdmoura/crewAI#why-crewai[3]
Getting Started: https://github.com/joaomdmoura/crewAI#getting-started[4]
Key Features: https://github.com/joaomdmoura/crewAI#key-features[5]
Examples: https://github.com/joaomdmoura/crewAI#examples[6]
Local Open Source Models: https://github.com/joaomdmoura/crewAI#local-open-source-models[7]
CrewAI x AutoGen x ChatDev: https://github.com/joaomdmoura/crewAI#how-crewai-compares[8]
Contributions: https://github.com/joaomdmoura/crewAI#contribution[9]
💬 CrewAI Discord Community: https://discord.gg/4ZqbAStv[10]
Hiring Consulting: https://github.com/joaomdmoura/crewAI#hire-consulting[11]
License: https://github.com/joaomdmoura/crewAI#license[12]
: https://github.com/joaomdmoura/crewAI#why-crewai[13]
Converse with Documentation: https://chat.openai.com/g/g-qqTuUWsBY-crewai-assistant[14]
Documentation Wiki: https://github.com/joaomdmoura/CrewAI/wiki[15]
: https://github.com/joaomdmoura/crewAI#getting-started[16]
: https://github.com/joaomdmoura/crewAI#local-open-source-models[17]
Ollama: https://ollama.ai/[18]
: https://github.com/joaomdmoura/crewAI#setting-up-ollama[19]
Adjusting the Model: https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md[20]
: https://github.com/joaomdmoura/crewAI#integrating-ollama-with-crewai