Discover the power of multi-agent systems for data generation, world simulation, and task automation
CAMEL-AI is the premier open-source framework for building and studying autonomous, communicative agents. It's revolutionizing how we approach multi-agent systems and collaborative AI.
Get assistance from our community or access full documentation.
Welcome to the comprehensive learning platform for CAMEL-AI, the cutting-edge open-source framework for building and studying autonomous, communicative agents. This course is designed for both individuals with Python knowledge and businesses looking to harness the power of multi-agent systems.
CAMEL-AI (Communicative Agents for "Mind" Exploration of Large Language Model Society) emerged as the earliest LLM-based multi-agent framework and has evolved into a powerful tool for real-world task solving. Whether you're interested in data generation, world simulation, or task automation, CAMEL-AI provides the infrastructure you need.
Learn how to implement CAMEL-AI's powerful tools and APIs in your applications.
Discover how CAMEL-AI can automate workflows and generate valuable data.
Explore agent scaling laws and contribute to cutting-edge AI research.
CAMEL stands for Communicative Agents for "Mind" Exploration of Large Language Model Society. It's an advanced framework designed to facilitate autonomous cooperation among communicative agents, enabling them to solve complex tasks with minimal human intervention.
"Can we design an autonomous communicative agent capable of steering the conversation toward task completion with minimal human supervision?"
This guiding question shapes the development of CAMEL-AI, focusing on creating agents that can work independently while effectively collaborating with other agents.
Agents interact and coordinate with minimal human intervention
Multiple agents collaborate to solve complex problems
Examines agent behaviors across different contexts
Frameworks scales from simple tasks to complex systems
Community-driven development and improvement
Learns from surroundings and improves over time
At its core, CAMEL-AI provides a structured environment for agents with different roles to collaborate on tasks. Each agent utilizes Large Language Models (LLMs) to enhance cognitive capabilities, enabling natural language understanding and generation.
The framework facilitates flexible communication between agents, equips them with tools to interact with the external world, and provides memory capabilities for more grounded learning and inference.
CAMEL-AI consists of several key modules that work together to create powerful agent systems. Understanding these components is essential for effectively implementing CAMEL-AI in your projects.
Architectures and customization options for agent intelligence, supporting a wide range of LLMs.
Messaging protocols for agent communication, enabling standardized information exchange.
Memory storage and retrieval mechanisms for more grounded learning and inference.
Integration with external tools, allowing agents to interact with the external world.
Prompt engineering and customization for guiding agent behavior and responses.
Task creation and management for agent workflows and goal-directed behavior.
Components for building agent societies and facilitating inter-agent collaboration.
Retrieval methods for knowledge access, enhancing agent capabilities with external information.
In a typical CAMEL-AI system, these components interact in the following way:
Our comprehensive curriculum takes you from CAMEL-AI fundamentals to advanced multi-agent applications. Each module builds on previous knowledge, providing a structured learning path.
This module introduces the core concepts of CAMEL-AI, its architecture, and the philosophy behind multi-agent systems.
# Install the base CAMEL library
pip install camel-ai
# For all dependencies
pip install 'camel-ai[all]'
# For HuggingFace agents
pip install 'camel-ai[huggingface-agent]'
# For RAG or agent memory
pip install 'camel-ai[tools]'
import os
# For OpenAI models
os.environ["OPENAI_API_KEY"] = "your-api-key-here"
# Alternative: Use a .env file
from dotenv import load_dotenv
load_dotenv() # This loads variables from .env file
Learn how to create, customize, and interact with CAMEL-AI agents. This module covers the essential steps for bringing your first agent to life.
from camel.agents import ChatAgent
from camel.models import ModelFactory
from camel.types import ModelPlatformType, ModelType
# Define the model
model = ModelFactory.create(
model_platform=ModelPlatformType.OPENAI,
model_type=ModelType.GPT_4O_MINI,
)
# Create your agent
agent = ChatAgent(
system_message="You are a helpful AI assistant that specializes in Python programming.",
model=model,
message_window_size=10 # Optional: set chat memory length
)
# Interact with the agent
response = agent.step("Can you explain how to use list comprehensions in Python?")
print(response.msgs[0].content)
In CAMEL-AI, an agent's role is defined through its system message, which guides its behavior, expertise, and how it approaches tasks. The more specific and detailed the role definition, the more focused the agent's responses will be.
Enhance your agents with tools for external interaction and memory capabilities for context retention and learning.
from camel.agents import ChatAgent
from camel.models import ModelFactory
from camel.toolkits import MathToolkit, SearchToolkit
# Create an agent with tools
tooled_agent = ChatAgent(
system_message="You are a research assistant with access to tools.",
model=ModelFactory.create(),
tools=[
*MathToolkit().get_tools(), # Math operations
*SearchToolkit().get_tools(), # Web search capabilities
]
)
# Use the agent with tools
response = agent.step("What is the square root of 169, and who discovered it?")
print(response.msgs[0].content)
# Check tool calls
print(response.info['tool_calls'])
from camel.agents import ChatAgent
from camel.messages import BaseMessage
# Create an agent with extended memory
memory_agent = ChatAgent(
system_message="You are an assistant with a good memory.",
model=ModelFactory.create(),
message_window_size=20, # Remember more context
)
# Access agent memory
context = memory_agent.memory.get_context()
# Add a message to memory
memory_agent.record_message(BaseMessage.make_user_message(
content="Remember this important information."
))
Discover how to create collaborative agent societies where multiple agents work together to solve complex tasks.
from camel.agents import ChatAgent
from camel.societies.workforce import Workforce
from camel.tasks.task import Task
from camel.models import ModelFactory
# Create individual agents
researcher = ChatAgent(
system_message="You are a research specialist who finds information.",
model=ModelFactory.create(),
tools=[SearchToolkit().get_tools()]
)
analyst = ChatAgent(
system_message="You analyze information and extract insights.",
model=ModelFactory.create()
)
writer = ChatAgent(
system_message="You create clear, engaging content from analysis.",
model=ModelFactory.create()
)
# Create workforce
team = Workforce('Content Creation Team')
team.add_single_agent_worker("Researcher", worker=researcher)
team.add_single_agent_worker("Analyst", worker=analyst)
team.add_single_agent_worker("Writer", worker=writer)
# Define and process a task
content_task = Task(
content="Create a comprehensive blog post about renewable energy trends.",
id='blog_001',
)
result = team.process_task(content_task)
print(result.result)
In multi-agent systems, each agent should have a specialized role with clear responsibilities. This specialization allows the workforce to handle complex tasks efficiently, with each agent focusing on what it does best. The Workforce module in CAMEL-AI manages the coordination between these specialized agents.
Explore advanced applications of CAMEL-AI, including Retrieval-Augmented Generation (RAG), knowledge graphs, and synthetic data generation.
from camel.agents import ChatAgent
from camel.retrievers import VectorDBRetriever
from camel.models import ModelFactory
# Set up a retriever with vector database (conceptual example)
retriever = VectorDBRetriever(
database_url="your-vector-db-connection",
embedding_model="text-embedding-ada-002",
)
# Create a RAG agent
rag_agent = ChatAgent(
system_message="You are a knowledge assistant that answers questions using a knowledge base.",
model=ModelFactory.create(),
retriever=retriever,
)
# Query the agent
response = rag_agent.step("What are the latest approaches to renewable energy storage?")
from camel.agents import ChatAgent
from camel.messages import BaseMessage
from camel.models import ModelFactory
from camel.types import ModelPlatformType, ModelType
# Create customer and service agent for synthetic conversations
customer_agent = ChatAgent(
BaseMessage.make_assistant_message(
role_name="Customer",
content="You are a customer with specific questions about a product."
),
model=ModelFactory.create(
model_platform=ModelPlatformType.DEFAULT,
model_type=ModelType.DEFAULT,
)
)
service_agent = ChatAgent(
BaseMessage.make_assistant_message(
role_name="Support Agent",
content="You are a helpful customer support agent for a tech company."
),
model=ModelFactory.create(
model_platform=ModelPlatformType.DEFAULT,
model_type=ModelType.DEFAULT,
)
)
# Generate synthetic conversation
conversations = []
initial_query = "I'm having trouble setting up my new device."
# Conversation loop
customer_msg = BaseMessage.make_user_message(content=initial_query)
for i in range(5): # 5 turns of conversation
service_response = service_agent.step(customer_msg.content)
conversations.append({"role": "customer", "message": customer_msg.content})
conversations.append({"role": "service", "message": service_response.msgs[0].content})
if i < 4: # Skip last customer turn
customer_msg = customer_agent.step(service_response.msgs[0].content)
# Save synthetic data
import json
with open("synthetic_support_conversations.json", "w") as f:
json.dump(conversations, f, indent=2)
Explore real-world implementations of CAMEL-AI through these practical examples, each demonstrating a different aspect of the framework's capabilities.
This example demonstrates how to use CAMEL-AI to automatically generate a knowledge graph from unstructured text, creating a structured representation of entities and their relationships.
A company with vast amounts of documentation wants to create a navigable knowledge graph to help employees find information more efficiently. Instead of manually creating the graph, they use CAMEL-AI to automatically extract entities and relationships from their documents.
from camel.agents import ChatAgent
from camel.models import ModelFactory
from camel.toolkits import KnowledgeGraphToolkit
from camel.messages import BaseMessage
# Create a knowledge graph agent
kg_agent = ChatAgent(
BaseMessage.make_assistant_message(
role_name="Knowledge Graph Specialist",
content="You are an expert at extracting entities and relationships from text to build knowledge graphs."
),
model=ModelFactory.create(),
tools=KnowledgeGraphToolkit().get_tools(),
)
# Sample document text
document = """
Renewable energy comes from sources that are naturally replenishing but flow-limited.
They are virtually inexhaustible in duration but limited in the amount of energy that
is available per unit of time. Renewable energy sources include biomass, hydropower,
geothermal, wind, and solar. Wind energy is captured through wind turbines, which convert
kinetic energy from wind into mechanical power. Solar energy is derived from the sun through
solar panels using photovoltaic cells.
"""
# Generate knowledge graph
response = kg_agent.step(f"Create a knowledge graph from this text: {document}")
# The response would contain a structured knowledge graph with entities like
# "Renewable Energy", "Wind Energy", "Solar Energy" and their relationships
This example showcases a sophisticated customer support system with multiple specialized agents working together to handle customer inquiries.
An e-commerce company wants to automate their customer support to handle common inquiries. They implement a multi-agent system where different agents handle different aspects of customer service, working together to provide comprehensive assistance.
from camel.agents import ChatAgent
from camel.societies.workforce import Workforce
from camel.tasks.task import Task
from camel.models import ModelFactory
from camel.toolkits import SearchToolkit
# Create greeter agent
greeter_agent = ChatAgent(
system_message="You are the initial greeter for our customer support. Welcome customers warmly and identify their needs.",
model=ModelFactory.create()
)
# Create product specialist
product_agent = ChatAgent(
system_message="You are a product specialist who knows all details about our product catalog.",
model=ModelFactory.create(),
tools=[SearchToolkit().get_tools()] # Can search product database
)
# Create support specialist
support_agent = ChatAgent(
system_message="You handle technical support issues and customer complaints with patience and expertise.",
model=ModelFactory.create()
)
# Set up the workforce
support_team = Workforce('Customer Support')
support_team.add_single_agent_worker("Greeter", worker=greeter_agent)
support_team.add_single_agent_worker("Product Specialist", worker=product_agent)
support_team.add_single_agent_worker("Support Specialist", worker=support_agent)
# Process customer query
customer_query = Task(
content="I received my order yesterday but the product is damaged. I'd like a replacement or refund.",
id='customer_123',
)
response = support_team.process_task(customer_query)
print(response.result)
This example demonstrates a complex travel planning system using multiple agents with specialized tools to create personalized travel itineraries.
A travel agency wants to create personalized travel itineraries for their clients. They implement a multi-agent system where different agents handle various aspects of travel planning, from destination research to logistics and itinerary creation.
from camel.agents import ChatAgent
from camel.societies.workforce import Workforce
from camel.tasks.task import Task
from camel.models import ModelFactory
from camel.toolkits import GoogleMapsToolkit, SearchToolkit
# Create destination expert agent
destination_agent = ChatAgent(
system_message="You are a travel destination expert with deep knowledge about global destinations.",
model=ModelFactory.create(),
tools=[SearchToolkit().get_tools()]
)
# Create logistics agent with mapping capabilities
logistics_agent = ChatAgent(
system_message="You handle travel logistics including flights, accommodations, and local transportation.",
model=ModelFactory.create(),
tools=GoogleMapsToolkit().get_tools()
)
# Create itinerary planner
itinerary_agent = ChatAgent(
system_message="You create detailed day-by-day travel itineraries based on traveler preferences.",
model=ModelFactory.create()
)
# Set up travel planning workforce
travel_team = Workforce('Travel Planning')
travel_team.add_single_agent_worker("Destination Expert", worker=destination_agent)
travel_team.add_single_agent_worker("Logistics Specialist", worker=logistics_agent)
travel_team.add_single_agent_worker("Itinerary Planner", worker=itinerary_agent)
# Process travel request
travel_request = Task(
content="I want a 5-day trip to Tokyo in October for a family of four with two teenagers. We're interested in technology, anime, and traditional culture.",
id='travel_001',
)
itinerary = travel_team.process_task(travel_request)
print(itinerary.result)
Discover how businesses across various industries are leveraging CAMEL-AI to solve real-world challenges, automate processes, and create value.
Marketing & Content Production
Marketing teams use CAMEL-AI to automate complex content production workflows, from data collection and analysis to report generation.
E-commerce & Service Industries
E-commerce companies implement multi-agent support systems to handle customer inquiries, provide product information, and resolve issues.
Finance & Healthcare
Financial institutions and healthcare organizations use CAMEL-AI to generate synthetic data for model training while preserving privacy and security.
Travel & Hospitality
Travel agencies leverage CAMEL-AI to create personalized travel experiences with multi-agent systems that handle all aspects of trip planning.
A practical roadmap for businesses looking to implement CAMEL-AI in their operations, from initial exploration to full deployment.
Reduce manual effort and increase throughput for knowledge-intensive tasks
Provide continuous service without staffing constraints
Handle growing workloads without linear staffing increases
Standardized processes result in more consistent quality
Create training data without privacy concerns
Budget for ongoing LLM API usage and compute resources
Initial investment in building and deploying systems
Ongoing resources for system oversight and updates
Costs associated with organizational adoption
Budget for ongoing improvements and adjustments
A simplified approach to calculating return on investment for CAMEL-AI implementations:
ROI = (Total Benefits - Total Costs) / Total Costs × 100%
This comprehensive cheatsheet provides quick reference code snippets for implementing CAMEL-AI features, from basic setup to advanced applications.
# Basic installation
pip install camel-ai
# Full installation with all dependencies
pip install 'camel-ai[all]'
# Setting up API keys
import os
# For OpenAI models
os.environ["OPENAI_API_KEY"] = "your-api-key-here"
# Using .env file
from dotenv import load_dotenv
load_dotenv() # Loads variables from .env file
# Simple agent creation
from camel.agents import ChatAgent
from camel.models import ModelFactory
agent = ChatAgent(
system_message="You are a helpful assistant.",
model=ModelFactory.create()
)
# Agent with specific role using BaseMessage
from camel.messages import BaseMessage
role_agent = ChatAgent(
BaseMessage.make_assistant_message(
role_name="Marketing Specialist",
content="You are an expert in digital marketing strategies."
),
model=ModelFactory.create()
)
# Agent with custom model
from camel.types import ModelPlatformType, ModelType
custom_model_agent = ChatAgent(
system_message="You are a technical writer specializing in documentation.",
model=ModelFactory.create(
model_platform=ModelPlatformType.OPENAI,
model_type=ModelType.GPT_4O_MINI,
)
)
# Interacting with an agent
response = agent.step("Tell me about machine learning.")
print(response.msgs[0].content) # Access the response content
# Adding tools to an agent
from camel.toolkits import MathToolkit, SearchToolkit, GoogleMapsToolkit
# Agent with math tools
math_agent = ChatAgent(
system_message="You are a math assistant.",
model=ModelFactory.create(),
tools=MathToolkit().get_tools()
)
# Agent with search capabilities
search_agent = ChatAgent(
system_message="You are a research assistant.",
model=ModelFactory.create(),
tools=SearchToolkit().get_tools()
)
# Agent with multiple tool types
multi_tool_agent = ChatAgent(
system_message="You are an assistant with multiple capabilities.",
model=ModelFactory.create(),
tools=[
*MathToolkit().get_tools(),
*SearchToolkit().get_tools(),
*GoogleMapsToolkit().get_tools()
]
)
# Checking tool calls in response
response = search_agent.step("Who won the Nobel Prize in Physics in 2023?")
print(response.info['tool_calls']) # See which tools were called
# Creating an agent with custom memory window
memory_agent = ChatAgent(
system_message="You are an assistant with enhanced memory.",
model=ModelFactory.create(),
message_window_size=20 # Remember more context
)
# Accessing agent memory
context = memory_agent.memory.get_context()
print(context) # Shows current memory contents
# Adding a message to memory
from camel.messages import BaseMessage
new_message = BaseMessage.make_user_message(
content="Remember this important fact for later reference."
)
memory_agent.record_message(new_message)
# Clearing agent memory
memory_agent.memory.clear()
# Setting up external memory (conceptual example)
from camel.memory import VectorDBMemory
external_memory = VectorDBMemory(
connection_string="your-vector-db-connection",
embedding_model="text-embedding-ada-002"
)
agent_with_external_memory = ChatAgent(
system_message="You have access to long-term memory.",
model=ModelFactory.create(),
memory=external_memory
)
# Setting up a multi-agent workforce
from camel.societies.workforce import Workforce
from camel.tasks.task import Task
# Create individual agents
agent1 = ChatAgent(
system_message="You are a research specialist.",
model=ModelFactory.create(),
tools=SearchToolkit().get_tools()
)
agent2 = ChatAgent(
system_message="You analyze information and extract insights.",
model=ModelFactory.create()
)
agent3 = ChatAgent(
system_message="You create well-structured reports.",
model=ModelFactory.create()
)
# Create the workforce
team = Workforce('Research Team')
team.add_single_agent_worker("Researcher", worker=agent1)
team.add_single_agent_worker("Analyst", worker=agent2)
team.add_single_agent_worker("Writer", worker=agent3)
# Define and process a task
research_task = Task(
content="Create a comprehensive report on quantum computing advancements in 2024.",
id='research_001',
)
result = team.process_task(research_task)
print(result.result) # Final output from all agents
# Setting up RAG with CAMEL-AI (conceptual example)
from camel.agents import ChatAgent
from camel.retrievers import VectorDBRetriever
from camel.models import ModelFactory
# Set up a retriever with vector database
retriever = VectorDBRetriever(
database_url="your-vector-db-connection",
embedding_model="text-embedding-ada-002",
similarity_top_k=3 # Number of documents to retrieve
)
# Create a RAG agent
rag_agent = ChatAgent(
system_message="You are a knowledge assistant that uses a database to answer questions accurately.",
model=ModelFactory.create(),
retriever=retriever,
)
# Query the agent (will automatically retrieve relevant information)
response = rag_agent.step("What are the key benefits of quantum computing for cryptography?")
print(response.msgs[0].content)
# Generating synthetic conversation data
from camel.agents import ChatAgent
from camel.messages import BaseMessage
from camel.models import ModelFactory
# Create role-playing agents
customer_agent = ChatAgent(
BaseMessage.make_assistant_message(
role_name="Customer",
content="You are a customer contacting support about a recent purchase."
),
model=ModelFactory.create(),
)
support_agent = ChatAgent(
BaseMessage.make_assistant_message(
role_name="Support Agent",
content="You are a helpful customer support representative."
),
model=ModelFactory.create(),
)
# Generate conversation
conversations = []
initial_query = "I recently purchased your product but it's not working as expected."
# Conversation loop
customer_msg = BaseMessage.make_user_message(content=initial_query)
for i in range(5): # 5 turns of conversation
support_response = support_agent.step(customer_msg.content)
conversations.append({"role": "customer", "message": customer_msg.content})
conversations.append({"role": "support", "message": support_response.msgs[0].content})
if i < 4: # Skip last customer turn
customer_msg = customer_agent.step(support_response.msgs[0].content)
# Save synthetic data
import json
with open("synthetic_conversations.json", "w") as f:
json.dump(conversations, f, indent=2)
# Setting agent output language
agent = ChatAgent(
system_message="You are a helpful assistant.",
model=ModelFactory.create(),
)
agent.set_output_language('french') # Responses will be in French
# Setting token limit
limited_agent = ChatAgent(
system_message="You provide concise information.",
model=ModelFactory.create(),
token_limit=1000 # Limit response length
)
# Using custom response terminators
terminator_agent = ChatAgent(
system_message="You help with code examples.",
model=ModelFactory.create(),
response_terminators=["END", "STOP"] # Custom signals to end responses
)
# Resetting an agent to initial state
agent.reset()
# Using a specific model configuration
from camel.configs import ChatGPTConfig
custom_config_agent = ChatAgent(
system_message="You provide detailed explanations.",
model=ModelFactory.create(
model_platform=ModelPlatformType.OPENAI,
model_type=ModelType.GPT_4O_MINI,
model_config_dict=ChatGPTConfig(
temperature=0.7,
top_p=0.9
).as_dict()
)
)
You now have a comprehensive understanding of CAMEL-AI and its capabilities. The next step is to start building your own multi-agent systems for your specific use cases.