Use Cases
- Use cases
- Notebooks
- All Notebooks
- Websockets: Streaming input and output using websockets
- Perplexity Search Tool
- Auto Generated Agent Chat: Task Solving with Code Generation, Execution, Debugging & Human Feedback
- Using a local Telemetry server to monitor a GraphRAG agent
- Auto Generated Agent Chat: Task Solving with Provided Tools as Functions
- RealtimeAgent with gemini client
- Agentic RAG workflow on tabular data from a PDF file
- Language Agent Tree Search
- Config loader utility functions
- Wikipedia Agent
- Google Drive Tools
- FSM - User can input speaker transition constraints
- DeepResearchAgent
- Group Chat with Retrieval Augmented Generation
- AgentOptimizer: An Agentic Way to Train Your LLM Agent
- Using RetrieveChat with Qdrant for Retrieve Augmented Code Generation and Question Answering
- RealtimeAgent with WebRTC connection
- Agent with memory using Mem0
- Preprocessing Chat History with `TransformMessages`
- RealtimeAgent in a Swarm Orchestration
- RAG OpenAI Assistants in AG2
- Using Guidance with AG2
- Using RetrieveChat Powered by MongoDB Atlas for Retrieve Augmented Code Generation and Question Answering
- Using Neo4j's graph database with AG2 agents for Question & Answering
- Use AG2 in Databricks with DBRX
- Wikipedia Search Tools
- Solving Complex Tasks with Nested Chats
- DeepSeek: Adding Browsing Capabilities to AG2
- Solving Complex Tasks with A Sequence of Nested Chats
- Nested Chats for Tool Use in Conversational Chess
- OpenAI Assistants in AG2
- Group Chat with Coder and Visualization Critic
- Agent Chat with Multimodal Models: LLaVA
- SocietyOfMindAgent
- Conversational Workflows with MCP: A Marie Antoinette Take on The Eiffel Tower
- Agent Chat with Multimodal Models: DALLE and GPT-4V
- StateFlow: Build Workflows through State-Oriented Actions
- Small, Local Model (IBM Granite) Multi-Agent RAG
- Group Chat with Tools
- DuckDuckGo Search Tool
- Load the configuration including the response format
- RealtimeAgent in a Swarm Orchestration
- MCP Clients
- Agent Tracking with AgentOps
- SQL Agent for Spider text-to-SQL benchmark
- AutoBuild
- Automatically Build Multi-agent System from Agent Library
- Generate Dalle Images With Conversable Agents
- Agent Observability with OpenLIT
- Using RetrieveChat Powered by Couchbase Capella for Retrieve Augmented Code Generation and Question Answering
- A Uniform interface to call different LLMs
- OptiGuide with Nested Chats in AG2
- Conversational Chess using non-OpenAI clients
- Chat Context Dependency Injection
- Agent with memory using Mem0
- `run` function examples with event processing
- From Dad Jokes To Sad Jokes: Function Calling with GPTAssistantAgent
- Mitigating Prompt hacking with JSON Mode in Autogen
- Group Chat with Customized Speaker Selection Method
- Trip planning with a FalkorDB GraphRAG agent using a Swarm
- Tavily Search Tool
- Structured output
- Adding Google Search Capability to AG2
- Task Solving with Provided Tools as Functions (Asynchronous Function Calls)
- Conversational Workflows with MCP: A French joke on a random Wikipedia article
- Auto Generated Agent Chat: Using MathChat to Solve Math Problems
- Agent Chat with custom model loading
- Auto Generated Agent Chat: Function Inception
- Use AG2 to Tune ChatGPT
- Auto Generated Agent Chat: Group Chat with GPTAssistantAgent
- Using FalkorGraphRagCapability with agents for GraphRAG Question & Answering
- Solving Multiple Tasks in a Sequence of Async Chats
- Discord, Slack, and Telegram messaging tools
- RealtimeAgent in a Swarm Orchestration using WebRTC
- Runtime Logging with AG2
- WebSurferAgent
- Use MongoDBQueryEngine to query Markdown files
- Conversational Workflows with MCP: A Shakespearean Take on arXiv Abstracts
- RAG with DocAgent
- Solving Multiple Tasks in a Sequence of Chats
- Cross-Framework LLM Tool for CaptainAgent
- Web Scraping using Apify Tools
- Auto Generated Agent Chat: Collaborative Task Solving with Coding and Planning Agent
- Currency Calculator: Task Solving with Provided Tools as Functions
- Use ChromaDBQueryEngine to query Markdown files
- Using RetrieveChat for Retrieve Augmented Code Generation and Question Answering
- Writing a software application using function calls
- Using OpenAI’s Web Search Tool with AG2
- Using RetrieveChat Powered by PGVector for Retrieve Augmented Code Generation and Question Answering
- Enhanced Swarm Orchestration with AG2
- Perform Research with Multi-Agent Group Chat
- Auto Generated Agent Chat: Teaching AI New Skills via Natural Language Interaction
- Usage tracking with AG2
- Using Neo4j's native GraphRAG SDK with AG2 agents for Question & Answering
- Groupchat with Llamaindex agents
- Assistants with Azure Cognitive Search and Azure Identity
- Swarm Orchestration with AG2
- Tools with Dependency Injection
- Structured output from json configuration
- Solving Multiple Tasks in a Sequence of Chats with Different Conversable Agent Pairs
- Chat with OpenAI Assistant using function call in AG2: OSS Insights for Advanced GitHub Data Analysis
- Auto Generated Agent Chat: Collaborative Task Solving with Multiple Agents and Human Users
- Adding YouTube Search Capability to AG2
- Chatting with a teachable agent
- Making OpenAI Assistants Teachable
- RealtimeAgent in a Swarm Orchestration
- Translating Video audio using Whisper and GPT-3.5-turbo
- Run a standalone AssistantAgent
- Auto Generated Agent Chat: Task Solving with Langchain Provided Tools as Functions
- Use LLamaIndexQueryEngine to query Markdown files
- Auto Generated Agent Chat: GPTAssistant with Code Interpreter
- Interactive LLM Agent Dealing with Data Stream
- Agent Chat with Async Human Inputs
- ReasoningAgent - Advanced LLM Reasoning with Multiple Search Strategies
- Auto Generated Agent Chat: Solving Tasks Requiring Web Info
- Use AG2 to Tune OpenAI Models
- Engaging with Multimodal Models: GPT-4V in AG2
- Supercharging Web Crawling with Crawl4AI
- Use AG2 in Microsoft Fabric
- Cross-Framework LLM Tool Integration with AG2
- Demonstrating the `AgentEval` framework using the task of solving math problems as an example
- Group Chat
- Adding Browsing Capabilities to AG2
- CaptainAgent
- (Legacy) Implement Swarm-style orchestration with GroupChat
- Task Solving with Code Generation, Execution and Debugging
- RealtimeAgent with local websocket connection
- Community Gallery
Notebooks
Agent Chat with Async Human Inputs
Async human inputs.
%pip install "autogen" chromadb sentence_transformers tiktoken pypdf nest-asyncio
import asyncio
from typing import Dict, Optional, Union
import nest_asyncio
from autogen import AssistantAgent
from autogen.agentchat.user_proxy_agent import UserProxyAgent
# Define an asynchronous function that simulates some asynchronous task (e.g., I/O operation)
async def my_asynchronous_function():
print("Start asynchronous function")
await asyncio.sleep(2) # Simulate some asynchronous task (e.g., I/O operation)
print("End asynchronous function")
return "input"
# Define a custom class `CustomisedUserProxyAgent` that extends `UserProxyAgent`
class CustomisedUserProxyAgent(UserProxyAgent):
# Asynchronous function to get human input
async def a_get_human_input(self, prompt: str) -> str:
# Call the asynchronous function to get user input asynchronously
user_input = await my_asynchronous_function()
return user_input
# Asynchronous function to receive a message
async def a_receive(
self,
message: Union[Dict, str],
sender,
request_reply: Optional[bool] = None,
silent: Optional[bool] = False,
):
# Call the superclass method to handle message reception asynchronously
await super().a_receive(message, sender, request_reply, silent)
class CustomisedAssistantAgent(AssistantAgent):
# Asynchronous function to get human input
async def a_get_human_input(self, prompt: str) -> str:
# Call the asynchronous function to get user input asynchronously
user_input = await my_asynchronous_function()
return user_input
# Asynchronous function to receive a message
async def a_receive(
self,
message: Union[Dict, str],
sender,
request_reply: Optional[bool] = None,
silent: Optional[bool] = False,
):
# Call the superclass method to handle message reception asynchronously
await super().a_receive(message, sender, request_reply, silent)
def create_llm_config(model, temperature, seed):
config_list = [
{
"model": "<model_name>",
"api_key": "<api_key>",
},
]
llm_config = {
"seed": int(seed),
"config_list": config_list,
"temperature": float(temperature),
}
return llm_config
nest_asyncio.apply()
async def main():
boss = CustomisedUserProxyAgent(
name="boss",
human_input_mode="ALWAYS",
max_consecutive_auto_reply=0,
code_execution_config=False,
)
assistant = CustomisedAssistantAgent(
name="assistant",
system_message="You will provide some agenda, and I will create questions for an interview meeting. Every time when you generate question then you have to ask user for feedback and if user provides the feedback then you have to incorporate that feedback and generate new set of questions and if user don't want to update then terminate the process and exit",
llm_config=create_llm_config("gpt-4", "0.4", "23"),
)
await boss.a_initiate_chat(
assistant,
message="Resume Review, Technical Skills Assessment, Project Discussion, Job Role Expectations, Closing Remarks.",
n_results=3,
)
await main()
Assistant
Responses are generated using AI and may contain mistakes.