Notebooks
OpenAI Assistants in AG2
Use Cases
- Use cases
- Notebooks
- All Notebooks
- AutoBuild
- Tools with Dependency Injection
- Chatting with a teachable agent
- Solving Complex Tasks with Nested Chats
- Agent Tracking with AgentOps
- Config loader utility functions
- Websockets: Streaming input and output using websockets
- A Uniform interface to call different LLMs
- RAG OpenAI Assistants in AG2
- From Dad Jokes To Sad Jokes: Function Calling with GPTAssistantAgent
- Agent Chat with custom model loading
- Agentic RAG workflow on tabular data from a PDF file
- Group Chat with Customized Speaker Selection Method
- RealtimeAgent in a Swarm Orchestration using WebRTC
- Auto Generated Agent Chat: Task Solving with Code Generation, Execution, Debugging & Human Feedback
- Web Scraping using Apify Tools
- Swarm Orchestration with AG2
- Currency Calculator: Task Solving with Provided Tools as Functions
- Agent with memory using Mem0
- RealtimeAgent in a Swarm Orchestration
- Solving Complex Tasks with A Sequence of Nested Chats
- Auto Generated Agent Chat: Teaching AI New Skills via Natural Language Interaction
- (Legacy) Implement Swarm-style orchestration with GroupChat
- Perform Research with Multi-Agent Group Chat
- Usage tracking with AG2
- Auto Generated Agent Chat: Collaborative Task Solving with Coding and Planning Agent
- DeepSeek: Adding Browsing Capabilities to AG2
- Chat with OpenAI Assistant using function call in AG2: OSS Insights for Advanced GitHub Data Analysis
- RealtimeAgent with gemini client
- Solving Multiple Tasks in a Sequence of Async Chats
- Agent with memory using Mem0
- OpenAI Assistants in AG2
- Demonstrating the `AgentEval` framework using the task of solving math problems as an example
- Adding Browsing Capabilities to AG2
- Group Chat with Retrieval Augmented Generation
- Cross-Framework LLM Tool Integration with AG2
- Auto Generated Agent Chat: GPTAssistant with Code Interpreter
- Auto Generated Agent Chat: Task Solving with Langchain Provided Tools as Functions
- Using Neo4j's graph database with AG2 agents for Question & Answering
- Run a standalone AssistantAgent
- StateFlow: Build Workflows through State-Oriented Actions
- Using RetrieveChat for Retrieve Augmented Code Generation and Question Answering
- ReasoningAgent - Advanced LLM Reasoning with Multiple Search Strategies
- Generate Dalle Images With Conversable Agents
- DeepResearchAgent
- Enhanced Swarm Orchestration with AG2
- Structured output from json configuration
- Agent Observability with OpenLIT
- Auto Generated Agent Chat: Solving Tasks Requiring Web Info
- Task Solving with Code Generation, Execution and Debugging
- RealtimeAgent with WebRTC connection
- Solving Multiple Tasks in a Sequence of Chats
- Using RetrieveChat with Qdrant for Retrieve Augmented Code Generation and Question Answering
- Writing a software application using function calls
- Runtime Logging with AG2
- Using RetrieveChat Powered by MongoDB Atlas for Retrieve Augmented Code Generation and Question Answering
- WebSurferAgent
- Using FalkorGraphRagCapability with agents for GraphRAG Question & Answering
- Auto Generated Agent Chat: Task Solving with Provided Tools as Functions
- Preprocessing Chat History with `TransformMessages`
- CaptainAgent
- FSM - User can input speaker transition constraints
- OptiGuide with Nested Chats in AG2
- Group Chat with Coder and Visualization Critic
- Mitigating Prompt hacking with JSON Mode in Autogen
- Translating Video audio using Whisper and GPT-3.5-turbo
- Auto Generated Agent Chat: Group Chat with GPTAssistantAgent
- Interactive LLM Agent Dealing with Data Stream
- Engaging with Multimodal Models: GPT-4V in AG2
- Using Guidance with AG2
- SQL Agent for Spider text-to-SQL benchmark
- Conversational Chess using non-OpenAI clients
- Group Chat
- Auto Generated Agent Chat: Function Inception
- Making OpenAI Assistants Teachable
- Use AG2 in Databricks with DBRX
- Agent Chat with Async Human Inputs
- Using RetrieveChat Powered by PGVector for Retrieve Augmented Code Generation and Question Answering
- Groupchat with Llamaindex agents
- Discord, Slack, and Telegram messaging tools
- Chat Context Dependency Injection
- Structured output
- Trip planning with a FalkorDB GraphRAG agent using a Swarm
- RealtimeAgent in a Swarm Orchestration
- Small, Local Model (IBM Granite) Multi-Agent RAG
- Assistants with Azure Cognitive Search and Azure Identity
- Supercharging Web Crawling with Crawl4AI
- RealtimeAgent with local websocket connection
- Using a local Telemetry server to monitor a GraphRAG agent
- AgentOptimizer: An Agentic Way to Train Your LLM Agent
- Using RetrieveChat Powered by Couchbase Capella for Retrieve Augmented Code Generation and Question Answering
- Using Neo4j's native GraphRAG SDK with AG2 agents for Question & Answering
- Automatically Build Multi-agent System from Agent Library
- Nested Chats for Tool Use in Conversational Chess
- Task Solving with Provided Tools as Functions (Asynchronous Function Calls)
- SocietyOfMindAgent
- Language Agent Tree Search
- Agent Chat with Multimodal Models: LLaVA
- Agent Chat with Multimodal Models: DALLE and GPT-4V
- Auto Generated Agent Chat: Collaborative Task Solving with Multiple Agents and Human Users
- RealtimeAgent in a Swarm Orchestration
- Cross-Framework LLM Tool for CaptainAgent
- Solving Multiple Tasks in a Sequence of Chats with Different Conversable Agent Pairs
- RAG with DocAgent
- Auto Generated Agent Chat: Using MathChat to Solve Math Problems
- Community Gallery
Notebooks
OpenAI Assistants in AG2
Two-agent chat with OpenAI assistants.
This notebook shows a very basic example of the
GPTAssistantAgent
,
which is an experimental AG2 agent class that leverages the OpenAI
Assistant API for
conversational capabilities, working with UserProxyAgent
in AG2.
import logging
import os
from autogen import AssistantAgent, UserProxyAgent, config_list_from_json
from autogen.agentchat.contrib.gpt_assistant_agent import GPTAssistantAgent
logger = logging.getLogger(__name__)
logger.setLevel(logging.WARNING)
assistant_id = os.environ.get("ASSISTANT_ID", None)
config_list = config_list_from_json("OAI_CONFIG_LIST")
llm_config = {"config_list": config_list}
assistant_config = {"assistant_id": assistant_id}
gpt_assistant = GPTAssistantAgent(
name="assistant",
instructions=AssistantAgent.DEFAULT_SYSTEM_MESSAGE,
llm_config=llm_config,
assistant_config=assistant_config,
)
user_proxy = UserProxyAgent(
name="user_proxy",
code_execution_config={
"work_dir": "coding",
"use_docker": False,
}, # Please set use_docker=True if docker is available to run the generated code. Using docker is safer than running the generated code directly.
is_termination_msg=lambda msg: "TERMINATE" in msg["content"],
human_input_mode="NEVER",
max_consecutive_auto_reply=1,
)
user_proxy.initiate_chat(gpt_assistant, message="Print hello world")
user_proxy.initiate_chat(gpt_assistant, message="Write py code to eval 2 + 2", clear_history=True)
gpt_assistant.delete_assistant()