Use planning agent in a function call.
AssistantAgent
is an LLM-based agent that can write and
debug Python code (in a Python coding block) for a user to execute for a
given task. UserProxyAgent
is an agent which serves as a proxy for a
user to execute the code written by AssistantAgent
. We further create
a planning agent for the assistant agent to consult. The planning agent
is a variation of the LLM-based AssistantAgent
with a different system
message.
Python>=3.9
. To run this notebook example, please install
autogen and docker:
config_list_from_json
function loads a list of configurations from an environment variable or
a json file. It first looks for an environment variable with a specified
name. The value of the environment variable needs to be a valid json
string. If that variable is not found, it looks for a json file with the
same name. It filters the configs by filter_dict.
It’s OK to have only the OpenAI API key, or only the Azure OpenAI API
key + base.
human_input_mode
as
“NEVER” in the user proxy agent, which will never ask for human
feedback. We define ask_planner
function to send a message to the
planner and return the suggestion from the planner.
human_input_mode
as “TERMINATE” in the user proxy agent, which will
ask for feedback when it receives a “TERMINATE” signal from the
assistant agent. We set the functions
in AssistantAgent
and
function_map
in UserProxyAgent
to use the created ask_planner
function.
initiate_chat()
method of the user proxy agent to start
the conversation. When you run the cell below, you will be prompted to
provide feedback after the assistant agent sends a “TERMINATE” signal at
the end of the message. If you don’t provide any feedback (by pressing
Enter directly), the conversation will finish. Before the “TERMINATE”
signal, the user proxy agent will try to execute the code suggested by
the assistant agent on behalf of the user.
ask_planner
. When this happens, a line like the following will
be displayed:
***** Suggested function Call: ask_planner *****