Use Langchain tools.
AssistantAgent
and
UserProxyAgent
to make function calls with the new feature of OpenAI
models (in model version 0613) with a set of Langchain-provided tools
and toolkits, to demonstrate how to leverage the 35+ tools available. A
specified prompt and function configs must be passed to AssistantAgent
to initialize the agent. The corresponding functions must be passed to
UserProxyAgent
, which will execute any function calls made by
AssistantAgent
. Besides this requirement of matching descriptions with
functions, we recommend checking the system message in the
AssistantAgent
to ensure the instructions align with the function call
descriptions.
Python>=3.9
. To run this notebook example, please install
ag2
and Langchain
:
LLMConfig.from_json
function tries to create a list of configurations using Azure OpenAI
endpoints and OpenAI endpoints for the provided list of models. It
assumes the api keys and api bases are stored in the corresponding
environment variables or local txt files:
openai_api_key_file="key_openai.txt"
.aoai_api_key_file="key_aoai.txt"
. Multiple keys can be stored, one
per line.aoai_api_base_file="base_aoai.txt"
. Multiple bases can be stored,
one per line.exclude
argument if they do.
AssistantAgent
and UserProxyAgent
. With the default system prompt of
AssistantAgent
, we allow the LLM assistant to perform tasks with code,
and the UserProxyAgent
would extract code blocks from the LLM response
and execute them. With the new “function_call” feature, we define
functions and specify the description of the function in the OpenAI
config for the AssistantAgent
. Then we register the functions in
UserProxyAgent
.