Use Databricks DBRX and Foundation Model APIs to build AG2 applications backed by open-source LLMs.
AssistantAgent
, UserProxyAgent
, and
ConversableAgent
. These demos are not intended to be exhaustive - feel
free to use them as a base to build upon!
Python>=3.9
. This example includes the %pip
magic command to
install: %pip install ag2
, as well as other necessary libraries.
This code has been tested on: * Serverless
Notebooks
(in public preview as of Apr 18, 2024) * Databricks Runtime 14.3 LTS ML
docs
This code can run in any Databricks workspace in a region where DBRX is
available via pay-per-token APIs (or provisioned throughput). To check
if your region is supported, see Foundation Model Region
Availability.
If the above is true, the workspace must also be enabled by an admin for
Foundation Model APIs
docs.
config_list
: LLM
Configuration
UserProxyAgent
asking a
question to an AssistantAgent
. This is based on the tutorial demo
here.
After sending the question and seeing a response, you can type exit
to
end the chat or continue to converse.
UserProxyAgent
will take advantage of our code_executor
; after the
code is shown on screen, type Return/Enter in the chatbox to have it
execute locally on your cluster via the bot’s auto-reply.
Note: with generative AI coding assistants, you should always
manually read and review the code before executing it yourself, as LLM
results are non-deterministic and may lead to unintended consequences.
class
that extends the capabilities
of autogen.runtime_logging
docs:
.start()
and .stop()
, as well as try/except for error
handling.
request
field above, we can also see the
system prompt for the LLM - this can be useful for prompt engineering as
well as debugging.
Note that when you deploy this to Databricks Model Serving, model
responses are auto-logged using Lakehouse
Monitoring;
but the above approach provides a simple mechanism to log chats from the
client side.
Let’s now persist these results to a Delta table in Unity
Catalog: