
Why Dependency Injection Is Essential
Here’s why dependency injection is a game-changer for secure LLM workflows:- Enhanced Security: Your sensitive data is never directly exposed to the LLM.
- Simplified Development: Secure data can be seamlessly accessed by functions without requiring complex configurations.
- Unmatched Flexibility: It supports safe integration of diverse workflows, allowing you to scale and adapt with ease.
Installation
To installAG2
, simply run the following command:
If you have been using oras
autogen
or ag2
, all you need to do is upgrade it using:autogen
and ag2
are aliases for the same PyPI package.Imports
The functionality demonstrated in this guide is located in theautogen.tools.dependency_injection
module. This module provides key components for dependency injection:
BaseContext
: abstract base class used to define and encapsulate data contexts, such as user account information, which can then be injected into functions or agents securely.Depends
: a function used to declare and inject dependencies, either from a context (likeBaseContext
) or a function, ensuring sensitive data is provided securely without direct exposure.
Define a BaseContext Class
We start by defining aBaseContext
class for accounts. This will act as the base structure for dependency injection. By using this approach, sensitive information like usernames and passwords is never exposed to the LLM.
Helper Functions
To ensure that the provided account is valid and retrieve its balance, we create two helper functions.Injecting BaseContext Parameter
Dependency injection simplifies passing data to a function. Here, we’ll inject anAccount
instance into a function automatically.
Agent Configuration
Configure the agents for the interaction.LLMConfig
defines the LLM configurations, including the model and API key.UserProxyAgent
simulates user inputs without requiring actual human interaction (set toNEVER
).AssistantAgent
represents the AI agent, configured with the LLM settings.
Register the Function with Dependency Injection
We register a function where the account information forbob
is injected as a dependency.
Note: You can also use account: Account = Depends(bob_account)
as an alternative syntax.
Initiate the Chat
Finally, we initiate a chat to retrieve the balance.Injecting Parameters Without BaseContext
Sometimes, you might not want to useBaseContext
. Here’s how to inject simple parameters directly.
Agent Configuration
Configure the agents for the interaction.LLMConfig
defines the LLM configurations, including the model and API key.UserProxyAgent
simulates user inputs without requiring actual human interaction (set toNEVER
).AssistantAgent
represents the AI agent, configured with the LLM settings.
Register the Function with Direct Parameter Injection
Instead of injecting a full context likeAccount
, you can directly inject individual parameters, such as the username and password, into a function. This allows for more granular control over the data injected into the function, and still ensures that sensitive information is managed securely.
Here’s how you can set it up:
Initiate the Chat
As before, initiate a chat to test the function.Assigning Different Contexts to Multiple Agents
You can assign different contexts, such as distinct account data, to different agents within the same group chat. This ensures that each assistant works with its own unique set of data—e.g., one assistant can usealice_account
, while another can use bob_account
.
GroupChat Configuration
Let’s configure aGroupChat
with two assistant agents.
Register Functions for Each Assistant
- For
assistant_1
, we inject thealice_account
context usingDepends(alice_account)
, ensuring that it retrieves the balance for Alice’s account. - For
assistant_2
, we inject thebob_account
context usingDepends(bob_account)
, ensuring that it retrieves the balance for Bob’s account.