LM Studio
This notebook shows how to use AG2 with multiple local models using LM Studio’s multi-model serving feature, which is available since version 0.2.17 of LM Studio.
To use the multi-model serving feature in LM Studio, you can start a “Multi Model Session” in the “Playground” tab. Then you select relevant models to load. Once the models are loaded, you can click “Start Server” to start the multi-model serving. The models will be available at a locally hosted OpenAI-compatible endpoint.
Installing AG2 with OpenAI API support
Run the following command to install AG2 with the OpenAI package as LM Studio supports the OpenAI API.
If you have been using autogen
or pyautogen
, all you need to do is upgrade it using:
or
as pyautogen
, autogen
, and ag2
are aliases for the same PyPI package.
Two Agent Chats
In this example, we create a comedy chat between two agents using two different local models, Phi-2 and Gemma it.
We first create configurations for the models.