LiteLLM Proxy Server
LiteLLM with OpenAI
Before starting this guide, ensure you have completed the Installation Guide and installed all required dependencies.
Run LiteLLM as a Docker Container
To connect LiteLLM with an OpenAI model
, configure your litellm_config.yaml
as follows:
Before starting the container, ensure you have correctly set the following environment variables:
OPENAI_API_KEY
Run the container using:
Once running, LiteLLM will be accessible at: http://0.0.0.0:4000
To confirm that config.yaml
is correctly mounted, check the logs:
Initiate Chat
To communicate with LiteLLM, configure the model in config_list
and initiate a chat session.