Run LiteLLM as a Docker Container
To connect LiteLLM with anOpenAI model
, configure your litellm_config.yaml
as follows:
OPENAI_API_KEY
http://0.0.0.0:4000
To confirm that config.yaml
is correctly mounted, check the logs:
Initiate Chat
To communicate with LiteLLM, configure the model inconfig_list
and initiate a chat session.