api_type
and model
. We’ll demonstrate how to use them below.
The community is continuing to enhance and build new client classes as cloud-based inference providers arrive. So, watch this space, and feel free to discuss or develop another one.
name
field on a message, so be sure to use your agent’s name in their system_message
and description
fields, as well as instructing the LLM to ‘act as’ them. This is particularly important for “auto” speaker selection in group chats as we need to guide the LLM to choose the next agent based on a name, so tweak select_speaker_message_template
, select_speaker_prompt_template
, and select_speaker_auto_multiple_template
with more guidance.OAI_CONFIG_LIST
. Ensure you specify the api_type
to initialize the respective client (Anthropic, Mistral, or Together).
[LLMConfig.from_json](https://docs.ag2.ai/latest/docs/api-reference/autogen/llm_config/LLMConfig)
method loads a list of configurations from an environment variable or a json file.
functionbot
to not reply with more than necessary. Even so, it can’t always help itself!
Let’s start with setting up our configuration and agents.
user_proxy
for execution and functionbot
for the LLM to consider using them.
summary_method
. Using summary_prompt
, we guide Sonnet to give us an email output.