build()
to let the build manager (with a builder_model
as backbone) complete the group chat agents generation.
If you think coding is necessary for your task, you can use coding=True
to add a user proxy (a local code interpreter) into the agent list as:
coding
is not specified, AgentBuilder will determine on its own whether the user proxy should be added or not according to the task.
The generated agent_list
is a list of AssistantAgent
instances.
If coding
is true, a user proxy (a UserProxyAssistant
instance) will be added as the first element to the agent_list
.
agent_configs
is a list of agent configurations including agent name, backbone LLM model, and system message.
For example
build()
complete the task collaboratively in a group chat.
recycle_endpoint=False
to retain the previous open-source LLM’s endpoint server.
save_config_TASK_MD5.json
.
You can load the saved config and skip the building process. AgentBuilder will create agents with those information without prompting the build manager.
use_oai_assistant=True
to build()
.
AgentBuilder
.
AutoBuild can help user solve their complex task with an automatically built multi-agent system.
AutoBuild supports open-source LLMs and GPTs API, giving users more flexibility to choose their favorite models.
More advanced features are coming soon.