![]() <!-- Thank you for your contribution! Please review https://microsoft.github.io/autogen/docs/Contribute before opening a pull request. --> <!-- Please add a reviewer to the assignee section when you create a PR. If you don't have the access to it, we will shortly find a reviewer and assign them to your PR. --> ## Why are these changes needed? <!-- Please give a short summary of the change and the problem this solves. --> This PR makes it clear which agent is speaking per message in the Chainlit team sample. Previously, messages would be exchanged without showing who is communicating. ## Related issue number <!-- For example: "Closes #1234" --> Closes #5609 ## Checks - [x] I've included any doc changes needed for <https://microsoft.github.io/autogen/>. See <https://github.com/microsoft/autogen/blob/main/CONTRIBUTING.md> to build and test documentation locally. - [x] I've added tests (if relevant) corresponding to the changes introduced in this PR. - [x] I've made sure all auto checks have passed. Co-authored-by: Eric Zhu <ekzhu@users.noreply.github.com> |
||
---|---|---|
.. | ||
.gitignore | ||
README.md | ||
app_agent.py | ||
app_team.py | ||
app_team_user_proxy.py | ||
model_config.yaml | ||
model_config_template.yaml |
README.md
Building a Multi-Agent Application with AutoGen and Chainlit
In this sample, we will demonstrate how to build simple chat interface that interacts with an AgentChat agent or a team, using Chainlit, and support streaming messages.
Installation
To run this sample, you will need to install the following packages:
pip install -U chainlit autogen-agentchat autogen-ext[openai] pyyaml
To use other model providers, you will need to install a different extra
for the autogen-ext
package.
See the Models documentation for more information.
Model Configuration
Create a configuration file named model_config.yaml
to configure the model
you want to use. Use model_config_template.yaml
as a template.
Running the Agent Sample
The first sample demonstrate how to interact with a single AssistantAgent from the chat interface.
chainlit run app_agent.py -h
You can use one of the starters. For example, ask "What the weather in Seattle?".
The agent will respond by first using the tools provided and then reflecting on the result of the tool execution.
Running the Team Sample
The second sample demonstrate how to interact with a team of agents from the chat interface.
chainlit run app_team.py -h
You can use one of the starters. For example, ask "Write a poem about winter.".
The team is a RoundRobinGroupChat, so each agent will respond in turn. There are two agents in the team: one is instructed to be generally helpful and the other one is instructed to be a critic and provide feedback. The two agents will respond in round-robin fashion until the 'APPROVE' is mentioned by the critic agent.
Running the Team Sample with UserProxyAgent
The third sample demonstrate how to interact with a team of agents including a UserProxyAgent for approval or rejection.
chainlit run app_team_user_proxy.py -h
You can use one of the starters. For example, ask "Write code to reverse a string.".
By default, the UserProxyAgent
will request an input action from the user
to approve or reject the response from the team.
When the user approves the response, the UserProxyAgent
will send a message
to the team containing the text "APPROVE", and the team will stop responding.
Next Steps
There are a few ways you can extend this example: