autogen/python/samples/agentchat_streamlit
Eric Zhu 025490a1bd
Use class hierarchy to organize AgentChat message types and introduce StructuredMessage type (#5998)
This PR refactored `AgentEvent` and `ChatMessage` union types to
abstract base classes. This allows for user-defined message types that
subclass one of the base classes to be used in AgentChat.

To support a unified interface for working with the messages, the base
classes added abstract methods for:
- Convert content to string
- Convert content to a `UserMessage` for model client
- Convert content for rendering in console.
- Dump into a dictionary
- Load and create a new instance from a dictionary

This way, all agents such as `AssistantAgent` and `SocietyOfMindAgent`
can utilize the unified interface to work with any built-in and
user-defined message type.

This PR also introduces a new message type, `StructuredMessage` for
AgentChat (Resolves #5131), which is a generic type that requires a
user-specified content type.

You can create a `StructuredMessage` as follow:

```python

class MessageType(BaseModel):
  data: str
  references: List[str]

message = StructuredMessage[MessageType](content=MessageType(data="data", references=["a", "b"]), source="user")

# message.content is of type `MessageType`. 
```

This PR addresses the receving side of this message type. To produce
this message type from `AssistantAgent`, the work continue in #5934.

Added unit tests to verify this message type works with agents and
teams.
2025-03-26 16:19:52 -07:00
..
.gitignore adding streamlit sample app (#5306) 2025-01-31 12:19:16 -08:00
README.md adding streamlit sample app (#5306) 2025-01-31 12:19:16 -08:00
agent.py Use class hierarchy to organize AgentChat message types and introduce StructuredMessage type (#5998) 2025-03-26 16:19:52 -07:00
main.py fix: type issues in streamlit sample and add streamlit to dev dependencies (#5309) 2025-01-31 14:06:59 -08:00

README.md

Streamlit AgentChat Sample Application

This is a sample AI chat assistant built with Streamlit

Setup

Install the streamlit package with the following command:

pip install streamlit

To use Azure OpenAI models or models hosted on OpenAI-compatible API endpoints, you need to install the autogen-ext[openai,azure] package. You can install it with the following command:

pip install "autogen-ext[openai,azure]"
# pip install "autogen-ext[openai]" for OpenAI models

Create a new file named model_config.yml in the the same directory as the script to configure the model you want to use.

For example, to use gpt-4o-mini model from Azure OpenAI, you can use the following configuration:

provider: autogen_ext.models.openai.AzureOpenAIChatCompletionClient
config:
  azure_deployment: "gpt-4o-mini"
  model: gpt-4o-mini
  api_version: REPLACE_WITH_MODEL_API_VERSION
  azure_endpoint: REPLACE_WITH_MODEL_ENDPOINT
  api_key: REPLACE_WITH_MODEL_API_KEY

For more information on how to configure the model and use other providers, please refer to the Models documentation.

Run

Run the following command to start the web application:

streamlit run main.py