mirror of https://github.com/microsoft/autogen.git
cleanup
This commit is contained in:
parent
38639fcf12
commit
e549fc4f80
|
@ -37,6 +37,9 @@ user_proxy.initiate_chat(assistant, message="Plot a chart of META and TESLA stoc
|
||||||
The figure below shows an example conversation flow with AutoGen.
|
The figure below shows an example conversation flow with AutoGen.
|
||||||

|

|
||||||
|
|
||||||
|
* [Code examples](/docs/Examples/AutoGen-AgentChat).
|
||||||
|
* [Documentation](/docs/Use-Cases/agent_chat).
|
||||||
|
|
||||||
#### Enhanced LLM Inferences
|
#### Enhanced LLM Inferences
|
||||||
Autogen also helps maximize the utility out of the expensive LLMs such as ChatGPT and GPT-4. It offers a drop-in replacement of `openai.Completion` or `openai.ChatCompletion` with powerful functionalites like tuning, caching, error handling, templating. For example, you can optimize generations by LLM with your own tuning data, success metrics and budgets.
|
Autogen also helps maximize the utility out of the expensive LLMs such as ChatGPT and GPT-4. It offers a drop-in replacement of `openai.Completion` or `openai.ChatCompletion` with powerful functionalites like tuning, caching, error handling, templating. For example, you can optimize generations by LLM with your own tuning data, success metrics and budgets.
|
||||||
```python
|
```python
|
||||||
|
@ -54,11 +57,13 @@ config, analysis = autogen.Completion.tune(
|
||||||
response = autogen.Completion.create(context=test_instance, **config)
|
response = autogen.Completion.create(context=test_instance, **config)
|
||||||
```
|
```
|
||||||
|
|
||||||
|
* [Code examples](/docs/Examples/AutoGen-Inference).
|
||||||
|
* [Documentation](/docs/Use-Cases/enhanced_inference).
|
||||||
|
|
||||||
### Where to Go Next?
|
### Where to Go Next?
|
||||||
|
|
||||||
* Understand the use cases for [multi-agent conversation](/docs/Use-Cases/agent_chat).
|
* Understand the use cases for [multi-agent conversation](/docs/Use-Cases/agent_chat) and [enhanced LLM inference](/docs/Use-Cases/enhanced_inference).
|
||||||
* Understand the use cases for [enhanced LLM inference](/docs/Use-Cases/enhanced_inference).
|
* Find [code examples](/docs/Examples/AutoGen-AgentChat).
|
||||||
* Find code examples from [Examples](/docs/Examples/AutoGen-AgentChat).
|
|
||||||
* Read [SDK](/docs/reference/agentchat/conversable_agent/).
|
* Read [SDK](/docs/reference/agentchat/conversable_agent/).
|
||||||
* Learn about [research](/docs/Research) around AutoGen.
|
* Learn about [research](/docs/Research) around AutoGen.
|
||||||
* Chat on [Discord](https://discord.gg/pAbnFJrkgZ).
|
* Chat on [Discord](https://discord.gg/pAbnFJrkgZ).
|
||||||
|
|
Loading…
Reference in New Issue