From 7dad875f43d03b39bdd4051de83f9812d71be47c Mon Sep 17 00:00:00 2001 From: Yogesh Haribhau Kulkarni Date: Wed, 25 Oct 2023 23:00:05 +0530 Subject: [PATCH] Added LM Studio way of serving open-source models (#377) * Added LM Studio way of serving open-source models Works on Windows too. The current suggested way of 'modelz' works on UNIX only. * Update open_source_language_model_example.ipynb --- notebook/open_source_language_model_example.ipynb | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/notebook/open_source_language_model_example.ipynb b/notebook/open_source_language_model_example.ipynb index d5d010e4f..45d96bc1b 100644 --- a/notebook/open_source_language_model_example.ipynb +++ b/notebook/open_source_language_model_example.ipynb @@ -10,7 +10,9 @@ "AutoGen is compatible with the OpenAI API library in Python for executing language models. Consequently, it can work with any models employing a similar API without the necessity to modify your AutoGen code.\n", "\n", "\n", - "In this guide, we will utilize the [modelz-llm](https://github.com/tensorchord/modelz-llm) package to illustrate how to locally serve a model and integrate AutoGen with the served model.\n" + "In this guide, we will utilize the [modelz-llm](https://github.com/tensorchord/modelz-llm) package to illustrate how to locally serve a model and integrate AutoGen with the served model.\n", + "Actually there are multiple ways to serve the local model in Open AI API compatible way. At this point in time [modelz-llm](https://github.com/tensorchord/modelz-llm) is not usable on Unix.\n", + "For Windows, [LM Studio](https://lmstudio.ai/) works. It allows downloading, checking and serving the local Open Source models. More details of how to use are [here](https://medium.com/p/97cba96b0f75).\n" ] }, {