mirror of https://github.com/microsoft/autogen.git
Add ChromaDBVectorMemory in Extensions (#5308)
<!-- Thank you for your contribution! Please review https://microsoft.github.io/autogen/docs/Contribute before opening a pull request. --> <!-- Please add a reviewer to the assignee section when you create a PR. If you don't have the access to it, we will shortly find a reviewer and assign them to your PR. --> ## Why are these changes needed? <!-- Please give a short summary of the change and the problem this solves. --> Shows an example of how to use the `Memory` interface to implement a just-in-time vector memory based on chromadb. ```python import os from pathlib import Path from autogen_agentchat.agents import AssistantAgent from autogen_agentchat.ui import Console from autogen_core.memory import MemoryContent, MemoryMimeType from autogen_ext.memory.chromadb import ChromaDBVectorMemory, PersistentChromaDBVectorMemoryConfig from autogen_ext.models.openai import OpenAIChatCompletionClient # Initialize ChromaDB memory with custom config chroma_user_memory = ChromaDBVectorMemory( config=PersistentChromaDBVectorMemoryConfig( collection_name="preferences", persistence_path=os.path.join(str(Path.home()), ".chromadb_autogen"), k=2, # Return top k results score_threshold=0.4, # Minimum similarity score ) ) # a HttpChromaDBVectorMemoryConfig is also supported for connecting to a remote ChromaDB server # Add user preferences to memory await chroma_user_memory.add( MemoryContent( content="The weather should be in metric units", mime_type=MemoryMimeType.TEXT, metadata={"category": "preferences", "type": "units"}, ) ) await chroma_user_memory.add( MemoryContent( content="Meal recipe must be vegan", mime_type=MemoryMimeType.TEXT, metadata={"category": "preferences", "type": "dietary"}, ) ) # Create assistant agent with ChromaDB memory assistant_agent = AssistantAgent( name="assistant_agent", model_client=OpenAIChatCompletionClient( model="gpt-4o", ), tools=[get_weather], memory=[user_memory], ) stream = assistant_agent.run_stream(task="What is the weather in New York?") await Console(stream) await user_memory.close() ``` ```txt ---------- user ---------- What is the weather in New York? ---------- assistant_agent ---------- [MemoryContent(content='The weather should be in metric units', mime_type='MemoryMimeType.TEXT', metadata={'category': 'preferences', 'mime_type': 'MemoryMimeType.TEXT', 'type': 'units', 'score': 0.4342913043162201, 'id': '8a8d683c-5866-41e1-ac17-08c4fda6da86'}), MemoryContent(content='The weather should be in metric units', mime_type='MemoryMimeType.TEXT', metadata={'category': 'preferences', 'mime_type': 'MemoryMimeType.TEXT', 'type': 'units', 'score': 0.4342913043162201, 'id': 'f27af42c-cb63-46f0-b26b-ffcc09955ca1'})] ---------- assistant_agent ---------- [FunctionCall(id='call_a8U3YEj2dxA065vyzdfXDtNf', arguments='{"city":"New York","units":"metric"}', name='get_weather')] ---------- assistant_agent ---------- [FunctionExecutionResult(content='The weather in New York is 23 °C and Sunny.', call_id='call_a8U3YEj2dxA065vyzdfXDtNf', is_error=False)] ---------- assistant_agent ---------- The weather in New York is 23 °C and Sunny. ``` Note that MemoryContent object in the MemoryQuery events have useful metadata like the score and id retrieved memories. ## Related issue number <!-- For example: "Closes #1234" --> ## Checks - [ ] I've included any doc changes needed for https://microsoft.github.io/autogen/. See https://microsoft.github.io/autogen/docs/Contribute#documentation to build and test documentation locally. - [ ] I've added tests (if relevant) corresponding to the changes introduced in this PR. - [ ] I've made sure all auto checks have passed.
This commit is contained in:
parent
7bbf8e075d
commit
78ef148c88
|
@ -4,29 +4,29 @@
|
|||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Memory\n",
|
||||
"## Memory \n",
|
||||
"\n",
|
||||
"There are several use cases where it is valuable to maintain a _store_ of useful facts that can be intelligently added to the context of the agent just before a specific step. The typically use case here is a RAG pattern where a query is used to retrieve relevant information from a database that is then added to the agent's context.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"AgentChat provides a {py:class}`~autogen_core.memory.Memory` protocol that can be extended to provide this functionality. The key methods are `query`, `update_context`, `add`, `clear`, and `close`.\n",
|
||||
"AgentChat provides a {py:class}`~autogen_core.memory.Memory` protocol that can be extended to provide this functionality. The key methods are `query`, `update_context`, `add`, `clear`, and `close`. \n",
|
||||
"\n",
|
||||
"- `add`: add new entries to the memory store\n",
|
||||
"- `query`: retrieve relevant information from the memory store\n",
|
||||
"- `update_context`: mutate an agent's internal `model_context` by adding the retrieved information (used in the {py:class}`~autogen_agentchat.agents.AssistantAgent` class)\n",
|
||||
"- `query`: retrieve relevant information from the memory store \n",
|
||||
"- `update_context`: mutate an agent's internal `model_context` by adding the retrieved information (used in the {py:class}`~autogen_agentchat.agents.AssistantAgent` class) \n",
|
||||
"- `clear`: clear all entries from the memory store\n",
|
||||
"- `close`: clean up any resources used by the memory store\n",
|
||||
"- `close`: clean up any resources used by the memory store \n",
|
||||
"\n",
|
||||
"\n",
|
||||
"## ListMemory Example\n",
|
||||
"\n",
|
||||
"{py:class}`~autogen_core.memory.ListMemory` is provided as an example implementation of the {py:class}`~autogen_core.memory.Memory protocol`. It is a simple list-based memory implementation that maintains memories in chronological order, appending the most recent memories to the model's context. The implementation is designed to be straightforward and predictable, making it easy to understand and debug.\n",
|
||||
"{py:class}~autogen_core.memory.ListMemory is provided as an example implementation of the {py:class}~autogen_core.memory.Memory protocol. It is a simple list-based memory implementation that maintains memories in chronological order, appending the most recent memories to the model's context. The implementation is designed to be straightforward and predictable, making it easy to understand and debug.\n",
|
||||
"In the following example, we will use ListMemory to maintain a memory bank of user preferences and demonstrate how it can be used to provide consistent context for agent responses over time."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 1,
|
||||
"execution_count": 2,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
|
@ -38,7 +38,7 @@
|
|||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 2,
|
||||
"execution_count": 3,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
|
@ -72,16 +72,32 @@
|
|||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 3,
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"---------- user ----------\n",
|
||||
"What is the weather in New York?\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"[MemoryContent(content='The weather should be in metric units', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None), MemoryContent(content='Meal recipe must be vegan', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None)]\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"[FunctionCall(id='call_GpimUGED5zUbfxORaGo2JD6F', arguments='{\"city\":\"New York\",\"units\":\"metric\"}', name='get_weather')]\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"[FunctionExecutionResult(content='The weather in New York is 23 °C and Sunny.', call_id='call_GpimUGED5zUbfxORaGo2JD6F', is_error=False)]\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"The weather in New York is 23 °C and Sunny.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"TaskResult(messages=[TextMessage(source='user', models_usage=None, content='What is the weather in New York?', type='TextMessage'), MemoryQueryEvent(source='assistant_agent', models_usage=None, content=[MemoryContent(content='The weather should be in metric units', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None, timestamp=None, source=None, score=None), MemoryContent(content='Meal recipe must be vegan', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None, timestamp=None, source=None, score=None)], type='MemoryQueryEvent'), ToolCallRequestEvent(source='assistant_agent', models_usage=RequestUsage(prompt_tokens=123, completion_tokens=20), content=[FunctionCall(id='call_pHq4p89gW6oGjGr3VsVETCYX', arguments='{\"city\":\"New York\",\"units\":\"metric\"}', name='get_weather')], type='ToolCallRequestEvent'), ToolCallExecutionEvent(source='assistant_agent', models_usage=None, content=[FunctionExecutionResult(content='The weather in New York is 23 °C and Sunny.', call_id='call_pHq4p89gW6oGjGr3VsVETCYX')], type='ToolCallExecutionEvent'), ToolCallSummaryMessage(source='assistant_agent', models_usage=None, content='The weather in New York is 23 °C and Sunny.', type='ToolCallSummaryMessage')], stop_reason=None)"
|
||||
"TaskResult(messages=[TextMessage(source='user', models_usage=None, metadata={}, content='What is the weather in New York?', type='TextMessage'), MemoryQueryEvent(source='assistant_agent', models_usage=None, metadata={}, content=[MemoryContent(content='The weather should be in metric units', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None), MemoryContent(content='Meal recipe must be vegan', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None)], type='MemoryQueryEvent'), ToolCallRequestEvent(source='assistant_agent', models_usage=RequestUsage(prompt_tokens=123, completion_tokens=20), metadata={}, content=[FunctionCall(id='call_GpimUGED5zUbfxORaGo2JD6F', arguments='{\"city\":\"New York\",\"units\":\"metric\"}', name='get_weather')], type='ToolCallRequestEvent'), ToolCallExecutionEvent(source='assistant_agent', models_usage=None, metadata={}, content=[FunctionExecutionResult(content='The weather in New York is 23 °C and Sunny.', call_id='call_GpimUGED5zUbfxORaGo2JD6F', is_error=False)], type='ToolCallExecutionEvent'), ToolCallSummaryMessage(source='assistant_agent', models_usage=None, metadata={}, content='The weather in New York is 23 °C and Sunny.', type='ToolCallSummaryMessage')], stop_reason=None)"
|
||||
]
|
||||
},
|
||||
"execution_count": 3,
|
||||
"execution_count": 4,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
|
@ -101,7 +117,7 @@
|
|||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 4,
|
||||
"execution_count": 5,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
|
@ -109,11 +125,11 @@
|
|||
"text/plain": [
|
||||
"[UserMessage(content='What is the weather in New York?', source='user', type='UserMessage'),\n",
|
||||
" SystemMessage(content='\\nRelevant memory content (in chronological order):\\n1. The weather should be in metric units\\n2. Meal recipe must be vegan\\n', type='SystemMessage'),\n",
|
||||
" AssistantMessage(content=[FunctionCall(id='call_pHq4p89gW6oGjGr3VsVETCYX', arguments='{\"city\":\"New York\",\"units\":\"metric\"}', name='get_weather')], source='assistant_agent', type='AssistantMessage'),\n",
|
||||
" FunctionExecutionResultMessage(content=[FunctionExecutionResult(content='The weather in New York is 23 °C and Sunny.', call_id='call_pHq4p89gW6oGjGr3VsVETCYX')], type='FunctionExecutionResultMessage')]"
|
||||
" AssistantMessage(content=[FunctionCall(id='call_GpimUGED5zUbfxORaGo2JD6F', arguments='{\"city\":\"New York\",\"units\":\"metric\"}', name='get_weather')], thought=None, source='assistant_agent', type='AssistantMessage'),\n",
|
||||
" FunctionExecutionResultMessage(content=[FunctionExecutionResult(content='The weather in New York is 23 °C and Sunny.', call_id='call_GpimUGED5zUbfxORaGo2JD6F', is_error=False)], type='FunctionExecutionResultMessage')]"
|
||||
]
|
||||
},
|
||||
"execution_count": 4,
|
||||
"execution_count": 5,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
|
@ -128,21 +144,65 @@
|
|||
"source": [
|
||||
"We see above that the weather is returned in Centigrade as stated in the user preferences. \n",
|
||||
"\n",
|
||||
"Similarly, assuming we ask a separate question about generating a meal plan, the agent is able to retrieve relevant information from the memory store and provide a personalized response."
|
||||
"Similarly, assuming we ask a separate question about generating a meal plan, the agent is able to retrieve relevant information from the memory store and provide a personalized (vegan) response."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"---------- user ----------\n",
|
||||
"Write brief meal recipe with broth\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"[MemoryContent(content='The weather should be in metric units', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None), MemoryContent(content='Meal recipe must be vegan', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None)]\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"Here's a brief vegan broth recipe:\n",
|
||||
"\n",
|
||||
"**Vegan Vegetable Broth**\n",
|
||||
"\n",
|
||||
"**Ingredients:**\n",
|
||||
"- 2 tablespoons olive oil\n",
|
||||
"- 1 large onion, chopped\n",
|
||||
"- 2 carrots, sliced\n",
|
||||
"- 2 celery stalks, sliced\n",
|
||||
"- 4 cloves garlic, minced\n",
|
||||
"- 1 teaspoon salt\n",
|
||||
"- 1/2 teaspoon pepper\n",
|
||||
"- 1 bay leaf\n",
|
||||
"- 1 teaspoon thyme\n",
|
||||
"- 1 teaspoon rosemary\n",
|
||||
"- 8 cups water\n",
|
||||
"- 1 cup mushrooms, sliced\n",
|
||||
"- 1 cup chopped leafy greens (e.g., kale, spinach)\n",
|
||||
"- 1 tablespoon soy sauce (optional)\n",
|
||||
"\n",
|
||||
"**Instructions:**\n",
|
||||
"\n",
|
||||
"1. **Sauté Vegetables:** In a large pot, heat olive oil over medium heat. Add the onion, carrots, and celery, and sauté until they begin to soften, about 5-7 minutes.\n",
|
||||
"\n",
|
||||
"2. **Add Garlic & Seasonings:** Stir in the garlic, salt, pepper, bay leaf, thyme, and rosemary. Cook for another 2 minutes until fragrant.\n",
|
||||
"\n",
|
||||
"3. **Simmer Broth:** Pour in the water, add mushrooms and soy sauce (if using). Increase heat and bring to a boil. Reduce heat and let it simmer for 30-45 minutes.\n",
|
||||
"\n",
|
||||
"4. **Add Greens:** In the last 5 minutes of cooking, add the chopped leafy greens.\n",
|
||||
"\n",
|
||||
"5. **Strain & Serve:** Remove from heat, strain out the vegetables (or leave them in for a chunkier texture), and adjust seasoning if needed. Serve hot as a base for soups or enjoy as is!\n",
|
||||
"\n",
|
||||
"Enjoy your flavorful, nourishing vegan broth! TERMINATE\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"TaskResult(messages=[TextMessage(source='user', models_usage=None, content='Write brief meal recipe with broth', type='TextMessage'), MemoryQueryEvent(source='assistant_agent', models_usage=None, content=[MemoryContent(content='The weather should be in metric units', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None, timestamp=None, source=None, score=None), MemoryContent(content='Meal recipe must be vegan', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None, timestamp=None, source=None, score=None)], type='MemoryQueryEvent'), TextMessage(source='assistant_agent', models_usage=RequestUsage(prompt_tokens=208, completion_tokens=253), content=\"Here's a brief vegan meal recipe using broth:\\n\\n**Vegan Mushroom & Herb Broth Soup**\\n\\n**Ingredients:**\\n- 1 tablespoon olive oil\\n- 1 onion, diced\\n- 2 cloves garlic, minced\\n- 250g mushrooms, sliced\\n- 1 carrot, diced\\n- 1 celery stalk, diced\\n- 4 cups vegetable broth\\n- 1 teaspoon thyme\\n- 1 teaspoon rosemary\\n- Salt and pepper to taste\\n- Fresh parsley for garnish\\n\\n**Instructions:**\\n1. Heat the olive oil in a large pot over medium heat. Add the diced onion and garlic, and sauté until the onion becomes translucent.\\n\\n2. Add the sliced mushrooms, carrot, and celery. Continue to sauté until the mushrooms are cooked through and the vegetables begin to soften, about 5 minutes.\\n\\n3. Pour in the vegetable broth. Stir in the thyme and rosemary, and bring the mixture to a boil.\\n\\n4. Reduce the heat to low and let the soup simmer for about 15 minutes, allowing the flavors to meld together.\\n\\n5. Season with salt and pepper to taste.\\n\\n6. Serve hot, garnished with fresh parsley.\\n\\nEnjoy your warm and comforting vegan mushroom & herb broth soup! \\n\\nTERMINATE\", type='TextMessage')], stop_reason=None)"
|
||||
"TaskResult(messages=[TextMessage(source='user', models_usage=None, metadata={}, content='Write brief meal recipe with broth', type='TextMessage'), MemoryQueryEvent(source='assistant_agent', models_usage=None, metadata={}, content=[MemoryContent(content='The weather should be in metric units', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None), MemoryContent(content='Meal recipe must be vegan', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None)], type='MemoryQueryEvent'), TextMessage(source='assistant_agent', models_usage=RequestUsage(prompt_tokens=208, completion_tokens=331), metadata={}, content=\"Here's a brief vegan broth recipe:\\n\\n**Vegan Vegetable Broth**\\n\\n**Ingredients:**\\n- 2 tablespoons olive oil\\n- 1 large onion, chopped\\n- 2 carrots, sliced\\n- 2 celery stalks, sliced\\n- 4 cloves garlic, minced\\n- 1 teaspoon salt\\n- 1/2 teaspoon pepper\\n- 1 bay leaf\\n- 1 teaspoon thyme\\n- 1 teaspoon rosemary\\n- 8 cups water\\n- 1 cup mushrooms, sliced\\n- 1 cup chopped leafy greens (e.g., kale, spinach)\\n- 1 tablespoon soy sauce (optional)\\n\\n**Instructions:**\\n\\n1. **Sauté Vegetables:** In a large pot, heat olive oil over medium heat. Add the onion, carrots, and celery, and sauté until they begin to soften, about 5-7 minutes.\\n\\n2. **Add Garlic & Seasonings:** Stir in the garlic, salt, pepper, bay leaf, thyme, and rosemary. Cook for another 2 minutes until fragrant.\\n\\n3. **Simmer Broth:** Pour in the water, add mushrooms and soy sauce (if using). Increase heat and bring to a boil. Reduce heat and let it simmer for 30-45 minutes.\\n\\n4. **Add Greens:** In the last 5 minutes of cooking, add the chopped leafy greens.\\n\\n5. **Strain & Serve:** Remove from heat, strain out the vegetables (or leave them in for a chunkier texture), and adjust seasoning if needed. Serve hot as a base for soups or enjoy as is!\\n\\nEnjoy your flavorful, nourishing vegan broth! TERMINATE\", type='TextMessage')], stop_reason=None)"
|
||||
]
|
||||
},
|
||||
"execution_count": 5,
|
||||
"execution_count": 6,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
|
@ -160,8 +220,124 @@
|
|||
"\n",
|
||||
"You can build on the `Memory` protocol to implement more complex memory stores. For example, you could implement a custom memory store that uses a vector database to store and retrieve information, or a memory store that uses a machine learning model to generate personalized responses based on the user's preferences etc.\n",
|
||||
"\n",
|
||||
"Specifically, you will need to overload the `add`, `query` and `update_context` methods to implement the desired functionality and pass the memory store to your agent.\n"
|
||||
"Specifically, you will need to overload the `add`, `query` and `update_context` methods to implement the desired functionality and pass the memory store to your agent.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"Currently the following example memory stores are available as part of the {py:class}`~autogen_ext` extensions package. \n",
|
||||
"\n",
|
||||
"- `autogen_ext.memory.chromadb.ChromaDBVectorMemory`: A memory store that uses a vector database to store and retrieve information. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"name": "stdout",
|
||||
"output_type": "stream",
|
||||
"text": [
|
||||
"---------- user ----------\n",
|
||||
"What is the weather in New York?\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"[MemoryContent(content='The weather should be in metric units', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None), MemoryContent(content='Meal recipe must be vegan', mime_type=<MemoryMimeType.TEXT: 'text/plain'>, metadata=None)]\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"[FunctionCall(id='call_PKcfeEHXimGG2QOhJwXzCLuZ', arguments='{\"city\":\"New York\",\"units\":\"metric\"}', name='get_weather')]\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"[FunctionExecutionResult(content='The weather in New York is 23 °C and Sunny.', call_id='call_PKcfeEHXimGG2QOhJwXzCLuZ', is_error=False)]\n",
|
||||
"---------- assistant_agent ----------\n",
|
||||
"The weather in New York is 23 °C and Sunny.\n"
|
||||
]
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"from pathlib import Path\n",
|
||||
"\n",
|
||||
"from autogen_agentchat.agents import AssistantAgent\n",
|
||||
"from autogen_agentchat.ui import Console\n",
|
||||
"from autogen_core.memory import MemoryContent, MemoryMimeType\n",
|
||||
"from autogen_ext.memory.chromadb import ChromaDBVectorMemory, PersistentChromaDBVectorMemoryConfig\n",
|
||||
"from autogen_ext.models.openai import OpenAIChatCompletionClient\n",
|
||||
"\n",
|
||||
"# Initialize ChromaDB memory with custom config\n",
|
||||
"chroma_user_memory = ChromaDBVectorMemory(\n",
|
||||
" config=PersistentChromaDBVectorMemoryConfig(\n",
|
||||
" collection_name=\"preferences\",\n",
|
||||
" persistence_path=os.path.join(str(Path.home()), \".chromadb_autogen\"),\n",
|
||||
" k=2, # Return top k results\n",
|
||||
" score_threshold=0.4, # Minimum similarity score\n",
|
||||
" )\n",
|
||||
")\n",
|
||||
"# a HttpChromaDBVectorMemoryConfig is also supported for connecting to a remote ChromaDB server\n",
|
||||
"\n",
|
||||
"# Add user preferences to memory\n",
|
||||
"await chroma_user_memory.add(\n",
|
||||
" MemoryContent(\n",
|
||||
" content=\"The weather should be in metric units\",\n",
|
||||
" mime_type=MemoryMimeType.TEXT,\n",
|
||||
" metadata={\"category\": \"preferences\", \"type\": \"units\"},\n",
|
||||
" )\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"await chroma_user_memory.add(\n",
|
||||
" MemoryContent(\n",
|
||||
" content=\"Meal recipe must be vegan\",\n",
|
||||
" mime_type=MemoryMimeType.TEXT,\n",
|
||||
" metadata={\"category\": \"preferences\", \"type\": \"dietary\"},\n",
|
||||
" )\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Create assistant agent with ChromaDB memory\n",
|
||||
"assistant_agent = AssistantAgent(\n",
|
||||
" name=\"assistant_agent\",\n",
|
||||
" model_client=OpenAIChatCompletionClient(\n",
|
||||
" model=\"gpt-4o\",\n",
|
||||
" ),\n",
|
||||
" tools=[get_weather],\n",
|
||||
" memory=[user_memory],\n",
|
||||
")\n",
|
||||
"\n",
|
||||
"stream = assistant_agent.run_stream(task=\"What is the weather in New York?\")\n",
|
||||
"await Console(stream)\n",
|
||||
"\n",
|
||||
"await user_memory.close()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Note that you can also serialize the ChromaDBVectorMemory and save it to disk."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": 5,
|
||||
"metadata": {},
|
||||
"outputs": [
|
||||
{
|
||||
"data": {
|
||||
"text/plain": [
|
||||
"'{\"provider\":\"autogen_ext.memory.chromadb.ChromaDBVectorMemory\",\"component_type\":\"memory\",\"version\":1,\"component_version\":1,\"description\":\"ChromaDB-based vector memory implementation with similarity search.\",\"label\":\"ChromaDBVectorMemory\",\"config\":{\"client_type\":\"persistent\",\"collection_name\":\"preferences\",\"distance_metric\":\"cosine\",\"k\":2,\"score_threshold\":0.4,\"allow_reset\":false,\"tenant\":\"default_tenant\",\"database\":\"default_database\",\"persistence_path\":\"/Users/victordibia/.chromadb_autogen\"}}'"
|
||||
]
|
||||
},
|
||||
"execution_count": 5,
|
||||
"metadata": {},
|
||||
"output_type": "execute_result"
|
||||
}
|
||||
],
|
||||
"source": [
|
||||
"chroma_user_memory.dump_component().model_dump_json()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
|
|
|
@ -33,6 +33,7 @@ file-surfer = [
|
|||
"markitdown>=0.0.1a2",
|
||||
]
|
||||
graphrag = ["graphrag>=1.0.1"]
|
||||
chromadb = ["chromadb"]
|
||||
web-surfer = [
|
||||
"autogen-agentchat==0.4.8",
|
||||
"playwright>=1.48.0",
|
||||
|
|
|
@ -0,0 +1 @@
|
|||
|
|
@ -0,0 +1,422 @@
|
|||
import logging
|
||||
import uuid
|
||||
from typing import Any, Dict, List, Literal
|
||||
|
||||
from autogen_core import CancellationToken, Component, Image
|
||||
from autogen_core.memory import Memory, MemoryContent, MemoryMimeType, MemoryQueryResult, UpdateContextResult
|
||||
from autogen_core.model_context import ChatCompletionContext
|
||||
from autogen_core.models import SystemMessage
|
||||
from chromadb import HttpClient, PersistentClient
|
||||
from chromadb.api import ClientAPI
|
||||
from chromadb.api.models.Collection import Collection
|
||||
from chromadb.api.types import Document, IncludeEnum, Metadata
|
||||
from pydantic import BaseModel, Field
|
||||
from typing_extensions import Self
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
try:
|
||||
from chromadb.api import ClientAPI
|
||||
except ImportError as e:
|
||||
raise ImportError(
|
||||
"To use the ChromaDBVectorMemory the chromadb extra must be installed. Run `pip install autogen-ext[chromadb]`"
|
||||
) from e
|
||||
|
||||
|
||||
class ChromaDBVectorMemoryConfig(BaseModel):
|
||||
"""Base configuration for ChromaDB-based memory implementation."""
|
||||
|
||||
client_type: Literal["persistent", "http"]
|
||||
collection_name: str = Field(default="memory_store", description="Name of the ChromaDB collection")
|
||||
distance_metric: str = Field(default="cosine", description="Distance metric for similarity search")
|
||||
k: int = Field(default=3, description="Number of results to return in queries")
|
||||
score_threshold: float | None = Field(default=None, description="Minimum similarity score threshold")
|
||||
allow_reset: bool = Field(default=False, description="Whether to allow resetting the ChromaDB client")
|
||||
tenant: str = Field(default="default_tenant", description="Tenant to use")
|
||||
database: str = Field(default="default_database", description="Database to use")
|
||||
|
||||
|
||||
class PersistentChromaDBVectorMemoryConfig(ChromaDBVectorMemoryConfig):
|
||||
"""Configuration for persistent ChromaDB memory."""
|
||||
|
||||
client_type: Literal["persistent", "http"] = "persistent"
|
||||
persistence_path: str = Field(default="./chroma_db", description="Path for persistent storage")
|
||||
|
||||
|
||||
class HttpChromaDBVectorMemoryConfig(ChromaDBVectorMemoryConfig):
|
||||
"""Configuration for HTTP ChromaDB memory."""
|
||||
|
||||
client_type: Literal["persistent", "http"] = "http"
|
||||
host: str = Field(default="localhost", description="Host of the remote server")
|
||||
port: int = Field(default=8000, description="Port of the remote server")
|
||||
ssl: bool = Field(default=False, description="Whether to use HTTPS")
|
||||
headers: Dict[str, str] | None = Field(default=None, description="Headers to send to the server")
|
||||
|
||||
|
||||
class ChromaDBVectorMemory(Memory, Component[ChromaDBVectorMemoryConfig]):
|
||||
"""
|
||||
Store and retrieve memory using vector similarity search powered by ChromaDB.
|
||||
|
||||
`ChromaDBVectorMemory` provides a vector-based memory implementation that uses ChromaDB for
|
||||
storing and retrieving content based on semantic similarity. It enhances agents with the ability
|
||||
to recall contextually relevant information during conversations by leveraging vector embeddings
|
||||
to find similar content.
|
||||
|
||||
This implementation serves as a reference for more complex memory systems using vector embeddings.
|
||||
For advanced use cases requiring specialized formatting of retrieved content, users should extend
|
||||
this class and override the `update_context()` method.
|
||||
|
||||
.. note::
|
||||
|
||||
This implementation requires the ChromaDB extra to be installed. Install with:
|
||||
`pip install autogen-ext[chromadb]`
|
||||
|
||||
Args:
|
||||
config (ChromaDBVectorMemoryConfig | None): Configuration for the ChromaDB memory.
|
||||
If None, defaults to a PersistentChromaDBVectorMemoryConfig with default values.
|
||||
Two config types are supported:
|
||||
- PersistentChromaDBVectorMemoryConfig: For local storage
|
||||
- HttpChromaDBVectorMemoryConfig: For connecting to a remote ChromaDB server
|
||||
|
||||
Example:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
from autogen_agentchat.agents import AssistantAgent
|
||||
from autogen_core.memory import MemoryContent, MemoryMimeType
|
||||
from autogen_ext.memory.chromadb import ChromaDBVectorMemory, PersistentChromaDBVectorMemoryConfig
|
||||
from autogen_ext.models.openai import OpenAIChatCompletionClient
|
||||
|
||||
# Initialize ChromaDB memory with custom config
|
||||
memory = ChromaDBVectorMemory(
|
||||
config=PersistentChromaDBVectorMemoryConfig(
|
||||
collection_name="user_preferences",
|
||||
persistence_path=os.path.join(str(Path.home()), ".chromadb_autogen"),
|
||||
k=3, # Return top 3 results
|
||||
score_threshold=0.5, # Minimum similarity score
|
||||
)
|
||||
)
|
||||
|
||||
# Add user preferences to memory
|
||||
await memory.add(
|
||||
MemoryContent(
|
||||
content="The user prefers temperatures in Celsius",
|
||||
mime_type=MemoryMimeType.TEXT,
|
||||
metadata={"category": "preferences", "type": "units"},
|
||||
)
|
||||
)
|
||||
|
||||
# Create assistant agent with ChromaDB memory
|
||||
assistant = AssistantAgent(
|
||||
name="assistant",
|
||||
model_client=OpenAIChatCompletionClient(
|
||||
model="gpt-4o",
|
||||
),
|
||||
memory=[memory],
|
||||
)
|
||||
|
||||
# The memory will automatically retrieve relevant content during conversations
|
||||
stream = assistant.run_stream(task="What's the weather in New York?")
|
||||
|
||||
# Remember to close the memory when finished
|
||||
await memory.close()
|
||||
"""
|
||||
|
||||
component_config_schema = ChromaDBVectorMemoryConfig
|
||||
component_provider_override = "autogen_ext.memory.chromadb.ChromaDBVectorMemory"
|
||||
|
||||
def __init__(self, config: ChromaDBVectorMemoryConfig | None = None) -> None:
|
||||
"""Initialize ChromaDBVectorMemory."""
|
||||
self._config = config or PersistentChromaDBVectorMemoryConfig()
|
||||
self._client: ClientAPI | None = None
|
||||
self._collection: Collection | None = None
|
||||
|
||||
@property
|
||||
def collection_name(self) -> str:
|
||||
"""Get the name of the ChromaDB collection."""
|
||||
return self._config.collection_name
|
||||
|
||||
def _ensure_initialized(self) -> None:
|
||||
"""Ensure ChromaDB client and collection are initialized."""
|
||||
if self._client is None:
|
||||
try:
|
||||
from chromadb.config import Settings
|
||||
|
||||
settings = Settings(allow_reset=self._config.allow_reset)
|
||||
|
||||
if isinstance(self._config, PersistentChromaDBVectorMemoryConfig):
|
||||
self._client = PersistentClient(
|
||||
path=self._config.persistence_path,
|
||||
settings=settings,
|
||||
tenant=self._config.tenant,
|
||||
database=self._config.database,
|
||||
)
|
||||
elif isinstance(self._config, HttpChromaDBVectorMemoryConfig):
|
||||
self._client = HttpClient(
|
||||
host=self._config.host,
|
||||
port=self._config.port,
|
||||
ssl=self._config.ssl,
|
||||
headers=self._config.headers,
|
||||
settings=settings,
|
||||
tenant=self._config.tenant,
|
||||
database=self._config.database,
|
||||
)
|
||||
else:
|
||||
raise ValueError(f"Unsupported config type: {type(self._config)}")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to initialize ChromaDB client: {e}")
|
||||
raise
|
||||
|
||||
if self._collection is None:
|
||||
try:
|
||||
self._collection = self._client.get_or_create_collection(
|
||||
name=self._config.collection_name, metadata={"distance_metric": self._config.distance_metric}
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to get/create collection: {e}")
|
||||
raise
|
||||
|
||||
def _extract_text(self, content_item: str | MemoryContent) -> str:
|
||||
"""Extract searchable text from content."""
|
||||
if isinstance(content_item, str):
|
||||
return content_item
|
||||
|
||||
content = content_item.content
|
||||
mime_type = content_item.mime_type
|
||||
|
||||
if mime_type in [MemoryMimeType.TEXT, MemoryMimeType.MARKDOWN]:
|
||||
return str(content)
|
||||
elif mime_type == MemoryMimeType.JSON:
|
||||
if isinstance(content, dict):
|
||||
# Store original JSON string representation
|
||||
return str(content).lower()
|
||||
raise ValueError("JSON content must be a dict")
|
||||
elif isinstance(content, Image):
|
||||
raise ValueError("Image content cannot be converted to text")
|
||||
else:
|
||||
raise ValueError(f"Unsupported content type: {mime_type}")
|
||||
|
||||
def _calculate_score(self, distance: float) -> float:
|
||||
"""Convert ChromaDB distance to a similarity score."""
|
||||
if self._config.distance_metric == "cosine":
|
||||
return 1.0 - (distance / 2.0)
|
||||
return 1.0 / (1.0 + distance)
|
||||
|
||||
async def update_context(
|
||||
self,
|
||||
model_context: ChatCompletionContext,
|
||||
) -> UpdateContextResult:
|
||||
"""
|
||||
Update the model context with relevant memory content.
|
||||
|
||||
This method retrieves memory content relevant to the last message in the context
|
||||
and adds it as a system message. It serves as the primary customization point for
|
||||
how retrieved memories are incorporated into the conversation.
|
||||
|
||||
By default, this implementation:
|
||||
1. Uses the last message as a query to find semantically similar memories
|
||||
2. Formats retrieved memories as a numbered list
|
||||
3. Adds them to the context as a system message
|
||||
|
||||
For custom memory formatting, extend this class and override this method.
|
||||
|
||||
Args:
|
||||
model_context (ChatCompletionContext): The model context to update with relevant memories.
|
||||
|
||||
Returns:
|
||||
UpdateContextResult: Object containing the memories that were used to update the context.
|
||||
|
||||
Example:
|
||||
|
||||
.. code-block:: python
|
||||
|
||||
from autogen_core.memory import Memory, MemoryContent, MemoryQueryResult, UpdateContextResult
|
||||
from autogen_core.model_context import ChatCompletionContext
|
||||
from autogen_core.models import SystemMessage
|
||||
from autogen_ext.memory.chromadb import ChromaDBVectorMemory, PersistentChromaDBVectorMemoryConfig
|
||||
|
||||
|
||||
class CustomVectorMemory(ChromaDBVectorMemory):
|
||||
async def update_context(
|
||||
self,
|
||||
model_context: ChatCompletionContext,
|
||||
) -> UpdateContextResult:
|
||||
# Get the last message to use as query
|
||||
messages = await model_context.get_messages()
|
||||
if not messages:
|
||||
return UpdateContextResult(memories=MemoryQueryResult(results=[]))
|
||||
|
||||
# Get query results
|
||||
last_message = messages[-1]
|
||||
query_text = last_message.content if isinstance(last_message.content, str) else str(last_message)
|
||||
query_results = await self.query(query_text)
|
||||
|
||||
if query_results.results:
|
||||
# Custom formatting based on memory category
|
||||
memory_strings = []
|
||||
for memory in query_results.results:
|
||||
category = memory.metadata.get("category", "general")
|
||||
if category == "preferences":
|
||||
memory_strings.append(f"User Preference: {memory.content}")
|
||||
else:
|
||||
memory_strings.append(f"Memory: {memory.content}")
|
||||
|
||||
# Add to context with custom header
|
||||
memory_context = "IMPORTANT USER INFORMATION:\n" + "\n".join(memory_strings)
|
||||
await model_context.add_message(SystemMessage(content=memory_context))
|
||||
|
||||
return UpdateContextResult(memories=query_results)
|
||||
"""
|
||||
messages = await model_context.get_messages()
|
||||
if not messages:
|
||||
return UpdateContextResult(memories=MemoryQueryResult(results=[]))
|
||||
|
||||
# Extract query from last message
|
||||
last_message = messages[-1]
|
||||
query_text = last_message.content if isinstance(last_message.content, str) else str(last_message)
|
||||
|
||||
# Query memory and get results
|
||||
query_results = await self.query(query_text)
|
||||
|
||||
if query_results.results:
|
||||
# Format results for context
|
||||
memory_strings = [f"{i}. {str(memory.content)}" for i, memory in enumerate(query_results.results, 1)]
|
||||
memory_context = "\nRelevant memory content:\n" + "\n".join(memory_strings)
|
||||
|
||||
# Add to context
|
||||
await model_context.add_message(SystemMessage(content=memory_context))
|
||||
|
||||
return UpdateContextResult(memories=query_results)
|
||||
|
||||
async def add(self, content: MemoryContent, cancellation_token: CancellationToken | None = None) -> None:
|
||||
"""Add a memory content to ChromaDB."""
|
||||
self._ensure_initialized()
|
||||
if self._collection is None:
|
||||
raise RuntimeError("Failed to initialize ChromaDB")
|
||||
|
||||
try:
|
||||
# Extract text from content
|
||||
text = self._extract_text(content)
|
||||
|
||||
# Use metadata directly from content
|
||||
metadata_dict = content.metadata or {}
|
||||
metadata_dict["mime_type"] = str(content.mime_type)
|
||||
|
||||
# Add to ChromaDB
|
||||
self._collection.add(documents=[text], metadatas=[metadata_dict], ids=[str(uuid.uuid4())])
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to add content to ChromaDB: {e}")
|
||||
raise
|
||||
|
||||
async def query(
|
||||
self,
|
||||
query: str | MemoryContent,
|
||||
cancellation_token: CancellationToken | None = None,
|
||||
**kwargs: Any,
|
||||
) -> MemoryQueryResult:
|
||||
"""Query memory content based on vector similarity."""
|
||||
self._ensure_initialized()
|
||||
if self._collection is None:
|
||||
raise RuntimeError("Failed to initialize ChromaDB")
|
||||
|
||||
try:
|
||||
# Extract text for query
|
||||
query_text = self._extract_text(query)
|
||||
|
||||
# Query ChromaDB
|
||||
results = self._collection.query(
|
||||
query_texts=[query_text],
|
||||
n_results=self._config.k,
|
||||
include=[IncludeEnum.documents, IncludeEnum.metadatas, IncludeEnum.distances],
|
||||
**kwargs,
|
||||
)
|
||||
|
||||
# Convert results to MemoryContent list
|
||||
memory_results: List[MemoryContent] = []
|
||||
|
||||
if (
|
||||
not results
|
||||
or not results.get("documents")
|
||||
or not results.get("metadatas")
|
||||
or not results.get("distances")
|
||||
):
|
||||
return MemoryQueryResult(results=memory_results)
|
||||
|
||||
documents: List[Document] = results["documents"][0] if results["documents"] else []
|
||||
metadatas: List[Metadata] = results["metadatas"][0] if results["metadatas"] else []
|
||||
distances: List[float] = results["distances"][0] if results["distances"] else []
|
||||
ids: List[str] = results["ids"][0] if results["ids"] else []
|
||||
|
||||
for doc, metadata_dict, distance, doc_id in zip(documents, metadatas, distances, ids, strict=False):
|
||||
# Calculate score
|
||||
score = self._calculate_score(distance)
|
||||
metadata = dict(metadata_dict)
|
||||
metadata["score"] = score
|
||||
metadata["id"] = doc_id
|
||||
if self._config.score_threshold is not None and score < self._config.score_threshold:
|
||||
continue
|
||||
|
||||
# Extract mime_type from metadata
|
||||
mime_type = str(metadata_dict.get("mime_type", MemoryMimeType.TEXT.value))
|
||||
|
||||
# Create MemoryContent
|
||||
content = MemoryContent(
|
||||
content=doc,
|
||||
mime_type=mime_type,
|
||||
metadata=metadata,
|
||||
)
|
||||
memory_results.append(content)
|
||||
|
||||
return MemoryQueryResult(results=memory_results)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to query ChromaDB: {e}")
|
||||
raise
|
||||
|
||||
async def clear(self) -> None:
|
||||
"""Clear all entries from memory."""
|
||||
self._ensure_initialized()
|
||||
if self._collection is None:
|
||||
raise RuntimeError("Failed to initialize ChromaDB")
|
||||
|
||||
try:
|
||||
results = self._collection.get()
|
||||
if results and results["ids"]:
|
||||
self._collection.delete(ids=results["ids"])
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to clear ChromaDB collection: {e}")
|
||||
raise
|
||||
|
||||
async def close(self) -> None:
|
||||
"""Clean up ChromaDB client and resources."""
|
||||
self._collection = None
|
||||
self._client = None
|
||||
|
||||
async def reset(self) -> None:
|
||||
"""Reset the memory by deleting all data."""
|
||||
self._ensure_initialized()
|
||||
if not self._config.allow_reset:
|
||||
raise RuntimeError("Reset not allowed. Set allow_reset=True in config to enable.")
|
||||
|
||||
if self._client is not None:
|
||||
try:
|
||||
self._client.reset()
|
||||
except Exception as e:
|
||||
logger.error(f"Error during ChromaDB reset: {e}")
|
||||
finally:
|
||||
self._collection = None
|
||||
|
||||
def _to_config(self) -> ChromaDBVectorMemoryConfig:
|
||||
"""Serialize the memory configuration."""
|
||||
|
||||
return self._config
|
||||
|
||||
@classmethod
|
||||
def _from_config(cls, config: ChromaDBVectorMemoryConfig) -> Self:
|
||||
"""Deserialize the memory configuration."""
|
||||
|
||||
return cls(config=config)
|
|
@ -0,0 +1,242 @@
|
|||
from pathlib import Path
|
||||
|
||||
import pytest
|
||||
from autogen_core.memory import MemoryContent, MemoryMimeType
|
||||
from autogen_core.model_context import BufferedChatCompletionContext
|
||||
from autogen_core.models import UserMessage
|
||||
from autogen_ext.memory.chromadb import ChromaDBVectorMemory, PersistentChromaDBVectorMemoryConfig
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def base_config(tmp_path: Path) -> PersistentChromaDBVectorMemoryConfig:
|
||||
"""Create base configuration without score threshold."""
|
||||
return PersistentChromaDBVectorMemoryConfig(
|
||||
collection_name="test_collection", allow_reset=True, k=3, persistence_path=str(tmp_path / "chroma_db")
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def strict_config(tmp_path: Path) -> PersistentChromaDBVectorMemoryConfig:
|
||||
"""Create configuration with strict score threshold."""
|
||||
return PersistentChromaDBVectorMemoryConfig(
|
||||
collection_name="test_collection",
|
||||
allow_reset=True,
|
||||
k=3,
|
||||
score_threshold=0.8, # High threshold for strict matching
|
||||
persistence_path=str(tmp_path / "chroma_db_strict"),
|
||||
)
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def lenient_config(tmp_path: Path) -> PersistentChromaDBVectorMemoryConfig:
|
||||
"""Create configuration with lenient score threshold."""
|
||||
return PersistentChromaDBVectorMemoryConfig(
|
||||
collection_name="test_collection",
|
||||
allow_reset=True,
|
||||
k=3,
|
||||
score_threshold=0.0, # No threshold for maximum retrieval
|
||||
persistence_path=str(tmp_path / "chroma_db_lenient"),
|
||||
)
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_basic_workflow(base_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test basic memory operations with default threshold."""
|
||||
memory = ChromaDBVectorMemory(config=base_config)
|
||||
await memory.clear()
|
||||
|
||||
await memory.add(
|
||||
MemoryContent(
|
||||
content="Paris is known for the Eiffel Tower and amazing cuisine.",
|
||||
mime_type=MemoryMimeType.TEXT,
|
||||
metadata={"category": "city", "country": "France"},
|
||||
)
|
||||
)
|
||||
|
||||
results = await memory.query("Tell me about Paris")
|
||||
assert len(results.results) > 0
|
||||
assert any("Paris" in str(r.content) for r in results.results)
|
||||
assert all(isinstance(r.metadata.get("score"), float) for r in results.results if r.metadata)
|
||||
|
||||
await memory.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_content_types(lenient_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test different content types with lenient matching."""
|
||||
memory = ChromaDBVectorMemory(config=lenient_config)
|
||||
await memory.clear()
|
||||
|
||||
# Test text content
|
||||
text_content = MemoryContent(content="Simple text content for testing", mime_type=MemoryMimeType.TEXT)
|
||||
await memory.add(text_content)
|
||||
|
||||
# Test JSON content
|
||||
json_data = {"key": "value", "number": 42}
|
||||
json_content = MemoryContent(content=json_data, mime_type=MemoryMimeType.JSON)
|
||||
await memory.add(json_content)
|
||||
|
||||
# Query for text content
|
||||
results = await memory.query("simple text content")
|
||||
assert len(results.results) > 0
|
||||
assert any("Simple text content" in str(r.content) for r in results.results)
|
||||
|
||||
# Query for JSON content
|
||||
results = await memory.query("value")
|
||||
result_contents = [str(r.content).lower() for r in results.results]
|
||||
assert any("value" in content for content in result_contents)
|
||||
|
||||
await memory.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_strict_matching(strict_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test matching behavior with high score threshold."""
|
||||
memory = ChromaDBVectorMemory(config=strict_config)
|
||||
await memory.clear()
|
||||
|
||||
await memory.add(
|
||||
MemoryContent(content="Specific technical details about quantum computing", mime_type=MemoryMimeType.TEXT)
|
||||
)
|
||||
|
||||
# Exact query should match
|
||||
exact_results = await memory.query("quantum computing details")
|
||||
assert len(exact_results.results) > 0
|
||||
assert all(
|
||||
result.metadata and result.metadata.get("score", 0) >= strict_config.score_threshold
|
||||
for result in exact_results.results
|
||||
)
|
||||
|
||||
# Unrelated query should not match due to high threshold
|
||||
unrelated_results = await memory.query("recipe for cake")
|
||||
assert len(unrelated_results.results) == 0
|
||||
|
||||
await memory.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_metadata_handling(base_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test metadata handling with default threshold."""
|
||||
memory = ChromaDBVectorMemory(config=base_config)
|
||||
await memory.clear()
|
||||
|
||||
test_content = "Test content with specific metadata"
|
||||
content = MemoryContent(
|
||||
content=test_content,
|
||||
mime_type=MemoryMimeType.TEXT,
|
||||
metadata={"test_category": "test", "test_priority": 1, "test_weight": 0.5, "test_verified": True},
|
||||
)
|
||||
await memory.add(content)
|
||||
|
||||
results = await memory.query(test_content)
|
||||
assert len(results.results) > 0
|
||||
result = results.results[0]
|
||||
|
||||
assert result.metadata is not None
|
||||
assert result.metadata.get("test_category") == "test"
|
||||
assert result.metadata.get("test_priority") == 1
|
||||
assert isinstance(result.metadata.get("test_weight"), float)
|
||||
assert result.metadata.get("test_verified") is True
|
||||
|
||||
await memory.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_error_handling(base_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test error cases with default threshold."""
|
||||
memory = ChromaDBVectorMemory(config=base_config)
|
||||
await memory.clear()
|
||||
|
||||
with pytest.raises(ValueError, match="Unsupported content type"):
|
||||
await memory.add(MemoryContent(content=b"binary data", mime_type=MemoryMimeType.BINARY))
|
||||
|
||||
with pytest.raises(ValueError, match="JSON content must be a dict"):
|
||||
await memory.add(MemoryContent(content="not a dict", mime_type=MemoryMimeType.JSON))
|
||||
|
||||
await memory.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_initialization(base_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test initialization with default threshold."""
|
||||
memory = ChromaDBVectorMemory(config=base_config)
|
||||
|
||||
# Test that the collection_name property returns the expected value
|
||||
# This implicitly tests that initialization succeeds
|
||||
assert memory.collection_name == "test_collection"
|
||||
|
||||
# Add something to verify the collection is working
|
||||
test_content = MemoryContent(content="Test initialization content", mime_type=MemoryMimeType.TEXT)
|
||||
await memory.add(test_content)
|
||||
|
||||
# Verify we can query the added content
|
||||
results = await memory.query("Test initialization")
|
||||
assert len(results.results) > 0
|
||||
|
||||
# Use the public reset method
|
||||
await memory.reset()
|
||||
|
||||
# Verify the reset worked by checking that the previous content is gone
|
||||
results_after_reset = await memory.query("Test initialization")
|
||||
assert len(results_after_reset.results) == 0
|
||||
|
||||
# Add new content to verify re-initialization happened automatically
|
||||
new_content = MemoryContent(content="New test content after reset", mime_type=MemoryMimeType.TEXT)
|
||||
await memory.add(new_content)
|
||||
|
||||
# Verify we can query the new content
|
||||
new_results = await memory.query("New test")
|
||||
assert len(new_results.results) > 0
|
||||
|
||||
await memory.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_model_context_update(base_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test updating model context with retrieved memories."""
|
||||
memory = ChromaDBVectorMemory(config=base_config)
|
||||
await memory.clear()
|
||||
|
||||
# Add content to memory
|
||||
await memory.add(
|
||||
MemoryContent(
|
||||
content="Jupiter is the largest planet in our solar system.",
|
||||
mime_type=MemoryMimeType.TEXT,
|
||||
metadata={"category": "astronomy"},
|
||||
)
|
||||
)
|
||||
|
||||
# Create a model context with a message
|
||||
context = BufferedChatCompletionContext(buffer_size=5)
|
||||
await context.add_message(UserMessage(content="Tell me about Jupiter", source="user"))
|
||||
|
||||
# Update context with memory
|
||||
result = await memory.update_context(context)
|
||||
|
||||
# Verify results
|
||||
assert len(result.memories.results) > 0
|
||||
assert any("Jupiter" in str(r.content) for r in result.memories.results)
|
||||
|
||||
# Verify context was updated
|
||||
messages = await context.get_messages()
|
||||
assert len(messages) > 1 # Should have the original message plus the memory content
|
||||
|
||||
await memory.close()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_component_serialization(base_config: PersistentChromaDBVectorMemoryConfig) -> None:
|
||||
"""Test serialization and deserialization of the component."""
|
||||
memory = ChromaDBVectorMemory(config=base_config)
|
||||
|
||||
# Serialize
|
||||
memory_config = memory.dump_component()
|
||||
assert memory_config.config["collection_name"] == base_config.collection_name
|
||||
|
||||
# Deserialize
|
||||
loaded_memory = ChromaDBVectorMemory.load_component(memory_config)
|
||||
|
||||
assert isinstance(loaded_memory, ChromaDBVectorMemory)
|
||||
|
||||
await memory.close()
|
||||
await loaded_memory.close()
|
347
python/uv.lock
347
python/uv.lock
|
@ -355,6 +355,18 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/f8/ed/e97229a566617f2ae958a6b13e7cc0f585470eac730a73e9e82c32a3cdd2/arrow-1.3.0-py3-none-any.whl", hash = "sha256:c728b120ebc00eb84e01882a6f5e7927a53960aa990ce7dd2b10f39005a67f80", size = 66419 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "asgiref"
|
||||
version = "3.8.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "typing-extensions", marker = "python_full_version < '3.11'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/29/38/b3395cc9ad1b56d2ddac9970bc8f4141312dbaec28bc7c218b0dfafd0f42/asgiref-3.8.1.tar.gz", hash = "sha256:c343bd80a0bec947a9860adb4c432ffa7db769836c64238fc34bdc3fec84d590", size = 35186 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/39/e3/893e8757be2612e6c266d9bb58ad2e3651524b5b40cf56761e985a28b13e/asgiref-3.8.1-py3-none-any.whl", hash = "sha256:3e1e3ecc849832fe52ccf2cb6686b7a55f82bb1d6aee72a58826471390335e47", size = 23828 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "asttokens"
|
||||
version = "2.4.1"
|
||||
|
@ -582,6 +594,9 @@ azure = [
|
|||
{ name = "azure-core" },
|
||||
{ name = "azure-identity" },
|
||||
]
|
||||
chromadb = [
|
||||
{ name = "chromadb" },
|
||||
]
|
||||
diskcache = [
|
||||
{ name = "diskcache" },
|
||||
]
|
||||
|
@ -704,6 +719,7 @@ requires-dist = [
|
|||
{ name = "azure-ai-inference", marker = "extra == 'azure'", specifier = ">=1.0.0b7" },
|
||||
{ name = "azure-core", marker = "extra == 'azure'" },
|
||||
{ name = "azure-identity", marker = "extra == 'azure'" },
|
||||
{ name = "chromadb", marker = "extra == 'chromadb'" },
|
||||
{ name = "diskcache", marker = "extra == 'diskcache'", specifier = ">=5.6.3" },
|
||||
{ name = "docker", marker = "extra == 'docker'", specifier = "~=7.0" },
|
||||
{ name = "ffmpeg-python", marker = "extra == 'video-surfer'" },
|
||||
|
@ -957,6 +973,47 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/ed/20/bc79bc575ba2e2a7f70e8a1155618bb1301eaa5132a8271373a6903f73f8/babel-2.16.0-py3-none-any.whl", hash = "sha256:368b5b98b37c06b7daf6696391c3240c938b37767d4584413e8438c5c435fa8b", size = 9587599 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "backoff"
|
||||
version = "2.2.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/47/d7/5bbeb12c44d7c4f2fb5b56abce497eb5ed9f34d85701de869acedd602619/backoff-2.2.1.tar.gz", hash = "sha256:03f829f5bb1923180821643f8753b0502c3b682293992485b0eef2807afa5cba", size = 17001 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/df/73/b6e24bd22e6720ca8ee9a85a0c4a2971af8497d8f3193fa05390cbd46e09/backoff-2.2.1-py3-none-any.whl", hash = "sha256:63579f9a0628e06278f7e47b7d7d5b6ce20dc65c5e96a6f3ca99a6adca0396e8", size = 15148 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "bcrypt"
|
||||
version = "4.2.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/56/8c/dd696962612e4cd83c40a9e6b3db77bfe65a830f4b9af44098708584686c/bcrypt-4.2.1.tar.gz", hash = "sha256:6765386e3ab87f569b276988742039baab087b2cdb01e809d74e74503c2faafe", size = 24427 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/bc/ca/e17b08c523adb93d5f07a226b2bd45a7c6e96b359e31c1e99f9db58cb8c3/bcrypt-4.2.1-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:1340411a0894b7d3ef562fb233e4b6ed58add185228650942bdc885362f32c17", size = 489982 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6a/be/e7c6e0fd6087ee8fc6d77d8d9e817e9339d879737509019b9a9012a1d96f/bcrypt-4.2.1-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b1ee315739bc8387aa36ff127afc99120ee452924e0df517a8f3e4c0187a0f5f", size = 273108 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/53/ac084b7d985aee1a5f2b086d501f550862596dbf73220663b8c17427e7f2/bcrypt-4.2.1-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8dbd0747208912b1e4ce730c6725cb56c07ac734b3629b60d4398f082ea718ad", size = 278733 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/ab/b8710a3d6231c587e575ead0b1c45bb99f5454f9f579c9d7312c17b069cc/bcrypt-4.2.1-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:aaa2e285be097050dba798d537b6efd9b698aa88eef52ec98d23dcd6d7cf6fea", size = 273856 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9d/e5/2fd1ea6395358ffdfd4afe370d5b52f71408f618f781772a48971ef3b92b/bcrypt-4.2.1-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:76d3e352b32f4eeb34703370e370997065d28a561e4a18afe4fef07249cb4396", size = 279067 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4e/ef/f2cb7a0f7e1ed800a604f8ab256fb0afcf03c1540ad94ff771ce31e794aa/bcrypt-4.2.1-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:b7703ede632dc945ed1172d6f24e9f30f27b1b1a067f32f68bf169c5f08d0425", size = 306851 },
|
||||
{ url = "https://files.pythonhosted.org/packages/de/cb/578b0023c6a5ca16a177b9044ba6bd6032277bd3ef020fb863eccd22e49b/bcrypt-4.2.1-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:89df2aea2c43be1e1fa066df5f86c8ce822ab70a30e4c210968669565c0f4685", size = 310793 },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/bc/9d501ee9d754f63d4b1086b64756c284facc3696de9b556c146279a124a5/bcrypt-4.2.1-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:04e56e3fe8308a88b77e0afd20bec516f74aecf391cdd6e374f15cbed32783d6", size = 320957 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/25/2ec4ce5740abc43182bfc064b9acbbf5a493991246985e8b2bfe231ead64/bcrypt-4.2.1-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:cfdf3d7530c790432046c40cda41dfee8c83e29482e6a604f8930b9930e94139", size = 339958 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6d/64/fd67788f64817727897d31e9cdeeeba3941eaad8540733c05c7eac4aa998/bcrypt-4.2.1-cp37-abi3-win32.whl", hash = "sha256:adadd36274510a01f33e6dc08f5824b97c9580583bd4487c564fc4617b328005", size = 160912 },
|
||||
{ url = "https://files.pythonhosted.org/packages/00/8f/fe834eaa54abbd7cab8607e5020fa3a0557e929555b9e4ca404b4adaab06/bcrypt-4.2.1-cp37-abi3-win_amd64.whl", hash = "sha256:8c458cd103e6c5d1d85cf600e546a639f234964d0228909d8f8dbeebff82d526", size = 152981 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4a/57/23b46933206daf5384b5397d9878746d2249fe9d45efaa8e1467c87d3048/bcrypt-4.2.1-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:8ad2f4528cbf0febe80e5a3a57d7a74e6635e41af1ea5675282a33d769fba413", size = 489842 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/28/3ea8a39ddd4938b6c6b6136816d72ba5e659e2d82b53d843c8c53455ac4d/bcrypt-4.2.1-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:909faa1027900f2252a9ca5dfebd25fc0ef1417943824783d1c8418dd7d6df4a", size = 272500 },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/7f/b43622999f5d4de06237a195ac5501ac83516adf571b907228cd14bac8fe/bcrypt-4.2.1-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cde78d385d5e93ece5479a0a87f73cd6fa26b171c786a884f955e165032b262c", size = 278368 },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/68/f2e3959014b4d8874c747e6e171d46d3e63a3a39aaca8417a8d837eda0a8/bcrypt-4.2.1-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:533e7f3bcf2f07caee7ad98124fab7499cb3333ba2274f7a36cf1daee7409d99", size = 273335 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/c3/4b4bad4da852924427c651589d464ad1aa624f94dd904ddda8493b0a35e5/bcrypt-4.2.1-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:687cf30e6681eeda39548a93ce9bfbb300e48b4d445a43db4298d2474d2a1e54", size = 278614 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/5a/ee107961e84c41af2ac201d0460f962b6622ff391255ffd46429e9e09dc1/bcrypt-4.2.1-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:041fa0155c9004eb98a232d54da05c0b41d4b8e66b6fc3cb71b4b3f6144ba837", size = 306464 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/72/916e14fa12d2b1d1fc6c26ea195337419da6dd23d0bf53ac61ef3739e5c5/bcrypt-4.2.1-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f85b1ffa09240c89aa2e1ae9f3b1c687104f7b2b9d2098da4e923f1b7082d331", size = 310674 },
|
||||
{ url = "https://files.pythonhosted.org/packages/97/92/3dc76d8bfa23300591eec248e950f85bd78eb608c96bd4747ce4cc06acdb/bcrypt-4.2.1-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:c6f5fa3775966cca251848d4d5393ab016b3afed251163c1436fefdec3b02c84", size = 320577 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/ab/a6c0da5c2cf86600f74402a72b06dfe365e1a1d30783b1bbeec460fd57d1/bcrypt-4.2.1-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:807261df60a8b1ccd13e6599c779014a362ae4e795f5c59747f60208daddd96d", size = 339836 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/b4/e75b6e9a72a030a04362034022ebe317c5b735d04db6ad79237101ae4a5c/bcrypt-4.2.1-cp39-abi3-win32.whl", hash = "sha256:b588af02b89d9fad33e5f98f7838bf590d6d692df7153647724a7f20c186f6bf", size = 160911 },
|
||||
{ url = "https://files.pythonhosted.org/packages/76/b9/d51d34e6cd6d887adddb28a8680a1d34235cc45b9d6e238ce39b98199ca0/bcrypt-4.2.1-cp39-abi3-win_amd64.whl", hash = "sha256:e84e0e6f8e40a242b11bce56c313edc2be121cec3e0ec2d76fce01f6af33c07c", size = 153078 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4e/6e/7193067042de23af3d71882f898c8c0bd2b18e6ee44a4f76e395dfadb5a8/bcrypt-4.2.1-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:76132c176a6d9953cdc83c296aeaed65e1a708485fd55abf163e0d9f8f16ce0e", size = 270069 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/05/2546085c6dc07a45627460a39e6291b82382b434fff2bd0167ff3bc31eb1/bcrypt-4.2.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e158009a54c4c8bc91d5e0da80920d048f918c61a581f0a63e4e93bb556d362f", size = 274652 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "beartype"
|
||||
version = "0.18.5"
|
||||
|
@ -1036,6 +1093,22 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/3e/05/43bae794c8e5f42d79e1c24205bc0c7447b3909a446de46cf231fa6b39dd/botocore-1.36.8-py3-none-any.whl", hash = "sha256:59d3fdfbae6d916b046e973bebcbeb70a102f9e570ca86d5ba512f1854b78fc2", size = 13318382 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "build"
|
||||
version = "1.2.2.post1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "(os_name == 'nt' and platform_machine != 'aarch64' and sys_platform == 'linux') or (os_name == 'nt' and sys_platform != 'darwin' and sys_platform != 'linux')" },
|
||||
{ name = "importlib-metadata", marker = "python_full_version < '3.10.2'" },
|
||||
{ name = "packaging" },
|
||||
{ name = "pyproject-hooks" },
|
||||
{ name = "tomli", marker = "python_full_version < '3.11'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/7d/46/aeab111f8e06793e4f0e421fcad593d547fb8313b50990f31681ee2fb1ad/build-1.2.2.post1.tar.gz", hash = "sha256:b36993e92ca9375a219c99e606a122ff365a760a2d4bba0caa09bd5278b608b7", size = 46701 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/84/c2/80633736cd183ee4a62107413def345f7e6e3c01563dbca1417363cf957e/build-1.2.2.post1-py3-none-any.whl", hash = "sha256:1d61c0887fa860c01971625baae8bdd338e517b836a2f70dd1f7aa3a6b2fc5b5", size = 22950 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "cachetools"
|
||||
version = "5.5.1"
|
||||
|
@ -1205,6 +1278,70 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/52/93/342cc62a70ab727e093ed98e02a725d85b746345f05d2b5e5034649f4ec8/chevron-0.14.0-py3-none-any.whl", hash = "sha256:fbf996a709f8da2e745ef763f482ce2d311aa817d287593a5b990d6d6e4f0443", size = 11595 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "chroma-hnswlib"
|
||||
version = "0.7.6"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "numpy" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/73/09/10d57569e399ce9cbc5eee2134996581c957f63a9addfa6ca657daf006b8/chroma_hnswlib-0.7.6.tar.gz", hash = "sha256:4dce282543039681160259d29fcde6151cc9106c6461e0485f57cdccd83059b7", size = 32256 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a8/74/b9dde05ea8685d2f8c4681b517e61c7887e974f6272bb24ebc8f2105875b/chroma_hnswlib-0.7.6-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f35192fbbeadc8c0633f0a69c3d3e9f1a4eab3a46b65458bbcbcabdd9e895c36", size = 195821 },
|
||||
{ url = "https://files.pythonhosted.org/packages/fd/58/101bfa6bc41bc6cc55fbb5103c75462a7bf882e1704256eb4934df85b6a8/chroma_hnswlib-0.7.6-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:6f007b608c96362b8f0c8b6b2ac94f67f83fcbabd857c378ae82007ec92f4d82", size = 183854 },
|
||||
{ url = "https://files.pythonhosted.org/packages/17/ff/95d49bb5ce134f10d6aa08d5f3bec624eaff945f0b17d8c3fce888b9a54a/chroma_hnswlib-0.7.6-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:456fd88fa0d14e6b385358515aef69fc89b3c2191706fd9aee62087b62aad09c", size = 2358774 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/6d/27826180a54df80dbba8a4f338b022ba21c0c8af96fd08ff8510626dee8f/chroma_hnswlib-0.7.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5dfaae825499c2beaa3b75a12d7ec713b64226df72a5c4097203e3ed532680da", size = 2392739 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d6/63/ee3e8b7a8f931918755faacf783093b61f32f59042769d9db615999c3de0/chroma_hnswlib-0.7.6-cp310-cp310-win_amd64.whl", hash = "sha256:2487201982241fb1581be26524145092c95902cb09fc2646ccfbc407de3328ec", size = 150955 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f5/af/d15fdfed2a204c0f9467ad35084fbac894c755820b203e62f5dcba2d41f1/chroma_hnswlib-0.7.6-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:81181d54a2b1e4727369486a631f977ffc53c5533d26e3d366dda243fb0998ca", size = 196911 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0d/19/aa6f2139f1ff7ad23a690ebf2a511b2594ab359915d7979f76f3213e46c4/chroma_hnswlib-0.7.6-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4b4ab4e11f1083dd0a11ee4f0e0b183ca9f0f2ed63ededba1935b13ce2b3606f", size = 185000 },
|
||||
{ url = "https://files.pythonhosted.org/packages/79/b1/1b269c750e985ec7d40b9bbe7d66d0a890e420525187786718e7f6b07913/chroma_hnswlib-0.7.6-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:53db45cd9173d95b4b0bdccb4dbff4c54a42b51420599c32267f3abbeb795170", size = 2377289 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/2d/d5663e134436e5933bc63516a20b5edc08b4c1b1588b9680908a5f1afd04/chroma_hnswlib-0.7.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5c093f07a010b499c00a15bc9376036ee4800d335360570b14f7fe92badcdcf9", size = 2411755 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3e/79/1bce519cf186112d6d5ce2985392a89528c6e1e9332d680bf752694a4cdf/chroma_hnswlib-0.7.6-cp311-cp311-win_amd64.whl", hash = "sha256:0540b0ac96e47d0aa39e88ea4714358ae05d64bbe6bf33c52f316c664190a6a3", size = 151888 },
|
||||
{ url = "https://files.pythonhosted.org/packages/93/ac/782b8d72de1c57b64fdf5cb94711540db99a92768d93d973174c62d45eb8/chroma_hnswlib-0.7.6-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:e87e9b616c281bfbe748d01705817c71211613c3b063021f7ed5e47173556cb7", size = 197804 },
|
||||
{ url = "https://files.pythonhosted.org/packages/32/4e/fd9ce0764228e9a98f6ff46af05e92804090b5557035968c5b4198bc7af9/chroma_hnswlib-0.7.6-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ec5ca25bc7b66d2ecbf14502b5729cde25f70945d22f2aaf523c2d747ea68912", size = 185421 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/3d/b59a8dedebd82545d873235ef2d06f95be244dfece7ee4a1a6044f080b18/chroma_hnswlib-0.7.6-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:305ae491de9d5f3c51e8bd52d84fdf2545a4a2bc7af49765cda286b7bb30b1d4", size = 2389672 },
|
||||
{ url = "https://files.pythonhosted.org/packages/74/1e/80a033ea4466338824974a34f418e7b034a7748bf906f56466f5caa434b0/chroma_hnswlib-0.7.6-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:822ede968d25a2c88823ca078a58f92c9b5c4142e38c7c8b4c48178894a0a3c5", size = 2436986 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "chromadb"
|
||||
version = "0.6.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "bcrypt" },
|
||||
{ name = "build" },
|
||||
{ name = "chroma-hnswlib" },
|
||||
{ name = "fastapi" },
|
||||
{ name = "grpcio" },
|
||||
{ name = "httpx" },
|
||||
{ name = "importlib-resources" },
|
||||
{ name = "kubernetes" },
|
||||
{ name = "mmh3" },
|
||||
{ name = "numpy" },
|
||||
{ name = "onnxruntime" },
|
||||
{ name = "opentelemetry-api" },
|
||||
{ name = "opentelemetry-exporter-otlp-proto-grpc" },
|
||||
{ name = "opentelemetry-instrumentation-fastapi" },
|
||||
{ name = "opentelemetry-sdk" },
|
||||
{ name = "orjson" },
|
||||
{ name = "overrides" },
|
||||
{ name = "posthog" },
|
||||
{ name = "pydantic" },
|
||||
{ name = "pypika" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "rich" },
|
||||
{ name = "tenacity" },
|
||||
{ name = "tokenizers" },
|
||||
{ name = "tqdm" },
|
||||
{ name = "typer" },
|
||||
{ name = "typing-extensions" },
|
||||
{ name = "uvicorn", extra = ["standard"] },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/39/cd/f0f2de3f466ff514fb6b58271c14f6d22198402bb5b71b8d890231265946/chromadb-0.6.3.tar.gz", hash = "sha256:c8f34c0b704b9108b04491480a36d42e894a960429f87c6516027b5481d59ed3", size = 29297929 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/28/8e/5c186c77bf749b6fe0528385e507e463f1667543328d76fd00a49e1a4e6a/chromadb-0.6.3-py3-none-any.whl", hash = "sha256:4851258489a3612b558488d98d09ae0fe0a28d5cad6bd1ba64b96fdc419dc0e5", size = 611129 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "chromedriver-autoinstaller"
|
||||
version = "0.6.4"
|
||||
|
@ -1649,6 +1786,15 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/8f/d7/9322c609343d929e75e7e5e6255e614fcc67572cfd083959cdef3b7aad79/docutils-0.21.2-py3-none-any.whl", hash = "sha256:dafca5b9e384f0e419294eb4d2ff9fa826435bf15f15b7bd45723e8ad76811b2", size = 587408 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "durationpy"
|
||||
version = "0.9"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/31/e9/f49c4e7fccb77fa5c43c2480e09a857a78b41e7331a75e128ed5df45c56b/durationpy-0.9.tar.gz", hash = "sha256:fd3feb0a69a0057d582ef643c355c40d2fa1c942191f914d12203b1a01ac722a", size = 3186 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/a3/ac312faeceffd2d8f86bc6dcb5c401188ba5a01bc88e69bed97578a0dfcd/durationpy-0.9-py3-none-any.whl", hash = "sha256:e65359a7af5cedad07fb77a2dd3f390f8eb0b74cb845589fa6c057086834dd38", size = 3461 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "email-validator"
|
||||
version = "2.2.0"
|
||||
|
@ -2674,6 +2820,15 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/a0/d9/a1e041c5e7caa9a05c925f4bdbdfb7f006d1f74996af53467bc394c97be7/importlib_metadata-8.5.0-py3-none-any.whl", hash = "sha256:45e54197d28b7a7f1559e60b95e7c567032b602131fbd588f1497f47880aa68b", size = 26514 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "importlib-resources"
|
||||
version = "6.5.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/cf/8c/f834fbf984f691b4f7ff60f50b514cc3de5cc08abfc3295564dd89c5e2e7/importlib_resources-6.5.2.tar.gz", hash = "sha256:185f87adef5bcc288449d98fb4fba07cea78bc036455dd44c5fc4a2fe78fed2c", size = 44693 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/ed/1f1afb2e9e7f38a545d628f864d562a5ae64fe6f7a10e28ffb9b185b4e89/importlib_resources-6.5.2-py3-none-any.whl", hash = "sha256:789cfdc3ed28c78b67a06acb8126751ced69a3d5f79c095a98298cd8a760ccec", size = 37461 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "iniconfig"
|
||||
version = "2.0.0"
|
||||
|
@ -3049,6 +3204,28 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/3a/1d/50ad811d1c5dae091e4cf046beba925bcae0a610e79ae4c538f996f63ed5/kiwisolver-1.4.8-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:65ea09a5a3faadd59c2ce96dc7bf0f364986a315949dc6374f04396b0d60e09b", size = 71762 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kubernetes"
|
||||
version = "32.0.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "certifi" },
|
||||
{ name = "durationpy" },
|
||||
{ name = "google-auth" },
|
||||
{ name = "oauthlib" },
|
||||
{ name = "python-dateutil" },
|
||||
{ name = "pyyaml" },
|
||||
{ name = "requests" },
|
||||
{ name = "requests-oauthlib" },
|
||||
{ name = "six" },
|
||||
{ name = "urllib3" },
|
||||
{ name = "websocket-client" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b7/e8/0598f0e8b4af37cd9b10d8b87386cf3173cb8045d834ab5f6ec347a758b3/kubernetes-32.0.1.tar.gz", hash = "sha256:42f43d49abd437ada79a79a16bd48a604d3471a117a8347e87db693f2ba0ba28", size = 946691 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/08/10/9f8af3e6f569685ce3af7faab51c8dd9d93b9c38eba339ca31c746119447/kubernetes-32.0.1-py2.py3-none-any.whl", hash = "sha256:35282ab8493b938b08ab5526c7ce66588232df00ef5e1dbe88a419107dc10998", size = 1988070 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "lancedb"
|
||||
version = "0.17.0"
|
||||
|
@ -3956,6 +4133,71 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/58/e7/7147c75c383a975c58c33f8e7ee7dbbb0e7390fbcb1ecd321f63e4c73efd/mistralai-1.5.0-py3-none-any.whl", hash = "sha256:9372537719f87bd6f9feef4747d0bf1f4fbe971f8c02945ca4b4bf3c94571c97", size = 271559 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mmh3"
|
||||
version = "5.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/47/1b/1fc6888c74cbd8abad1292dde2ddfcf8fc059e114c97dd6bf16d12f36293/mmh3-5.1.0.tar.gz", hash = "sha256:136e1e670500f177f49ec106a4ebf0adf20d18d96990cc36ea492c651d2b406c", size = 33728 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/01/9d06468928661765c0fc248a29580c760a4a53a9c6c52cf72528bae3582e/mmh3-5.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:eaf4ac5c6ee18ca9232238364d7f2a213278ae5ca97897cafaa123fcc7bb8bec", size = 56095 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/d7/7b39307fc9db867b2a9a20c58b0de33b778dd6c55e116af8ea031f1433ba/mmh3-5.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:48f9aa8ccb9ad1d577a16104834ac44ff640d8de8c0caed09a2300df7ce8460a", size = 40512 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/85/728ca68280d8ccc60c113ad119df70ff1748fbd44c89911fed0501faf0b8/mmh3-5.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d4ba8cac21e1f2d4e436ce03a82a7f87cda80378691f760e9ea55045ec480a3d", size = 40110 },
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/96/beaf0e301472ffa00358bbbf771fe2d9c4d709a2fe30b1d929e569f8cbdf/mmh3-5.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d69281c281cb01994f054d862a6bb02a2e7acfe64917795c58934b0872b9ece4", size = 100151 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c3/ee/9381f825c4e09ffafeffa213c3865c4bf7d39771640de33ab16f6faeb854/mmh3-5.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4d05ed3962312fbda2a1589b97359d2467f677166952f6bd410d8c916a55febf", size = 106312 },
|
||||
{ url = "https://files.pythonhosted.org/packages/67/dc/350a54bea5cf397d357534198ab8119cfd0d8e8bad623b520f9c290af985/mmh3-5.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78ae6a03f4cff4aa92ddd690611168856f8c33a141bd3e5a1e0a85521dc21ea0", size = 104232 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b2/5d/2c6eb4a4ec2f7293b98a9c07cb8c64668330b46ff2b6511244339e69a7af/mmh3-5.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:95f983535b39795d9fb7336438faae117424c6798f763d67c6624f6caf2c4c01", size = 91663 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/ac/17030d24196f73ecbab8b5033591e5e0e2beca103181a843a135c78f4fee/mmh3-5.1.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d46fdd80d4c7ecadd9faa6181e92ccc6fe91c50991c9af0e371fdf8b8a7a6150", size = 99166 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b9/ed/54ddc56603561a10b33da9b12e95a48a271d126f4a4951841bbd13145ebf/mmh3-5.1.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0f16e976af7365ea3b5c425124b2a7f0147eed97fdbb36d99857f173c8d8e096", size = 101555 },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/c3/33fb3a940c9b70908a5cc9fcc26534aff8698180f9f63ab6b7cc74da8bcd/mmh3-5.1.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:6fa97f7d1e1f74ad1565127229d510f3fd65d931fdedd707c1e15100bc9e5ebb", size = 94813 },
|
||||
{ url = "https://files.pythonhosted.org/packages/61/88/c9ff76a23abe34db8eee1a6fa4e449462a16c7eb547546fc5594b0860a72/mmh3-5.1.0-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:4052fa4a8561bd62648e9eb993c8f3af3bdedadf3d9687aa4770d10e3709a80c", size = 109611 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/8e/27d04f40e95554ebe782cac7bddda2d158cf3862387298c9c7b254fa7beb/mmh3-5.1.0-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:3f0e8ae9f961037f812afe3cce7da57abf734285961fffbeff9a4c011b737732", size = 100515 },
|
||||
{ url = "https://files.pythonhosted.org/packages/7b/00/504ca8f462f01048f3c87cd93f2e1f60b93dac2f930cd4ed73532a9337f5/mmh3-5.1.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:99297f207db967814f1f02135bb7fe7628b9eacb046134a34e1015b26b06edce", size = 100177 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6f/1d/2efc3525fe6fdf8865972fcbb884bd1f4b0f923c19b80891cecf7e239fa5/mmh3-5.1.0-cp310-cp310-win32.whl", hash = "sha256:2e6c8dc3631a5e22007fbdb55e993b2dbce7985c14b25b572dd78403c2e79182", size = 40815 },
|
||||
{ url = "https://files.pythonhosted.org/packages/38/b5/c8fbe707cb0fea77a6d2d58d497bc9b67aff80deb84d20feb34d8fdd8671/mmh3-5.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:e4e8c7ad5a4dddcfde35fd28ef96744c1ee0f9d9570108aa5f7e77cf9cfdf0bf", size = 41479 },
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/f1/663e16134f913fccfbcea5b300fb7dc1860d8f63dc71867b013eebc10aec/mmh3-5.1.0-cp310-cp310-win_arm64.whl", hash = "sha256:45da549269883208912868a07d0364e1418d8292c4259ca11699ba1b2475bd26", size = 38883 },
|
||||
{ url = "https://files.pythonhosted.org/packages/56/09/fda7af7fe65928262098382e3bf55950cfbf67d30bf9e47731bf862161e9/mmh3-5.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0b529dcda3f951ff363a51d5866bc6d63cf57f1e73e8961f864ae5010647079d", size = 56098 },
|
||||
{ url = "https://files.pythonhosted.org/packages/0c/ab/84c7bc3f366d6f3bd8b5d9325a10c367685bc17c26dac4c068e2001a4671/mmh3-5.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4db1079b3ace965e562cdfc95847312f9273eb2ad3ebea983435c8423e06acd7", size = 40513 },
|
||||
{ url = "https://files.pythonhosted.org/packages/4f/21/25ea58ca4a652bdc83d1528bec31745cce35802381fb4fe3c097905462d2/mmh3-5.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:22d31e3a0ff89b8eb3b826d6fc8e19532998b2aa6b9143698043a1268da413e1", size = 40112 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/78/4f12f16ae074ddda6f06745254fdb50f8cf3c85b0bbf7eaca58bed84bf58/mmh3-5.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2139bfbd354cd6cb0afed51c4b504f29bcd687a3b1460b7e89498329cc28a894", size = 102632 },
|
||||
{ url = "https://files.pythonhosted.org/packages/48/11/8f09dc999cf2a09b6138d8d7fc734efb7b7bfdd9adb9383380941caadff0/mmh3-5.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8c8105c6a435bc2cd6ea2ef59558ab1a2976fd4a4437026f562856d08996673a", size = 108884 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/91/e59a66538a3364176f6c3f7620eee0ab195bfe26f89a95cbcc7a1fb04b28/mmh3-5.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:57730067174a7f36fcd6ce012fe359bd5510fdaa5fe067bc94ed03e65dafb769", size = 106835 },
|
||||
{ url = "https://files.pythonhosted.org/packages/25/14/b85836e21ab90e5cddb85fe79c494ebd8f81d96a87a664c488cc9277668b/mmh3-5.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bde80eb196d7fdc765a318604ded74a4378f02c5b46c17aa48a27d742edaded2", size = 93688 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ac/aa/8bc964067df9262740c95e4cde2d19f149f2224f426654e14199a9e47df6/mmh3-5.1.0-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e9c8eddcb441abddeb419c16c56fd74b3e2df9e57f7aa2903221996718435c7a", size = 101569 },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/b6/1fb163cbf919046a64717466c00edabebece3f95c013853fec76dbf2df92/mmh3-5.1.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:99e07e4acafbccc7a28c076a847fb060ffc1406036bc2005acb1b2af620e53c3", size = 98483 },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/49/ba64c050dd646060f835f1db6b2cd60a6485f3b0ea04976e7a29ace7312e/mmh3-5.1.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:9e25ba5b530e9a7d65f41a08d48f4b3fedc1e89c26486361166a5544aa4cad33", size = 96496 },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/07/f2751d6a0b535bb865e1066e9c6b80852571ef8d61bce7eb44c18720fbfc/mmh3-5.1.0-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:bb9bf7475b4d99156ce2f0cf277c061a17560c8c10199c910a680869a278ddc7", size = 105109 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/02/30360a5a66f7abba44596d747cc1e6fb53136b168eaa335f63454ab7bb79/mmh3-5.1.0-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:2a1b0878dd281ea3003368ab53ff6f568e175f1b39f281df1da319e58a19c23a", size = 98231 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8c/60/8526b0c750ff4d7ae1266e68b795f14b97758a1d9fcc19f6ecabf9c55656/mmh3-5.1.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:25f565093ac8b8aefe0f61f8f95c9a9d11dd69e6a9e9832ff0d293511bc36258", size = 97548 },
|
||||
{ url = "https://files.pythonhosted.org/packages/6d/4c/26e1222aca65769280d5427a1ce5875ef4213449718c8f03958d0bf91070/mmh3-5.1.0-cp311-cp311-win32.whl", hash = "sha256:1e3554d8792387eac73c99c6eaea0b3f884e7130eb67986e11c403e4f9b6d372", size = 40810 },
|
||||
{ url = "https://files.pythonhosted.org/packages/98/d5/424ba95062d1212ea615dc8debc8d57983f2242d5e6b82e458b89a117a1e/mmh3-5.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:8ad777a48197882492af50bf3098085424993ce850bdda406a358b6ab74be759", size = 41476 },
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/08/0315ccaf087ba55bb19a6dd3b1e8acd491e74ce7f5f9c4aaa06a90d66441/mmh3-5.1.0-cp311-cp311-win_arm64.whl", hash = "sha256:f29dc4efd99bdd29fe85ed6c81915b17b2ef2cf853abf7213a48ac6fb3eaabe1", size = 38880 },
|
||||
{ url = "https://files.pythonhosted.org/packages/f4/47/e5f452bdf16028bfd2edb4e2e35d0441e4a4740f30e68ccd4cfd2fb2c57e/mmh3-5.1.0-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:45712987367cb9235026e3cbf4334670522a97751abfd00b5bc8bfa022c3311d", size = 56152 },
|
||||
{ url = "https://files.pythonhosted.org/packages/60/38/2132d537dc7a7fdd8d2e98df90186c7fcdbd3f14f95502a24ba443c92245/mmh3-5.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b1020735eb35086ab24affbea59bb9082f7f6a0ad517cb89f0fc14f16cea4dae", size = 40564 },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/2a/c52cf000581bfb8d94794f58865658e7accf2fa2e90789269d4ae9560b16/mmh3-5.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:babf2a78ce5513d120c358722a2e3aa7762d6071cd10cede026f8b32452be322", size = 40104 },
|
||||
{ url = "https://files.pythonhosted.org/packages/83/33/30d163ce538c54fc98258db5621447e3ab208d133cece5d2577cf913e708/mmh3-5.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d4f47f58cd5cbef968c84a7c1ddc192fef0a36b48b0b8a3cb67354531aa33b00", size = 102634 },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/5c/5a18acb6ecc6852be2d215c3d811aa61d7e425ab6596be940877355d7f3e/mmh3-5.1.0-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2044a601c113c981f2c1e14fa33adc9b826c9017034fe193e9eb49a6882dbb06", size = 108888 },
|
||||
{ url = "https://files.pythonhosted.org/packages/1f/f6/11c556324c64a92aa12f28e221a727b6e082e426dc502e81f77056f6fc98/mmh3-5.1.0-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c94d999c9f2eb2da44d7c2826d3fbffdbbbbcde8488d353fee7c848ecc42b968", size = 106968 },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/61/ca0c196a685aba7808a5c00246f17b988a9c4f55c594ee0a02c273e404f3/mmh3-5.1.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a015dcb24fa0c7a78f88e9419ac74f5001c1ed6a92e70fd1803f74afb26a4c83", size = 93771 },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/55/0927c33528710085ee77b808d85bbbafdb91a1db7c8eaa89cac16d6c513e/mmh3-5.1.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:457da019c491a2d20e2022c7d4ce723675e4c081d9efc3b4d8b9f28a5ea789bd", size = 101726 },
|
||||
{ url = "https://files.pythonhosted.org/packages/49/39/a92c60329fa470f41c18614a93c6cd88821412a12ee78c71c3f77e1cfc2d/mmh3-5.1.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:71408579a570193a4ac9c77344d68ddefa440b00468a0b566dcc2ba282a9c559", size = 98523 },
|
||||
{ url = "https://files.pythonhosted.org/packages/81/90/26adb15345af8d9cf433ae1b6adcf12e0a4cad1e692de4fa9f8e8536c5ae/mmh3-5.1.0-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:8b3a04bc214a6e16c81f02f855e285c6df274a2084787eeafaa45f2fbdef1b63", size = 96628 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/4d/340d1e340df972a13fd4ec84c787367f425371720a1044220869c82364e9/mmh3-5.1.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:832dae26a35514f6d3c1e267fa48e8de3c7b978afdafa0529c808ad72e13ada3", size = 105190 },
|
||||
{ url = "https://files.pythonhosted.org/packages/d3/7c/65047d1cccd3782d809936db446430fc7758bda9def5b0979887e08302a2/mmh3-5.1.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bf658a61fc92ef8a48945ebb1076ef4ad74269e353fffcb642dfa0890b13673b", size = 98439 },
|
||||
{ url = "https://files.pythonhosted.org/packages/72/d2/3c259d43097c30f062050f7e861075099404e8886b5d4dd3cebf180d6e02/mmh3-5.1.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3313577453582b03383731b66447cdcdd28a68f78df28f10d275d7d19010c1df", size = 97780 },
|
||||
{ url = "https://files.pythonhosted.org/packages/29/29/831ea8d4abe96cdb3e28b79eab49cac7f04f9c6b6e36bfc686197ddba09d/mmh3-5.1.0-cp312-cp312-win32.whl", hash = "sha256:1d6508504c531ab86c4424b5a5ff07c1132d063863339cf92f6657ff7a580f76", size = 40835 },
|
||||
{ url = "https://files.pythonhosted.org/packages/12/dd/7cbc30153b73f08eeac43804c1dbc770538a01979b4094edbe1a4b8eb551/mmh3-5.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:aa75981fcdf3f21759d94f2c81b6a6e04a49dfbcdad88b152ba49b8e20544776", size = 41509 },
|
||||
{ url = "https://files.pythonhosted.org/packages/80/9d/627375bab4c90dd066093fc2c9a26b86f87e26d980dbf71667b44cbee3eb/mmh3-5.1.0-cp312-cp312-win_arm64.whl", hash = "sha256:a4c1a76808dfea47f7407a0b07aaff9087447ef6280716fd0783409b3088bb3c", size = 38888 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "monotonic"
|
||||
version = "1.6"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/ea/ca/8e91948b782ddfbd194f323e7e7d9ba12e5877addf04fb2bf8fca38e86ac/monotonic-1.6.tar.gz", hash = "sha256:3a55207bcfed53ddd5c5bae174524062935efed17792e9de2ad0205ce9ad63f7", size = 7615 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/67/7e8406a29b6c45be7af7740456f7f37025f0506ae2e05fb9009a53946860/monotonic-1.6-py2.py3-none-any.whl", hash = "sha256:68687e19a14f11f26d140dd5c86f3dba4bf5df58003000ed467e0e2a69bca96c", size = 8154 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "more-itertools"
|
||||
version = "10.6.0"
|
||||
|
@ -4372,6 +4614,7 @@ name = "nvidia-cublas-cu12"
|
|||
version = "12.4.5.8"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7f/7f/7fbae15a3982dc9595e49ce0f19332423b260045d0a6afe93cdbe2f1f624/nvidia_cublas_cu12-12.4.5.8-py3-none-manylinux2014_aarch64.whl", hash = "sha256:0f8aa1706812e00b9f19dfe0cdb3999b092ccb8ca168c0db5b8ea712456fd9b3", size = 363333771 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/71/1c91302526c45ab494c23f61c7a84aa568b8c1f9d196efa5993957faf906/nvidia_cublas_cu12-12.4.5.8-py3-none-manylinux2014_x86_64.whl", hash = "sha256:2fc8da60df463fdefa81e323eef2e36489e1c94335b5358bcb38360adf75ac9b", size = 363438805 },
|
||||
]
|
||||
|
||||
|
@ -4380,6 +4623,7 @@ name = "nvidia-cuda-cupti-cu12"
|
|||
version = "12.4.127"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/93/b5/9fb3d00386d3361b03874246190dfec7b206fd74e6e287b26a8fcb359d95/nvidia_cuda_cupti_cu12-12.4.127-py3-none-manylinux2014_aarch64.whl", hash = "sha256:79279b35cf6f91da114182a5ce1864997fd52294a87a16179ce275773799458a", size = 12354556 },
|
||||
{ url = "https://files.pythonhosted.org/packages/67/42/f4f60238e8194a3106d06a058d494b18e006c10bb2b915655bd9f6ea4cb1/nvidia_cuda_cupti_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl", hash = "sha256:9dec60f5ac126f7bb551c055072b69d85392b13311fcc1bcda2202d172df30fb", size = 13813957 },
|
||||
]
|
||||
|
||||
|
@ -4388,6 +4632,7 @@ name = "nvidia-cuda-nvrtc-cu12"
|
|||
version = "12.4.127"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/77/aa/083b01c427e963ad0b314040565ea396f914349914c298556484f799e61b/nvidia_cuda_nvrtc_cu12-12.4.127-py3-none-manylinux2014_aarch64.whl", hash = "sha256:0eedf14185e04b76aa05b1fea04133e59f465b6f960c0cbf4e37c3cb6b0ea198", size = 24133372 },
|
||||
{ url = "https://files.pythonhosted.org/packages/2c/14/91ae57cd4db3f9ef7aa99f4019cfa8d54cb4caa7e00975df6467e9725a9f/nvidia_cuda_nvrtc_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl", hash = "sha256:a178759ebb095827bd30ef56598ec182b85547f1508941a3d560eb7ea1fbf338", size = 24640306 },
|
||||
]
|
||||
|
||||
|
@ -4396,6 +4641,7 @@ name = "nvidia-cuda-runtime-cu12"
|
|||
version = "12.4.127"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a1/aa/b656d755f474e2084971e9a297def515938d56b466ab39624012070cb773/nvidia_cuda_runtime_cu12-12.4.127-py3-none-manylinux2014_aarch64.whl", hash = "sha256:961fe0e2e716a2a1d967aab7caee97512f71767f852f67432d572e36cb3a11f3", size = 894177 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ea/27/1795d86fe88ef397885f2e580ac37628ed058a92ed2c39dc8eac3adf0619/nvidia_cuda_runtime_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl", hash = "sha256:64403288fa2136ee8e467cdc9c9427e0434110899d07c779f25b5c068934faa5", size = 883737 },
|
||||
]
|
||||
|
||||
|
@ -4418,6 +4664,7 @@ dependencies = [
|
|||
{ name = "nvidia-nvjitlink-cu12", marker = "(platform_machine != 'aarch64' and sys_platform == 'linux') or (sys_platform != 'darwin' and sys_platform != 'linux')" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7a/8a/0e728f749baca3fbeffad762738276e5df60851958be7783af121a7221e7/nvidia_cufft_cu12-11.2.1.3-py3-none-manylinux2014_aarch64.whl", hash = "sha256:5dad8008fc7f92f5ddfa2101430917ce2ffacd86824914c82e28990ad7f00399", size = 211422548 },
|
||||
{ url = "https://files.pythonhosted.org/packages/27/94/3266821f65b92b3138631e9c8e7fe1fb513804ac934485a8d05776e1dd43/nvidia_cufft_cu12-11.2.1.3-py3-none-manylinux2014_x86_64.whl", hash = "sha256:f083fc24912aa410be21fa16d157fed2055dab1cc4b6934a0e03cba69eb242b9", size = 211459117 },
|
||||
]
|
||||
|
||||
|
@ -4426,6 +4673,7 @@ name = "nvidia-curand-cu12"
|
|||
version = "10.3.5.147"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/80/9c/a79180e4d70995fdf030c6946991d0171555c6edf95c265c6b2bf7011112/nvidia_curand_cu12-10.3.5.147-py3-none-manylinux2014_aarch64.whl", hash = "sha256:1f173f09e3e3c76ab084aba0de819c49e56614feae5c12f69883f4ae9bb5fad9", size = 56314811 },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/6d/44ad094874c6f1b9c654f8ed939590bdc408349f137f9b98a3a23ccec411/nvidia_curand_cu12-10.3.5.147-py3-none-manylinux2014_x86_64.whl", hash = "sha256:a88f583d4e0bb643c49743469964103aa59f7f708d862c3ddb0fc07f851e3b8b", size = 56305206 },
|
||||
]
|
||||
|
||||
|
@ -4439,6 +4687,7 @@ dependencies = [
|
|||
{ name = "nvidia-nvjitlink-cu12", marker = "(platform_machine != 'aarch64' and sys_platform == 'linux') or (sys_platform != 'darwin' and sys_platform != 'linux')" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/46/6b/a5c33cf16af09166845345275c34ad2190944bcc6026797a39f8e0a282e0/nvidia_cusolver_cu12-11.6.1.9-py3-none-manylinux2014_aarch64.whl", hash = "sha256:d338f155f174f90724bbde3758b7ac375a70ce8e706d70b018dd3375545fc84e", size = 127634111 },
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/e1/5b9089a4b2a4790dfdea8b3a006052cfecff58139d5a4e34cb1a51df8d6f/nvidia_cusolver_cu12-11.6.1.9-py3-none-manylinux2014_x86_64.whl", hash = "sha256:19e33fa442bcfd085b3086c4ebf7e8debc07cfe01e11513cc6d332fd918ac260", size = 127936057 },
|
||||
]
|
||||
|
||||
|
@ -4450,6 +4699,7 @@ dependencies = [
|
|||
{ name = "nvidia-nvjitlink-cu12", marker = "(platform_machine != 'aarch64' and sys_platform == 'linux') or (sys_platform != 'darwin' and sys_platform != 'linux')" },
|
||||
]
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/96/a9/c0d2f83a53d40a4a41be14cea6a0bf9e668ffcf8b004bd65633f433050c0/nvidia_cusparse_cu12-12.3.1.170-py3-none-manylinux2014_aarch64.whl", hash = "sha256:9d32f62896231ebe0480efd8a7f702e143c98cfaa0e8a76df3386c1ba2b54df3", size = 207381987 },
|
||||
{ url = "https://files.pythonhosted.org/packages/db/f7/97a9ea26ed4bbbfc2d470994b8b4f338ef663be97b8f677519ac195e113d/nvidia_cusparse_cu12-12.3.1.170-py3-none-manylinux2014_x86_64.whl", hash = "sha256:ea4f11a2904e2a8dc4b1833cc1b5181cde564edd0d5cd33e3c168eff2d1863f1", size = 207454763 },
|
||||
]
|
||||
|
||||
|
@ -4466,6 +4716,7 @@ name = "nvidia-nvjitlink-cu12"
|
|||
version = "12.4.127"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/02/45/239d52c05074898a80a900f49b1615d81c07fceadd5ad6c4f86a987c0bc4/nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_aarch64.whl", hash = "sha256:4abe7fef64914ccfa909bc2ba39739670ecc9e820c83ccc7a6ed414122599b83", size = 20552510 },
|
||||
{ url = "https://files.pythonhosted.org/packages/ff/ff/847841bacfbefc97a00036e0fce5a0f086b640756dc38caea5e1bb002655/nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl", hash = "sha256:06b3b9b25bf3f8af351d664978ca26a16d2c5127dbd53c0497e28d1fb9611d57", size = 21066810 },
|
||||
]
|
||||
|
||||
|
@ -4474,9 +4725,19 @@ name = "nvidia-nvtx-cu12"
|
|||
version = "12.4.127"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/06/39/471f581edbb7804b39e8063d92fc8305bdc7a80ae5c07dbe6ea5c50d14a5/nvidia_nvtx_cu12-12.4.127-py3-none-manylinux2014_aarch64.whl", hash = "sha256:7959ad635db13edf4fc65c06a6e9f9e55fc2f92596db928d169c0bb031e88ef3", size = 100417 },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/20/199b8713428322a2f22b722c62b8cc278cc53dffa9705d744484b5035ee9/nvidia_nvtx_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl", hash = "sha256:781e950d9b9f60d8241ccea575b32f5105a5baf4c2351cab5256a24869f12a1a", size = 99144 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "oauthlib"
|
||||
version = "3.2.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/6d/fa/fbf4001037904031639e6bfbfc02badfc7e12f137a8afa254df6c4c8a670/oauthlib-3.2.2.tar.gz", hash = "sha256:9859c40929662bec5d64f34d01c99e093149682a3f38915dc0655d5a633dd918", size = 177352 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/80/cab10959dc1faead58dc8384a781dfbf93cb4d33d50988f7a69f1b7c9bbe/oauthlib-3.2.2-py3-none-any.whl", hash = "sha256:8139f29aac13e25d502680e9e19963e83f16838d48a0d71c287fe40e7067fbca", size = 151688 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ollama"
|
||||
version = "0.4.7"
|
||||
|
@ -4745,6 +5006,38 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/ff/b1/55a77152a83ec8998e520a3a575f44af1020cfe4bdc000b7538583293b85/opentelemetry_instrumentation-0.50b0-py3-none-any.whl", hash = "sha256:b8f9fc8812de36e1c6dffa5bfc6224df258841fb387b6dfe5df15099daa10630", size = 30728 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-instrumentation-asgi"
|
||||
version = "0.50b0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "asgiref" },
|
||||
{ name = "opentelemetry-api" },
|
||||
{ name = "opentelemetry-instrumentation" },
|
||||
{ name = "opentelemetry-semantic-conventions" },
|
||||
{ name = "opentelemetry-util-http" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/49/cc/a7b2fd243c6d2621803092eba62e450071b6752dfe4f64f530bbfd91a328/opentelemetry_instrumentation_asgi-0.50b0.tar.gz", hash = "sha256:3ca4cb5616ae6a3e8ce86e7d5c360a8d8cc8ed722cf3dc8a5e44300774e87d49", size = 24105 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d2/81/0899c6b56b1023835f266d909250d439174afa0c34ed5944c5021d3da263/opentelemetry_instrumentation_asgi-0.50b0-py3-none-any.whl", hash = "sha256:2ba1297f746e55dec5a17fe825689da0613662fb25c004c3965a6c54b1d5be22", size = 16304 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-instrumentation-fastapi"
|
||||
version = "0.50b0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "opentelemetry-api" },
|
||||
{ name = "opentelemetry-instrumentation" },
|
||||
{ name = "opentelemetry-instrumentation-asgi" },
|
||||
{ name = "opentelemetry-semantic-conventions" },
|
||||
{ name = "opentelemetry-util-http" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/8d/f8/1917b0b3e414e23c7d71c9a33f0ce020f94bc47d22a30f54ace704e07588/opentelemetry_instrumentation_fastapi-0.50b0.tar.gz", hash = "sha256:16b9181682136da210295def2bb304a32fb9bdee9a935cdc9da43567f7c1149e", size = 19214 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cb/d6/37784bb30b213e2dd6838b9f96c2940907022c1b75ef1ff18a99afe42433/opentelemetry_instrumentation_fastapi-0.50b0-py3-none-any.whl", hash = "sha256:8f03b738495e4705fbae51a2826389c7369629dace89d0f291c06ffefdff5e52", size = 12079 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-proto"
|
||||
version = "1.29.0"
|
||||
|
@ -4784,6 +5077,15 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/da/fb/dc15fad105450a015e913cfa4f5c27b6a5f1bea8fb649f8cae11e699c8af/opentelemetry_semantic_conventions-0.50b0-py3-none-any.whl", hash = "sha256:e87efba8fdb67fb38113efea6a349531e75ed7ffc01562f65b802fcecb5e115e", size = 166602 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "opentelemetry-util-http"
|
||||
version = "0.50b0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/69/10/ce3f0d1157cedbd819194f0b27a6bbb7c19a8bceb3941e4a4775014076cf/opentelemetry_util_http-0.50b0.tar.gz", hash = "sha256:dc4606027e1bc02aabb9533cc330dd43f874fca492e4175c31d7154f341754af", size = 7859 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/64/8a/9e1b54f50d1fddebbeac9a9b0632f8db6ece7add904fb593ee2e268ee4de/opentelemetry_util_http-0.50b0-py3-none-any.whl", hash = "sha256:21f8aedac861ffa3b850f8c0a6c373026189eb8630ac6e14a2bf8c55695cc090", size = 6942 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "orjson"
|
||||
version = "3.10.15"
|
||||
|
@ -5133,6 +5435,23 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/9b/fb/a70a4214956182e0d7a9099ab17d50bfcba1056188e9b14f35b9e2b62a0d/portalocker-2.10.1-py3-none-any.whl", hash = "sha256:53a5984ebc86a025552264b459b46a2086e269b21823cb572f8f28ee759e45bf", size = 18423 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "posthog"
|
||||
version = "3.16.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "backoff" },
|
||||
{ name = "distro" },
|
||||
{ name = "monotonic" },
|
||||
{ name = "python-dateutil" },
|
||||
{ name = "requests" },
|
||||
{ name = "six" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/b4/cd/d349468731e2cdbd61bc9655acae5dac961156f4b9c652f011b8433d906e/posthog-3.16.0.tar.gz", hash = "sha256:953176a443b30b1404c0f36010a95caad60a83c31ecb17b427f6d986f6f765c1", size = 65192 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/75/89/5524d64b421e946f85a42d9e95348bfd1b43335eadb9f3ee4a0e368a1b47/posthog-3.16.0-py2.py3-none-any.whl", hash = "sha256:6d2140f58823e540855885a77474a32045f77c2276351791db4dca844f278b37", size = 75934 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pot"
|
||||
version = "0.9.5"
|
||||
|
@ -5636,6 +5955,21 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/3e/6e/9aa158121eb5a6af5537af0bde9e38092a97c40a5a0ecaec7cc9688b2c2e/pypdf-5.2.0-py3-none-any.whl", hash = "sha256:d107962ec45e65e3bd10c1d9242bdbbedaa38193c9e3a6617bd6d996e5747b19", size = 298686 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pypika"
|
||||
version = "0.48.9"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c7/2c/94ed7b91db81d61d7096ac8f2d325ec562fc75e35f3baea8749c85b28784/PyPika-0.48.9.tar.gz", hash = "sha256:838836a61747e7c8380cd1b7ff638694b7a7335345d0f559b04b2cd832ad5378", size = 67259 }
|
||||
|
||||
[[package]]
|
||||
name = "pyproject-hooks"
|
||||
version = "1.2.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/e7/82/28175b2414effca1cdac8dc99f76d660e7a4fb0ceefa4b4ab8f5f6742925/pyproject_hooks-1.2.0.tar.gz", hash = "sha256:1e859bd5c40fae9448642dd871adf459e5e2084186e8d2c2a79a824c970da1f8", size = 19228 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/bd/24/12818598c362d7f300f18e74db45963dbcb85150324092410c8b49405e42/pyproject_hooks-1.2.0-py3-none-any.whl", hash = "sha256:9e5c6bfa8dcc30091c74b0cf803c81fdd29d94f01992a7707bc97babb1141913", size = 10216 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pyreadline3"
|
||||
version = "3.5.4"
|
||||
|
@ -6036,6 +6370,19 @@ wheels = [
|
|||
{ url = "https://files.pythonhosted.org/packages/d7/25/dd878a121fcfdf38f52850f11c512e13ec87c2ea72385933818e5b6c15ce/requests_file-2.1.0-py2.py3-none-any.whl", hash = "sha256:cf270de5a4c5874e84599fc5778303d496c10ae5e870bfa378818f35d21bda5c", size = 4244 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "requests-oauthlib"
|
||||
version = "2.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "oauthlib" },
|
||||
{ name = "requests" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/42/f2/05f29bc3913aea15eb670be136045bf5c5bbf4b99ecb839da9b422bb2c85/requests-oauthlib-2.0.0.tar.gz", hash = "sha256:b3dffaebd884d8cd778494369603a9e7b58d29111bf6b41bdc2dcd87203af4e9", size = 55650 }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3b/5d/63d4ae3b9daea098d5d6f5da83984853c1bbacd5dc826764b249fe119d24/requests_oauthlib-2.0.0-py2.py3-none-any.whl", hash = "sha256:7dd8a5c40426b779b0868c404bdef9768deccf22749cde15852df527e6269b36", size = 24179 },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "requests-toolbelt"
|
||||
version = "1.0.0"
|
||||
|
|
Loading…
Reference in New Issue