rig/rig-core
Marko Kranjac 92c91d23c3
feat: AWS Bedrock provider (#318)
* feat: aws-bedrock-provider

Added new model provider

* feat: AWS Bedrock provider

- changes as requested in PR comments

* feat: image generation
2025-04-07 08:21:05 -07:00
..
examples added mcp_tool + Example (#335) 2025-03-25 13:31:42 -07:00
rig-core-derive misc: Add `rig-derive` missing manifest fields (#129) 2024-11-29 11:52:28 -05:00
src feat: AWS Bedrock provider (#318) 2025-04-07 08:21:05 -07:00
tests feat: Add EpubFileLoader for EPUB file processing (#192) 2025-02-26 10:26:19 -08:00
.gitignore Initial commit 2024-05-29 15:56:59 -04:00
CHANGELOG.md chore: release (#356) 2025-03-31 17:27:41 -04:00
Cargo.toml chore: release (#356) 2025-03-31 17:27:41 -04:00
LICENSE misc: README and package metadata update 2024-05-29 16:14:31 -04:00
README.md docs: add more provider notes (#237) 2025-01-28 12:24:42 -08:00

README.md

Rig

Rig is a Rust library for building LLM-powered applications that focuses on ergonomics and modularity.

More information about this crate can be found in the crate documentation.

Table of contents

High-level features

  • Full support for LLM completion and embedding workflows
  • Simple but powerful common abstractions over LLM providers (e.g. OpenAI, Cohere) and vector stores (e.g. MongoDB, SQLite, in-memory)
  • Integrate LLMs in your app with minimal boilerplate

Installation

cargo add rig-core

Simple example:

use rig::{completion::Prompt, providers::openai};

#[tokio::main]
async fn main() {
    // Create OpenAI client and model
    // This requires the `OPENAI_API_KEY` environment variable to be set.
    let openai_client = openai::Client::from_env();

    let gpt4 = openai_client.model("gpt-4").build();

    // Prompt the model and print its response
    let response = gpt4
        .prompt("Who are you?")
        .await
        .expect("Failed to prompt GPT-4");

    println!("GPT-4: {response}");
}

Note using #[tokio::main] requires you enable tokio's macros and rt-multi-thread features or just full to enable all features (cargo add tokio --features macros,rt-multi-thread).

Integrations

Rig supports the following LLM providers natively:

  • OpenAI
  • Cohere
  • Anthropic
  • Perplexity
  • Google Gemini
  • xAI
  • DeepSeek

Additionally, Rig currently has the following integration sub-libraries:

  • MongoDB vector store: rig-mongodb