Note
This is an unofficial OpenAI SDK for Rust, providing a convenient abstraction over OpenAI's API. It enables you to easily perform tasks such as generating completions, creating and editing images, moderating text, fine-tuning models, and more.
Tip
Make sure you have Rust and Cargo installed on your system before proceeding. Visit rust-lang.org for installation instructions.
To use this SDK, add the following dependencies to your Cargo.toml
file:
[dependencies]
rusty-openai = "0.1.11"
serde_json = "1.0"
tokio = { version = "1", features = ["full"] }
reqwest = { version = "0.12.5", features = ["json", "multipart"] }
To get started with the OpenAI Rust SDK, follow these steps:
Important
You'll need an OpenAI API key to use this SDK. Get your API key from the OpenAI dashboard.
First, create an instance of the OpenAI
struct with your API key:
use rusty_openai::openai::OpenAI;
#[tokio::main]
async fn main() {
let openai = OpenAI::new("YOUR_API_KEY", "https://api.openai.com/v1");
}
Note
The SDK now supports structured outputs with JSON Schema validation, allowing for more controlled and predictable responses.
Here's an example of generating chat completions with structured output:
use rusty_openai::openai::OpenAI;
use rusty_openai::openai_api::completion::ChatCompletionRequest;
use serde_json::json;
use std::env;
#[tokio::main]
async fn main() {
let api_key = env::var("OPENAI_API_KEY").expect("API key not set");
let openai = OpenAI::new(&api_key, "https://api.openai.com/v1");
// Example with structured outputs using JSON Schema
let schema = json!({
"type": "object",
"properties": {
"steps": {
"type": "array",
"items": {
"type": "object",
"properties": {
"explanation": {"type": "string"},
"output": {"type": "string"}
},
"required": ["explanation", "output"]
}
},
"final_answer": {"type": "string"}
},
"required": ["steps", "final_answer"]
});
let messages = vec![
json!({
"role": "user",
"content": "Solve this equation: 2x + 5 = 13"
})
];
let request = ChatCompletionRequest::new_json_schema(
"gpt-4-0125-preview".to_string(),
messages,
"math_reasoning".to_string(),
schema
)
.temperature(0.7);
let chat_response = openai.completions().create(request).await;
match chat_response {
Ok(chat) => println!("{}", json!(chat).to_string()),
Err(err) => eprintln!("Error: {}", err),
}
}
Tip
The SDK supports all major OpenAI API endpoints and features:
- 🤖 Chat Completions with GPT-4 and GPT-3.5
- 🎨 Image Generation and Editing (DALL·E)
- 📝 Text Moderation
- 🔄 Fine-tuning Models
- 🔍 Embeddings Generation
- 📊 Structured Outputs with JSON Schema
- 🌐 Custom Base URL Support
- ⚡ Async/Await Support
- 🔒 Type-safe API
Note
For detailed information on all available endpoints and their respective methods, please refer to the SDK Documentation.
Important
This SDK is licensed under the MIT License. See the LICENSE file for details.