Table of Contents
Initial Setup
Add Raif to your application’s Gemfile:
gem "raif"
And then execute:
bundle install
Run the install generator:
rails generate raif:install
This will:
- Create a configuration file at
config/initializers/raif.rb - Copy Raif’s database migrations to your application
- Mount Raif’s engine at
/raifin your application’sconfig/routes.rbfile
Next, run the migrations. Raif is compatible with both PostgreSQL and MySQL databases.
rails db:migrate
Configuring LLM Providers & API Keys
You must configure at least one API key for an LLM provider (OpenAI, Anthropic, AWS Bedrock, OpenRouter).
By default, the initializer will load them from environment variables (e.g. ENV["OPENAI_API_KEY"], ENV["ANTHROPIC_API_KEY"], ENV["OPEN_ROUTER_API_KEY"]). Alternatively, you can set them directly in config/initializers/raif.rb.
OpenAI
OpenAI Responses API
Use this adapter to utilize OpenAI’s newer Responses API, which supports provider-managed tools, including web search, code execution, and image generation.
Note: OpenAI’s GPT-OSS models are not supported by OpenAI’s API, but are available via OpenRouter.
Raif.configure do |config|
config.open_ai_models_enabled = true
config.open_ai_api_key = ENV["OPENAI_API_KEY"]
config.default_llm_model_key = "open_ai_responses_gpt_4o"
end
Currently supported OpenAI Responses API models:
open_ai_responses_gpt_5open_ai_responses_gpt_5_miniopen_ai_responses_gpt_5_nanoopen_ai_responses_gpt_3_5_turboopen_ai_responses_gpt_4_1open_ai_responses_gpt_4_1_miniopen_ai_responses_gpt_4_1_nanoopen_ai_responses_gpt_4oopen_ai_responses_gpt_4o_miniopen_ai_responses_o1open_ai_responses_o1_miniopen_ai_responses_o1_proopen_ai_responses_o3open_ai_responses_o3_miniopen_ai_responses_o3_proopen_ai_responses_o4_mini
OpenAI Completions API
This adapter utilizes OpenAI’s legacy Completions API. This API does not support provider-managed tools like web search, code execution, and image generation. To utilize those, use the newer Responses API instead.
Raif.configure do |config|
config.open_ai_models_enabled = true
config.open_ai_api_key = ENV["OPENAI_API_KEY"]
config.default_llm_model_key = "open_ai_gpt_4o"
end
Currently supported OpenAI Completions API models:
open_ai_gpt_5open_ai_gpt_5_miniopen_ai_gpt_5_nanoopen_ai_gpt_3_5_turboopen_ai_gpt_4_1open_ai_gpt_4_1_miniopen_ai_gpt_4_1_nanoopen_ai_gpt_4oopen_ai_gpt_4o_miniopen_ai_o1open_ai_o1_miniopen_ai_o3open_ai_o3_miniopen_ai_o4_mini
Anthropic
The Anthropic adapter provides access to provider-managed tools for web search and code execution.
Raif.configure do |config|
config.anthropic_models_enabled = true
config.anthropic_api_key = ENV["ANTHROPIC_API_KEY"]
config.default_llm_model_key = "anthropic_claude_3_5_sonnet"
end
Currently supported Anthropic models:
anthropic_claude_3_5_haikuanthropic_claude_3_5_sonnetanthropic_claude_3_7_sonnetanthropic_claude_3_opusanthropic_claude_4_opusanthropic_claude_4_1_opusanthropic_claude_4_sonnetanthropic_claude_4_5_sonnet
AWS Bedrock
Note: Raif utilizes the AWS Bedrock gem and AWS credentials should be configured via the AWS SDK (environment variables, IAM role, etc.)
Raif.configure do |config|
config.bedrock_models_enabled = true
config.aws_bedrock_region = "us-east-1"
config.default_llm_model_key = "bedrock_claude_3_5_sonnet"
end
Currently supported Bedrock models:
bedrock_claude_3_5_haikubedrock_claude_3_5_sonnetbedrock_claude_3_7_sonnetbedrock_claude_3_opusbedrock_claude_4_opusbedrock_claude_4_1_opusbedrock_claude_4_sonnetbedrock_claude_4_5_sonnetbedrock_amazon_nova_litebedrock_amazon_nova_microbedrock_amazon_nova_pro
OpenRouter
OpenRouter is a unified API that provides access to multiple AI models from different providers including Anthropic, Meta, Google, and more.
See Adding LLM Models for more information on adding new OpenRouter models to your application.
Raif.configure do |config|
config.open_router_models_enabled = true
config.open_router_api_key = ENV["OPEN_ROUTER_API_KEY"]
config.open_router_app_name = "Your App Name" # Optional
config.open_router_site_url = "https://yourdomain.com" # Optional
config.default_llm_model_key = "open_router_claude_3_7_sonnet"
end
Currently included OpenRouter models:
open_router_claude_3_7_sonnetopen_router_deepseek_chat_v3open_router_gemini_2_0_flashopen_router_llama_3_1_8b_instructopen_router_llama_3_3_70b_instructopen_router_llama_4_maverickopen_router_llama_4_scoutopen_router_open_ai_gpt_oss_120bopen_router_open_ai_gpt_oss_20b
Read next: Chatting with the LLM