Class: Raif::Llms::Anthropic
- Inherits:
-
Raif::Llm
- Object
- Raif::Llm
- Raif::Llms::Anthropic
- Includes:
- Concerns::Llms::Anthropic::MessageFormatting, Concerns::Llms::Anthropic::ResponseToolCalls, Concerns::Llms::Anthropic::ToolFormatting
- Defined in:
- app/models/raif/llms/anthropic.rb
Class Method Summary collapse
- .cache_creation_input_token_cost_multiplier ⇒ Object
- .cache_read_input_token_cost_multiplier ⇒ Object
- .prompt_tokens_include_cached_tokens? ⇒ Boolean
Instance Method Summary collapse
Class Method Details
.cache_creation_input_token_cost_multiplier ⇒ Object
16 17 18 |
# File 'app/models/raif/llms/anthropic.rb', line 16 def self.cache_creation_input_token_cost_multiplier 1.25 end |
.cache_read_input_token_cost_multiplier ⇒ Object
12 13 14 |
# File 'app/models/raif/llms/anthropic.rb', line 12 def self.cache_read_input_token_cost_multiplier 0.1 end |
.prompt_tokens_include_cached_tokens? ⇒ Boolean
8 9 10 |
# File 'app/models/raif/llms/anthropic.rb', line 8 def self.prompt_tokens_include_cached_tokens? false end |
Instance Method Details
#perform_model_completion!(model_completion, &block) ⇒ Object
20 21 22 23 24 25 26 27 28 29 30 31 32 |
# File 'app/models/raif/llms/anthropic.rb', line 20 def perform_model_completion!(model_completion, &block) params = build_request_parameters(model_completion) response = connection.post("messages") do |req| req.body = params req..on_data = streaming_chunk_handler(model_completion, &block) if model_completion.stream_response? end unless model_completion.stream_response? update_model_completion(model_completion, response.body) end model_completion end |