Skip to content

Temperature.normalize incorrectly forces temperature=1.0 for all gpt-5* models #719

@adamcooper

Description

@adamcooper

Description

RubyLLM::Providers::OpenAI::Temperature.normalize uses the regex /^(o\d|gpt-5)/ to detect models that require temperature=1.0. This correctly targets reasoning models (o1, o3, gpt-5-pro, etc.) but also matches standard completion models like gpt-5.4-nano and gpt-5.4-mini, which fully support custom temperatures.

When calling Chat.new(model: "gpt-5.4-nano").with_temperature(0.5).ask(...), the temperature is silently overridden to 1.0 before the request is sent to OpenAI.

Reproduction

chat = RubyLLM::Chat.new(model: "gpt-5.4-nano").with_temperature(0.5)
# Expected: request sent with temperature=0.5
# Actual: request sent with temperature=1.0
# Log: "Model gpt-5.4-nano requires temperature=1.0, setting that instead."

Root cause

In lib/ruby_llm/providers/openai/temperature.rb:

def normalize(temperature, model_id)
  if model_id.match?(/^(o\d|gpt-5)/) && !temperature.nil? && !temperature_close_to_one?(temperature)
    RubyLLM.logger.debug { "Model #{model_id} requires temperature=1.0, setting that instead." }
    1.0
  # ...

The gpt-5 branch of the regex is too broad — it matches the entire gpt-5.* family rather than just the reasoning variants that actually require fixed temperature.

Workaround

We're currently monkey-patching in our app's initializer:

RubyLLM::Providers::OpenAI::Temperature.module_eval do
  class << self
    alias_method :original_normalize, :normalize

    def normalize(temperature, model_id)
      return temperature if model_id.match?(/^gpt-5\.\d+-(nano|mini)/)

      original_normalize(temperature, model_id)
    end
  end
end

AI-suggested fix

Only force temperature=1.0 for reasoning model variants (-pro suffix and the base gpt-5 / gpt-5.N models without a size suffix):

def normalize(temperature, model_id)
  if reasoning_model?(model_id) && !temperature.nil? && !temperature_close_to_one?(temperature)
    RubyLLM.logger.debug { "Model #{model_id} requires temperature=1.0, setting that instead." }
    1.0
  elsif model_id.include?('-search')
    RubyLLM.logger.debug { "Model #{model_id} does not accept temperature parameter, removing" }
    nil
  else
    temperature
  end
end

def reasoning_model?(model_id)
  model_id.match?(/^o\d/) ||                          # o1, o3, o4-mini, etc.
    model_id.match?(/^gpt-5(\.\d+)?(-\d{4})?$/) ||    # gpt-5, gpt-5.4, gpt-5.4-2026-03-05
    model_id.match?(/^gpt-5(\.\d+)?-pro/)              # gpt-5-pro, gpt-5.4-pro
end

Version

ruby_llm 1.14.1

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions