| layout | default |
|---|---|
| title | Agents |
| nav_order | 6 |
| description | Define reusable AI assistants with class-based configuration, runtime context, and prompt conventions |
{: .d-inline-block .no_toc }
New in 1.12 {: .label .label-green }
{{ page.description }} {: .fs-6 .fw-300 }
{: .no_toc .text-delta }
- TOC {:toc}
After reading this guide, you will know:
- How to define agents with a class-based DSL
- How to use agents with plain Ruby chats and Rails-backed chats
- How runtime context works (
chat,inputs, and lazy evaluation) - How prompt conventions work in
app/prompts - Which methods are available on agent instances
Agents are a class-based way to define a chat setup once and reuse it everywhere.
For example, instead of re-adding the same instructions and tools in every controller, job, or service, you define them once in an agent class and call that agent wherever you need it.
class SupportAgent < RubyLLM::Agent
model "{{ site.models.default_chat }}"
instructions "You are a concise support assistant."
tools SearchDocs, LookupAccount
end
response = SupportAgent.new.ask "How do I reset my API key?"In other words, an agent is a named wrapper around the same configuration you would otherwise apply progressively with chat.with_* calls (with_instructions, with_tools, with_params, and so on).
Agents work in two modes:
- Plain Ruby mode via
.chat(returnsRubyLLM::Chat) - Rails mode via
.create/.create!/.findwhenchat_modelis configured (returns your ActiveRecord chat model)
Example of Rails mode:
class WorkAssistant < RubyLLM::Agent
chat_model Chat # this activates the Rails integration
model "{{ site.models.default_chat }}"
instructions "You are a helpful assistant."
tools SearchDocs, LookupAccount
end
chat = WorkAssistant.create!(user: current_user)
same_chat = WorkAssistant.find(chat.id)Create a class that inherits from RubyLLM::Agent and declare its configuration:
# app/agents/work_assistant.rb
class WorkAssistant < RubyLLM::Agent
model "{{ site.models.default_chat }}"
instructions "You are a helpful assistant."
tools SearchDocs, LookupAccount
temperature 0.2
params max_output_tokens: 256
endSupported class macros:
These macros use the same arguments you already know from RubyLLM.chat(...) and Chat#with_* methods.
For example, model maps to RubyLLM.chat(model:, provider:, ...), tools maps to with_tools, instructions maps to with_instructions, and so on.
model(see [Chat Basics]({% link _core_features/chat.md %}))tools(see [Tools]({% link _core_features/tools.md %}))instructions(see [Chat Basics]({% link _core_features/chat.md %}))temperature(see [Chat Basics]({% link _core_features/chat.md %}))thinking(see [Thinking]({% link _core_features/thinking.md %}))params(see [Chat Basics]({% link _core_features/chat.md %}))headers(see [Chat Basics]({% link _core_features/chat.md %}))schema(see [Chat Basics]({% link _core_features/chat.md %}))context(see [Configuration]({% link _getting_started/configuration.md %}))chat_model(Rails-backed mode)inputs(declared runtime inputs)
schema supports:
- A schema class (for example
PersonSchema) - same aswith_schema - A JSON schema hash - same as
with_schema - An inline DSL block with
schema do ... end- agent-specific convenience
Inline DSL example:
class CriticAgent < RubyLLM::Agent
schema do
string :verdict, enum: ["pass", "revise"]
string :feedback
end
endAgents support runtime-evaluated values using blocks and lambdas.
Declare additional runtime inputs with inputs:
class WorkAssistant < RubyLLM::Agent
chat_model Chat
inputs :workspace
instructions { "You are helping #{workspace.name}" }
endchat is always available in execution context:
- In
.chatmode,chatis aRubyLLM::Chat - In
.create/.create!/.findmode,chatis yourchat_modelrecord
This enables Rails-style usage:
class WorkAssistant < RubyLLM::Agent
chat_model Chat
instructions current_date_time: -> { Time.current.strftime("%B %d, %Y") },
display_name: -> { chat.user.display_name_or_email },
full_name: -> { chat.user.full_name.presence || chat.user.display_name_or_email }
tools do
[
TodoTool.new(chat: chat),
GoogleDriveListTool.new(user: chat.user),
GoogleDriveSearchTool.new(user: chat.user),
GoogleDriveReadTool.new(user: chat.user)
]
end
endImportant: values that depend on runtime chat must be lazy (blocks/lambdas), not eager class-load expressions.
Agents have prompt conventions built in.
Calling instructions with no arguments enables default prompt lookup:
class WorkAssistant < RubyLLM::Agent
chat_model Chat
instructions
endRubyLLM looks for:
app/prompts/work_assistant/instructions.txt.erb
If the file exists, it is rendered and used as instructions. If it does not exist, RubyLLM raises RubyLLM::PromptNotFoundError.
You can pass locals directly:
class WorkAssistant < RubyLLM::Agent
chat_model Chat
instructions display_name: -> { chat.user.display_name_or_email }
endThis also renders instructions.txt.erb for that agent path.
Within execution context you can call:
instructions { prompt("instructions", display_name: chat.user.display_name_or_email) }Agent prompt path is derived from class name:
WorkAssistant->app/prompts/work_assistant/...Admin::SupportAgent->app/prompts/admin/support_agent/...
Prompt extension defaults to .txt.erb.
chat = WorkAssistant.chat
response = chat.ask("Hello")
puts response.contentWorkAssistant.chat(...) returns a configured RubyLLM::Chat.
You can still instantiate and use an agent instance directly:
agent = WorkAssistant.new
agent.ask("Hello")Agent instances delegate the full RubyLLM::Chat instance API to the underlying chat object
(or to to_llm when using a Rails-backed chat model).
Delegated methods include:
model,messages,tools,params,headers,schemaask,say,completeadd_message,reset_messages!,eachwith_tool,with_toolswith_model,with_temperature,with_thinking,with_contextwith_params,with_headers,with_schemaon_new_message,on_end_message,on_tool_call,on_tool_result
You can always access the wrapped chat object directly via agent.chat.
Set chat_model to use your ActiveRecord chat model:
class WorkAssistant < RubyLLM::Agent
chat_model Chat
model "{{ site.models.default_chat }}"
instructions "You are a helpful assistant."
tools SearchDocs, LookupAccount
endThen you can:
# Create persisted chat with agent configuration applied
chat = WorkAssistant.create!(user: current_user)
# Load existing persisted chat with runtime config applied (no DB write)
chat = WorkAssistant.find(params[:id])
# Explicitly persist/sync the current agent instructions if you've modified them
WorkAssistant.sync_instructions!(chat)create/create!/find require chat_model. Calling them without it raises an error.
Instruction persistence contract in Rails mode:
create/create!applies and persists instructionsfindapplies instructions at runtime only (no persistence side effects)sync_instructions!explicitly persists the current agent instructions
If you already have a Chat record, pass it to Agent.new(chat:) instead of calling Agent.find. This applies all agent configuration (instructions, tools, etc.) without an extra database query:
chat_record = Chat.find(params[:id])
chat = WorkAssistant.new(chat: chat_record)
chat.ask("Hello")Use RubyLLM.chat for one-off, inline conversations:
chat = RubyLLM.chat(model: "{{ site.models.default_chat }}")
chat.with_instructions "Explain this clearly."Use agents when you want named, reusable behavior:
class WorkAssistant < RubyLLM::Agent
model "{{ site.models.default_chat }}"
instructions "You are a helpful assistant."
tools SearchDocs, LookupAccount
endThink of RubyLLM.chat as ad-hoc and RubyLLM::Agent as reusable application architecture.
These two styles are equivalent in capability, but optimized for different contexts.
Use progressive Chat#with_* when configuration is local and one-off:
chat = RubyLLM.chat(model: "{{ site.models.default_chat }}")
chat.with_instructions("You are a helpful assistant.")
chat.with_tools(SearchDocs, LookupAccount)
chat.ask("Help me find docs about callbacks.")Use agents when that setup should be centralized and reused:
class WorkAssistant < RubyLLM::Agent
model "{{ site.models.default_chat }}"
instructions "You are a helpful assistant."
tools SearchDocs, LookupAccount
end
WorkAssistant.new.ask("Help me find docs about callbacks.")- Learn about [Chat Basics]({% link _core_features/chat.md %})
- Explore [Tools]({% link _core_features/tools.md %})
- Review [Rails Integration]({% link _advanced/rails.md %})