Skip to content

[FEATURE] Add support for client-side tool calls #681

@mtoneil

Description

@mtoneil

Scope check

  • This is core LLM communication (not application logic)
  • This benefits most users (not just my use case)
  • This can't be solved in application code with current RubyLLM
  • I read the Contributing Guide

Due diligence

  • I searched existing issues
  • I checked the documentation

What problem does this solve?

Currently RubyLLM only supports server-side execution of tools. However client-side tools can significantly enrich a chat experience: fetching page content, requesting the user's location, displaying custom components to gather input beyond just free-form text, etc.

Vercel's ai-sdk has first-class support for client-side tool calls:
https://ai-sdk.dev/docs/ai-sdk-ui/chatbot-tool-usage

Proposed solution

  1. Support a client_side macro on Tool subclasses, e.g.:
class RequestLocation < RubyLLM::Tool
  description "Get the user's current geographic location"
  client_side
end
  1. Server calls execute normally, but client calls are skipped and a Tool::Pause is returned containing the client tool calls that the caller will need to handle. (I prefer to skip :on_tool_call for client tools but would defer to you on it)

  2. After executing client tools, the caller resumes the conversation with existing APIs: chat.add_message(role: :tool, content: "...", tool_call_id: "call_1") and chat.continue.

I will be happy to open a PR for this change.

Why this belongs in RubyLLM

This can't be accomplished today without misusing halt and then correctly identifying and deleting the halted tool call result.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions