mruby-llm is mruby's most capable AI runtime.
It brings a single runtime for providers, agents, tools, skills, MCP, streaming, files, and persisted state to mruby in a form that can be embedded into small standalone applications. The project began as a fork of llm.rb, and a large number of features turned out to be portable. Both projects generally improve each other and code continues to flow both ways.
There is support for OpenAI, Anthropic, Google Gemini, DeepSeek, xAI, Z.ai, Ollama, and llama.cpp. The mruby port keeps the same overall execution model as llm.rb, but adapts it to mruby constraints. There's still quite a lot of the original llm.rb runtime that is supported though.
The LLM::Context object is at the heart of the runtime. Almost all other features build on top of it. It is a low-level interface to a model, and requires tool execution to be managed manually. The LLM::Agent class is almost the same as LLM::Context, but it manages tool execution for you:
llm = LLM.openai(key: ENV["OPENAI_SECRET"])
ctx = LLM::Context.new(llm, stream: $stdout)
ctx.talk("Hello world")The LLM::Agent object is implemented on top of LLM::Context. It provides the same interface, but manages tool execution for you. It also includes loop guards that detect repeated tool-call patterns and advise the model to change course rather than raise an error:
llm = LLM.openai(key: ENV["OPENAI_SECRET"])
agent = LLM::Agent.new(llm, stream: $stdout)
agent.talk("Hello world")The LLM::Tool class can be subclassed to implement your own tools that extend the abilities of a model:
class ReadFile < LLM::Tool
name "read-file"
description "Read a file"
parameter :path, String, "The filename or path"
required %i[path]
def call(path:)
{contents: File.read(path)}
end
endThe LLM::MCP object lets mruby-llm use tools provided by an MCP server. Those tools are exposed through the same runtime as local tools, so you can pass them to either LLM::Context or LLM::Agent:
llm = LLM.openai(key: ENV["OPENAI_SECRET"])
mcp = LLM::MCP.stdio(argv: ["ruby", "server.rb"])
mcp.run do
ctx = LLM::Context.new(llm, stream: $stdout, tools: mcp.tools)
ctx.talk("Use the available tools to inspect the environment.")
ctx.talk(ctx.wait(:call)) while ctx.functions?
endSkills are reusable instructions loaded from a SKILL.md directory.
They let you package behavior and tool access together, and they plug
into the same runtime as tools, agents, and MCP:
---
name: release
description: Prepare a release
tools: ["read-file"]
---
## Task
Review the release state and summarize what changed.class ReleaseAgent < LLM::Agent
model "gpt-4.1-mini"
skills "./skills/release"
endThe LLM::Stream object lets you observe output and runtime events as they happen. You can subclass it to handle streamed content in your own application:
require "llm"
class Stream < LLM::Stream
def on_content(content)
$stdout << content
end
end
llm = LLM.openai(key: ENV["KEY"])
ctx = LLM::Context.new(llm, stream: Stream.new)
ctx.talk "Write a haiku about Ruby."The
LLM::Stream
object can also resolve tool calls while output is still streaming. In
on_tool_call, you can spawn the tool, push the work onto the stream
queue, and later drain it with wait:
require "llm"
class Stream < LLM::Stream
def on_content(content)
$stdout << content
end
def on_tool_call(tool, error)
return queue << error if error
queue << ctx.spawn(tool, :thread)
end
end
llm = LLM.openai(key: ENV["KEY"])
ctx = LLM::Context.new(llm, stream: Stream.new, tools: [ReadFile])
ctx.talk "Read README.md and summarize the quick start."
ctx.talk(ctx.wait) while ctx.functions?The LLM::Context
object can be serialized to JSON, which makes it suitable for storing
in a file, a database column, or a Redis queue. The built-in
ActiveRecord and Sequel plugins are built on top of this feature:
require "llm"
llm = LLM.openai(key: ENV["KEY"])
# Serialize a context
ctx1 = LLM::Context.new(llm)
ctx1.talk "Remember that my favorite language is Ruby"
string = ctx1.to_json
# Restore a context (from JSON)
ctx2 = LLM::Context.new(llm, stream: $stdout)
ctx2.restore(string:)
ctx2.talk "What is my favorite language?"Add to your mruby build config:
MRuby::Build.new("app") do |conf|
curldir = File.expand_path(ENV["CURLDIR"] || "/usr/local")
conf.toolchain
conf.cc.include_paths << File.join(curldir, "include")
conf.linker.library_paths << File.join(curldir, "lib")
conf.gembox "default"
conf.gem github: "llmrb/mruby-llm", branch: "main"
conf.enable_debug
endFor local development in this repository, use the bundled Makefile:
make
make testThe Makefile expects an mruby checkout at ../mruby. Override that with
MRUBY_DIR=/absolute/path/to/mruby if needed.
For direct integration into another mruby build, build through your mruby checkout:
ruby minirake MRUBY_CONFIG=/absolute/path/to/build_config.rbDependencies are declared in mrbgem.rake. In practice the
main external build requirement is libcurl, because the runtime depends on
mruby-curl and mruby-http. The local build also expects curl headers and
libraries under /usr/local by default; override with
CURLDIR=/absolute/path.
Declared mrbgem dependencies include:
mruby-httpmruby-curlmruby-jsonmruby-stringiomruby-processmruby-enumeratormruby-iomruby-timemruby-envmruby-structmruby-regexp
See mrbgem.rake.
