Caricamento in corso...
Caricamento in corso...
Last synced: Today, 22:00
Technical reference for the OpenClaw framework. Real-time synchronization with the official documentation engine.
Use this file to discover all available pages before exploring further.
OpenClaw can run local AI CLIs as a text-only fallback when API providers are down, rate-limited, or temporarily misbehaving. This is intentionally conservative:
bundleMcp: trueThis is designed as a safety net rather than a primary path. Use it when you want “always works” text responses without relying on external APIs.
If you want a full harness runtime with ACP session controls, background tasks, thread/conversation binding, and persistent external coding sessions, use ACP Agents instead. CLI backends are not ACP.
You can use Codex CLI without any config (the bundled OpenAI plugin registers a default backend):
bashopenclaw agent --message "hi" --model codex-cli/gpt-5.5
If your gateway runs under launchd/systemd and PATH is minimal, add just the command path:
json5{ agents: { defaults: { cliBackends: { "codex-cli": { command: "/opt/homebrew/bin/codex", }, }, }, }, }
That’s it. No keys, no extra auth config needed beyond the CLI itself.
If you use a bundled CLI backend as the primary message provider on a gateway host, OpenClaw now auto-loads the owning bundled plugin when your config explicitly references that backend in a model ref or under
agents.defaults.cliBackendsAdd a CLI backend to your fallback list so it only runs when primary models fail:
json5{ agents: { defaults: { model: { primary: "anthropic/claude-opus-4-6", fallbacks: ["codex-cli/gpt-5.5"], }, models: { "anthropic/claude-opus-4-6": { alias: "Opus" }, "codex-cli/gpt-5.5": {}, }, }, }, }
Notes:
agents.defaults.modelsAll CLI backends live under:
textagents.defaults.cliBackends
Each entry is keyed by a provider id (e.g.
codex-climy-clitext<provider>/<model>
json5{ agents: { defaults: { cliBackends: { "codex-cli": { command: "/opt/homebrew/bin/codex", }, "my-cli": { command: "my-cli", args: ["--json"], output: "json", input: "arg", modelArg: "--model", modelAliases: { "claude-opus-4-6": "opus", "claude-sonnet-4-6": "sonnet", }, sessionArg: "--session", sessionMode: "existing", sessionIdFields: ["session_id", "conversation_id"], systemPromptArg: "--system", // For CLIs with a dedicated prompt-file flag: // systemPromptFileArg: "--system-file", // Codex-style CLIs can point at a prompt file instead: // systemPromptFileConfigArg: "-c", // systemPromptFileConfigKey: "model_instructions_file", systemPromptWhen: "first", imageArg: "--image", imageMode: "repeat", serialize: true, }, }, }, }, }
codex-cli/...claude-cliThe bundled OpenAI
codex-climodel_instructions_file-c model_instructions_file="..."--append-system-promptThe bundled Anthropic
claude-cli--plugin-dirClaude CLI also has its own noninteractive permission mode. OpenClaw maps that to the existing exec policy instead of adding Claude-specific config: when the effective requested exec policy is YOLO (
tools.exec.security: "full"tools.exec.ask: "off"--permission-mode bypassPermissionsagents.list[].tools.exectools.exec--permission-mode default--permission-mode acceptEditsagents.defaults.cliBackends.claude-cli.argsresumeArgsBefore OpenClaw can use the bundled
claude-clibashclaude auth login claude auth status --text openclaw models auth login --provider anthropic --method cli --set-default
Use
agents.defaults.cliBackends.claude-cli.commandclaudePATHsessionArg--session-idsessionArgs{sessionId}resumeArgsargsresumeOutputsessionModealwaysexistingnoneclaude-cliliveSession: "claude-stdio"output: "jsonl"input: "stdin"reason=transcript-missing--resume/resetsession.resetSerialization notes:
serialize: trueWhen a
claude-cliagents.defaults.model.fallbacks~/.claude/projects/claude-cli/compactcompact_boundary(tool call: name)(tool result: …)(truncated)claude-cliclaude-cli--resumeIf your CLI accepts image paths, set
imageArgjson5imageArg: "--image", imageMode: "repeat"
OpenClaw will write base64 images to temp files. If
imageArgimageArgoutput: "json"responsestatsusageoutput: "jsonl"--jsonoutput: "text"Input modes:
input: "arg"input: "stdin"maxPromptArgCharsThe bundled OpenAI plugin also registers a default for
codex-clicommand: "codex"args: ["exec","--json","--color","never","--sandbox","workspace-write","--skip-git-repo-check"]resumeArgs: ["exec","resume","{sessionId}","-c","sandbox_mode=\"workspace-write\"","--skip-git-repo-check"]output: "jsonl"resumeOutput: "text"modelArg: "--model"imageArg: "--image"sessionMode: "existing"The bundled Google plugin also registers a default for
google-gemini-clicommand: "gemini"args: ["--output-format", "json", "--prompt", "{prompt}"]resumeArgs: ["--resume", "{sessionId}", "--output-format", "json", "--prompt", "{prompt}"]imageArg: "@"imagePathScope: "workspace"modelArg: "--model"sessionMode: "existing"sessionIdFields: ["session_id", "sessionId"]Prerequisite: the local Gemini CLI must be installed and available as
geminiPATHbrew install gemini-clinpm install -g @google/gemini-cliGemini CLI JSON notes:
responsestatsusagestats.cachedcacheReadstats.inputstats.input_tokens - stats.cachedOverride only if needed (common: absolute
commandCLI backend defaults are now part of the plugin surface:
api.registerCliBackend(...)idagents.defaults.cliBackends.<id>normalizeConfigPlugins that need tiny prompt/message compatibility shims can declare bidirectional text transforms without replacing a provider or CLI backend:
typescriptapi.registerTextTransforms({ input: [ { from: /red basket/g, to: "blue basket" }, { from: /paper ticket/g, to: "digital ticket" }, { from: /left shelf/g, to: "right shelf" }, ], output: [ { from: /blue basket/g, to: "red basket" }, { from: /digital ticket/g, to: "paper ticket" }, { from: /right shelf/g, to: "left shelf" }, ], });
inputoutputFor CLIs that emit Claude Code stream-json compatible JSONL, set
jsonlDialect: "claude-stream-json"CLI backends do not receive OpenClaw tool calls directly, but a backend can opt into a generated MCP config overlay with
bundleMcp: trueCurrent bundled behavior:
claude-clicodex-climcp_serversgoogle-gemini-cliWhen bundle MCP is enabled, OpenClaw:
OPENCLAW_MCP_TOKENIf no MCP servers are enabled, OpenClaw still injects a strict config when a backend opts into bundle MCP so background runs stay isolated.
Session-scoped bundled MCP runtimes are cached for reuse within a session, then reaped after
mcp.sessionIdleTtlMs0bundleMcp: true--jsoncommandmodelAliasesprovider/modelsessionArgsessionModenoneimageArg© 2024 TaskFlow Mirror
Powered by TaskFlow Sync Engine