Caricamento in corso...
Caricamento in corso...
Last synced: Today, 22:00
Technical reference for the OpenClaw framework. Real-time synchronization with the official documentation engine.
Use this file to discover all available pages before exploring further.
For quick start, QA runners, unit/integration suites, and Docker flows, see Testing. This page covers the live (network-touching) test suites: model matrix, CLI backends, ACP, and media-provider live tests, plus credential handling.
Source
~/.profilebashsource ~/.profile
Safe media smoke:
bashpnpm openclaw infer tts convert --local --json \ --text "OpenClaw live smoke." \ --output /tmp/openclaw-live-smoke.mp3
Safe voice-call readiness smoke:
bashpnpm openclaw voicecall setup --json pnpm openclaw voicecall smoke --to "+15555550123"
voicecall smoke--yes--yessrc/gateway/android-node.capabilities.live.test.tspnpm android:test:integrationnode.invokeOPENCLAW_ANDROID_NODE_IDOPENCLAW_ANDROID_NODE_NAMEOPENCLAW_ANDROID_GATEWAY_URLOPENCLAW_ANDROID_GATEWAY_TOKENOPENCLAW_ANDROID_GATEWAY_PASSWORDLive tests are split into two layers so we can isolate failures:
src/agents/models.profiles.live.test.tsgetApiKeyForModelpnpm test:liveOPENCLAW_LIVE_TEST=1OPENCLAW_LIVE_MODELS=modernallpnpm test:liveOPENCLAW_LIVE_MODELS=modernOPENCLAW_LIVE_MODELS=allOPENCLAW_LIVE_MODELS="openai/gpt-5.5,openai-codex/gpt-5.5,anthropic/claude-opus-4-6,..."OPENCLAW_LIVE_MAX_MODELS=0OPENCLAW_LIVE_TEST_TIMEOUT_MSOPENCLAW_LIVE_MODEL_CONCURRENCYOPENCLAW_LIVE_PROVIDERS="google,google-antigravity,google-gemini-cli"OPENCLAW_LIVE_REQUIRE_PROFILE_KEYS=1src/gateway/gateway-models.profiles.live.test.tsagent:dev:*readreadexec+readexecreadcat <CODE>src/gateway/gateway-models.profiles.live.test.tssrc/gateway/live-image-probe.tspnpm test:liveOPENCLAW_LIVE_TEST=1OPENCLAW_LIVE_GATEWAY_MODELS=allOPENCLAW_LIVE_GATEWAY_MODELS="provider/model"OPENCLAW_LIVE_GATEWAY_MAX_MODELS=0OPENCLAW_LIVE_GATEWAY_PROVIDERS="google,google-antigravity,google-gemini-cli,openai,anthropic,zai,minimax"readexec+readsrc/gateway/live-image-probe.tsagentattachments: [{ mimeType: "image/png", content: "<base64>" }]images[]src/gateway/server-methods/agent.tssrc/gateway/chat-attachments.tscatbashopenclaw models list openclaw models list --json
src/gateway/gateway-cli-backend.live.test.tscli-backend.tspnpm test:liveOPENCLAW_LIVE_TEST=1OPENCLAW_LIVE_CLI_BACKEND=1claude-cli/claude-sonnet-4-6OPENCLAW_LIVE_CLI_BACKEND_MODEL="codex-cli/gpt-5.5"OPENCLAW_LIVE_CLI_BACKEND_COMMAND="/full/path/to/codex"OPENCLAW_LIVE_CLI_BACKEND_ARGS='["exec","--json","--color","never","--sandbox","read-only","--skip-git-repo-check"]'OPENCLAW_LIVE_CLI_BACKEND_IMAGE_PROBE=1OPENCLAW_LIVE_CLI_BACKEND_IMAGE_ARG="--image"OPENCLAW_LIVE_CLI_BACKEND_IMAGE_MODE="repeat""list"IMAGE_ARGOPENCLAW_LIVE_CLI_BACKEND_RESUME_PROBE=1OPENCLAW_LIVE_CLI_BACKEND_MODEL_SWITCH_PROBE=1OPENCLAW_LIVE_CLI_BACKEND_MCP_PROBE=1Example:
bashOPENCLAW_LIVE_CLI_BACKEND=1 \ OPENCLAW_LIVE_CLI_BACKEND_MODEL="codex-cli/gpt-5.5" \ pnpm test:live src/gateway/gateway-cli-backend.live.test.ts
Cheap Gemini MCP config smoke:
bashOPENCLAW_LIVE_TEST=1 \ pnpm test:live src/agents/cli-runner/bundle-mcp.gemini.live.test.ts
This does not ask Gemini to generate a response. It writes the same system settings OpenClaw gives Gemini, then runs
gemini --debug mcp listtransport: "streamable-http"Docker recipe:
bashpnpm test:docker:live-cli-backend
Single-provider Docker recipes:
bashpnpm test:docker:live-cli-backend:claude pnpm test:docker:live-cli-backend:claude-subscription pnpm test:docker:live-cli-backend:codex pnpm test:docker:live-cli-backend:gemini
Notes:
scripts/test-live-cli-backend-docker.shnode@anthropic-ai/claude-code@openai/codex@google/gemini-cliOPENCLAW_DOCKER_CLI_TOOLS_DIR~/.cache/openclaw/docker-cli-toolspnpm test:docker:live-cli-backend:claude-subscription~/.claude/.credentials.jsonclaudeAiOauth.subscriptionTypeCLAUDE_CODE_OAUTH_TOKENclaude setup-tokenclaude -pcron/acp spawn ... --bind heresrc/gateway/gateway-acp-bind.live.test.ts/acp spawn <agent> --bind herepnpm test:live src/gateway/gateway-acp-bind.live.test.tsOPENCLAW_LIVE_ACP_BIND=1claude,codex,geminipnpm test:live ...claudeacpxOPENCLAW_LIVE_ACP_BIND_AGENT=claudeOPENCLAW_LIVE_ACP_BIND_AGENT=codexOPENCLAW_LIVE_ACP_BIND_AGENT=droidOPENCLAW_LIVE_ACP_BIND_AGENT=geminiOPENCLAW_LIVE_ACP_BIND_AGENT=opencodeOPENCLAW_LIVE_ACP_BIND_AGENTS=claude,codex,geminiOPENCLAW_LIVE_ACP_BIND_AGENT_COMMAND='npx -y @agentclientprotocol/claude-agent-acp@<version>'OPENCLAW_LIVE_ACP_BIND_CODEX_MODEL=gpt-5.5OPENCLAW_LIVE_ACP_BIND_OPENCODE_MODEL=opencode/kimi-k2.6OPENCLAW_LIVE_ACP_BIND_REQUIRE_TRANSCRIPT=1OPENCLAW_LIVE_ACP_BIND_REQUIRE_CRON=1OPENCLAW_LIVE_ACP_BIND_PARENT_MODEL=openai/gpt-5.5chat.sendOPENCLAW_LIVE_ACP_BIND_AGENT_COMMANDacpxOPENCLAW_LIVE_ACP_BIND_REQUIRE_CRON=1Example:
bashOPENCLAW_LIVE_ACP_BIND=1 \ OPENCLAW_LIVE_ACP_BIND_AGENT=claude \ pnpm test:live src/gateway/gateway-acp-bind.live.test.ts
Docker recipe:
bashpnpm test:docker:live-acp-bind
Single-agent Docker recipes:
bashpnpm test:docker:live-acp-bind:claude pnpm test:docker:live-acp-bind:codex pnpm test:docker:live-acp-bind:droid pnpm test:docker:live-acp-bind:gemini pnpm test:docker:live-acp-bind:opencode
Docker notes:
scripts/test-live-acp-bind-docker.shclaudecodexgeminiOPENCLAW_LIVE_ACP_BIND_AGENTS=claudeOPENCLAW_LIVE_ACP_BIND_AGENTS=codexOPENCLAW_LIVE_ACP_BIND_AGENTS=droidOPENCLAW_LIVE_ACP_BIND_AGENTS=geminiOPENCLAW_LIVE_ACP_BIND_AGENTS=opencode~/.profile@anthropic-ai/claude-code@openai/codexhttps://app.factory.ai/cli@google/gemini-cliopencode-aiacpx/runtimeacpx~/.factoryFACTORY_API_KEYdroid exec --output-format acpOPENCODE_CONFIG_CONTENTOPENCLAW_LIVE_ACP_BIND_OPENCODE_MODELopencode/kimi-k2.6~/.profilepnpm test:docker:live-acp-bind:opencodeacpxacpxagentcodexOPENCLAW_AGENT_RUNTIME=codexopenai/gpt-5.5/codex status/codex modelssrc/gateway/gateway-codex-harness.live.test.tsOPENCLAW_LIVE_CODEX_HARNESS=1openai/gpt-5.5OPENCLAW_LIVE_CODEX_HARNESS_IMAGE_PROBE=1OPENCLAW_LIVE_CODEX_HARNESS_MCP_PROBE=1OPENCLAW_LIVE_CODEX_HARNESS_GUARDIAN_PROBE=1OPENCLAW_AGENT_HARNESS_FALLBACK=noneOPENAI_API_KEY~/.codex/auth.json~/.codex/config.tomlLocal recipe:
bashsource ~/.profile OPENCLAW_LIVE_CODEX_HARNESS=1 \ OPENCLAW_LIVE_CODEX_HARNESS_IMAGE_PROBE=1 \ OPENCLAW_LIVE_CODEX_HARNESS_MCP_PROBE=1 \ OPENCLAW_LIVE_CODEX_HARNESS_GUARDIAN_PROBE=1 \ OPENCLAW_LIVE_CODEX_HARNESS_MODEL=openai/gpt-5.5 \ pnpm test:live -- src/gateway/gateway-codex-harness.live.test.ts
Docker recipe:
bashsource ~/.profile pnpm test:docker:live-codex-harness
Docker notes:
scripts/test-live-codex-harness-docker.sh~/.profileOPENAI_API_KEY@openai/codexOPENCLAW_LIVE_CODEX_HARNESS_IMAGE_PROBE=0OPENCLAW_LIVE_CODEX_HARNESS_MCP_PROBE=0OPENCLAW_LIVE_CODEX_HARNESS_GUARDIAN_PROBE=0OPENCLAW_AGENT_HARNESS_FALLBACK=noneNarrow, explicit allowlists are fastest and least flaky:
Single model, direct (no gateway):
OPENCLAW_LIVE_MODELS="openai/gpt-5.5" pnpm test:live src/agents/models.profiles.live.test.tsSingle model, gateway smoke:
OPENCLAW_LIVE_GATEWAY_MODELS="openai/gpt-5.5" pnpm test:live src/gateway/gateway-models.profiles.live.test.tsTool calling across several providers:
OPENCLAW_LIVE_GATEWAY_MODELS="openai/gpt-5.5,openai-codex/gpt-5.5,anthropic/claude-opus-4-6,google/gemini-3-flash-preview,deepseek/deepseek-v4-flash,zai/glm-5.1,minimax/MiniMax-M2.7" pnpm test:live src/gateway/gateway-models.profiles.live.test.tsGoogle focus (Gemini API key + Antigravity):
OPENCLAW_LIVE_GATEWAY_MODELS="google/gemini-3-flash-preview" pnpm test:live src/gateway/gateway-models.profiles.live.test.tsOPENCLAW_LIVE_GATEWAY_MODELS="google-antigravity/claude-opus-4-6-thinking,google-antigravity/gemini-3-pro-high" pnpm test:live src/gateway/gateway-models.profiles.live.test.tsGoogle adaptive thinking smoke:
source ~/.profilepnpm openclaw qa manual --provider-mode live-frontier --model google/gemini-3.1-pro-preview --alt-model google/gemini-3.1-pro-preview --message '/think adaptive Reply exactly: GEMINI_ADAPTIVE_OK' --timeout-ms 180000pnpm openclaw qa manual --provider-mode live-frontier --model google/gemini-2.5-flash --alt-model google/gemini-2.5-flash --message '/think adaptive Reply exactly: GEMINI25_ADAPTIVE_OK' --timeout-ms 180000Notes:
google/...google-antigravity/...google-gemini-cli/...geminiThere is no fixed “CI model list” (live is opt-in), but these are the recommended models to cover regularly on a dev machine with keys.
This is the “common models” run we expect to keep working:
openai/gpt-5.5openai-codex/gpt-5.5anthropic/claude-opus-4-6anthropic/claude-sonnet-4-6google/gemini-3.1-pro-previewgoogle/gemini-3-flash-previewgoogle-antigravity/claude-opus-4-6-thinkinggoogle-antigravity/gemini-3-flashdeepseek/deepseek-v4-flashdeepseek/deepseek-v4-prozai/glm-5.1minimax/MiniMax-M2.7Run gateway smoke with tools + image:
OPENCLAW_LIVE_GATEWAY_MODELS="openai/gpt-5.5,openai-codex/gpt-5.5,anthropic/claude-opus-4-6,google/gemini-3.1-pro-preview,google/gemini-3-flash-preview,google-antigravity/claude-opus-4-6-thinking,google-antigravity/gemini-3-flash,deepseek/deepseek-v4-flash,zai/glm-5.1,minimax/MiniMax-M2.7" pnpm test:live src/gateway/gateway-models.profiles.live.test.tsPick at least one per provider family:
openai/gpt-5.5anthropic/claude-opus-4-6anthropic/claude-sonnet-4-6google/gemini-3-flash-previewgoogle/gemini-3.1-pro-previewdeepseek/deepseek-v4-flashzai/glm-5.1minimax/MiniMax-M2.7Optional additional coverage (nice to have):
xai/grok-4mistral/cerebras/lmstudio/Include at least one image-capable model in
OPENCLAW_LIVE_GATEWAY_MODELSIf you have keys enabled, we also support testing via:
openrouter/...openclaw models scanopencode/...opencode-go/...OPENCODE_API_KEYOPENCODE_ZEN_API_KEYMore providers you can include in the live matrix (if you have creds/config):
openaiopenai-codexanthropicgooglegoogle-vertexgoogle-antigravitygoogle-gemini-clizaiopenrouteropencodeopencode-goxaigroqcerebrasmistralgithub-copilotmodels.providersminimaxLive tests discover credentials the same way the CLI does. Practical implications:
If the CLI works, live tests should find the same keys.
If a live test says “no creds”, debug the same way you’d debug
openclaw models listPer-agent auth profiles:
~/.openclaw/agents/<agentId>/agent/auth-profiles.jsonConfig:
~/.openclaw/openclaw.jsonOPENCLAW_CONFIG_PATHLegacy state dir:
~/.openclaw/credentials/Live local runs copy the active config, per-agent
auth-profiles.jsoncredentials/workspace/sandboxes/agents.*.workspaceagentDirIf you want to rely on env keys (e.g. exported in your
~/.profilesource ~/.profile~/.profileextensions/deepgram/audio.live.test.tsDEEPGRAM_API_KEY=... DEEPGRAM_LIVE_TEST=1 pnpm test:live extensions/deepgram/audio.live.test.tsextensions/byteplus/live.test.tsBYTEPLUS_API_KEY=... BYTEPLUS_LIVE_TEST=1 pnpm test:live extensions/byteplus/live.test.tsBYTEPLUS_CODING_MODEL=ark-code-latestextensions/comfy/comfy.live.test.tsOPENCLAW_LIVE_TEST=1 COMFY_LIVE_TEST=1 pnpm test:live -- extensions/comfy/comfy.live.test.tsmusic_generateplugins.entries.comfy.config.<capability>test/image-generation.runtime.live.test.tspnpm test:live test/image-generation.runtime.live.test.tspnpm test:live:media image~/.profileauth-profiles.json<provider>:generate<provider>:editdeepinfrafalgoogleminimaxopenaiopenroutervydraxaiOPENCLAW_LIVE_IMAGE_GENERATION_PROVIDERS="openai,google,openrouter,xai"OPENCLAW_LIVE_IMAGE_GENERATION_PROVIDERS="deepinfra"OPENCLAW_LIVE_IMAGE_GENERATION_MODELS="openai/gpt-image-2,google/gemini-3.1-flash-image-preview,openrouter/google/gemini-3.1-flash-image-preview,xai/grok-imagine-image"OPENCLAW_LIVE_IMAGE_GENERATION_CASES="google:flash-generate,google:pro-edit,openrouter:generate,xai:default-generate,xai:default-edit"OPENCLAW_LIVE_REQUIRE_PROFILE_KEYS=1For the shipped CLI path, add an
inferbashOPENCLAW_LIVE_TEST=1 OPENCLAW_LIVE_INFER_CLI_TEST=1 pnpm test:live -- test/image-generation.infer-cli.live.test.ts openclaw infer image providers --json openclaw infer image generate \ --model google/gemini-3.1-flash-image-preview \ --prompt "Minimal flat test image: one blue square on a white background, no text." \ --output ./openclaw-infer-image-smoke.png \ --json
This covers CLI argument parsing, config/default-agent resolution, bundled plugin activation, on-demand bundled runtime-dependency repair, the shared image-generation runtime, and the live provider request.
extensions/music-generation-providers.live.test.tsOPENCLAW_LIVE_TEST=1 pnpm test:live -- extensions/music-generation-providers.live.test.tspnpm test:live:media music~/.profileauth-profiles.jsongenerateeditcapabilities.edit.enabledgooglegenerateeditminimaxgeneratecomfyOPENCLAW_LIVE_MUSIC_GENERATION_PROVIDERS="google,minimax"OPENCLAW_LIVE_MUSIC_GENERATION_MODELS="google/lyria-3-clip-preview,minimax/music-2.6"OPENCLAW_LIVE_REQUIRE_PROFILE_KEYS=1extensions/video-generation-providers.live.test.tsOPENCLAW_LIVE_TEST=1 pnpm test:live -- extensions/video-generation-providers.live.test.tspnpm test:live:media videoOPENCLAW_LIVE_VIDEO_GENERATION_TIMEOUT_MS180000--video-providers falOPENCLAW_LIVE_VIDEO_GENERATION_PROVIDERS="fal"~/.profileauth-profiles.jsongenerateOPENCLAW_LIVE_VIDEO_GENERATION_FULL_MODES=1imageToVideocapabilities.imageToVideo.enabledvideoToVideocapabilities.videoToVideo.enabledimageToVideovydraveo3klingOPENCLAW_LIVE_TEST=1 OPENCLAW_LIVE_VYDRA_VIDEO=1 pnpm test:live -- extensions/vydra/vydra.live.test.tsveo3klingvideoToVideorunwayrunway/gen4_alephvideoToVideoalibabaqwenxaihttp(s)googleopenaiOPENCLAW_LIVE_VIDEO_GENERATION_PROVIDERS="deepinfra,google,openai,runway"OPENCLAW_LIVE_VIDEO_GENERATION_MODELS="google/veo-3.1-fast-generate-preview,openai/sora-2,runway/gen4_aleph"OPENCLAW_LIVE_VIDEO_GENERATION_SKIP_PROVIDERS=""OPENCLAW_LIVE_VIDEO_GENERATION_TIMEOUT_MS=60000OPENCLAW_LIVE_REQUIRE_PROFILE_KEYS=1pnpm test:live:media~/.profilescripts/test-live.mjspnpm test:live:mediapnpm test:live:media image video --providers openai,google,minimaxpnpm test:live:media video --video-providers openai,runway --all-providerspnpm test:live:media music --quiet© 2024 TaskFlow Mirror
Powered by TaskFlow Sync Engine