🚀 Zero Polling • Free Models • AI-Powered

Code Smarter,
Not Harder

Run OpenCode AI coding agent on OpenClaw without token waste. Dispatch tasks, get results, build faster.

opencode-dispatch
$ ~/.openclaw/scripts/dispatch-opencode.sh \
  -p "implement auth API" \
  -n "auth-api" \
  -w "/project" \
  -m "opencode/glm-5-free"

🚀 Starting: auth-api
✅ auth-api → done
📄 /root/.openclaw/data/opencode-results/auth-api.json

Why OpenCode on OpenClaw?

Four powerful methods for different scenarios

Zero Polling

Dispatch tasks without wasting tokens on status checks. Results saved to JSON.

Free Models

Use opencode/glm-5-free or minimax-m2.5-free. No API costs.

Multi-Task DAG

Combine with ralph-team for complex projects with parallel execution.

Background Run

Tasks run in background. Continue chatting while code gets written.

Quick Start

Three ways to use OpenCode on OpenClaw

METHOD 1

Dispatch (Recommended)

# Zero polling - save result to JSON ~/.openclaw/scripts/dispatch-opencode.sh \ -p "implement auth API" \ -n "auth-api" \ -w "/project"
METHOD 2

Interactive PTY

# With pseudo-terminal bash pty:true workdir:/project \ command:"opencode run 'task'"
METHOD 3

Team Tasks DAG

# Multi-task with dependencies TM="team-tasks" $TM init project -m dag $TM add project "task1" $TM add project "task2" -d "task1"

Available Models

GLM-5-Free
opencode/glm-5-free
✅ Recommended
MiniMax M2.5
opencode/minimax-m2.5-free
Free
Gemini Flash
google/gemini-1.5-flash
Free

Ready to Code Smarter?

Install OpenCode and start building faster with AI assistance.

View on GitHub