Using Jetty with AI Agents
Jetty workflows can be created, run, and monitored directly from your coding agent or editor — no context switching to a web UI required.
Three Ways to Use Jetty with Agents
| Approach | Best For | Setup Time |
|---|---|---|
| Claude Code Plugin | Claude Code users who want /jetty commands | 3 minutes |
| MCP Server | Cursor, Windsurf, VS Code Copilot, Zed, Gemini CLI | 5 minutes |
| REST API from agent | Any agent that can run curl | Already set up |
What Can You Do?
Once connected, your agent can:
- Create workflows — describe what you want in natural language, and the agent builds the JSON workflow definition
- Run workflows — kick off runs with custom parameters and monitor progress
- Inspect results — view trajectory outputs, download generated files, check step-by-step execution
- Evaluate and label — run LLM-as-judge evaluations, add labels to trajectory results
- Browse step templates — explore 40+ pre-built activities (LLM chat, image generation, text processing, evaluation, etc.)
Quick Example
Ask your agent:
"Create a Jetty workflow that takes a prompt, generates an image with Flux, then evaluates whether the image is cute using GPT-4o as a judge"
The agent will use Jetty's MCP tools (or REST API) to:
- Look up the
replicate_text2imageandsimple_judgestep templates - Build a multi-step workflow JSON definition
- Create the task in your collection
- Run the workflow and poll for results
- Show you the generated image and judge's verdict
Agent Runtimes for Runbooks
When running runbooks on Jetty, the agent runtime determines which coding agent CLI executes your instructions inside the sandbox.
| Agent | Runtime ID | Default Model | API Key Required |
|---|---|---|---|
| Claude Code | claude-code | claude-sonnet-4-6 | ANTHROPIC_API_KEY |
| Codex | codex | gpt-5.4 | OPENAI_API_KEY |
| Gemini CLI | gemini-cli | gemini-3.1-pro-preview | GOOGLE_API_KEY |
Set the agent in your runbook's YAML frontmatter:
---
agent: claude-code
model: claude-sonnet-4-6
snapshot: python312-uv
---
Or pass it via the Chat Completions API:
{
"model": "claude-sonnet-4-6",
"jetty": {
"agent": "claude-code",
"snapshot": "python312-uv"
}
}
Store the required API key in your collection's environment variables — the agent needs it to make LLM calls inside the sandbox.
Choose Your Path
- New to Jetty? Start with the Claude Code Plugin for the fastest onboarding, or the MCP Server if you use a different editor.
- Already have a token? Jump straight to Agent Recipes for copy-paste workflows.
- Hitting weird errors? Check the Common Gotchas — there are a few parameter mismatches that trip up agents.