HolyCode - OpenCode in Docker Skip to content
HolyCode

HolyCode

One container. Every tool. Any provider.

Cloud
-
GitHub Stars
-
Docker Pulls
0
Dev Tools
0
AI Providers
<2min
Setup

Works with Anthropic, OpenAI, Gemini, Groq, Bedrock, Azure, Vertex AI, Ollama, and more. Free and open source.

Don't want to self-host? Try HolyCode Cloud

Quick Start

From zero to AI coding in under a minute.

1

Pull the image

$ docker pull coderluii/holycode:latest
2

Create a compose.yaml

compose.yaml
services:
  holycode:
    image: coderluii/holycode:latest
    container_name: holycode
    restart: unless-stopped
    shm_size: 2g
    ports:
      - "4096:4096"
    volumes:
      - ./data/opencode:/home/opencode
      - ./workspace:/workspace
    environment:
      - PUID=1000
      - PGID=1000
      - ANTHROPIC_API_KEY=your-key-here
3

Start and open

$ docker compose up -d

Open http://localhost:4096. You're in. Want the board or the meta-agent later? Expose ports 3100 and 8642, then flip ENABLE_PAPERCLIP or ENABLE_HERMES.


What You Get

Everything you need. Nothing you don't.

OpenCode AI Agent

Built-in web UI on port 4096. Provider-agnostic. Bring any API key.

Persistent State

Settings, sessions, MCP configs, plugins. Persistent storage. Rebuild without losing anything.

50+ Dev Tools

git, ripgrep, pnpm, TypeScript, Prisma, Lighthouse, ffmpeg, and more. Pre-installed.

Paperclip Board

Flip one env var and get a local board on port 3100 that hires OpenCode workers and wakes them on heartbeat.

Hermes Agent

Start Hermes on port 8642 for a self-improving meta-agent with MCP, messaging adapters, and local OpenCode delegation.

Headless Browser

Chromium + Xvfb + Playwright. Screenshots, scraping, automation. Zero config.

Node.js 22 + Python 3

Two runtimes ready for your projects. npm, pip, venv included.

s6-overlay

Process supervision. Auto-restart. Clean shutdown. No zombies.


Any Provider. Your Choice.

OpenCode is provider-agnostic. Bring your own key, use any model.

Anthropic

OpenAI

Gemini

Groq

AWS Bedrock

Azure OpenAI

Plus Vertex AI, GitHub Models, Ollama, and any OpenAI-compatible endpoint via OpenCode's provider system.

Set one key. Or set all of them. Switch providers anytime without rebuilding.


Why HolyCode

I built this because I was tired of re-doing the same setup every time. Installing OpenCode, configuring a headless browser in Docker, fixing UID permission issues, debugging process supervision. Every. Single. Time.

So I packaged everything into one container. Every tool configured. Every bug already fixed. You just pull and go.

HolyCode
DIY
Setup time
Under 2 minutes
30-60 minutes
Chromium + Xvfb
Pre-configured
Debug yourself
Dev tools
50+ pre-installed
Install one by one
State persistence
Automatic volumes
Manual bind mounts
UID/GID permissions
Built-in PUID/PGID
Dockerfile hacks
Multi-arch
amd64 + arm64
Build both yourself
Updates
docker pull
Rebuild from scratch

Your environment should follow you. Not the other way around.

Coming Soon

HolyCode Cloud

Same 50+ tools. Same 10+ providers. Same persistent state. No Docker. No setup. Just open your browser and code.

Get Early Access

Ready to stop rebuilding?

Pull one image. Get every tool. Keep everything between rebuilds. Your AI coding workstation, portable and ready.