How to Install OpenClaw: Complete Setup Guide for Mac, Linux and Windows
Welcome back to the OpenClaw Mastery series — Episode 2. As we covered in our previous guide, OpenClaw is an open-source AI agent framework that makes building autonomous workflows easier. In this episode you'll get a complete, hands-on installation walkthrough for Mac, Linux and Windows, covering every supported method:
- Docker (recommended) — quick, reproducible, and works on every platform (about 5 minutes)
- CLI onboard wizard — guided setup from your terminal using
openclaw onboard - ClawOneClick managed deploy — hosted, managed deployment (about 60 seconds)
- VPS (Hetzner) — low-cost 24/7 operation for $5–10/month
I'll also list prerequisites, show how to connect LLM providers (OpenAI, Anthropic Claude, and DeepSeek-compatible endpoints), and show how to verify your install. The tone is practical; follow the steps and you’ll be running OpenClaw quickly.
Prerequisites — what you need before you begin
Make sure you have these installed and configured:
- Node.js: Node 18+ is supported; Node 20 is a recommended runtime for modern tooling. Use nvm to manage versions if needed.
- Docker: Docker Engine or Docker Desktop. On Windows, Docker Desktop must be configured with WSL2 as the backend. Docker 20.10+ is compatible.
- A terminal/shell: Terminal.app, iTerm2, Windows Terminal, or a Linux shell.
- LLM provider account & API key (for production): OpenAI (OpenAI API key), Anthropic (Claude API key), or a DeepSeek-compatible endpoint. You'll store keys as environment variables or in OpenClaw's secret store.
- (Optional) Git: to fetch examples or the repository.
If you haven't installed Node or Docker yet, follow the official Node (nodejs.org) and Docker (docker.com) guides for your OS. On Windows, ensure WSL2 is enabled and you have a Linux distro installed from the Microsoft Store.

Photo by Daniil Komov on Pexels | Source
Method A — Docker (recommended, ~5 minutes)
Docker is the fastest and most portable way to run OpenClaw. This method isolates dependencies and is ideal for local testing and production containers.
Pull the latest image (example):
docker pull openclaw/openclaw:latest
Run the container with a mapped port and environment variables for your LLM keys:
docker run -d --name openclaw -p 8080:8080
-e OPENAI_API_KEY="sk-xxx"
-e CLAUDE_API_KEY="claude-xxx"
openclaw/openclaw:latest- Change the port mapping (8080) to your preference.
- Provide the API key(s) for the LLM provider(s) you plan to use.
Check container logs to confirm startup:
docker logs -f openclaw
Why Docker? It takes care of platform differences, simplifies upgrades (pull a new image), and is the recommended way to get started quickly.
Method B — CLI onboard wizard (interactive)
If you prefer a guided setup in your terminal, use the onboard wizard. The CLI typically walks you through storage, LLM provider selection, and basic configuration.
Install the CLI (if required):
Example — consult official docs if different
npm install -g openclaw-cli
Run the wizard:
openclaw onboard
Follow prompts to:
- Choose deployment type (local, Docker, or remote)
- Enter your LLM provider and API key (OpenAI, Claude, DeepSeek endpoint)
- Set admin user/email and persistence options
The CLI will produce a ready-to-run command or a docker-compose file. If you selected Docker, it will offer to launch the container for you.
Method C — ClawOneClick managed deploy (60 seconds)
ClawOneClick is the simplest path if you want a hosted, managed deployment without server maintenance (good for demos and small production apps).
- Visit the ClawOneClick dashboard and sign in.
- Click Deploy -> New Instance, choose region and plan.
- Paste your LLM API key(s) and click Deploy.
In about a minute you'll receive an admin URL and credentials. ClawOneClick handles TLS, scaling, and backups for you. This is the fastest but comes with managed service costs; it’s convenient if you prefer zero infra work.

Photo by Brett Sayles on Pexels | Source
Method D — VPS setup (Hetzner) — cheap 24/7 operation ($5–10/month)
If you want a low-cost 24/7 instance, a small Hetzner Cloud server is a great option. Many community deployments use the smallest CX or basic shared CPU instance in the €4–8 / month range (roughly $5–10/month) depending on currency and options.
Create a Hetzner Cloud project and provision a server (Ubuntu LTS is a good choice).
SSH into the server and install Docker and Docker Compose:
sudo apt update && sudo apt install -y docker.io docker-compose
Pull or upload your OpenClaw docker-compose.yml (from the CLI or repo), inject your LLM keys as environment variables or use a secrets manager, then launch:
docker compose up -d
Configure a firewall (ufw) and set up a reverse proxy (nginx) with TLS (Let's Encrypt) for secure public access.
Tips:
- Use environment variables or Hetzner's secret store for API keys — avoid committing keys to git.
- Monitor costs and bandwidth; the smallest instance typically handles light-to-moderate usage.
Connecting an LLM provider
OpenClaw supports multiple LLM backends via provider adapters. Typical providers:
- OpenAI — set OPENAI_API_KEY
- Anthropic / Claude — set CLAUDE_API_KEY (and endpoint if required)
- DeepSeek-compatible endpoints — set DEEPSEEK_API_KEY or a custom URL
How to configure:
- For Docker: pass environment vars in the docker run or docker-compose file.
- For CLI onboard: the wizard will ask for keys and set config files for you.
- For VPS: place keys in a .env file loaded by docker-compose, or use system secrets.
Example docker-compose excerpt:
environment: - OPENAI_API_KEY=${OPENAI_API_KEY} - CLAUDE_API_KEY=${CLAUDE_API_KEY}
Always keep keys out of version control. Rotate keys regularly and use provider-specific rate limits and cost controls.
Verifying your installation
Once you’ve launched OpenClaw, verify it's healthy and connected to your LLM provider.
- Check process / container status:
- Docker: docker ps, docker logs -f openclaw
- Systemd (VPS): sudo systemctl status openclaw (if you set a service)
Use the CLI status (if available):
openclaw status
Test the API or UI
- If you have a local UI, open http://localhost:8080 (or the port you mapped). The CLI or container logs will usually show the access URL.
- Send a lightweight API request (use a test prompt) and verify a valid LLM response. The onboard wizard often runs an initial test message.
- Confirm LLM connectivity
- Check logs for successful API handshakes and not-forbidden / auth errors.
- If you see rate-limit or authentication errors, verify the right key and billing on the provider side.

Photo by Adonyi Gábor on Pexels | Source
Troubleshooting — common issues
- Windows users: Docker Desktop must use WSL2. If containers fail to start, confirm WSL2 is enabled and Docker is set to use it.
- Auth errors: re-check API keys and environment variable names. Provider dashboards can show blocked or expired keys.
- Port conflicts: change host port mapping if 8080 (or another default) is already in use.
- Performance on cheap VPS: consider moving heavy workloads to managed GPUs or provider-hosted LLM endpoints if latency or throughput is important.
Next steps and best practices
- For production, secure your instance with HTTPS (Let's Encrypt), a reverse proxy, and network rules.
- Use provider quotas and request batching to control API costs.
- Back up configuration and persist storage outside ephemeral containers.
- Monitor system metrics and LLM request logs.
In the next episode of OpenClaw Mastery we'll build your first autonomous workflow and show how to author reliable prompts and checks. If you missed Episode 1, go back to “What Is OpenClaw? The Open-Source AI Agent Everyone's Talking About” for the high-level view before you dive deeper.
Keywords & tags
This guide focused on practical installation and was designed so you can get OpenClaw up and running quickly across platforms.
Happy deploying — and if you run into a specific error, drop the error text and your platform and I’ll help troubleshoot in the comments or the next article.
Frequently Asked Questions
Which install method should I choose?
Use Docker for most cases — it's fast, portable, and recommended for local testing and production containers. Choose ClawOneClick if you want a managed host, or Hetzner VPS for cheap 24/7 operation.
What LLM providers does OpenClaw support?
OpenClaw supports multiple providers via adapters; common choices are OpenAI and Anthropic Claude. You can also connect DeepSeek-compatible endpoints by supplying the provider endpoint and API key.
Do I need WSL2 on Windows?
Yes — if you plan to use Docker Desktop on Windows, WSL2 must be enabled and configured as Docker's backend for best compatibility.
How do I securely store API keys?
Avoid committing keys to git. Use environment variables, Docker secrets, or a secrets manager provided by your cloud/VPS provider to keep keys out of version control.
How much does a Hetzner VPS cost to run OpenClaw 24/7?
A small Hetzner instance suitable for light workloads typically falls in the $5–10/month range depending on region and plan; monitor resource use to pick the right tier.



