I timed myself the last time I deployed OpenClaw from scratch. Four minutes and thirty-seven seconds — from clicking "Buy Now" to having a working AI assistant respond to my first Telegram message. And that included fumbling for my API key in a password manager.
The deployment itself is genuinely fast. What trips people up is the stuff that goes wrong after — daemon not starting, model not responding, pairing codes expiring. So this guide covers both: the quick deployment AND the troubleshooting notes I wish someone had given me on day one.
Go to the Tencent Cloud Lighthouse Special Offer page:
Pick 2 cores / 4GB RAM minimum. Choose an overseas region (Singapore, Silicon Valley) if you're connecting to international platforms like WhatsApp or Telegram. Mainland China region if you're using domestic models or platforms.
Once the instance is ready, go to the Tencent Cloud Console → your Lighthouse instance → Application Management.
Paste your LLM API key (DeepSeek, OpenAI, Claude, Gemini — whatever you prefer) into the Models section. Click Add and Apply. Wait for "in use" status.
# Alternatively, if you prefer command-line setup:
# SSH into your instance first, then:
clawdbot onboard
# The wizard will walk you through model configuration interactively
Do not hardcode API keys in any script files. Always use the console panel or environment variables.
In the Lighthouse web terminal (OrcaTerm), run:
clawdbot onboard
Navigate the wizard:
Then go to your messaging app, send a message to your bot, grab the pairing code, and run:
openclaw pairing approve telegram <your-code-here>
loginctl enable-linger $(whoami) && export XDG_RUNTIME_DIR=/run/user/$(id -u)
clawdbot daemon install
clawdbot daemon start
clawdbot daemon status
Status healthy? You're done. Close everything. Your AI assistant is live.
Now for the part that actually saves you hours. Here are the most common issues I've hit (and seen others hit) during and after deployment.
Symptom: You send a message to the bot, it acknowledges but never replies with actual content.
Cause: Usually a region mismatch. If your Lighthouse instance is in mainland China but you're using an overseas model API (like OpenAI or Anthropic), the network connection may be blocked or throttled.
Fix: Either switch to a domestic model provider (DeepSeek, Tencent Hunyuan) or redeploy your instance in an overseas region. The Custom Model Tutorial has the full list of supported providers and their base URLs.
Symptom: clawdbot daemon start returns an error or the status shows "inactive."
Fix: Make sure you ran the linger command first:
loginctl enable-linger $(whoami)
export XDG_RUNTIME_DIR=/run/user/$(id -u)
Then reinstall and restart:
clawdbot daemon install
clawdbot daemon start
If it still fails, check the logs:
clawdbot daemon logs
Symptom: You enter the pairing code but get a rejection.
Fix: Pairing codes have a short TTL. Send a new message in your messaging app to generate a fresh code, then approve it immediately.
Symptom: Your API bill is higher than expected after a few days.
Cause: OpenClaw carries full conversation context with each request. Long conversations with many turns accumulate tokens fast.
Fix:
Symptom: Everything works while your terminal is open, but the bot goes silent after you disconnect.
Fix: You forgot the daemon setup. Go back and run the daemon commands from the deployment section above. On templates from 2026.1.29 and later, the daemon should be pre-configured, but it's worth verifying with clawdbot daemon status.
Symptom: You try to access the OpenClaw web interface via your server's public IP and get nothing.
Cause: WebUI is intentionally not exposed to the public internet by default — it's a security risk.
Fix: Use the Lighthouse console's Application Management panel instead. If you absolutely need WebUI access, set up an SSH tunnel or follow the secure access method in the deployment guide.
Five minutes to deploy, five common issues to know about. That's the real-world OpenClaw experience — fast to start, and manageable to maintain once you know the gotchas.
Ready to try it yourself? Head to the Tencent Cloud Lighthouse Special Offer:
Happy deploying.