This note includes the setup I use to have [[OpenClaw]] on a [[Google Cloud Compute Engine]] instance using [[VertexAI]] to authenticate including [[Anthropic]] models. Since that is not supported at the moment, I use [[LiteLLM]] as a proxy to communicate with [[VertexAI]].
This note is related to the installation and setup; for the configuration and skills included, you can check [[Clippy]], the assistant's page.
# Setting up [[OpenClaw]]
## Provisioning the instance
There is no need to create a specific machine to hold it, except you expect to run the models locally. I don't, so a basic machine does the job. For my setup, I tend to forward two ports locally, one for [[OpenClaw]]'s gateway and one for the [[Setting up Headless Obsidian on a Remote Server|headless Obsidian]] instance in it via VNC.
```bash
# SSH into the instance
gcloud compute ssh <YOUR_INSTANCE> --zone=us-central1-f
# SSH with port forwarding (OpenClaw + VNC)
gcloud compute ssh <YOUR_INSTANCE> --zone=us-central1-f -- -L 18789:localhost:18789 -L 5900:localhost:5900
```
# Installation
## Prerequisites
```bash
# Node.js 22
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt-get install -y nodejs
# Git
sudo apt-get install -y git
# Homebrew (Linux)
NONINTERACTIVE=1 /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
echo 'eval "$(/home/linuxbrew/.linuxbrew/bin/brew shellenv bash)"' >> ~/.bashrc
```
## OpenClaw
```bash
sudo npm install -g openclaw@latest
```
On my machine, I have installed the version 2026.2.1
## [[LiteLLM]] Proxy
[[LiteLLM]] is used to proxy [[VertexAI]] models (including Claude via Vertex) to [[OpenClaw]], since [[OpenClaw]] doesn't natively support [[Anthropic]] via [[VertexAI]].
```bash
# Install
sudo apt-get install -y python3-pip
pip3 install 'litellm[proxy]' --break-system-packages
pip3 install -U 'google-cloud-aiplatform>=1.38' --break-system-packages
```
# Configuration
## [[VertexAI]] Authentication
Uses **Application Default Credentials** (ADC) for Vertex AI access.
```bash
# Set up user ADC (run interactively)
gcloud auth application-default login --no-launch-browser
```
The ADC file is stored at `~/.config/gcloud/application_default_credentials.json`.
## [[LiteLLM]] Config
**Location**: `~/.litellm/config.yaml`
```yaml
model_list:
- model_name: claude-opus-4-5
litellm_params:
model: vertex_ai/claude-opus-4-5
vertex_project: <YOUR_GCP_PROJECT>
vertex_location: us-east5
- model_name: gemini-2.5-pro
litellm_params:
model: vertex_ai/gemini-2.5-pro
vertex_project: <YOUR_GCP_PROJECT>
vertex_location: us-central1
- model_name: gemini-2.5-flash
litellm_params:
model: vertex_ai/gemini-2.5-flash
vertex_project: <YOUR_GCP_PROJECT>
vertex_location: us-central1
litellm_settings:
drop_params: true
general_settings:
master_key: "<YOUR_MASTER_KEY>"
```
## [[OpenClaw]] Config
**Location**: `~/.openclaw/openclaw.json`
```json
{
"agents": {
"defaults": {
"model": {
"primary": "litellm/claude-opus-4-5"
}
}
},
"models": {
"mode": "merge",
"providers": {
"litellm": {
"baseUrl": "http://localhost:4000/v1",
"apiKey": "<YOUR_MASTER_KEY>",
"api": "openai-completions",
"models": [
{
"id": "claude-opus-4-5",
"name": "Claude Opus 4.5 (via LiteLLM/Vertex)",
"contextWindow": 200000,
"maxTokens": 64000
},
{
"id": "gemini-2.5-pro",
"name": "Gemini 2.5 Pro (via LiteLLM/Vertex)",
"contextWindow": 1000000,
"maxTokens": 8192
},
{
"id": "gemini-2.5-flash",
"name": "Gemini 2.5 Flash (via LiteLLM/Vertex)",
"contextWindow": 1000000,
"maxTokens": 8192
}
]
}
}
},
"gateway": {
"mode": "local",
"auth": {
"token": "<YOUR_GATEWAY_TOKEN>"
}
}
}
```
# [[Systemd]] Services
All services run independently of SSH sessions and start on boot.
## [[LiteLLM]] Service
**File**: `/etc/systemd/system/litellm.service`
```ini
[Unit]
Description=LiteLLM Proxy Server
After=network.target
[Service]
Type=simple
User=dvicente
Group=dvicente
WorkingDirectory=/home/dvicente
Environment="PATH=/home/dvicente/.local/bin:/usr/local/bin:/usr/bin:/bin"
Environment="HOME=/home/dvicente"
Environment="GOOGLE_APPLICATION_CREDENTIALS=/home/dvicente/.config/gcloud/application_default_credentials.json"
ExecStart=/home/dvicente/.local/bin/litellm --config /home/dvicente/.litellm/config.yaml --port 4000
Restart=always
RestartSec=10
[Install]
WantedBy=multi-user.target
```
## [[OpenClaw]] Service
**File**: `/etc/systemd/system/openclaw.service`
```ini
[Unit]
Description=OpenClaw Gateway Server
After=network-online.target litellm.service
Wants=network-online.target litellm.service
[Service]
Type=simple
User=dvicente
Group=dvicente
WorkingDirectory=/home/dvicente
ExecStart=/usr/bin/node /usr/lib/node_modules/openclaw/dist/index.js gateway --port 18789
Restart=always
RestartSec=5
KillMode=process
Environment="HOME=/home/dvicente"
Environment="PATH=/home/linuxbrew/.linuxbrew/bin:/home/dvicente/.local/bin:/usr/local/bin:/usr/bin:/bin"
Environment="NODE_ENV=production"
Environment="GOOGLE_CLOUD_PROJECT=<YOUR_GCP_PROJECT>"
Environment="CLOUDSDK_CORE_PROJECT=<YOUR_GCP_PROJECT>"
Environment="OPENCLAW_GATEWAY_PORT=18789"
Environment="OPENCLAW_GATEWAY_TOKEN=<YOUR_GATEWAY_TOKEN>"
[Install]
WantedBy=multi-user.target
```
Note: This replaces any user-level [[Systemd]] service (`~/.config/systemd/user/openclaw-gateway.service`) that [[OpenClaw]] may have installed. If you see lock conflicts, remove the user service:
```bash
systemctl --user stop openclaw-gateway.service
systemctl --user disable openclaw-gateway.service
rm ~/.config/systemd/user/openclaw-gateway.service
```
## Service Management
```bash
# Enable all services (auto-start on boot)
sudo systemctl enable litellm openclaw
# Start/Stop/Restart
sudo systemctl start litellm openclaw
sudo systemctl restart openclaw
# Check status
sudo systemctl status litellm openclaw
# View logs
sudo journalctl -u litellm -f
sudo journalctl -u openclaw -f
```
# Accessing [[OpenClaw]] Gateway
## Web UI
1. Port forward: `gcloud compute ssh <YOUR_INSTANCE> --zone=us-central1-f -- -L 18789:localhost:18789`
2. Open: `http://localhost:18789/?token=<YOUR_GATEWAY_TOKEN>`
# To-Do
- [x] Configure WhatsApp Business using a landline ā³ 2026-02-05 š
2026-02-07 ā
2026-02-05
- [x] Configure it to connect to mail/calendar š¼ ā 2026-02-06 ā³ 2026-02-06 ā
2026-02-06