Self‑hosted AI assistant on Linux with PicoClaw

Use PicoClaw to run a self‑hosted AI assistant on your own Linux server, VPS, or homelab node with predictable resource usage and full control of your data.

Instead of relying on SaaS bots that run in someone else's cloud, you can deploy PicoClaw on any small Linux box and keep prompts, logs, and workflows under your control. PicoClaw's tiny memory footprint makes it ideal for always‑on cron jobs, webhooks, and background agents.

1. Choose where to host PicoClaw

You can run PicoClaw on:

  • A low‑cost VPS
  • A homelab NUC or mini‑PC
  • An on‑prem server inside your organisation's network

As long as the host can run Go binaries or Docker containers, it can host a self‑hosted AI assistant with PicoClaw.

2. Install PicoClaw (binary or Docker)

On most distributions you can either download the binary or use the Docker image.

Binary install

curl -L "https://github.com/sipeed/picoclaw/releases/latest/download/picoclaw-linux-amd64" -o picoclaw
chmod +x picoclaw
sudo mv picoclaw /usr/local/bin/

picoclaw --help

Docker

docker run --rm -it \
  -v $PWD/config.yaml:/etc/picoclaw/config.yaml \
  ghcr.io/sipeed/picoclaw:latest

See the Docker page for more container examples.

3. Schedule jobs with cron

One of the strongest use cases for PicoClaw is cron‑driven AI automation: tasks that run every few minutes, hours, or days to summarise information and send results somewhere useful.

Typical patterns include:

  • Summarising logs or error reports into a daily digest
  • Generating status updates for teams or customers
  • Monitoring external APIs and turning raw responses into human‑readable summaries

Because PicoClaw is lightweight, you can run many such jobs on a single small server.

4. React to webhooks and events

For near‑real‑time automation, expose a small HTTP endpoint that triggers PicoClaw workflows when webhooks arrive from your tools or devices.

This lets you build a self‑hosted AI automation hub that connects chat, monitoring, CI, and other systems through LLMs.

5. Next steps

From here you can: