Why am I running Clawdbot on a $3000 computer

Jan 25, 2026 • 6 minutes to read

You do not seriously think AGI can run on the Raspberry Pi, right?

It's only January 2026, and we've already caught a genuine glimpse of AGI—not in a sterile research lab, but in a cheeky little agent called Clawdbot. People are messaging it, voice-calling it, even yelling at it across the kitchen while cooking dinner. And it just … does stuff. Real stuff.

For everyday folks, Clawdbot is straight-up wizardry. You message it via Telegram, iMessage, or voice, and it handles things.

Take Alex Finn ( @AlexFinnX), who installed Clawdbot on his Mac Mini and let it loose for a day. While he “lived his life,” it:

  • Wrote 3 YouTube scripts
  • Built a custom daily AI news brief
  • Created its own project management system
  • Assembled a full second-brain setup to replace Notion
    Alex called it “AGI in practice.”

For the technical crowd, Clawdbot is essentially a souped-up, always-on wrapper around LLM's agentic capabilities (the name “Clawd” is a dead giveaway to its origin from Claude and Claude Code). You give natural-language commands, and it writes + executes code to get results — no copying/pasting required. It can call APIs, control apps, crunch data, and loop until the job's done.

But where do you run this personal AGI?

Clawdbot's creator, Peter Steinberger ( @steipete), noted that you can run Clawdbot anywhere, even on a free tier virtual machine on AWS or a $10 Raspberry Pi device. That is certainly true, especially if you consider it is technically a wrapper around AI services like OpenAI, Anthropic Claude, and Google Gemini.

But in reality, the Mac Mini has became many people's first choice for running Clawdbot. It has allegedly driven Mac Mini interests in January to the same heights as Black Friday! Even Google's Lead for AI products, Logan Kilpatrick ( @OfficialLoganK), announced he is buying a Mac Mini for this …

And there are good reasons for that!

  1. Clawdbot knows everything about you. API keys, calendars, emails, sometimes even sensitive logins. Running it in the cloud feels like handing your diary to a stranger. It could also have your credit cards or crypto private keys (for @x402pay micro-payments) to make purchases on your behalf. I would like the option to pull the power cord if the bot goes awry!
  2. The Mac ecosystem has a lot of apps people use everyday. Deep hooks into Shortcuts, AppleScript, Messages, Photos, Health. Clawdbot drives it all natively when given proper permissions on an Mac Mini.

One doctor ( @jeffdavismd) asked about local Clawdbot specifically for protected health information (PHI).

While I have two Mac Minis in my house running personal services, smaller than Jeff Tang ( @jefftangx)‘s “12 Mac Minis + 12 Claude Max Plans” setup, I am actually opting for even more capable machines to run Clawdbot.

The renaissance of “personal cloud”, such as the Olares OS, enables me to run a large number of open source utilities and SaaS alternatives at home. All those services are properly isolated and containerized so that they do not create security issues for me. Examples of open source utilities I ran on Olares?

  • Ghost as an alternative to WordPress
  • NocoDB as an alternative to Airtable
  • OpenWebUI and LobeChat as alternatives to ChatGPT
  • Immich as an alternative to Google Photos
  • Perplexica as an alternative to Perplexity AI
    and much more!

Best of all? I can easily run Olares OS on almost all personal computers, including the Olares One and the Mac mini.

A private brain for your AGI

However, all these private apps and private data would be for nothing if you have to send every request and every conversation to the cloud AI. Companies are shipping absurdly powerful mini-workstations designed explicitly for the local AI brain.

  • The $3,000 Olares One with an RTX 5090 can run SOTA open-source LLMs at usable speed. It is a runaway Kickstarter success.

  • Nvidia's $4,000 DGX Spark is basically a baby data-center in a shoebox.

  • Even Apple’s Mac Studio is in the game, though it's slower on raw inference.

Smart Clawdbot users are already doing hybrid setups: a fast local model acts as the router, handling any task that touches private data, while non-sensitive reasoning, web searches, and heavy coding get forwarded to frontier cloud models (Claude 4.5, Grok 4.1, GPT 5.2, Gemini 3.0, whatever is SOTA this week). You pay cloud API bills only when you actually need bleeding-edge capability, and your calendar, finances, and embarrassing voice memos never leave your house.

The Clawdbot is designed to work with multiple AI services. You could have a “primary” choice for LLM and VLM, and then multiple backup choices in a whitelist. As Jensen Huang put it in his CES 2026 keynote: “AI can span from cloud services to desktop and edge applications.”

He's right. Hybrid intelligence is the future. The last mile of AGI isn't more parameters — it's your private data, safely fused with shared public intelligence. That combination is worth far more than any hardware price tag.

My private data is the moat no cloud company can replicate. And for the cost of a decent gaming PC, I get to keep it.

So yeah, I bought the Mac Minis and Mac Studios. Then I bought the Olares One pre-order. Clawdbot now lives entirely on my local network, sipping electricity and occasionally roasting my music choices, which it can access using my private credentials.

Welcome to 2026. Your AI just became a roommate. It pays rent by saving you hours every day.

LLMClaude CodeAI AgentAILLM
A high-performance, extensible, and hardware optimized WebAssembly Virtual Machine for automotive, cloud, AI, and blockchain applications