How to Build Your Own Hermes + OpenClaw!
The video provides a step-by-step guide to building a free, local AI coding agent called 'Hermes plus OpenClaw' using PyAgent, Ollama, Gemma 4, and Parallel's free web search MCP. The setup requires no paid API keys, runs entirely on the user's machine, and includes web search capability. The presenter walks through installation, configuration, and practical use cases for the tool.
Summary
The video is a tutorial walkthrough for building a fully local, cost-free AI coding agent dubbed 'Hermes plus OpenClaw.' The presenter explains that most CLI agents require paid API subscriptions to services like OpenAI or Anthropic, but this setup avoids all fees by running Google's open-source Gemma 4 model locally via Ollama — a free tool that allows AI models to run on personal hardware without sending data to external servers.
The core components are broken down clearly: PyAgent serves as the CLI wrapper that ties everything together; Ollama handles local model execution; Gemma 4 is the underlying AI model; and Parallel's newly released free web search MCP (Model Context Protocol) plugin gives the local agent the ability to retrieve live information from the internet. The presenter emphasizes that MCP functions like a plugin system for AI agents, allowing extensibility to other tools and services.
The installation process is outlined in four steps: (1) Install Ollama from ollama.com, (2) Download Gemma 4 via the terminal command 'ollama pull gemma4', (3) Install PyAgent from its GitHub page and configure it to point at the local Gemma 4 model, and (4) Add Parallel's web search MCP to the PyAgent config file. The presenter notes the initial setup may take about an hour but only needs to be done once.
Three major advantages of this setup are highlighted: privacy (prompts and data never leave the user's machine except during web searches), speed (no API latency or rate limits), and cost (completely free indefinitely). The presenter acknowledges that Gemma 4 is not as capable as top-tier paid models like GPT-4 or Claude but argues it is more than sufficient for everyday tasks including coding, writing, searching, summarizing, and automation.
Six practical workflows are demonstrated: cleaning customer data from CSV files, writing personalized outreach emails, building automation scripts, conducting web research using the MCP, building small web apps and landing pages, and debugging existing code. The presenter also offers five usage tips: write clear and specific prompts, build a reusable prompt library, integrate the agent with other tools via MCP, keep Ollama and PyAgent updated, and share workflows with the community. The video closes with promotions for the AI Profit Boardroom and the free AI Success Lab community.
Key Insights
- The presenter argues that PyAgent, unlike most CLI agents, requires no paid API key because it can run entirely off a locally hosted model via Ollama, making the entire coding agent stack free and offline-capable once set up.
- The presenter explains that Parallel's newly released free web search MCP functions like a plugin for AI agents, allowing a fully local model to pull live data from the internet — effectively bridging the gap between offline local models and real-time information.
- The presenter claims that using ChatGPT or Claude online sends user data to external servers, which poses a privacy risk for businesses handling sensitive client information, whereas the Hermes plus OpenClaw setup keeps all prompts on the user's machine except during web searches.
- The presenter states that Gemma 4 is tuned for coding, writing, and reasoning, runs on consumer-grade hardware without requiring a dedicated GPU, and that larger versions can be run on more powerful machines for better results.
- The presenter concedes that local models like Gemma 4 are not as capable as the largest paid models, and recommends swapping in a paid model only for exceptionally complex one-off tasks while handling all routine daily work locally to save costs.
Topics
Full transcript available for MurmurCast members
Sign Up to Access