This Free AI Agent Can Do $900/Week Tasks While You Sleep (Hermes + Open WebUI)
This video introduces Hermes Agent combined with Open WebUI as a free, locally-run AI agent system that offers persistent memory, autonomous task execution, and over 40 built-in tools. The presenter argues this setup outperforms subscription-based AI tools and can be monetized by offering private AI setup services to businesses with data privacy needs. The video is partly a promotional pitch for the presenter's free AI cash flow masterclass.
Summary
The video opens by contrasting traditional chatbot-style AI usage — where users manually prompt and copy responses — with autonomous AI agents that execute tasks independently. The presenter introduces Hermes Agent, built by a research lab called News Research and launched in early 2026, as a fully open-source, locally-run AI agent that addresses a core frustration with most AI tools: the lack of persistent memory across sessions. Unlike typical AI tools that reset after each conversation, Hermes maintains a persistent memory file injected into every new session, allowing it to remember past work, user preferences, and workflows. It also builds reusable skill libraries from successful past interactions, meaning it improves with use.
Hermes comes with over 40 built-in tools including web search, file management, terminal access, image generation, scheduled tasks, and multi-platform messaging integrations (Telegram, Discord, Slack, WhatsApp, Signal). The presenter notes that AMD published an official guide for running Hermes on their hardware, citing this as evidence of serious industry backing rather than a hobbyist project.
The presenter then explains how Open WebUI — a free, self-hosted, ChatGPT-style browser interface with over 128,000 GitHub stars — solves the accessibility problem of Hermes living only in a terminal. Because Hermes uses an OpenAI-compatible API, connecting the two takes roughly 10 minutes: enable Hermes's API server, generate a key, add the connection in Open WebUI, and Hermes appears as a selectable model with all its autonomous features accessible through a clean chat interface. Critically, everything runs locally with no data leaving the machine and no subscription costs.
The presenter emphasizes that the common mistake people make with agents is treating them like chatbots — asking questions and reading answers — rather than setting up autonomous, recurring workflows. Examples given include automated research pipelines, daily briefings delivered to Telegram, and recurring summarization workflows that flag items needing attention.
The latter portion of the video pivots to monetization, framing the AI agent market (projected to grow from $12–15 billion in 2025 to $80–100 billion by 2030) as an opportunity for non-developers. The presenter highlights privacy-conscious businesses — law firms, medical practices, financial advisors, real estate teams — as a particularly motivated market for local, private AI setups. Suggested business models include one-time setup fees, monthly retainers, custom skill library licensing, and positioning as a local AI specialist. The video closes with promotion of the presenter's free AI cash flow masterclass and an affiliated AI software offering a 30-day free trial.
Key Insights
- The presenter claims Hermes Agent builds reusable skill libraries from successful past workflows — not just storing notes, but compressing effective processes into skills it can automatically reuse in future sessions, making it progressively more useful over time.
- The presenter argues that AMD publishing an official guide for running Hermes on their hardware is meaningful validation, stating that 'major hardware companies don't write setup guides for side projects.'
- The presenter contends that the fundamental mistake most people make with AI agents is treating them like chatbots — asking questions and moving on — rather than designing autonomous, recurring systems that run independently, which is where the real productivity leverage lies.
- The presenter identifies privacy-sensitive businesses — law firms, medical practices, financial advisors, and real estate teams — as a specifically motivated market for local AI setups, framing 'private AI' as a more targeted and valuable niche than general AI consulting.
- The presenter cites Ollama reaching 52 million monthly downloads as of Q1 2026 as evidence that locally-run AI has moved beyond hobbyist adoption into mainstream use, signaling a broader shift away from cloud-dependent, subscription-based AI models.
Topics
Full transcript available for MurmurCast members
Sign Up to Access