InsightfulOpinion

Why cultivating agency matters more than cultivating skills in the AI era | Max Schoening (Head of Product, Notion)

Max Schoening, Head of Product at Notion, discusses how AI is transforming product building, arguing that agency matters more than skills in the AI era. He explores malleable software, the merging of design/engineering/PM roles, how the 'first 10% of any project is now free,' and what makes great products succeed.

Summary

Max Schoening, Head of Product at Notion, joins Lenny's podcast to discuss the profound shifts happening in product building as AI becomes more powerful. Max has an unusually multidisciplinary background spanning engineering, design, and product management at companies like GitHub, Heroku, and Google, which gives him a unique perspective on where things are heading.

On the topic of role convergence, Max explains how Notion designers and PMs began coding not to ship production code, but to understand the material they design with. He created a simplified 'playground' codebase optimized for LLM-assisted prototyping, especially for chat interfaces, citing Brett Victor's 'Stop Drawing Dead Fish' talk as inspiration. His core argument is that coding forces designers and PMs to deeply understand the medium — particularly agent loops — rather than designing static mockups divorced from reality.

Max argues that the most important differentiator in the AI era is agency, not technical skill. He defines agency as the understanding that the world is malleable and that one can change things — a trait he sees as unevenly distributed. He distinguishes high-agency employees (like Brian Levin and Eric Liu at Notion) as those who redefine their roles based on what needs to happen rather than clinging to job descriptions. He connects this to the Steve Jobs quote about realizing the world is made by people no smarter than you.

On malleable software, Max argues that most software today serves the corporation that makes it rather than the people who use it. He envisions software that adapts to individual users the way a well-lived-in home adapts to its inhabitants over time. He connects Notion's success with AI agents to this philosophy — agents need open, connected environments to operate effectively, and Notion's structure provides that.

Max is skeptical of the 'SaaS apocalypse' narrative, arguing that people don't actually want to maintain their own software stacks. The 'as a service' model persists because it handles maintenance and specialized thinking, which users don't want to do themselves. He predicts tools will become more general (like 90s-era word processors and spreadsheets) but remain as-a-service.

On the future of product building, Max says the 'first 10% of every project is now free' — it's trivially easy to build a v0 prototype — but the last 10% remains 90% of the work. He sees AI expanding the ability to explore multiple product directions simultaneously rather than fundamentally changing the nature of product quality work. He's concerned that vibe coding has increased the volume of software without increasing its quality.

On taste, Max defines it as the ability to run a 'virtual machine in your head' — to predict how a specific in-group will react to an idea. He argues taste is built through repeated iterations with feedback, analogous to how models are trained. He notes this makes him somewhat skeptical that taste is uniquely human and uncopyable by AI.

Max shares his 'hot take' that knowledge work is already a form of universal basic income — that comfortable, air-conditioned cognitive labor is far more than people need to live, and that humans will always find ways to insert themselves into agentic loops regardless. He closes with advice for young people in Silicon Valley to reduce the frenetic anxiety about missing the 'last train' and instead focus on genuine curiosity and tinkering.

Key Insights

  • Max argues that agency — not technical skill — is the key differentiator in the AI era, because skills are now readily accessible through AI models, but the willingness to act and change things remains unevenly distributed.
  • Max claims that designing in code (even if it never ships to production) forces designers and PMs to deeply understand agent loops and the actual material they are designing with, which static Figma mockups cannot replicate.
  • Max contends that the 'first 10% of every project is now free' due to AI — building a v0 prototype requires almost no effort — but the last 10% of quality and polish remains 90% of the actual work.
  • Max argues that all great products have one tiny core superpower (the iPhone's multi-touch, GitHub's pull request, Heroku's 'git push heroku master') and that repeatedly adding features to compensate for a weak core never works.
  • Max defines taste as the ability to run a virtual machine in your head — to accurately predict whether a specific in-group will like an idea — and argues it is built through high-frequency iteration with feedback, analogous to model training.
  • Max claims that the SaaS apocalypse is greatly exaggerated because people fundamentally don't want to maintain their own software stacks; the 'as a service' value is paying specialists to think hard about a problem and handle maintenance.
  • Max argues that software engineering capabilities bleeding into other domains (HR, marketing, etc.) is the real transformation happening — not that AI is creating distinct new job categories, but that 'software is eating the world' is accelerating.
  • Max is skeptical that frontier model intelligence is what most knowledge work tasks actually need, arguing that for many tasks models are already 'good enough' and future optimization will shift to cost, speed, and local execution rather than raw intelligence.
  • Max claims that the Notion AI agent's success is partly explained by the fact that agents need open, connected environments to operate — and Notion's connected workspace structure provides that context without narrow 'orifices' between data silos.
  • Max argues that knowledge work is already effectively a form of universal basic income, and that humans will inevitably find new ways to insert themselves into agentic loops regardless of how capable AI becomes.
  • Max contends that being first to market is overrated — AirPods weren't the first Bluetooth headphones, and Anthropic started after OpenAI with less funding yet is now dominant — and that doing it right matters far more than timing.
  • Max warns that vibe coding has increased the volume of software produced in the last 12 months without a corresponding increase in software quality or reliability, and that the industry is missing an 'Apple-esque machined unibody' engineering discipline.

Topics

Agency vs. skills in the AI eraRole convergence of designers, PMs, and engineersMalleable software philosophyThe future of SaaSAI's impact on product building workflowsWhat makes great products succeedTaste and how to develop itToken spend and ROI on AI toolsUniversal basic income and knowledge workPrototyping in code vs. Figma

Full transcript available for MurmurCast members

Sign Up to Access

Get AI summaries like this delivered to your inbox daily

Get AI summaries delivered to your inbox

MurmurCast summarizes your YouTube channels, podcasts, and newsletters into one daily email digest.