How to Build an AI Native Team with Mike Cannon-Brookes
Atlassian co-founder Mike Cannon-Brookes discusses how enterprises are adopting AI, the importance of organizational context graphs for AI effectiveness, and Atlassian's strategy of meeting customers where they are while pushing toward AI-native workflows. He covers internal AI adoption challenges, new product announcements around the Teamwork Graph CLI and MCP, and predicts 2026 will be the year AI moves meaningfully beyond chat interfaces.
Summary
In this conversation, Atlassian co-founder and CEO Mike Cannon-Brookes speaks from two perspectives: as a leader internally adopting AI tools across a 13,000-person organization, and as a platform builder helping 300,000+ enterprise customers do the same. He opens by describing the real friction in enterprise AI adoption — not just deploying models, but navigating security requirements, compliance controls, and the cultural work of getting employees to actually use, share, and learn from AI tools. He uses the example of Atlassian's acquisition of Dia (the browser company), where initially only 4 of 13,000 employees could use it due to security concerns, illustrating how enterprise controls must be built before broad rollout is possible.
Cannon-Brookes frames enterprise AI value as 'intelligence multiplied by context,' arguing that while model intelligence is accelerating rapidly and visibly, context is the differentiating layer that Atlassian is uniquely positioned to provide. He describes Atlassian's Teamwork Graph — built over seven to eight years — as the best enterprise context graph available, now operating at 150 billion objects and connections for their largest customers. Major new capabilities announced at Team26 include a full semantic index of customer codebases, expanded people and org chart data (including calculated skill profiles), and physical asset tracking — all feeding into a richer context layer accessible via a new Teamwork Graph CLI and expanded MCP server integration.
On the product strategy side, Cannon-Brookes articulates a deliberate dual approach: helping customers work faster within existing workflows (e.g., AI writing assistance in Confluence, agent-assignable Jira tickets) while simultaneously introducing new paradigms (e.g., 'Create with Rovo' for turning documents into slide presentations, Rovo Studio for no-code agent and app creation). He notes Atlassian has passed five million agents created through Rovo Studio, and emphasizes that security and compliance controls are baked into all of these tools through the underlying Atlassian platform.
When discussing what separates AI leaders from laggards among enterprise customers, Cannon-Brookes observes that leading organizations are pushing for 20-30% improvements rather than incremental gains, asking platform-level questions about how AI tools connect across their stack, and focusing on quality of output and engineering throughput rather than raw token consumption. He highlights their DX acquisition as enabling engineering organizations to understand AI tool adoption and productivity correlation at scale.
He closes by predicting that the most exciting AI development of 2026 will be AI moving beyond chat interfaces into natural, embedded product experiences — making powerful AI capabilities accessible to everyday users without requiring prompt engineering expertise.
Key Insights
- Cannon-Brookes argues that enterprise AI value is best understood as 'intelligence multiplied by context,' and that while model intelligence is commoditizing rapidly, organizational context — not model choice — is becoming the primary differentiator for enterprise AI outcomes.
- Cannon-Brookes claims that leading enterprise AI adopters are distinguishing themselves not by tool usage volume but by asking platform-level integration questions and targeting 20-30% workflow improvements, while also shifting focus from token consumption to quality of output and engineering throughput.
- Cannon-Brookes describes a structural bottleneck in enterprise AI adoption where security and compliance requirements create significant delays — citing Atlassian's own Dia acquisition where only 4 of 13,000 employees could initially use the tool — and argues that embedding enterprise controls directly into AI platforms is a prerequisite for scale.
- Cannon-Brookes contends that Atlassian's Teamwork Graph CLI and MCP server — now incorporating full codebase semantic indexing, org chart data, skill profiles, and physical assets — reduce the token cost and latency of agentic tasks by pre-computing context rather than requiring models to perform expensive multi-hop reasoning at runtime.
- Cannon-Brookes predicts that 2026 will be defined by AI moving beyond chat interfaces into embedded, design-led product experiences that meet everyday users in their existing workflows, arguing that the current chat-and-terminal paradigm is a transitional phase rather than the end state of AI interaction.
Topics
Full transcript available for MurmurCast members
Sign Up to Access