Skip to content
Projects
Electronic ArtsElectronic Arts·product

Project AIR

Autonomous World Simulation

Agentic AIKnowledge GraphsSystem DesignGoSwift
Project AIR is the foundational architecture for an autonomous world simulation. The core idea: instead of scripted NPCs that follow decision trees, build agents with independent agency, long-term memory, and a shared understanding of the world they inhabit.
The system is built on a decentralized Agent SDK where each NPC possesses its own reasoning loop, memory store, and goal hierarchy. Agents don't wait for triggers — they observe the world state, form intentions, and act. When a player interacts with one agent, the ripple effects propagate through the world because all agents share a centralized Knowledge Graph.
The Knowledge Graph, powered by Neo4j, serves as the persistent World-State model. Every character, location, relationship, event, and piece of lore is stored as nodes and edges. Agents query the graph to understand context — who they are, what they know, what's happened nearby — and write back to it when they act. This creates genuine emergence: events that designers didn't script but that make narrative sense.
The backend is written in Go for low-latency inference routing, and the iOS frontend in Swift using Metal for real-time rendering. I personally authored the Swift frontend and integrated MCP servers to enable secure, bi-directional data flow between agents and local/cloud data sources.
I led a lean squad of 3 PMs, 1 Design Engineer, and 1 Lead Product Designer, managing cross-functional delivery with 20+ engineers across iOS, Backend, and DevOps.