At Sphere, I led user research to turn 20,000-person audiences into active players—using phones and massive screens for real-time games. With no precedent, I built a research system from scratch: defining design theses, running large-scale playtests, and prioritizing high-impact unknowns to guide development at scale.
In 2021, I joined Sphere’s R&D team to explore how technology could transform passive viewers into active participants. We set out to design live multiplayer games for crowds of up to 20,000— exploring a range of different formats:
While the game mechanics varied, my research process remained consistent.
What follows isn’t a detailed case study (to respect confidentiality), but a breakdown of my research approach—one built for navigating ambiguity and applicable to any product with high stakes and unknowns.
1
When starting with a blank canvas, our first move was alignment. I gathered key stakeholders—the VP of Interactive, Creative Director, and Technical Director—for focused workshops to ask:
What?
To get clear on purpose, scope, and success.
2
From our early brainstorming, we defined what a great experience should feel like. I turned these into clear benchmarks—giving teams a shared language to design, test, and measure against as we built.




3
To focus our efforts, I created a Research Impact Matrix—a simple tool to align on what to tackle first. It helped us prioritize high-impact unknowns, deprioritize low-value tasks, and guide sprint planning and milestone decisions.
Confidence (X): How certain we already are about the answer?
Impact (Y): How much the insight could shift design or strategy?
4
To design effectively, we segmented our audience into key groups. This not only grounded our approach in real player behavior, but also helped us validate assumptions through targeted testing.
Unlike typical digital experiences, Sphere’s audience wasn’t hypothetical—it filled a stadium. Replicating this during testing was a core challenge. We broke the research into three key stages:
To turn insights into action, I combined storytelling with evidence—mixing quotes, data, and visuals. This helped leadership spot patterns fast and make confident decisions.


Post-playtest interviews


Another powerful way of humanizing data was creating behavioral personas, helping the team design from a player-first perspective instead of assumptions.


Turbo Tim

Low-Key Linda