The Data Firehose: Why Your TV Still Looks Like 2012
Every NFL, MLB, and NBA game captures millions of data points per play. Broadcast shows almost none of them. Here's what's happening in the gap and why it matters.

By 2030, courtside might mean VR. By 2035, the broadcast could follow your eye movements. By 2050, the definition of "watching sports" will have changed entirely.
The way we watch live sports hasn't fundamentally changed since the invention of the instant replay. But right now, out of sight of the average fan, the underlying technology of the broadcast is being entirely rebuilt.
This series models how mixed reality, AI personalization, and spatial computing will completely collapse the distance between the fan and the field over the next three decades.
Part 1 grounds you in the *current* data infrastructure. Parts 2–3 show what's shipping now (AI overlays and spatial tech). Parts 4–6 model the mid-term shift to volumetric, immersive, and personalized viewing.
Part 7 covers the business layer—rights fragmentation and why distribution matters. Part 8 speculates: what could sports look like in 2036 if all these trends converge?
Every NFL, MLB, and NBA game captures millions of data points per play. Broadcast shows almost none of them. Here's what's happening in the gap and why it matters.
Real-time overlays, win probability graphics, auto-generated highlights, and virtual cameras. The next layer of broadcast technology is already here—deployed by Amazon, Stats Perform, and the networks. Here's how it works.
Vision Pro headsets in living rooms. Courtside seats projected into your kitchen. The shift from screens to spatial computing changes everything about where the action happens next.
Amazon Prime and Sportradar already tested simultaneous multi-angle feeds and AI commentary in 2025. By 2035, AI will generate a custom broadcast for every viewer in real time—same play, incompatible experiences. Here's what's actually being built.
COSM's experiences. The Sphere in Vegas. Massive LED environments where fans gather to experience games with 40-foot screens and synchronized haptic feedback. The stadium becomes the broadcast.
When every fan gets their own AI-generated broadcast tailored to their preferences—following their team, their favorite player, or the stats they care about most. Personalization at scale. And the privacy trade-offs that come with it.
Rights holders are splintered across platforms. Leagues are experimenting with direct-to-consumer models. Who controls the broadcast in 2030? And what does that mean for how you watch?
Pure speculation built on Parts 1–7. Neural interfaces. Codec avatars of athletes. AI-generated commentary. A thought experiment about how far this can go—and whether we'll actually want it to.
Parts 1–4 are live. Parts 5–8 drop weekly through April 17. Subscribe to get each one as it publishes — no noise, just signal.
Clarity over clickbait. Insight over hype. Unbiased analysis over partisan spin. Join curious readers who want to understand what's really happening.