Animator explains how ESPN turned an NFL game into a Pixar animated football game in real time using AI

Screen capture of the Toy Story Funday Football display banner.
ESPN’s collaboration with the NFL, Pixar and an AI developer to create a real time animated simulcast football game is the culmination of several pieces of cutting edge technology, says award-winning animator and Northeastern professor Jason Donati. Photo by ESPN/Disney

Sunday’s clash between the NFL’s Atlanta Falcons and Jacksonville Jaguars took place in London’s Wembley Stadium, but that wasn’t the only iconic venue where viewers could see the two teams face off. The NFL also simulcast a version of the game on a much smaller scale: inside Andy’s room from Pixar’s “Toy Story.” 

Toy-like representations of the players competed in a real-time, computer-animated version of the game, as characters like Woody, Buzz Lightyear, Bo Peep, Rex and Forky watched from the sidelines. “Toy Story Funday Football” was the result of a first-time collaboration between the NFL, ESPN’s Creative Studio, Beyond Sports, a Sony-owned AI company, and Pixar, which provided its character and environment assets. Meanwhile, Silver Spoon Animation helped create a motion captured, animated trio of color commentators.

Viewers watching the animated collisions on Sunday were probably wondering the same thing: How did this all come together? 

Jason Donati, an award-winning animator and teaching professor of art and design at Northeastern University, says the technology to make this possible has been in place for a few years. However, this is one of the first times all this cutting-edge technology –– the NFL’s stat tracker, Beyond Sports’ active tracking system and the Unity game engine –– has been used in concert.

“You’ve seen pieces of it, but you’ve not really seen it all together, and certainly not all together for something in real time or really close to real time,” Donati says.

Screen capture of the Toy Story Funday Football.
ESPN/Disney

It all starts with the NFL’s Next Gen Stats, a statistic tracking tool powered by Amazon Web Services that can monitor players’ locations, movements and speeds in real time. Each NFL player has a motion-tracking chip embedded in their gear that helps the NFL and its coaches track performance. Amazon also integrates that data into its Thursday night football broadcasts on Prime Video.

Donati says ESPN also used state-of-the-art limb sensor motion-tracking data collected by Beyond Sports’s AI active tracking system to create its animated players. 

Donati says technology like this is a massive step forward for animation and motion capture.

“Think back to ‘The Lord of the Rings’ where we’ve got the motion-capture suits and the little [motion-tracking] markers on them,” Donti says. “That certainly wasn’t in real time. You’re capturing that data, and you’re delivering it to animators to apply onto a CG [computer generated] character.”

“Now, with the Hawk-Eye technology, that can happen in real time without any of the trackers,” Donati continues. “From triangulating camera angles, we can place limbs in space and, in real time, capture that data and feed it into a source.”

In this case, the source is Unity, one of the most popular video game engines. When all this motion- and limb-tracking data is fed into Unity, it allows animators and artists to render high-quality imagery in real time.

Donati says ESPN’s latest foray into real-time simulcast animation is a lot more complex than its previous collaboration with Beyond Sports, a live-animated NHL game themed after the “Big City Green” animated show. The sheer amount of collisions taking place on the field –– where limbs are being obscured by other players –– is an animator’s worst nightmare. But he says systems like Beyond Sports’ AI can correct for all the chaos in ways that weren’t possible even a few years ago.

“In this game you’ve got 20-plus people all pig piling on a ball, so you’ve got AI, happening in real time too, filling in all of these motion and limb data blanks where a player is covered by another player,” Donati says. “It’s figuring it out along the way in real time. ‘That limb was moving this way at this speed before it got blocked, so it must be going this way and picks up this way.’” 

Donati says this is likely not the last time ESPN does something like this, and he’s excited to see how this technology will create more immersive experiences for fans.

Even now, the technology can go beyond just replicating a traditional NFL broadcast. Once all this data is plugged into Unity, animators can manipulate the “field” like they would any video game. While a real NFL broadcast is bound by where a physical camera can fit or reach, there are no such limits in a 3D-rendered environment like this. 

“It’s a first major step in breaking down the barriers between this thing happening there and me experiencing it on a flat screen over here,” Donati says. “I think the next frontier is some sort of photo realistic version. You’ve got photogrammetry happening potentially in real time, so it doesn’t look like Woody or Buzz but looks very similar to what the Madden games look like –– but now I’m in it. What if I want to be on the field? What if I want to be in the huddle? What if I want to be on the line? … I think that’s maybe where this is all headed.”

Cody Mello-Klein is a Northeastern Global News reporter. Email him at c.mello-klein@northeastern.edu. Follow him on Twitter @Proelectioneer.