Baseball is back, but COVID-19 is keeping fans safely at home. To help television audiences adjust to the new normal during the pandemic-shortened baseball season, some teams are seating cardboard cutouts of their fans, for about the cost of a ticket. FOX Sports is taking a higher-tech approach by filling empty seats with virtual fans during its game broadcasts. The technology behind the computer-generated crowd has been used for nearly a decade in various parts of sports broadcasts, but the unprecedented circumstances are pushing it to creative new deployment.
To understand how FOX Sports is building its virtual sell-outs this season and what this expanded use of computer graphics could mean for the future of sports broadcasts, the Drexel News Blog turned to Nick Jushchyshyn, an assistant professor of digital media in Westphal College of Media Arts & Design and director of Drexel’s Immersive Research Lab, for some insight.
First off, this seems like the best collision between video games and sports – how is FOX Sports making it happen?
This type of “collision” has been happening at FOX Sports and other major broadcasters for several years now. When cameras transitioned from using film and video tape to capturing digitally, the information they created became essentially the same as the graphics used to create video game visuals.
The tools that allow a game like “Fortnite” to be played by thousands of players at once, all controlling their own character’s actions in a shared world, are largely the same tools that allow for a live camera video to be merged with thousands of “fans,” or explain a critical moment of a race through the use of a detailed 3D, computer modeled race car.
FOX Sports has been one of the broadcasters at the leading edge of using this technology for sports broadcasts. Much of the dramatic computer graphics and animations you see in their “NASCAR Race HUB” show and Super Bowl broadcast are created this way.
Outside of sports, the Weather Channel is another broadcaster that is leveraging these same tools, which they refer to as “immersive mixed reality.”
What are the challenges it might present technologically and visually during the broadcast?
The interesting thing is that these tools range in price from free to very expensive and can be scaled depending on need. These days, I use the very same technology to host virtual live events for our school, participate in Zoom meetings and host online panel discussions and academic symposiums from within a virtual studio — without even needing to be on campus. Much of this was achieved by leveraging existing video and software technology we already had in our department.
Things get more complicated and challenging as productions become more elaborate, involving a coordinated performance of camera movement, on-screen talent and computer animations.
The movements of a real-world camera in a studio or stadium have to be precisely tracked and matched in the computer as they happen live and even the shadows and reflections of on-screen talent need to be calculated and simulated in a computer thousands of times per minute.
How do the designers go about creating “generic fans”?
The approach to creating “generic fans” generally starts with creating 3D representations of several different people in a computer. These can be created through scanning technology (similar to digital photography, but more three-dimensional) or these can be “sculpted” by artists in a digital process similar to creating a statue with clay. We call this “geometry” or “meshes.” The different colored clothing is added by applying photographed or digitally “painted” wardrobes onto these meshes.
Finally, the movements of the fans can be created through the work of skilled animators or motion capture artists. The difference between these is that animators create motion through direct artistic input — like the animators who would draw and paint the frames of classic cartoons — whereas motion capture artists use advanced camera technologies to record the motions of live actors into the computer.
To combine all this information into a large crowd, digital artists use a variety of tools to make thousands of unique, art directed combinations of these shapes, clothing and motions positioned into stadium seating, similar to using “copy/paste” over and over and over again. Lastly, specialized camera tracking is used to match up the digital crowd with the real-life scene.
These are all some of the techniques and tools my colleagues and I teach to students in our Immersive Media, Animation and Game Design programs.
What details do you think they’ll need to consider in order to strike a balance between the fans looking “too real” and not real enough?
As long as the digital people are not too large in the video image and they’re grouped into large crowds, it’s actually not too hard to make digital characters look real these days. Viewers just really need to see an authentic-looking variety of motion and textures to get the impression of a large crowd.
Digital characters tend to become less effective if the camera gets so close that you can start to recognize faces. At that point, since we really spend our entire lives looking at real human faces, it takes much more effort to create something completely convincing. At this level, even subtle inconsistencies can result in the digital character being perceived as “creepy.”
In the industry, we call this the “uncanny valley.” Cartoonish character faces look fine. Extremely well executed digital faces can also work. But in-between is this low point psychologically, where we just don’t accept the result as real or amusing.
What other areas of sports broadcasts do you think we might see experimentation with graphics or VR/AR technology this season?
These approaches, collectively labeled “virtual production,” are finding their way into more and more live broadcasts and episodic work today.
On Disney+, for example, “The Mandalorian” series used a variant of the same tools to create fantastic location sets from exotic “Star Wars” locations and displayed them on massive LED video walls to be seen by the camera filming scenes. On camera, it looked no different than a genuine, physical set made of wood and plaster, but it could be far more flexible, accommodating instant changes to positioning and lighting. This saved a significant amount of production cost, while providing creative freedom to the production team. We’ll certainly see this used again in the next season of the show.
In addition to episodic entertainment and sports, we’ll also see this technology appearing more and more frequently in music videos, live broadcasts reporting on major events like elections and major weather events and even major corporate event internet streams, such as product launches and annual conferences.
Jushchyshyn is a visual effects expert whose resume includes work on “The Curious Case of Benjamin Button,” “The Last Airbender,” and “The Girl with the Dragon Tattoo,” and recent immersive 3D projects that brought to life dinosaurs, ghosts and ghouls.
Media requests for Jushchyshyn should be directed to Britt Faulstick, assistant director of media relations, at email@example.com or 215.895.2617.