How the Flume and Unreal Engine brought Coachella to the Metaverse

A version of this article originally appeared in TIME Into the Metaverse Newsletter. Sign up for a weekly guide to the future of the internet. You can find previous editions of the newsletter here.

If you’ve been watching Coachella’s YouTube live on Saturday night, you might have done something double like giant leafy trees and a Godzilla-sized parrot slowly rising above the electronic artist stage Flume. Were they giant inflatable structures? Mirage on a 200-foot LED screen? Pure mind hallucinations?

None of the above. This year, Coachella teamed up with Unreal Engine, Epic Games’ 3D software development tool, which I wrote about in this newsletter two weeks ago, to create what organizers say is the first live broadcast to add augmented reality (AR) technology to a music festival performance. . Unreal Engine worked with Flume’s technical team and other technical collaborators to create massive, psychedelic 3D visuals that blend seamlessly with their design and background, and float around the artist and across the skies of India.

But no one at the festival could see these huge parrots, only spectators at home. The result, although it only takes a few minutes, serves as a blueprint for how live event planners can use metaverse technology in the future to create unique experiences for home viewers. Many metaverse creators believe that live events will increasingly be hybridized with digital and real-world components, and that immersive tools can help make each version of the event special and desirable in its own right. “It doesn’t make sense to recreate the live music experience by default,” says Sam Schoonover, Coachella’s chief innovation officer. “Instead, you have to give fans something new and different that they can’t do in real life.”

Over the past couple of years, augmented reality images have made their way into live broadcasts, albeit mostly in the form of little gimmicks. Riot Games brought a giant dragon to the opening ceremony of the League of Legends Worlds 2017 Finals; The monster’s camera tracked the howl as it flew around the stadium crowd. Last September, a giant tiger walked the Carolina Panthers Stadium in the same way. (Tiger was also created with the Unreal Engine.)

Schoonover has been trying to use similar effects in Coachella’s live broadcast for years in an effort to expand its audience beyond the confines of the Empire Polo Club. “The audience for online shows is growing so much that there are probably 10 or 20 times more people watching a show via a live stream than a festival,” Schonover said. “Because the home experience will never be compared to the festival experience, we want to give artists new ways to express themselves and increase viewership around the world.”

However, previous efforts to experiment with augmented reality at Coachella were thwarted by the cost of production and the lack of interest from the performers. This year, it took a partnership with Epic, which aims to lower barriers to entry for 3D creators, and joining Flume, an electronic musician who has long emphasized visual prowess during his concerts, to implement the project. . Key players in the process include artist Jonathan Zawada, who has worked extensively on audiovisual projects with Flume, including NFT, and director Michael Healy, who directed Flume’s very latest music video, “Say Nothing.”

The result was a formation of huge australia (Australian flume), brightly colored flowers and lush trees swaying in the wind over the stage and crowded crowd. Three broadcast cameras with additional hardware tracking allowed the production team to insert these 3D graphics into the real-time video stream.

Schoonover says the graphics are just the beginning of what can be created in augmented reality for live concerts. Future performers could have light effects surrounding their faces at all times, for example, or synchronize their dance movements with those of the surrounding avatars. It’s easy to imagine production designers adding, in real time, the kind of effects that ubiquitous music video director Cole Bennett adds to his videos in post-production, or a Snoop Dogg performance in which he surrounds his characters from the Sandbox metaverse.

Schonover says AR experiences will be taken to another level when AR glasses are standardized. Eventually, you might be able to watch the concert in 3D, from the festival site, surrounded by birds, plants, and everything the artists dream about in 3D. “When it comes to people who want to get the Coachella experience from their couch, this is the entry point,” he says.

Subscribe to the Metaverse for a weekly guide to the future of the internet.

Join TIMEPieces at Twitter and disagreement

More must-try stories from TIME


call us to me [email protected]

Leave a Comment