100% found this document useful (2 votes)
188 views

Broadcast and Live Events Field Guide

This guide is intended to provide an introduction to using real-time technology for broadcast and live events. It will explore best practices for building pipelines that incorporate real-time elements, while also addressing considerations like training staff, production roles, and budgets. The guide features case studies and interviews with studios that have successfully used real-time rendering engines like Unreal Engine for live broadcasts and events. It aims to help both technical and non-technical readers understand the opportunities and challenges of this emerging approach.

Uploaded by

Will Weinsoff
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (2 votes)
188 views

Broadcast and Live Events Field Guide

This guide is intended to provide an introduction to using real-time technology for broadcast and live events. It will explore best practices for building pipelines that incorporate real-time elements, while also addressing considerations like training staff, production roles, and budgets. The guide features case studies and interviews with studios that have successfully used real-time rendering engines like Unreal Engine for live broadcasts and events. It aims to help both technical and non-technical readers understand the opportunities and challenges of this emerging approach.

Uploaded by

Will Weinsoff
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

THE

BROADCAST AND
LIVE EVENTS
FIELD GUIDE
By Will Freeman
Presented by Epic Games

© 2022 Epic Games / All Rights Reserved.


Contributors
Author: Will Freeman
Editor in Chief: Bernt Johannessen
Editors: Michele Bousquet and Jill Ramsay
Layout and Illustrations: Oliver Morgan and Carys Norfor

Acknowledgments
We would like to thank everyone we interviewed for this guide for generously sharing their time to give us insights
into how they use Unreal Engine in broadcast and live event projects that employ real-time technology, and for
their support in helping to make this guide happen. We would also like to thank all of the team at Epic Games for
providing additional details about broadcast and live event productions that use Unreal Engine in their pipelines,
including Bernt Johannessen, Sevan Dalkian, and Patrick Wambold.

Cover image courtesy of Co-op Live


Image courtesy of Moment Factory

3
The Broadcast and Live Events Field Guide Contents

Contents
Chapter 1: Introduction 6 Chapter 5: Summary 68
Who is This Guide For? 9 New Skills and Retraining Staff 70
The Rise of Real-Time 10 Establishing New Roles and Production Hierarchies 70
Why Real-Time Content Creation? 12 Adopting Unknown and New Technologies 70
About Epic Games 15 Time-Saving and Budgetary Misconceptions 71
New Hardware Choices 71
Chapter 2: Evolution of Real-Time Technology 16 Does it Have to be Real-Time? 72
Technology and Entertainment 18 Profile: Moment Factory | AT&T Discovery District 73
Why Real-Time Technology Matters 18 Profile: XR Studios 76
The Evolution of Pipelines 20 It’s Showtime! 79
Case Study: The Famous Group | Carolina Panthers | The Panther Project 21 Glossary 80
Q&A: Carolina Panthers | The Panther Project 23 Links 81

Chapter 3: Building a Real-Time Pipeline 26


Unreal Engine for Real-Time Experiences 28
Flexibility as a Standard 28
Contrasting Pipeline Approaches 29
Why Games Are Relevant 30
A ‘Storytelling First’ Mindset 31
Building Custom Tools 32
About Version Control 33
Case Study: Illuminarium | SPACE: A Journey to the Moon & Beyond 34

Chapter 4: Real-Time Graphics Content for Live Events 36


Key Considerations 40
Case Study: Creative Works London | Guns N’ Roses | 2021 Tour 45
Q&A: disguise 47
Augmented and Mixed Reality for Live Productions 50
AR/MR and Virtual Production 50
Practice Makes Perfect 52
Case Study: Myreze | Valve Corporation | The International 10 53
Case Study: The Weather Channel | IMR Studio 56
Practical and Technical Considerations 58
Case Study: Psyonix | RLCS Hype Chamber | Capacity Studio and Partners 65

4 5
The Broadcast and Live Events Field Guide Chapter 1: Introduction

CHAPTER 1:

Introduction Broadcast and live events are both well-


established mediums. Live broadcast
goes back almost 100 years, while in-
person events have existed throughout
human history. As we’ll see in this guide,
both are incredibly powerful in their ability
to connect people with moments in time
through shared experiences, and both
remain highly relevant and popular today.

Image courtesy of Myreze.com

6 7
The Broadcast and Live Events Field Guide Chapter 1: Introduction

Who is This Guide For?


The long history of broadcast and live events doesn’t real-time. With a profound belief in that opportunity, This guide explores best practices in building, using, technically minded readers, including technical artists
mean that the space is free from innovation and over time we’ve built up specialist internal teams and and maintaining an effective pipeline for successful and software developers, it equally serves founders,
progress—far from it, in fact. Both live broadcasts and developed Unreal Engine features to directly support broadcasts and live events that embrace real-time C-suite executives, team managers, producers,
live events each provide a foundation on which to build sectors like broadcast, live events, architecture, elements—and the critical role of the game engine in and directors looking to understand the opportunity
new experiences, and could even be considered blank automotive, virtual production, and more. those pipelines. while future-proofing their businesses or projects.
canvases for innovation. Contractors, freelancers, and specialists will also take
Establishing new skill sets, technologies, pipelines, However, it is not a piece of instructional a great deal of practical insight from this guide.
In recent years, real-time technologies have facilitated and processes for real-time broadcast and live documentation, nor does it require deep technical
an explosion in innovation around broadcast and live events has been a highly collaborative process, with knowledge or hands-on experience in the field. If you feel intimidated by the technology, or unsure
events, giving creators a means to mix digital content shared learning at its heart, as technologists, studios, Through interviews and analysis, the guide also about the reality, there is even more reason to read on.
with reality, seamlessly and in a single space. hardware manufacturers, storytellers, and more come explores themes related to pipelines, such as While pragmatic and frank, you will also find realistic
together to establish collective knowledge. training, the relevance of existing skill sets, and other encouragement informed by experience. The real-
Real-time technologies have also enabled a considerations that impact the planning, production, time opportunity is immense, and you are very likely
tremendous increase in the breadth of what a This guide shares a great deal of that knowledge, and delivery of live real-time experiences. more equipped to embrace the technology than you
broadcast or live event can be. A real-time-embellished being shaped by expert interviews with those involved might imagine.
event might reinvent familiar forms such as concerts in successful real-time broadcast and live events Epic’s Broadcast and Live Events Field Guide is for
or theater, but could equally see a group woodland walk projects, together with insights from Epic’s teams and those looking for a starting point in understanding Put another way, if you work in any specialty or
transformed into an immersive educational experience, veterans of the wider world of broadcast and events. the typical industry challenges and the opportunity discipline related to broadcasting or live events
or cities reimagined as galleries where buildings are present, or who need to deepen their familiarity with and wish to keep pace with the frontrunners in your
draped in dynamic, evolving digital art works. This guide also offers insight on the opportunity, the space. While it will provide value and insight to industry, consider this an essential read.
challenges, and future of real-time in broadcast
Over in broadcast, a similar revolution is underway, and live events, as well as practical guidance on
also powered by real-time technology. Traditional establishing pipelines and workflows, rethinking team
entities such as live sports and video journalism are hierarchies and skill sets, and best practices for
finding new ways to connect and communicate with the craft.
audiences, while realms like esports are using real-
time to forge entirely new broadcast concepts made Throughout, we’ve also spotlighted many of the best
from the virtual and real. and most impactful real-time broadcast and live
events projects, sharing interviews with the teams and
The creative and commercial opportunity there is individuals behind them.
immense. But a real-time project requires a quality
real-time pipeline. Fundamentally, a real-time pipeline This guide is not intended to be a piece of technical
is like many others, bringing together the technology documentation or a highly detailed guide on the
and workflows required to take a project from early machinations of pipelines, but rather a welcoming,
concept to final delivery. However, in the case of thorough overview of the broad technologies, themes,
the exact technologies and experience required, and approaches that make this possible. It is designed
real-time pipelines can be highly distinct, and feel to help you or your organization identify the reality of
deeply complicated. the real-time opportunity, while informing decision-
making and strategy—and sharing a bounty of
At Epic Games, we understand that complexity, having practical tips along the way.
provided the video game industry with a real-time
solution that sits at the heart of thousands of game
development pipelines. Over the years, we’ve seen
increasing sectors from outside games adopt that
Image courtesy of Capacity Studios
technology, Unreal Engine, to harness the potential of

8 9
The Broadcast and Live Events Field Guide Chapter 1: Introduction

The Rise of Real-Time


Live music and theatrical performances have real-time technology enables teams to embellish point of consumption. Why would a tool from an entirely production in those spaces. Today, game engines offer
existed since the earliest days of civilization, bringing live action with visuals that make tuning in more different industry be relevant to real-time content in a highly relevant element of those pipelines, coming
audiences and performers together for experiences memorable, personal, or impactful. events and live broadcast? As interactive entities, from an industry defined by the interactive, real-time
unique to the moment they occurred. Centuries video games are defined by their real-time nature. nature of its medium. Equally, understanding the skill
on, when television arrived, so too did the modern Meanwhile, the rise of devices like VR and AR in the They have to respond instantly to a constant stream of sets, studio models, and collaborative techniques of
counterpart of those ancient forms; namely the live consumer domain, new pipeline and R&D approaches, user inputs, providing highly polished experiences that game development paves the way forward for the use
broadcast, which has remained a popular method adoption of technologies from other industries, dynamically adapt while remaining as a cohesive whole. of real-time technology in other sectors.
of entertainment and presentation despite the and fresh paradigms for blending the digital and As such, the video game industry has spent decades
proliferation of pre-recorded options. physical worlds are bringing a bounty of creative and establishing tools, techniques and conventions, today Broadcast and live events have always been about
commercial opportunities. serving an audience of over 3 billion users. innovation, experimentation, and working with the
Today the evolution of real-time technology has cutting edge, and that remains the case. What has
reached a point where it can facilitate live experiences While the production and delivery of live real-time As we are seeing live events and broadcasts embrace changed is the capacity for impact, engagement,
well beyond these modest beginnings, both reinventing experiences is inherently complex, the potential is real-time technology, the expertise and practice of and success.
in-person live events with large-scale interactive significant—and the challenges have become far from game development has become more relevant than
animated displays and bringing up-to-the-minute, insurmountable. Emerging standards, technological ever to those fields. At the same time, game engine In this field guide, Epic Games discusses the elements
dynamic graphics to live television. refinements, and user-centric approaches equip providers like Epic Games have not only adapted their that constitute an efficient and workable pipeline
teams with the ability to work efficiently, creatively, tools to better serve broadcast and live events, but all the way through design, content production, and
We are seeing interactive real-time elements become and ambitiously. Understanding how to establish have built up support and internal teams to serve delivery; various approaches to implementing them;
increasingly commonplace at concerts, art exhibitions, an appropriate high-quality, robust pipeline lays the those sectors. the place and value of a game engine; and the tools
sports arenas, and other events, and also permanent foundation for increased success in broadcast and live and techniques required to evaluate and optimize.
installations such as building facades or corporate events empowered by real-time technology. Look across real-time event and broadcast outfits The guide also explores how best to seize the broader
lobbies. Those in-person real-time experiences can such as Myreze, Creative Works, Moment Factory, XR opportunity, bringing in expert insight and advice from
empower visitors to be active participants as well Game engines have emerged as an ideal and Studios, disguise, and The Famous Group, and you'll practitioners that are deeply experienced in the craft of
as passive audience members, bringing profound significant element of those pipelines. Simply put, a see that game engines—specifically, Epic Games’ real-time broadcast and live events.
opportunities to artists, performers, brands, and those game engine is a software framework with tools for Unreal Engine—are already a mainstay of real-time
that host events. Over in the live broadcast space, both creating video games, and powering them at the

Image courtesy of Illuminarium Experiences

10 11
The Broadcast and Live Events Field Guide Chapter 1: Introduction

Shortens rendering time during production. Real-time content creation platforms eliminate the rendering
process. This time-consuming factor of traditional content creation software can add hours for a
team to see the result of a small change. Instead, the real-time content file is executed on demand,
generating imagery instantly even after a content edit.
Provides resolution- and platform-independent output. Since it’s not rendered at a specific resolution,
real-time content is resolution-independent and can be scaled or re-framed live without the need to re-
render. Its output is platform-independent, and works well for transmedia applications. The same assets
could be used to provide visuals for the stage, overlay graphics on the live stream, and create a mobile
companion application.
Gives more creative flexibility. Working in a real-time content platform gives the creative team the ability
to make detailed adjustments to content and see the results rendered immediately at full quality, and
Why Real-Time Content Creation? distribution of the update is smaller and faster. Content variables can be altered as the content is
playing, with the results visible immediately. Compare with a traditional process, where change notes
by Laura Frank are delivered to the motion graphic artists, and the result is re-rendered, re-delivered, re-distributed,
Founder & Creator Advocate at frame:work Courtesy of XR Studios and played again for review. Ultimately, real-time content creation tools allow for faster iteration and
more opportunities to refine the creative.
Simplifies interactive content production. The result of an interactive video display is dependent on
The language of digital production technology is constantly adapting to new developments and practices. external triggers to inform image creation. Just like a video game controller defines what is displayed
In a landscape of terms like XR, virtual production, and in-camera visual effects (ICVFX), it’s hard to know on the game viewing screen at 60+ times per second, a person, object, or other external input seeds
where to invest one’s efforts and which process is best for your project. Workflows developed for one style of generative content creation with action. While there are a number of software programming languages
creative video production find their way into other disciplines, adapting the terms used and further disrupting that have dominated interactive content creation for decades, their use has been largely limited to
the establishment of a clearly shared paradigm. However, there is one common technology that ties all our installations that can support long project development timelines. Real-time content platforms are
production lexicon together no matter the production style, and that is real-time content creation. purpose-built to be interactive, making them faster to use and more amenable to entertainment
production schedules.
Real-time content creation is the result of generating visual media from some combination of image assets, Augments live video signals. To alter a live camera feed with minimal delay, a real-time content platform
code, and external data on demand. We often talk about real-time content creation as if it were new, but video must be employed. As the signal is processed, the image content is augmented frame by frame. The
content rendered in this manner is as old as computer history. As long as there have been computer displays, entire frame can be reimagined with artistic effects, or graphics can be composited to the signal
motion graphic imagery has been generated in real time. The first video games and generative video artists feed. Image analysis of the camera feed alone produces sophisticated results through tools like facial
have their origins in the 1950s. Compositing those graphics with live camera feeds for broadcast television has recognition and color keying. You need only to check your favorite video conferencing software to
been in use since the late 1950s. So, what drives the rush to adopt real-time right now? see the potential. When combined with interactive content data sources like depth sensing cameras
and position location systems, we introduce the ability to enhance the camera signal with spatially
Currently, we are experiencing a collision of technologies across many entertainment production disciplines reactive content.
that redefines what can be achieved with real-time content creation. In the last decade, we’ve seen incredible Responds to environmental input. There are few experiences that universally elicit a state of childlike
advances in position tracking, GPU development, 3D model generation, media servers, and LED screen wonder as visiting an immersive interactive video installation. Projection mapping is a form of real-time
display technology. These tools embody the best capabilities of many entertainment technologies, and when content manipulation, where video content applied to a 3D surface in a virtual representation of the
combined, they fundamentally alter creative video content production. working environment corrects the image for use in the real world. As long as the model of the projection
surface and the projector locations are accurately described to the computer, we get perfectly aligned
There are a number of ways to combine these exciting technologies. Successful application of real-time imagery. But what happens when the projection surface moves or viewing perspective changes? The
content creation is knowing when it’s the best process for your production, and which aspects of their use more real-world information we feed into the model of our working environment, the more types of
best serve your needs. location-sensitive content we can create. When this content is combined with interactive inputs, we can
track moving scenery or a performer’s position on a stage, or interact with a viewer in a creative video
Let’s look at the motivation to use real-time content creation. There are already many software applications installation in real time.
currently in use to create motion graphics and narrative video content. What is the impetus to switch from
your current content production workflow to a real-time solution? A real-time solution does the following:

12 13
The Broadcast and Live Events Field Guide Chapter 1: Introduction

About Epic Games

Builds worlds. When we combine all the technologies discussed, we get to one of the most exciting aspects Epic Games was founded in 1991 by Tim Sweeney, a passionate technologist who still stands as CEO to this
of real-time content creation: content generated from the perspective of the camera, which is the day. At that time, the video game industry was nearing a pivotal point in its history, where the arrival of
cornerstone of the world of virtual production. Virtual production is a real-time content creation process widespread internet access and the sector’s tremendous growth saw it begin to shape and influence other
defined by the relationship between a physical camera and a virtual camera. The virtual camera exactly mediums and industries around it.
mimics the attributes and behavior of a physical camera in a simulated 3D environment, generating
perspective-sensitive content. The generated content is then separated into foreground and background What was happening in games by the mid-1990s served as a prelude to today’s online communities, social
imagery relative to the live action captured by the camera. This can be used to generate scenery or media, concepts of the metaverse, the rise of esports, and XR. Meanwhile, technologies and techniques
performers composited over a live broadcast signal, or to replace backgrounds for capture in camera or forged in games were starting to shape VFX, computer animated films, advertising, and broadcast.
composited with chroma key.
Thirty years later, Epic now has a long and rich history of not only developing games, but also of building and
This process has applications across the spectrum of live entertainment production disciplines, and will continue supporting online communities, providing robust creativity tools like Unreal Engine, and facilitating a wide
to be adapted in ways yet to be imagined. All one needs is the creative vision, time, and expertise to realize the range of content creation pipelines for both interactive digital experiences and linear content.
next advancements in use of real-time content creation.
Epic is also behind the global phenomenon of popular culture that is Fortnite, a game that has over 350
Laura Frank is an entertainment production technologist specializing in content production workflow for million player accounts and 2.5 billion friend connections.
multiscreen spaces. Her career spans work in film, broadcast, theater, concert touring and art installations.
She is currently building frame:work, a community platform for live entertainment and virtual production pixel Since its founding, Epic has grown with and shaped the game industry it is part of—at the same time that the
professionals, and is the author of two textbooks on screens producing and real-time content. game industry has increasingly influenced the wider technological and creative landscape. As such, it is only
natural that Epic’s content creation powerhouse Unreal Engine has seen adoption across film and television,
live events, architecture, automotive, manufacturing, and simulation.

Today, Epic Games proactively supports all those sectors and more, with Unreal Engine being used in
numerous ways by myriad different creative professionals. As new opportunities emerge in real-time-
powered broadcast and live events, we’re excited to help shape the journey forward in these amazing spaces.

Image courtesy of The Weather Channel

14 15
The Broadcast and Live Events Field Guide Chapter 2: Evolution of Real-Time Technology

CHAPTER 2:

Evolution of Real-
Time Technology
We humans, at our most fundamental
level, are social beings with an inner
desire to be part of something big—
to connect with one another through
extraordinary and memorable shared
experiences, the kind that inspire us
and create long-lasting memories.

Image courtesy of CVP

16 17
The Broadcast and Live Events Field Guide Chapter 2: Evolution of Real-Time Technology

Live theater, concerts, and sporting events have been exciting times, as we witness the merging of two
with us for centuries. It might seem that technology is enormous worlds, where game engines and real-time
at odds with the “live” aspect of such experiences—a graphics play hand-in-hand with video playback and
film adaptation of a stage play might, for example, live installations.
seem to replace the stage play itself—but audiences
have shown us, through their enthusiasm for such We are seeing more and more teams and projects Image courtesy of Capacity Studios

events, that technology employed to enhance live all around the world embracing real-time technology
experiences simply gives them an additional avenue as a conduit for both pre-rendered and real-time
As technology powering digital on-screen information only the tools for game developers to construct a world
to enjoy entertainment. From the deus ex machina of content delivery. Some creators are even integrating
advanced, influential companies like Vizrt emerged in and program gameplay, but also a means to compile
fifth-century Greek theater to jumbotron displays at completely live-controlled gameplay mechanics into
the late 1990s, giving studios the means to add ever- these games into a format that can be served to
sporting events, technology can enhance, involve, and their offerings. Media server companies, the backbone
more lavish real-time assets into broadcasts. consumer devices for gameplay.
engage audiences beyond their physical attendance at of any serious professional execution, are embracing
an event. and adopting the support and integration of game
Around the same time, advances in other fields Game development is a deeply collaborative process,
engines, facilitating exponential and industry-wide
were beginning to usher in a new era for 3D graphics where animators, coders, writers, artists, audio
Today’s live events fall into very broad and diverse capabilities never seen before.
and real-time delivery. Much credit must go to the specialists, and many more come together to build
categories, ranging from rock concerts with digital
techniques, technology, and artistry developed interactive worlds. A game engine serves as a central
action projected on enormous screens, to broadcast Why Real-Time Technology Matters in pre-rendered spaces like VFX and computer- hub that brings their contributions together.
television that engages audiences with mixed reality
animated cinema. But it is the history of video game
visuals updated on the fly. Real-time graphics, at a most fundamental level, can development that really accelerated what is possible Real-Time Technology in Broadcast and Live Events
be understood to emphasize the live nature of an event with real-time technology. As game engines advanced, it became clear that
Technology and Entertainment or broadcast. they were perfect for the emerging field of real-
The Role of Video Games in Real-Time History time graphics in broadcast and live events. For
Audiences, after experiencing increasingly compelling We all know live experiences are powerful. Whether A video game is essentially a real-time environment example, when The Famous Group unleashed a
technology-infused events, now expect more it’s taking in a monumental sporting event via a live where the player gets final say in how events play out. giant digital panther on the Carolina Panthers NFL
than ever from the entertainment sector. To be broadcast or pushing to the front of a crowd at a Narratives are carefully plotted, character assets are team’s stadium, they created a real-time experience
successful, creators currently have unique and concert, it is seeing events unfold before you that meticulously shaped and rendered, and environments that was thoughtfully mixed with reality. While that
challenging requirements with regard to system makes you feel part of them. Add a little mixed reality are sculpted long in advance of any public release. And highly collaborative process involved disciplines like
performance and stability as well as visual impact and magic that dynamically adapts to what’s happening yet the game itself must be able to meet the player's camera operation and live streaming to stadiums, it
creative experiences. before you, and that sense of being involved in the expectation to interact, giving them various degrees remains fundamentally comparable to making a video
moment is greatly enhanced. of control over what they see and do at any given game, highlighting the reason that game engines are
To keep up with audience demand, studios are tasked moment. Watch 100 players make their way through applicable to such projects.
with producing large and complex projects that very That’s true whether you’re using real-time elements an identical game, and each will see and experience
often require an armada of display technologies, to deliver additional information and context, or simply something different. And so it is that we find ourselves at a point where
ever-increasing pixel density, and enhanced realism, to set tone and atmosphere. As long as it’s applied broadcasts and events can inherit much of the real-
all driven by synchronous media servers working thoughtfully, mixed reality applications give audiences As consumer expectations for games rose, new time potential of games, mixing up the reality of a wide
together seamlessly as a whole, with no downtime. more reason to feel engaged in a happening. methods emerged in game development, giving range of live experiences.
The old industry saying “The show must go on” now animation, lighting, music, in-game weather systems,
translates to a 24/7 mindset where the screen can The Emergence of Real-Time in Entertainment and character motion the ability to adapt on the fly as The opportunity here is creative and commercial,
never go black. The arrival of widespread computer graphics in the players triggered events. Over time, the concept of a empowering everything from science communication
1990s would eventually give us the real-time concepts “game engine” emerged to keep all these various real- and brand experiences to sports broadcast and
Until recently, the development of any such experience we now consider so familiar that they hardly stand time elements working as a harmonious one. musical performance, all with a view to attracting,
was dominated by offline rendering workflows and out—displaying score and time on a sports broadcast, engaging, and retaining ever larger and more
video playback. Now, things are different. These are for example, or updates on a news ticker. A game engine is a software framework for both devoted audiences.
creating and delivering games—it encompasses not

18 19
The Broadcast and Live Events Field Guide Chapter 2: Evolution of Real-Time Technology

Case Study: The Famous Group | Carolina Panthers | The Panther Project
Today, these sectors find themselves looking at a on works that blend real-time interactive experiences Project type: Live event/on-site broadcast
tremendously promising, sometimes intimidating with scripted, pre-rendered cinematic CG scenes,
future where media convergence is finally happening pipelines like those founded on games engines like When the NFL team, the Carolina Panthers, took their first foray into real-time mixed reality, it was with fairly
in a meaningful way. Unreal Engine have evolved to serve both sides of straightforward goals: they wanted to boost spectator energy on game days, and establish new traditions that
the craft. would further cement the relationship fans have with the team.
“The main building block of this
Put another way, game engines now offer pipelines— These goals would ultimately lead the Panthers to partner with The Famous Group, a self-described “fan
new future is the game engine, or cores for pipelines—that are perfectly suited experience company” specializing in virtual events and mixed reality productions. Together, they would
which means Unreal Engine.” for the new generation of real-time projects seen envision and deliver a remarkable experience for fans at the Panthers’ opening game of the 2021 season.
at the forefront of broadcast and live events. This
development has garnered much interest in Unreal Before the game started, a giant mixed reality panther was let loose in the Bank of America Stadium. The
Björn Myreze
Engine from those sectors in recent years—an interest virtual beast scaled the video board, tore down the flag of the visiting team New York Jets, leaped down to the
Founder and CEO
that has complemented Epic Games’ effort to build out field, and then bounded away with a roar. The spectacle unfolded on the site’s big screens before the 70,211
Myreze
features and provide support that precisely serve the spectators, showing the panther moving through live footage of the event. Beyond courting a riotous response
needs of broadcast and live event users. on the day, a video of the spectacle went viral online, attracting over six million views on Twitter alone.
The Evolution of Pipelines
We are at a point where Unreal Engine now stands “Right now, mixed reality has a big ‘wow' factor,” says Greg Harvey, CIO and Co-Founder of The Famous Group.
Every creative process, to be efficient and produce at the center of pipelines for so many production “Fans have that ‘What just happened!?’ moment. People don’t really get how these digital creatures interact
the best end result, requires a pipeline, the process companies that exist far from games. There are those with the physical world.
and technological chain that facilitates an early doing truly live, real-time mixed reality work that can
concept’s journey through production to final pixels. harness the side of Unreal Engine forged to serve the “The fans get super fired up now, and start yelling and screaming as soon as they hear its roar,” Harvey adds.
Technological pipelines have been a mainstay of interactive elements of video games. Others commonly “The impact is much more powerful than running traditional content on the video boards.”
broadcast for some time, and where live events include use Unreal Engine to produce the types of elaborate
digital assets, such pipelines are also a must. and highly detailed pre-rendered assets once served The project took about seven weeks to complete. A
by traditional motion graphics pipelines. And then scan of the entire Bank of America Stadium gave The
Traditionally, broadcasters—and, to a lesser extent, there is another class of user—particularly in the Famous Group a 3D model for anchoring the panther
those who produce events—have taken a two-pronged events space—working to deliver “pre-rendered for to real-world surfaces during the animated sequence.
approach to pipelines. The most detailed, memory- live” assets that can use the dynamic and adaptable On the day itself, that model sat invisibly inside the
draining pre-rendered motion graphics that might be nature of Unreal Engine to create and deliver those live footage broadcast from the venue to its own big
made for background plates or “scripted real-time” are hybrid entities. screens. As camera operators tracked the panther’s
processed through robust motion graphic pipelines pre-set trajectory in the stadium, the live footage
that need time to bake and render. For lighter real-time That’s not to say that the solution is simply to pick from those cameras was sent to Unreal Engine and
assets, specific pipelines offered by real-time pioneers up a game engine and work like a game developer. matched to the invisible model, the big cat was added
such as Vizrt, Chyron, and Ross Video have long been Rather, over time, industry shifts and technological to the footage, and the result was broadcast to the
used, and have done a very workable job of managing evolution have seen a trend play out that now means event screens.
earlier real-time broadcasts. game engines are commonly the most suitable option
for broadcast and live events. It is essential, however, The panther was created with a traditional process,
Over time, however, real-time ambitions have grown that the providers of such engines offer support and where it was modeled in Maya and exported to Unreal
in broadcast, supported by remarkable technological functionality that specifically serves those fields. Engine via the FBX file format. There, the team
advances. In tandem with that movement, the video created their own motion sequences and motion clips
game space has blossomed into a giant industry and for the panther’s movements.
creative force. Thanks to video games’ common focus

20 21
The Broadcast and Live Events Field Guide Chapter 2: Evolution of Real-Time Technology

Q&A: Carolina Panthers | The Panther Project

The panther, having been built to work with real-time When the football team the Carolina Panthers let a giant real-time vision of their mascot character loose at the
environments, can now be deployed in all kinds of Bank of America Stadium in Charlotte, NC, they also delivered one of the most high-profile instances of mixed
ways. “It's very much like a video game character, reality seen in the public domain.
which is the opposite of limiting,” says Erik Beaumont,
The Famous Group’s Head of Mixed Reality. “The Envisioned as a way to introduce a new game day tradition, the Panther project also served to demonstrate
panther’s technical nature is letting us consider the technology to millions of viewers, and has since become a touchpoint for understanding, explaining, or
making a gamified version of this, where operators evangelizing the potential of real-time technology at live events. It wasn’t just that the project worked at a
can more freely control the panther in real time like functional level—it thrived in terms of engagement and reach, from delighting fans on the day to attracting
a game character, rather than just having prepared millions of views and engagements across social media.
linear sequences.”
To get an inside view, we spoke to the Panther’s Senior Director/Executive Producer of Game Presentation &
In mixed reality, The Famous Group and the Carolina Production, Mike Bonner, who drove the ambition of the project from the football team’s side.
Panthers found the perfect medium through which to
give fans an amplified sense of being part of a special What were the Carolina Panthers looking to achieve by embellishing live action with real-time assets?
moment, fueling their support and devotion on the day
and beyond. And what they’ve achieved is only the The Carolina Panthers are committed to bringing new and innovative live experiences to our passionate fans
start—in the future, fans might see the panther moving at Bank of America Stadium. We are extremely excited to be one of the first teams to incorporate unique mixed
to different parts of the stadium, and interacting with reality content into our game production. So far, the response has been excellent.
players on the field or groups of fans in the stands.
The live element was extremely important in creating an incredible game-day atmosphere. Our fans love to see
“There is a seismic shift happening right now,” Harvey the panther on the video board, and it turned into a truly viral moment with how it was embraced on social media.
concludes. “Immersive is the future of the live events
industry, and we will see over time what form that
will take—whether it is augmented reality, virtual
reality, mixed reality, a combination, or something
completely new.”

Images courtesy of The Famous Group Image courtesy of The Famous Group

22 23
The Broadcast and Live Events Field Guide Chapter 2: Evolution of Real-Time Technology

What were some of the challenges you faced with this project?

Calibration, calibration, calibration. We utilize mixed reality with multiple cameras, all of which need to be
calibrated, and it takes quite a bit of time to get it looking just right. To make the project work, we had to make
the commitment to leave the cameras in place for the entire season.

We also learned that with mixed reality, it’s okay to push the envelope of believability—we don’t need to only
show the sequence when the field is empty. In the first game, we ran it 13 minutes prior to kickoff, and in the
second game, during the game with players on the field. The fans erupted when it was played during the game.

Do you see this kind of deployment starting to have a profile in sports beyond the NFL?

We actually started the conversations about real-time mixed reality with the Charlotte Football Club [FC]—
the MLS [Major League Soccer] team owned by David Tepper, who also owns the Panthers—with an initial
conversation around activating at their games. It just so happened that the NFL [National Football League]
season started before the MLS season, so the Panther project came first.

The incredible success and magic of the Panther project only reinforced the excitement to do something like
this with Charlotte FC. The mixed reality panther proved to be an incredible season-long test run for what we
were planning for our Charlotte FC broadcasts. The results, and the reaction from fans, make us proud of the
unique mixed reality deployments as part of both professional teams within Tepper Sports and Entertainment.

What would you tell other production teams looking to embrace mixed reality?

Don’t be afraid to try new things. We’ll continue to see an increase in these types of experiences, especially as
the technology continues to improve, and we see more interoperability with existing equipment like SkyCam
and handheld cameras.

Image courtesy of The Famous Group

24 25
The Broadcast and Live Events Field Guide Chapter 3: Building a Real-Time Pipeline

CHAPTER 3:

Building a Real-
Time Pipeline When it comes to the contemporary
real-time opportunity in broadcast and
live events, pipeline is everything.
A quality pipeline enables and connects
every element of a production from ideation
to final pixel, while bringing efficiency and
performance to the entire process.

Image courtesy of The Weather Channel

26 27
The Broadcast and Live Events Field Guide Chapter 3: Building a Real-Time Pipeline

With that in mind, to successfully embrace today’s He adds that Myreze sees Unreal Engine right at the “They want something like our Panthers project, but disguise’s projects might include projection, LED
real-time opportunity, you’ll need to establish a heart of every pipeline for their real-time experiences, better,” says Beaumont. “That ambition is great. It screens, or both; might fall into the categories of AR,
pipeline constructed with real-time in mind. If that so much so that the studio is ready to try all means we’re not doing the same thing over and over. MR, or both; and might be for broadcast or live events.
same pipeline can also be capable of handling the manner of new projects, each of which uses these We can’t work by formula because every single client is For each client, disguise produces a flexible platform
pre-rendered elements, all the better. Fortunately, technologies differently. very different, from the visual style they pursue to their that works for that client’s specific needs.
assembling such a pipeline isn’t likely to push willingness to experiment.”
traditional broadcast and events teams too far beyond “That's exactly why we love what game engines bring The common theme across all these types of
familiar territory. to these new opportunities,” Myreze continues. That logic extends to differences in how given clients projects, says Solutions Manager Peter Kirkup, is
prefer to collaborate, provide assets, and maintain that they come down to complex pixel manipulation.
As we’ve already seen in this guide, game engines
have now emerged as a particularly fitting foundation
“Whatever I need to build, brand and style with bespoke assets. “Often, we’re working on projects where pixels
take a weird configuration on a stage, something
for real-time production pipelines. Unreal Engine will be the main Contrasting Pipeline Approaches very different from a standard 16 x 9 monitor,”
ingredient in our vision.” At this point, it is worth briefly comparing and
he says. “These non-standard setups are where
Unreal Engine for Real- we really add value. And we can’t really add that
contrasting a traditional pre-rendered pipeline with an value with a rigid or ‘standard’ pipeline.”
Time Experiences Flexibility as a Standard
example of an adaptive real-time pipeline.
disguise serves clients that might need to send pixels
As games have become more ambitious both as Myreze himself has touched on a key point echoed by
At a glance these two pipelines can appear very similar to a vast moving stage space as seen on the likes of
technological entities and narrative works, the long- almost every real-time professional we spoke to for
in terms of required source assets, needed expertise, Eurovision Song Contest or the BRIT Awards, where
predicted concept of media convergence is finally this field guide: few real-time productions are alike,
and increasingly, the fidelity of their visual output. highly unusual aspect ratios may change on the fly. It’s
happening in meaningful ways, as live news, esports which means there is no one-size-fits-all approach
However, they significantly differ in their overarching a striking example of how much variety modern real-
events, concert performances, sports broadcasting, when it comes to pipeline. Fortunately, that doesn’t
philosophies. Primarily, one is geared towards the time pipelines need to adapt to.
immersive experiences, and games all borrow from and mean constantly rebuilding pipelines on a per-project
offline while the other is completely real-time.
inform one another. basis. Rather, real-time pipelines from broadcast and
Kirkup explains that the many ways disguise can work
live events need to be flexible.
A traditional linear pipeline focuses on pre-rendered with content in Unreal Engine reflects the way the
Myreze, a motion graphics, branding, virtual production, assets, meaning it better serves projects or parts of artistic teams view pipelines themselves.
and real-time studio, sees this convergence as an This sentiment is echoed by Erik Beaumont, Head of
projects that demand much more precise control over
exciting opportunity. “The main building block of this Mixed Reality at The Famous Group. Because not all
new future is the game engine, which means Unreal customers are the same, he says, the pipeline has
final asset details, where rapid iteration or interactivity
is not required. Real-time, meanwhile, offers immediate
“For us, a pipeline is just a
Engine,” says Björn Myreze, Founder and CEO. “Unreal to be highly flexible. In addition, clients all want to
push things a little bit further than what they’ve seen
results and iterations, reducing the time to render to combination of layers in
Engine can adapt to all kinds of uses as we see all almost zero, while allowing for interactivity of content.
these new opportunities and approaches emerge.” others do with real-time before, and that’s just not
As such, real-time pipelines introduce a significant the timeline, layers that we
possible if the pipeline is too rigid.
shift in design approach, content creation possibilities, composite and blend.”
and ‘over the approval’ process for changes. That
means content can be rapidly and reliably adjusted “We think of Unreal Engine as a layer on that timeline,
right up until delivery, enabling, for example, a layer that we can comb for assets and even textures,
assets to be reworked to fit atypical or changing and bring them into the worlds we support,” Kirkup
display hardware. says. Building out assets might include mapping
content onto surfaces or cutting pixels out from a
That’s certainly been the case over at disguise, a render that's happening in Unreal Engine in real time and
studio that provides turnkey solutions for broadcast putting those pixels on a different part of the stage. The
and live events, where Unreal Engine is deeply team might also bring in other layer types as textures,
integrated into a platform that puts flexibility at the such as pre-rendered imagery or web-based content
Requires re-rendering
forefront. And that flexibility is simply a reflection of from an HTML5 source.
Figure 1: Comparison of pipelines with traditional rendering and real-time rendering how much variety real-time technology enables.

28 29
The Broadcast and Live Events Field Guide Chapter 3: Building a Real-Time Pipeline

enable your entire team to explore the fundamentals of Facilitating such attention to storytelling is the
real-time process, practice, and technology. And as the maturing of real-time technology, where we now see
audience of people making games has grown vast and integration between tools in the ecosystem.
diverse—with Unreal Engine now serving millions of
individual users—it has also become a necessity that Björn Myreze, Founder and CEO at Myreze, reports
thorough, detailed documentation and clear tutorials that the company is now able to create a variety of
(including those at the entry level) are provided and projects, from opening sequences to idents to virtual
Image courtesy of creativeworks.london constantly updated. studios, inside a single pipeline. He sees Unreal Engine
as more than just an interesting option—the company
disguise offers an encouraging example of how a Many thousands of games, from big-budget hits to Game development, as a craft, isn’t something one has found that Unreal Engine truly works in harmony
modern, flexible real-time pipeline can function in a way brief, experimental artworks, have used Unreal Engine picks up overnight, but today it’s a great deal more with the tools the team is familiar with, and with other
that should be familiar to anyone who’s done digital as the foundation for their pipeline—from ideation to friendly and familiar than you might imagine. That real-time specialist solutions.
video editing. Kirkup helpfully frames the disguise final product—and none of those projects are the same. internal game project might be more powerful than you
pipeline as fundamentally comparable to a real-time think, and can provide an impactful way to empower
the wider team as they move towards working with a
“What’s exciting is that we’re
video editing tool that just happens to be working on This is part of why teams like The Famous Group have
big-pixel canvases with complex shapes. started to think of real-time production in terms of real-time pipeline. seeing all these technologies
Kirkup describes disguise as “very focused on pixel
game development, including steps like production and
post-production, where the steps are very much like A ‘Storytelling First’ Mindset that struggled to work well
delivery.” As a way to help users understand the real- game development. The difference, says The Famous together in the past, now
time process, disguise encourages them to think of it Group’s Beaumont, comes in when you consider the Some teams find it useful to think of real-time
as a layered composition, with layers stacked on top aesthetics and visual quality of the output. “You need technologies as just another means for storytelling, become a full, efficient,
of one another. Kirkup gives the example of a possible to combine these two worlds where all your design often a simpler way than they’d previously been powerful pipeline—a place
real-time setup for a full LED display: the setup could and creative is living in the broadcast space,” he says, using. Myreze takes this approach, making use of
bring in an Unreal Engine layer rendered out of a “but your technology is all living in a space similar to the widely varied technical skill sets the company’s where everything really
particular cluster into a different world, then apply a game development.” team possesses along with a shared understanding of
narrative and storytelling.
works together.”
mask on top of it, blend something on top of that, and
multiply that by a factor that gives a soft vignette Game engines’ core ability to manage both true real- “That’s an important development that makes it much
around a circular LED screen. Then the system could time and pre-rendered assets means they offer a Jørgen Steinheim, Partner and President at Myreze,
easier to build these pipelines,” Myreze points out.
push out the resulting imagery to a full LED display. highly suitable solution here. Adopting even just a few explains that while the opportunity in real-time
game development practices will give a significant includes the embracing of new technologies, at the
At a company like disguise, the pipelines themselves boost to any effort to work with the kinds of real-time same time you need to ensure that teams continue
are consistent, but utilize a workflow that can be projects this field guide focuses on. That could mean to focus on storytelling as the main goal, rather
applied in many different ways. hiring from games—with roles like Game Producer or than leaving teams to get lost in the technical
Director bringing in all-around experience. Equally, you details. “Remember, those traditional skills—that
“However you approach it, that’s what pipelines need might want to speak with engine specialists or tech understanding of your audiences—that’s something
to be in this space,” Kirkup states. leads from the gaming space—perhaps in a consulting you already have, and that’s going to carry you through
or recruiting capacity. the technical challenges,” he reminds us.
Why Games Are Relevant
But the way forward isn’t only about hiring. In an era You’ll need real-time specialists on your team, and a
If the sound of new pipeline approaches built around where game development has become so welcoming basic familiarity with the core concepts across the
tools from the game industry is starting to sound a that small or single-person teams make and release creative and technological workforce. But it’s equally
little too alien, be assured that the roots of platforms commercially triumphant titles, many broadcast and important, Steinheim says, that you have an individual
like Unreal Engine in game development only event teams have taken the route of building their or small team that can expertly talk to the client about
strengthen their suitability for real-time production, own game as an internal project. It’s a powerful way to real-time opportunities and limitations in a clear,
while being more welcoming than you might imagine. welcoming, and engaging way. Image courtesy of creativeworks.london

30 31
The Broadcast and Live Events Field Guide Chapter 3: Building a Real-Time Pipeline

About Version Control


Building Custom Tools Version control is a fundamental part of almost any Perforce Helix Core is a common choice for game
collaborative creative process. In its simplest form, development, with its ability to precisely manage
A game engine can also be used as a means to it offers a means to track and manage changes to a a central database and a master repository of file
develop your own internal tools, or extensions that project or its assets. In many software applications, it versions. It’s also highly compatible with Unreal
add abilities to the engine itself or to your pipeline in is also referred to as “source control”. Engine—and widespread use of Perforce means that
general. This approach presents another way to keep beyond official documentation, a vast community
your real-time pipeline flexible, adaptable, and ready To gain an understanding of version control, let’s look exists that are eager to help with problems and
for the dynamic production and delivery that defines at a simple example: collaboration on documents. If blockers. Helix Core is emerging as the Google Docs of
real-time. two writers are co-authoring a single article, Google real-time—currently, 44% of Unreal Engine broadcast
Docs provides ample version control—each author can users have adopted it into their pipeline.
For real-time mixed reality projects, studios find see updates, and even work simultaneously without
that custom tools provide a necessary bridge to the spawning multiple new versions. As an alternative to Perforce, Git is a free and open
physical world. For example, studio Moment Factory, source distributed version control system that
which specializes in real-time and mixed reality On other straightforward projects, such as creating focuses on a branching and merging workflow. Popular
experiences hosted in public spaces, might need to illustrations for a book, a basic spreadsheet might platforms for Git include Github and Gitlab.
work with multiple real-world and digital elements suffice for tracking the filename and folder path of the
for a single event—elements like cameras, trackable latest version. A detailed guide to enacting version control or getting
displays, moving objects, and projection. To facilitate the most from your chosen platform is beyond the
the process, the team creates tools to connect Real-time projects, of course, are many times more scope of this guide, but rest assured that putting just
with the lights, sound systems, digital content, and complex than a written article or a set of illustrations. a little effort into understanding version control will
sensors, resulting in an integrated toolset for the When a single project involves live broadcast, physical save significant amounts of time overall, help deliver
event. “It's really about creating tools that enable sets, real-time assets, pre-rendered content, and better results, and let you focus on the work that
visualization, being able to create the tools to elements of virtual production, robust version control matters most.
interlink multimedia such as audio, light and real-time is critical.
content—that lets us really work well in the hybrid
space that mixed reality presents,” explains Céline As an example, consider two different specialists
Mornet, Innovation Producer at Moment Factory. working on the same animated character model.
Without version control, a texture artist and facial
Building a real-time pipeline doesn't require a animator might simultaneously make changes,
dramatic pivot into the unfamiliar. Rather, by leading to two distinctly updated versions. Version
embracing a little from the mindset of games, familiar control brings efficiency while maintaining quality,
concepts and technologies that bear comparison preventing the chaos of multiple files with spiraling
with your existing pipeline will empower your effort to version numbers.
embrace real-time. And you can be sure most of your
team’s existing experience, pipeline, and process will Version control frameworks currently used for
be deeply relevant to this opportunity. broadcast and live events are built to serve more
traditional pipelines. As there are considerable parallels
between real-time production and game development,
well established version control tools from the games
industry bring highly applicable options to real-time
content makers.

Image courtesy of Illuminarium Experiences

32 33
The Broadcast and Live Events Field Guide Chapter 3: Building a Real-Time Pipeline

Case Study: Illuminarium | SPACE: A Journey to the Moon & Beyond


Project type: Location-based immersive experience

The team at Illuminarium employs a blend of pre-rendered and real-time technology to pursue a goal that is, at
least in principle, fairly straightforward.

“We create experiences with the goal of eliciting an emotional response,” offers Brian Allen, Executive Vice
President of Technology and Content Integration at Illuminarium Experiences. “This could be triggering a
memory, or coming away with a question or just a feeling of wonder. We aim to democratize the world's most
extraordinary experiences.”

There, Allen has got to the crux of one of real-time’s greatest powers:
in mixing realities, the distant, unreachable, or fantastic can fire up
Image courtesy of Illuminarium Experiences
imaginations and give audiences an experience that stays with them
well after it’s done.
“It’s a vastly different process to producing pre-rendered content at Illuminarium sizes. Using Unreal Engine
That’s very much evident in the latest experience available at within the disguise architecture allows us to manipulate parameters in real time from our media server playback
Illuminarium venues, called SPACE: A Journey to the Moon & Beyond. software, which is great for quick changes and the creative process.
The SPACE experience sends audiences to the inside of a nebula, to the
moon, or to the middle of an asteroid field, all from the comfort of the “We can have editor machines running and making new builds while we are testing the current build, all within the
company’s current sites in Atlanta and Las Vegas. venue. Once up and running, the process is straightforward. We have solved the scale problem with what may
be the largest Unreal Engine nDisplay cluster in the world. We can generate pixels at a scale in the hundreds of
Illuminarium ultimately provides venues that audiences can visit and millions. The possibilities have widened considerably for us because of this technology.”
walk through in an exploratory manner, much like a museum or gallery.
Here, however, visitors step into a world spun from real-time and pre- As we’ve seen throughout this guide, the tools should always serve the story, and never the other way around. In
rendered assets, cutting-edge sound technology, and even haptic the case of Illuminarium, however, it works both ways—building the right pipeline to enable their vision has, in turn,
and scent-based elements. Using systems such as 4K projection, led them to envision new stories to tell.
LiDAR-based movement sensors, and spatial audio, the walls, floors
and scenery within an Illuminarium theater can be filled with highly “For Illuminarium, most challenges are caused by our scale,” explains Allen. “Some issues can involve multiple
realistic visuals and sensory triggers to create a remarkably immersive manual operations across many machines. Our solution to this has been to automate as much as possible while
experience. The venues are also entirely reprogrammable, meaning they keeping a healthy balance. 
can be refitted with new content and experiences as often as needed. 
“Rendering for our space has also been challenging,” he continues. “In the beginning, we quickly realized that
For the foreseeable future, pre-rendered content will be key to we needed to develop a custom nDisplay Mesh Policy to get the images to render correctly for our space. After
Illuminariums, as it allows for high fidelity on large projection surfaces delving deeper into that, we realized that the Mesh Policy is heavily dependent on what the creative is—different
that audiences can view close up. But blending those pre-rendered visuals require different configurations for projection.”
elements with real-time, game engine-generated content has always
been part of the Illuminarium plan, with a view to providing increasing Illuminarium believes the future of immersive entertainment is real-time and interactive, so much so that the
levels of dynamism and even personalized content. As such, real-time company plans to establish a dozen or more venues globally in the next five years, with increased interactivity
rendering is part of the team’s pipelines, workflow,  and infrastructure, and more personalized experiences. “Are we making video games?” Allen asks rhetorically. “No, but we have a
with Unreal Engine sitting close to the heart of each project. large-scale immersive and communal space where you can step into any world that we can create, along with
your friends and family.
“Unreal Engine has been a great fit for us,” Allen says. “Moving into
interactive and generative development at our scale was no easy task. “To have characters that understand who you are, and that can react with audience members, will create new
We are still learning with every new build and show we are introducing. opportunities for narratives to unfold that the world has never seen before.”
Image courtesy of Illuminarium Experiences

34 35
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

CHAPTER 4:

Real-Time Graphics
Content for Live Events When it comes to live events, the
real-time opportunity is about
making audience experiences more
dynamic, engaging, and interactive
by embellishing reality with digital
content. It’s a commercially and
creatively exciting prospect for anyone
in the space, and today a number
of advancements in the field mean
much greater ambition is possible,
while barriers of complexity fall away.

Image courtesy of Illuminarium Experiences

36 37
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

First, a new generation of LED displays and projection extraordinary canvases and opportunities for real- Brian Allen, Executive Vice President of Technology
methods have emerged that allow for new levels of time experiences across the globe. and Content Integration at Illuminarium Experiences,
innovation in terms of set design and performance. explains that the company’s offerings spring from
Robust, compatible pipeline ecosystems have There will always be something special about the audience demand: in this ever-evolving world of
simultaneously emerged, and the companies already traditional unembellished on-stage performance. The immersive experiences, he says, people have come
working in the field have established much collective magic of seeing emerging talent in a tiny, dark, low- to expect them to be interactive, personalized, and
knowledge in the space, all while conventions ceilinged venue, for example, might never be replaced refreshed often. The company has found that real-
continually emerge that bring more consistency and by technology. But that is not the aim of the real-time time content generation is the best way to accomplish
efficiency to working processes. movement. Rather, modern technology is allowing for this, especially from a user journey standpoint.
a far greater variety of live experiences, and richer
At the same time, as the pandemic put live events on
hold across the world, audiences became considerably
diversity of creative expression.
“Real-time means we can
more familiar and comfortable with the concept of We are not only talking about traditional stage crafts deliver a fully interactive, ever-
mixed reality, and the value digital, real-time assets such as live music and theater. As consumer interest
bring to live experiences. From concerts hosted in in mixed reality and on-site interactivity increases,
changing, and even narrative-
Fortnite to music festivals that take place within live entirely new experiences that put real-time front driven experience to our guests,
games, the general public has not just become used and center of their offering are proving themselves
to mixed reality performances, but even expects some to be commercially successful and popular. A prime We will continue to produce
real-time elements as part of the live experience. example are the Illuminarium venues, which welcome large-scale, pre-rendered
audience members to reprogrammable immersive
Many more venues and sets are now also furnished theaters. Using a combination of 4K laser projection, shows, but the path forward
with adaptable LED screens and projection systems,
and even moving stage elements, unconventional
LIDAR-based movement sensors, and even digital
scent technology, Illuminarium drapes the interior
for us is the flexibility and
screen shapes, and LED floor displays, presenting walls of its theaters with a variety of interactive opportunity to have audiences
experiences that visitors can explore together.
truly interact with our content
that comes from game engine-
generated content.”
Elsewhere, popular esports events, where
professional competitive players compete in front
of live arena audiences while thousands more
tune in live via streaming platforms, have become
commonplace globally. Their blending of sports
broadcast convention, live audiences, and video-game
content makes them especially fitting for real-time
embellishment. Such broadcasts have become so
familiar that it’s hard to imagine these types of events
without an impressive mixed reality presentation.

Image courtesy of creativeworks.london Images courtesy of creativeworks.london

38 39
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Real-time for events is now more achievable and As seen in the case of Creative Works and Guns need to be stored on media servers, potentially making
approachable than ever before, as the standards of N’ Roses, animated scenes played out on towering for great expense in terms of memory and cost—and
what is possible constantly climb. That, however, does LED screens serve to extend the narrative of a show suddenly real-time offers another advantage.
not mean there are no challenges. As processing while celebrating the performers’ heritage. The same
power grows, so does the ambition to offer ever more animations served as cues for the band and a means Real-time content and engines are also more
grand experiences on vast screens—event hosts to build excitement and energy in the crowd before accepting of multiple users working on single assets
always want content at the cutting edge. Screen and a performance. Across many other large-scale gigs, or projects, as multiple changes from multiple
technology setups can differ from venue to venue, mixing realities can emphasize key moments in the users or machines can be simultaneously or near-
presenting difficulties for touring shows. And until very music, and generally serve to give audiences a more simultaneously updated.
recently, the pipelines that serve these experiences engaging, engrossing, and memorable experience.
continually struggled to communicate consistently. Additionally, while the digital content for an arena
In esports context, such assets can bring show by a touring artist might not be interactive in
Orchestrating an event that might include live informational value and analysis, while in those itself, venues themselves can be considered dynamic
performers, ceiling projection, fully interactive real- and many more contexts, the opportunity for spaces, and as such present a challenge. As touched
time assets and moving LED screens is a complicated memorable brand experience is plain to see. on in the case study on Illuminarium, not all venue
business. And yet that complexity is a symptom of the screens and projection canvases are the same. Special
profound opportunity here. And yet with all that considered, the question, “Is events at atypical venues may also mean working with
real-time right for this project?” still needs to be temporary and highly distinct display ecosystems.
The first questions, then, are “Does my live event need asked. By going the route of delivering pre-rendered, In those cases, content made in a real-time pipeline
real-time?”, and “What kind of real-time is required?”. linear content to screens and projectors via a familiar based around a game engine like Unreal Engine can
Answering those involves a little more understanding media player, you get reliability and predictability; adapt quickly to a unique setup.
of the technology and platforms involved here, which everything is set in place, and you get the foundational
we’ll continue in this chapter, after a closer look simplicity of “playing a video”. On simpler, smaller, “It used to be a real challenge when working with
at how Creative Works added new dimensions to and lower-budget projects, that may well suffice. local venues’ display setups,” says Dan Potter,
performances by rock icons Guns N’ Roses—and what Creative Works Co-Founder and Executive Creative
they’ve learned about working with clients and getting For a moment, let’s forget about the real-time, deeply Director. “Now that we’re using Unreal and disguise,
real-time live events right. interactive live experiences that let the performer we don't worry about that anymore. We don't have to
or audience dynamically and immediately affect the reconfigure anything because we know the disguise
Key Considerations digital assets around them. Without those forms of pipeline can handle whatever screens we throw the
interactivity, it might appear that simply sending pre- content to.”
As client and consumer interest in real-time rendered content to screens via a media player is the
live events continues, there are increasing best way forward. As an example, Potter cites a show at The Colosseum
motivations to augment those experiences. at Caesars Palace in Las Vegas, where the team found
And yet real-time brings many other strengths beyond out on the day that they had to get some 16:9 content
Should your or your client’s next live event project dynamic interaction. It allows much faster iteration and ready for a full wall with an entirely different aspect
be real-time? Unless there’s value in leaning into rapid prototyping at the start of a workflow. It brings ratio, and they had only minutes to sort it out. “We
the traditional live experience, the answer is the same speed advantages to blocking out early were standing there—me, and Jeremy, and the Guns ‘N
increasingly “Yes.” iterations to guide creative teams or inform clients. Roses manager—and the manager tells us he wants to
It’s ideal for building VR experiences that creators or move all the tour’s usual lights out the way so we can
Some theater productions thrive through minimalism, clients can visit to get a better sense of a project as it make use of this wide, full screen.”
and there real-time may be a distraction. At a might be seen on site. Real-time is also significantly
given point, a band as a brand might have much more economical when it comes to memory and
to gain from asserting that it still plays traditional server use. Consider that the pre-rendered approach
concerts. More and more, though, there is a place increasingly demands tremendously large files that
for adding digital assets to live experiences. Images courtesy of creativeworks.london

40 41
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

problem of driving simultaneous displays at increased DMX


resolutions. Equally, multi-GPU solutions have their Additionally, standards born in the live events
own limits—they do not properly distribute and scale sectors—such as Open Sound Control (OSC) and
real-time rendering beyond the scope of one PC. DMX—have become established over time as
standards for communicating between light fixtures,
Even a handful of years ago, those issues presented sound, and video systems in real time. Conceived and
a serious challenge. In a relatively short time, though, used as a means to send control information between
much progress has been made, mainly in terms instruments, computers, and lighting consoles, they
of pipeline capability, software compatibility, and now offer a base of convention that real-time content
standardization around displays. can and should plug into.

You can view these protocols as the main language or


Media Server Integration base communication layer that makes Unreal Engine
Image courtesy of creativeworks.london
Integration with media players and media servers is part of a seamless live-broadcast ecosystem. By
now much more reliable and frictionless, with Unreal supporting DMX, you can use Unreal Engine to help
Three or four years ago, that would have meant Technical Challenges Engine and many other pipeline tools now better design and previs an entire live show. Alternatively, you
degraded, pixelated imagery. Instead, a technician There are three important technical factors in the supporting a wide range of platforms and products. can control and trigger gameplay mechanics via DMX
from disguise was able to reconfigure the system to successful deployment of real-time in live events: or OSC cues (or vice versa), or use Unreal Engine to
high-resolution 24:9 imagery—all within three minutes. Within the industry, we are seeing an increase in high- control external devices such as lighting fixtures. The
“It was pixel-perfect in this super-wide, huge format, • Adaptation to multiple displays of different sizes, quality integrations leveraging various techniques, list goes on and on.
with the show soon to start,” says Jeremy Leeor, shapes, and curvatures. such as transferring real-time UE content through
Co-Founder and Managing Director at Creative Works. • Synchronization of all displayed content and IP video or using Texture Share mechanics, to have Equally, disguise offers OSC and DMX support
“That is why we are using Unreal Engine. That is why other elements of a live event arrive at once, with both applications processes cohabit on the same either natively or via extension apps. With the
real-time is so powerful.” millisecond-accurate timing. PCs. These newly available solutions are paving the whole tool ecosystem now fully supporting OSC
• Implementation of redundancy, so if a single road to the future by providing solid products to the and DMX, the display and synchronization of real-
Over on projects like Illuminarium—and at live events machine fails, the show will go on. marketplace which are not only real-time enabled, but time content with other systems or devices is
using interactive output—there is a case to be made also scalable to never-before-seen complexity. increasingly straightforward.
that that is the ultimate realization of real-time. Not Addressing those factors starts with a brief look back
knowing exactly how a show or event will play out at the evolution of the science of sending real-time nDisplay Technology Redundancy
each time will certainly feel nerve-wracking to some content to displays. For driving complex and large display systems With regard to redundancy, that remains a hardware
practitioners, but video games have been getting requiring clustered rendering, Epic Games has issue. disguise and others have moved to counter
interactivity right for decades—and it is Unreal Around 20 years ago, as the complexity of real-time developed the nDisplay system for real-time the problem, and now offer systems whereby backup
Engine’s heritage as a game engine that makes it video games increased, greater power was needed to practitioners. nDisplay technology extends Unreal machines can immediately and seamlessly step in if
fitting here. render at a rate of around 16 ms/frame to achieve a Engine by distributing the rendering of a camera view any prime machines fail during an event. disguise, AV
playback speed of at least 60 fps, a standard speed over any number of machines and then displaying Stumpfl, 7thsense, Smode, and others now also offer
Finally, one can fully leverage the capabilities of real- for realism and believability. the rendered images on any number of display their own blend of powerful physical media servers
time pipelines for design and content production, and mechanisms. nDisplay was designed to address and software tools that operators can use to prepare,
bake the end result as high-quality video assets for This advance was facilitated by, and stimulated issues introduced by the limitations of GPUs or older arrange, rehearse, and orchestrate the delivery
final delivery. At the end of the day, it’s a question of improvements to, the power of graphics processing cluster-based rendering method’s limitations in of shows.
figuring out the best strategy for your project, deciding units (GPUs), with machines imbued with more cores pushing synchronized imagery to multiple displays
on what matters most to you. and ever greater on-board memory. This evolution in real time. The system supports proper frame/
continues, giving creators the means to drive time synchronization, correct viewing frustum based
graphical fidelity to stunning levels. However, GPU on the topology of the screens in world space, and
advances are ultimately focused on the performance deterministic content that is identical across the
of one host machine, and as such do not address the visualization system.

42 43
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Case Study: Creative Works London | Guns N’ Roses | 2021 Tour


may be ideal for first forays into real-time live event Project type: Live performance
development and production, especially where you
can be absolutely certain of the display setup for a Emmy-nominated Creative Works’ tagline, “Make It Live,” amply sums up the focus of the UK-based agency:
show that will not change with each delivery. Bear in brand-centric, narrative-driven, real-time content for live events. To the team in London, that might mean
mind, however, that as your ambitions around visual building video game-worthy assets, plotting out short form animations, shaping XR content, or bringing mixed
fidelity and screen size increase, so will the need for reality to broadcast. Having integrated the core of their pipeline with disguise’s RenderStream Plugin and
considerable storage space. Unreal Engine, the London team has become something of a pioneer in working with “as live” real-time projects.

Another approach is to deliver a project made up of Among Creative Works’ most celebrated projects is the show opener and backdrop the agency produced for
both pre-rendered and real-time content at the point the 2021 Guns ‘N’ Roses live tour We’re F’N’ Back!. The mad laboratory-themed opener, presented on a vast
of consumption by the audience, letting you harness LED screen behind the stage, is designed to grab the crowd’s attention, build hype and noise in the moments
the gains of both approaches. It may mean slightly before the gig begins, and even cue the band onto the stage. The goal was to make each Guns N’ Roses show
more complex pipelines, media server structures, feel thoroughly contemporary and deeply engaging.
and display approaches, but considering so many
technologies in this space serve both real-time and At each show, a live operator controls what fans see on the LED screens, choosing visuals that sync to or
pre-rendered content, you wouldn’t necessarily need augment the band’s performance on the stage. This not only creates a unique fan experience at each show, but
to double up on technical intricacy. also takes advantage of Unreal Engine’s ability to adapt its output to virtually any LED screen’s shape or size. 
Image courtesy of Illuminarium Experiences

The hybrid approach is put to stunning effect by Guns N’ Roses typically tours with its own hardware including LED screens, and the entire event team
Setup and implementation of media servers, Moment Factory's AT&T Discovery District project, endeavors to replicate the same stage environment at each show. But that isn’t always possible, and there
synchronization, and redundancy are complex which uses the power of Unreal Engine to explore how are cases when content needs to be sent to a screen of a completely different size and aspect ratio than
matters, but there is no doubt that through buildings could adopt new ways to set or enhance the originally expected. Within the narrow window allotted for setting up for a live show, the Creative Works team, in
technological improvement, increases in power, and tone of the local area. The project saw a vast 104-foot collaboration with disguise, have been able to adapt content to new constraints in as little as three minutes. 
the establishment of new standards and conventions, tall, 6K media wall and an LED-powered trellis wrapped
delivering real-time events now comes with much around the corner of a real Dallas building, which Dan Potter, Creative Works Co-Founder and Executive Creative Director, adopted a game development
more consistency and robustness. In short, real- displays gradually evolving visual content designed approach to building the content for the show, where the team handled everything from initial ideas and
time is something users can rely upon because it to inspire, relax, or engage. The display hosts blended concepts to producing final pixels, and making sure everything is running smoothly on the day.
is now scalable to display setups of arbitrary size pre-rendered and real-time content—the pre-rendered
and complexity. visuals have been rendered with high visual fidelity,
while real-time content is fed in to reflect changes
Real-Time Approach as a Spectrum in the time, weather, or season. Moment Factory’s
At the start of this chapter, we discussed the idea work of “media architecture” is deeply impressive and
of choosing whether to use real-time technology affecting, and showcases what a real-time live event
as a basis for a project, and when to consider a can be.
more traditional pre-rendered approach. Practically
speaking, it isn’t as simple as making a binary choice. With the industry still experimenting with the many
There are several hybrid models that you may want to ways real-time content can engage viewers, we’ve only
consider; in this way, it’s helpful to think of the real- scratched the surface of the real-time spectrum. We’re
time opportunity as existing across a spectrum. eager to see what the creative teams of the future
dream up to entice, inspire, and astonish us.
Some may use real-time technology for a preparatory
step because of its rapid iteration, collaboration, and
flexibility, before developing and baking final assets
to be delivered as a pre-rendered, static video. That
Image courtesy of creativeworks.london

44 45
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Q&A: disguise
“We’re working with venue and tour teams, and embracing both the visual strength of offline and the flexibility disguise is a software and hardware platform that enables creatives and technical producers to deliver real-time
of real-time,” Potter reveals. “The parallels with game development, and what live events can learn from that live events, virtual productions, and location-based experiences. Founded in 2000 and with over 10 established
process, is key here.” He adds that the team used a lot of the same tools a game developer would, such as locations around the world including London, Hong Kong, Los Angeles, and Shanghai, the company has powered
3ds Max, Blender, Cinema4D, Substance Painter, and ZBrush. Once Creative Works has built out the content, numerous real-time productions for music artists such as Katy Perry and Billie Eilish, film and episodics for
the team can import it into Unreal Engine to get real-time assets that work well with, and adapt to, the live Netflix and Amazon Prime, corporate presentations for Siemens and Verizon, and live broadcast programs from
stage environment.”  Eurosport, MTV, and ITV. disguise has also contributed significantly to knowledge sharing, community building,
and the establishment of conventions for delivering deeply engaging experiences.
Creative Works also stands as a key proponent of educating clients in the real-time opportunity from the
earliest days of the project, with the end goal of delivering the best results. While there are clients who are Here, Solutions Director Peter Kirkup sheds some light on what he and his colleagues strive to achieve. Kirkup’s
happy to trust the team to work autonomously, Potter says, there can be a strong advantage in involving passion for event production began at age seven, when he would operate the lights during school plays. It’s fair to
the client at every stage of a project’s evolution. In the case of Guns N’ Roses, Creative Works had built up a say that what he does—and what is possible—has come a long way since that formative experience.
considerable relationship with the band after partnering on smaller projects as far back as 2012. As the real-
time opportunity for the 2021 began to take shape, Creative Works found that involving the Guns N’ Roses How do you describe what disguise does?
team helped guide the narrative, style, tone, and aims of the project. 
We are a manufacturer of service and software workflows that enable people to deliver amazing events in
Potter describes such an approach as “highly collaborative, playful, conversational, and rapidly iterative,” and spectacular locations around the world. 
recommends that creators go this way whenever possible—there is so much space for innovation and new
ideas, he says, and the client might not initially understand the opportunity or its potential. These days, we transcend a lot of sectors, but we actually came from the music concert touring world—we used
to do shows for the bands U2 and Massive Attack back in the day, visuals-to-screen for stadium shows and
For the Creative Works team, Unreal Engine has proved ideal for engaging clients early on to gain insights arena shows. We started out as a creative agency making the content for those screens, and then we started
for steering the project toward its goals. The Guns N’ Roses project is a prime example of how this synergy making our own software to deliver that content in shows. 
works between agency and client. Kristin Oldershaw, Creative Works Technical Lead and the one responsible
for deploying real-time assets, explains. “The world-building capabilities of our creative team using Unreal Over time, we worked in theater, on corporate events, on product launches, and on other types of projects.
Engine, accelerates the process of bringing big ideas to life in a rapid and iterative way,” Oldershaw says. “We We saw this blurring of the lines in so many spaces, which is when we started our move to support broadcast
can, in a matter of days, build a visual blueprint for the entire project to work further ideas into, and quickly and and virtual production. We’re currently working with
confidently share these visions with clients to keep them in the conversation at every stage.” LED screens, projection, installations, and setups
on the exteriors of buildings, all kinds of things, and
“We want to push the audience’s expectations, but to do that in the right way, we need to understand we essentially do all that through a software-on-
them,” explains Jeremy Leeor, Co-Founder and Managing Director at Creative Works. “This means we need hardware model.
to understand the musicians, understand where key points in the songs are, and understand where the
musicians feel that the show is. All that is key to understanding what the audience is coming for.” In terms of the overall project workflows, where does
disguise fit in?
For Creative Works, Unreal Engine is at the center of this type of collaborative effort. “With a game engine in
our pipeline, we can do that rapid iterating,” Potter adds. “We can get content made up and blocked out very We make our own software that does all of the last-
fast, and give the client a sense of what’s possible, what we’re planning, and how it matches their aims. In that mile delivery. This means making pixels light up in
way, a game engine is really central to what we do.” time and in sync across an almost unlimited scale,
which involves all those hard technical bits—pixel
In fact, for Creative Works, the devotion to inheriting practice and workflows from the game industry goes control, EDID [Extended Display Identification Data]
a little deeper. In 2021, they released their internally developed game Daisy’s Revenge, which explores the management, genlock, and so on. 
themes and music of rock supergroup The Dead Daisies.
Scale is becoming an increasingly important issue as
“If you want to understand real-time and how to get it right, make a game,” asserts Leeor. “Even if it’s just an projects grow in ambition. For example, we've been
internal exercise, it's a really powerful process.” doing projects involving 200 projectors and the world's
tallest buildings, with lots of interlinked servers to
serve those large-scale ambitions. Image courtesy of disguise

46 47
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Unreal Engine is an important content creation platform in the space, so we’ve made sure everything we do
on the hardware and software side can be deeply integrated into the engine. We have servers that do the
connectivity to the outside world, and we also have a render farm that we built, which basically turns Unreal
Engine into a content source for our system. And we leverage nDisplay to do that, and it all works over video-
over-IP. It taps into various different bits of the engine, but also uses our own custom plugins.

“The cool thing about this setup is that it leaves artists and
animators free to focus on the creative side of content, using Image courtesy of disguise

Unreal Engine as a creative tool without having to get bogged That's why we've built this ecosystem of dependability and reliability of hardware and software, and we focus
down with the technical details of displaying the content.” heavily on making sure those reliability mechanisms are in place. We surround it with SLAs [synchronous line
adapters], and we build redundancy into the systems so if something goes down, something else takes over.
In software, and through infrastructure, we handle decisions about which pixel map an LED processor needs in
order to sit on this screen in this stadium. That's a key part of what disguise does—we do the heavy technical How is disguise able to support so many different types of projects?
lifting so the creative people can focus on using the game engine to create content.
At disguise, we've produced a platform that is incredibly flexible and will work for whatever the needs of the
In fact, when we’re talking to potential clients about a new project, we don’t call Unreal Engine a game engine. client are. We work on projection projects, LED-based projects, VR projects, AR broadcast projects, and more. 
We call it a “content engine,” because that’s really what it’s doing—scene building and animation, and real-time
rendering and output, which is content creation. This can be a really helpful way to communicate about the Sometimes the project calls for a rigid screen that sits there throughout the whole show. Other times, we're
technology, especially to people new to these types of opportunities. dealing with something like the Eurovision Song Contest or the Brit Awards, where the stage is moving apart
and then coming together, and things are flying out. 
How does disguise address the unique pressures of live events?
Each of these types of projects is different, but the common theme is that it's all complex pixel manipulation—
Away from the technical side of things, what we're actually doing at disguise—particularly on the live event it's not a standard-sized monitor showing a standard HD video file, it’s pixels in a unique configuration on the
side of things—is addressing the fact that the stakes are higher. At a live event, a technical error on the display stage. That complex pixel manipulation is generally where we really add value. Our system is designed to just
might be seen by 70,000 people simultaneously, and a serious technical failure at a major event might make the handle all these situations, and pretty much whatever else our clients can dream up. 
front page of a newspaper. 
Where does Unreal Engine fit into the picture?

We think of Unreal Engine as a layer in our timeline—it’s a layer of content that we can comp, and bring into our
world and manipulate.

We can map it onto surfaces, or we can cut pixels out from a render from Unreal and put them somewhere else
on a different part of the stage. We can bring in other layer types so we can play our traditional pre-rendered
video, or we can bring in web-based content from an HTML5 source, or we can bring in content from any
number of other content creation platforms. In a way, it's all just layers in a timeline—layers that we compose
and blend, just as with traditional photo editing and video editing.

It's complex stuff, but we handle it all. The opportunity for real-time content is really just about using real-time
video editing tools that just happen to be working on big-pixel canvases with complex shapes. At disguise,
we're very much focused on that pixel delivery, and on making it possible for clients to deliver engaging and
entertaining content in these new ways, so audiences and viewers can enjoy it.

Image courtesy of disguise

48 49
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Augmented and Mixed Reality


for Live Productions
Today, we regularly see cinematic and televised works This chapter focuses on live productions that mix Meanwhile, live is live, where there’s no room for any Courtesy of XR Studios
created through virtual production (VP). realities, whether AR or MR, broadcast or streamed, of that polishing. However, real-time is increasingly
and also those that include an on-site audience. affording creators, agency staff, operators, and AV
The technique empowers storytellers to build worlds teams ways to adapt or adjust content during the
that are extremely realistic or believable, and present AR/MR and Virtual Production delivery of such productions.
information in deeply engaging ways. But VP is really
about the day of a shoot, using various methods to Live mixed and augmented reality productions share Ultimately, in the case of live mixed reality projects,
blend live-action and computer-generated content on considerable parity with virtual production—they can there is no hiding of errors. Everything must be final
set, with real-time technology providing immediate reasonably be framed as part of a singular movement pixel—meaning delivering final image quality live, “in-
feedback to the cast and crew so as to inform within film, television, broadcast, and streaming. camera,” and without requiring subsequent visual
performances and production decisions. effects work. And yet the proximity of the well-
Live mixed reality productions are distinct from established virtual production approach and newer
In the era of Star Wars: Episode 1 - The Phantom cases like The Mandalorian, in that the latter is freed live mixed reality production methods does mean
Menace (1999), the film industry first started to explore from the pressures of being shown to audiences live. the former is informing the latter in various ways.
the benefits real-time technology could bring. Things The teams creating the series work to meticulously
have come a very long way since then, and now The plotted deadlines, but also have the luxury of Over in virtual production, for example, there is a
Mandalorian series stands as a shining example of how polishing and correcting work, or even reshooting move to “in-camera visual effects” (ICVFX) models.
seamlessly real-world performers can be convincingly when required. In those cases, a shoot might be based around
placed in virtual worlds via VP. large LED volumes that display realistic output
from real-time engines behind performers. Images
on the screens move in synchronization with real-
world camera tracking as actors are followed. This Meanwhile, LED walls and motion capture are
approach can produce final-pixel imagery completely sometimes used in live and broadcast real-time
in camera—the current state of the art method for applications as a visualization tool in the early stages
virtual production, and a means to help cast and of a workflow. Technological advances will likely
crew on set visualize the digital world around them, make LEDs and mocap more commonplace in live
rather than have to make guesses in a green-screen contexts. However, the fact that live broadcasts need
volume. In those cases, though, much would still to be technically robust and truly final pixel in the
be done in post; the physical complexity of ICVFX moment of the shoot or performance means a more
favors productions that get generous attention significant reliance on more proven approaches such
in post, rather than for straightforward linear live as green screen.
broadcasts. Further into the future, though, ICVFX
may have some role to play in live productions. Equally, with live, a great deal more planning is
required to capture everything in the single moment
Elsewhere, motion capture (mocap) methods of a shoot or performance. That means camera
such as performance, full-body, or facial capture changes, animations, and performer hand-overs are
are regularly used in virtual production, lending precisely plotted and timed, right down to sub-second
performer’s movements to digital entities, or levels. Long before the final live broadcast, cable
embellishing actors with virtual elements. Real- wattages will also have been mapped out and set in
time mocap models are becoming more practically stone. Anyone with experience of such a shoot might
workable, which presents an option for live tell you there’s no such thing as being over-prepared,
productions where a performers’ movements could and that no detail is too granular to be considered at
be captured in real time, guiding the movements of the planning stage.
Courtesy of XR Studios
an animated character during an event.

50 51
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Case Study: Myreze | Valve Corporation | The International 10


Practice Makes Perfect taking place in empty spaces. Eilish’s empty room Project type: Esports broadcast
could be embellished with all manner of dynamic real-
Rehearsals also play a more significant role for live time content. On the heels of its successful esports events during the pandemic, gaming giant Valve wanted to put
mixed reality productions—not just in terms of giving all the learnings into a single project that would set a new standard for delivery of competitive gaming to
performers and crew an opportunity to prepare to work Now, however, with the pandemic’s impact significantly global audiences.
with precision and timing, but also to establish and less present as it stands, a new medium or format has
refine tracking data for guiding the live shoot itself. emerged; one that leans into concepts of mixed reality The event was The International 10 in October 2021, which brought together the world’s best players of video
In these cases, rehearsals are as much about data and frees itself from the old conventions of physical game Dota 2 for a share of a record-breaking $40+ million prize pool.
generation as they are practice. events. The pandemic just happened to give creators,
performers, operators, and—importantly—audiences, To meet their ambition for The International 10, Valve charged Norwegian agency Myreze with the task of
It might be something of an oversimplifying contrast, a chance to become familiar with the new way of establishing a virtual set and live stream production workflows at a grand scale. It’s fair to say the results
but it remains helpful to consider that at a fundamental doing things. were exceptional—in addition to attracting millions of viewers, the production earned an Outstanding Esports
level, the effort of live virtual production exists in pre- Coverage Emmy for Valve in 2022.
production up until the shoot, while virtual production Beyond fan-centric events, live mixed reality
for entities like film places a far greater emphasis on production got its truly mainstream moment as teams As part of the event, 125 hours of live content were delivered to audiences globally, culminating in a grand final
post-production. all over the world used the approach in their coverage viewed or rewatched by 100 million individuals (numbers do not include China’s viewership, the game’s largest
of the Beijing 2022 Winter Olympics. There, COVID-19 single market). The event was also the third-most-watched video ever on streaming platform Twitch.
But what of the actual opportunity of real-time live- restrictions meant a dramatic shift in how pundits
virtual production? It provides a means to make and commentators could report on live sports, which To understand why and how so many fans were engaged—and why real-time matters to the success—it's best
live broadcasts more distinct, creative, or engaging. in turn afforded an opportunity to do new things. to start by taking a look back to a time before COVID-19.
The Weather Channel has used such approaches to Presenters on a virtual set could suddenly be placed
make education and even critically important science halfway up a mountain, right beside the action— A traditional esports competition plays out in a sports arena or other large venue in front of energetic on-
education more welcoming and effective. delivering insights while surrounded by dynamic site crowds. While presenters and commentators—‘casters’ in esports parlance—augment the audience
elements that communicate data and results from experience with observations and commentary, the real focal point is the gameplay. The venue includes rows
The real opportunity, though, is in facilitating and live competitions. of gaming hardware where players and teams sit down to compete, and the live gameplay is fed to large
delivering a new form of hybrid live spectacle; a form screens so the audience can see the action. Camera operators capture the entire event for broadcast or live
that has significantly risen in presence and popularity Esports provides a defining example of the potential streaming, with the broadcast team switching between live-action and in-game footage. To the audience
as a result of the COVID-19 pandemic. Only a few of mixed reality production for a live audience, where at home—often made up of loyal fans of a particular team or player—the event unfolds like a live sports
years ago, broadcasts of concerts, event TV, fashion an already successful format has been reinvented in a broadcast, right down to on-screen data visualization of statistics and results.
shows, sporting occasions, or esports competitions way that is powerfully fitting for its subject matter.
typically focused on capturing the experience of
attending in person. While viewers watching remotely
would make up far greater numbers than those
attending personally, such events were always
framed and structured as in-person experiences first
and foremost.

Then the pandemic put an end to most large in-person


gatherings. As a result, new ways of presenting major
events had to be concocted. Billie Eilish playing to
an empty physical arena or theater simply wouldn’t
cut it. Mixed reality and virtual sets initially offered a
means through which to compensate for the lack of
large crowds and the sense of performances or events
Image courtesy of Myreze.com

52 53
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

From early esports events like Nintendo PowerFest '94 to the huge Dota 2 and League of Legends competitions
of 2019, the crowd in attendance has always contributed significantly to the event’s energy.

And then the pandemic happened. The International was postponed in 2020 due to COVID-19 concerns, and
Valve looked to October 2021 as a safe date to reboot the traditional model of holding the tournament before a
large in-person crowd. But when more concerns about COVID-19 cropped up less than a week before the first
round, Valve elected to pivot to a virtual-only event.

Appetites for esports competitions certainly didn’t wane during the pandemic, but without the energy of the live
crowd, something new was needed. Real-time virtual production quickly emerged as a solution to that problem.

Mixed and augmented reality are not new for esports—in fact, streaming coverage of The International in 2018
had featured AR game characters on an LED stage. But for the 2021 tournament, the concept was one that
Myreze—in partnership with virtual production specialists Pixotope—were able to take to dazzling new places,
using a real-time production pipeline to deliver a seamless experience that placed casters, competitors, and
live performers in a single Dota-inspired world, despite the fact that all were in different physical locations.

The International for 2021 was held at Bucharest’s Arena Nat‚ ională stadium, as originally planned, but without
an audience in attendance. Instead, casters presented from a main AR stage and four green-screen studios
within the event’s venue. Two additional remote caster green-screen stages were hosted in Brazil and Peru,
Image courtesy of Myreze.com
while one more with an LED screen was used in China for that vast local audience.

In Bucharest, the workflow was shaped to display assets such as a player’s in-game character in the volume Across the stages, 15 Grass Valley cameras required precise tracking, which is where rehearsals proved their
alongside real-world casters. Meanwhile, an on-site LED screen gave casters and players a means to visualize value. With each rehearsal, tracking data could be carefully refined and realigned. In fact, those rehearsals were
the position of those assets, giving them an impression of the perspective enjoyed by viewers at home. just as valuable when replayed as when enacted; an essential process and a strength of the real-time approach,
where so much can be captured beyond performances.

Myreze chose to bring together Blackmagic Design HyperDeck Studio 4K Pro broadcast decks and Teranex
Mini Audio to SDI 12G converters as a way to capture audio and video data from those rehearsals, ultimately
generating reliable timecodes. The audio and video recordings could then be played back on the broadcast decks,
and carefully aligned with tracking data in Pixotope, where the virtual set and on-screen graphics were created in
a pipeline that also included Unreal Engine, 3ds Max, and Blender.

On the day, after so much preparation, it came down to the effort of teams from Myreze, Valve, Pixotope, the
hardware operators, the professional players, and the casters, who collaborated under a carefully arranged
hierarchy and structure. Reworking the conventions of on-set hierarchies, and welcoming new roles while evolving
the responsibilities of AV providers and creative agencies, is where much of the innovation and progress is being
made in this space.

The result of the collaborating companies’ effort was remarkable in meaningfully blending so many elements
virtual and physical, and has set an inspiring new standard for what is possible through live mixed reality
production. The International 10’s award-winning presentation has successfully asserted the case that this new
format is not just novel or interesting, but in many cases improves on traditional approaches by bringing more
meaningful and relevant ways to share live happenings with global audiences.
Image courtesy of Myreze.com

54 55
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Case Study: The Weather Channel | IMR Studio


Project type: Live mixed reality production

Weather broadcasting has long been a pioneering form when it comes to mixing realities. The earliest examples
of green-screen-based forecasting may seem primitive by today’s standards, but they were an important part of
the early history of mixed reality and real-time embellished broadcasting.

Over at The Weather Channel, which provides forecasting and weather information to a global audience, that
pioneering spirit has never faded. When Michael Potts joined the team in 2013, he came with a clear vision of
employing immersive storytelling to deliver engaging weather information. By 2018, the broadcaster’s Vice
President of Design had spearheaded a project with The Future Group to create special immersive mixed reality
segments on lightning and tornadoes, and later in the year, on storm surges and wildfires—all by using Unreal
Engine, real-time rendering, and virtual production methods. These pre-recorded segments certainly impressed
Image courtesy of The Weather Channel
their audience, with the tornado section winning an Emmy award.

But Potts and his team wanted to do more. They set out to use live immersive mixed reality (IMR) to tell stories in This is where the power of The Weather Channel’s IMR studio comes in. Building on the virtual set established
the moment, bringing viewers up-to-the moment data from a virtual set where the presenter could use dynamic for the pre-recorded real-time work, and using Unreal Engine at the core of its pipeline, The Weather Channel
computer-generated assets to visualize and communicate information. established a volume that is half physical and half virtual, where presenters and experts can move between the
two worlds seamlessly. From there, real-time visualizations of meteorological phenomena, live forecast data,
“We all use our phones today to glance at the weather, and get a little information on highs and lows and and more can be broadcast live. Essentially, the system uses visual storytelling to make it easier for a vast and
precipitation. Those are great tools, but they don’t tell you the full story,” Potts explains. “There’s no room to diverse audience to understand the weather, its impacts, and its systems.
communicate the nuance of the weather, or explain why things happen, or even communicate weather safety. We
knew there was this whole communication ecosystem missing where we could fill in the gap, and that’s what led The highly realistic outdoor environment that surrounds the studio’s virtual element can also change to reflect
us to committing to live immersive mixed reality.” the current topic on air, shifting from stormy mountains to a coastal or snowy area as needed. That’s not just
a gimmick, but a means to help viewers imagine the experience of the conditions, making the broadcast more
Weather is also complex, and can feel abstract or extremely hard to visualize. What happens inside a engaging and the information more useful and consumable.
cloud formation or under the surface of water, for example, are concepts that aren’t easily described with
traditional reporting. Potts says the opportunity real-time brings is about leveraging technology at the intersection of weather data
and science to share things that can’t simply be captured as video—such as the inside of something incredibly
small, a bird’s-eye view of a large-scale event, or the impact of a potential future catastrophe—one example is
the use of a city model to show what six feet of flood water actually looks like. With real-time and live immersive
mixed reality, The Weather Channel can present such visuals so its audience can be better prepared for such
a situation.

The workflow for pulling together a given piece of storytelling for a segment starts with identifying stories that
will benefit from an IMR delivery. From there, a collaborative process begins, involving science editors, a director,
presenters, artists, creative directors, technical specialists, and Potts himself. Once they’ve chosen the story,
they begin work on a storyboard and script.

“At this point, we can start gathering the data and information and get our artists to work building things that are
realistic, informed by science, and captivating for viewers,” Potts explains. He adds that this stage of production
is a lot like traditional broadcast, but that things really start to feel different at rehearsals.

“What you see on your computer often feels different when it’s suddenly in the studio,” he says. “There we can
really use one of real-time’s strengths, which is editing and adjusting on the fly.”
Image courtesy of The Weather Channel

56 57
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

The whole process, from initial concept to delivery, takes four to six weeks. Potts says that the talent is key here,
because they bring not only meteorological knowledge, but also those all-important science communication skills
and the ability to be really engaging. “And that’s important,” Potts says. “All this technology should really serve
your storytelling and talent, and not the other way around.”

In terms of technical pipeline, everything is very much founded on The Weather Channel’s broadcast heritage,
using cameras and a green-screen set, which run in tandem with Zero Density’s Reality Engine, an Unreal
Engine-based real-time broadcast compositing system. The team also harnesses the power of Zero Density’s
Reality Keyer, an innovative real-time image-based keyer that runs on a GPU. Rather than assuming a single
color value for the entire background as seen with chroma keying methods, image-based keying contrasts the
captured video with a clean plate, enabling subtle transparent details and shadows to be retained, upping the
realism and fusion between the virtual and physical.

“Beyond establishing all the technology,” Potts concludes,


“you have to ask yourself a lot of questions before
committing too much. Do you want to do something as a
one-off event or a regular spot? Are you going to integrate
things with broadcast infrastructure? What are the key
milestones and deadlines you need to hit? What are the
story highlights that really deserve attention? Why does
the story even matter? Who is the right talent? Why is
immersive and real-time the right choice?”

A great many of those things, Potts is quick to point out,


are not in any way real-time specific, but that is the point
here. IMR and real-time should never be adopted for their
own sake, and much of the craft of telling visual stories
remains unchanged.

“Getting this right starts with good storytelling and before


that, it starts with a vision,” Potts says. “You have to get
those things in place to succeed with this opportunity. To
us, Unreal Engine is a tool that makes going immersive
and real-time effortless and efficient. But good
storytelling and good talent remain the most important
things we have at our disposal.”

Practical and Technical Considerations

The live mixed reality production is still an emerging


form, which means processes and best practices are still
forming. Having explored the evolution and potential of
live mixed reality productions, let’s consider some of the
practical and technical considerations.
Image courtesy of The Weather Channel
Image courtesy of Moment Factory

58 59
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

New Roles, New Skills or computer graphics presented on screens, and Training up your existing team to understand the
The very fact that this remains an emerging form the AV team who take responsibility for hardware. practice and delivery of these new forms is key to
means there is a way to go in terms of establishing At a fundamental level, that trio of elements— embracing this opportunity. And that might not
convention in the space. First, the roles and hierarchies performance, visuals, and hardware—are still at be as intimidating as you expect. In almost every
of this space are yet to solidify. the foundation of the delivery of live mixed reality interview conducted for this field guide, it came up
productions. But introducing new technology or more that traditional broadcast and event roles and skills
For now, many live mixed reality productions still adopt complexity—multiple simultaneous locations, LED still remain highly relevant, and retraining is only about
a structure loosely based on traditional approaches screens of various shapes and complexity and 4K augmenting and tuning existing staff’s expertise.
that sees three broad teams collaborating to pull off projectors, tracking systems, and new pipelines—
an event on the day. There’s the performer and their means there is a great deal of change to consider. The team at Myreze, for example, comes from wide
support staff, a creative agency handling any video and varied production backgrounds. Knowing about
game engines and new technologies is important, says
Jørgen Steinheim, Partner & President at Myreze, but
at the same time, it’s still all about storytelling, and
building an engaging piece of viewing.

“It’s just ways to do more with storytelling, and


it’s important to remember that,” says Steinheim.
“With the skills across our team we can make the
most of Unreal—we have the technology people,
those with traditional broadcast experience, and the
understanding of narrative and storytelling informed
by that.”

Steinheim adds that traditional skills, and the ability


to understand audiences, are key to being successful
in this space. “You need a diversity of skills and
experiences in your team,” he advises, “so it might be
about retraining and rehiring.”

For the artists and designers using DCCs, the core


craft remains largely unchanged. Those employees
are likely going to need to reconsider assumptions
around factors such as screen sizes and resolutions,
and embrace the concept of real-time removing how
Figure 2: Life cycle of a live event project, with Unreal Engine as the central hub for assets and controls.
scripted things can be in some instances. But it is
the teams working on rehearsals and the final live
production that may see the most profound change.

We are, however, seeing new roles appear in tandem


with the growing real-time opportunity, although there
is little established agreement on how to frame or title
those roles.

Images courtesy of Myreze.com

60 61
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Many speak to the value of securing a “system world of real-time production and delivery. It can
integrator” or similar team who can handle work particularly well if you pair up gaming engine
the technical side of connecting content with specialists and senior AV staff, as that combination
infrastructure, while also serving as a go-between who covers much of the skills needed in a real-time live
smooths the interactions between hardware people, mixed reality production.
performers, and content providers.
Equally, be aware that as much as you may need
Elsewhere, many virtual mixed reality productions are to retrain, educate, or augment your existing team,
happening with a “screen producer” sitting high in the educating customers to manage expectations is
food chain, an individual effectively charged with taking equally key here.
responsibility for what appears on screen, and thus
coordinating the related technology and teams. Testing, Calibration, and Assumptions
Away from considering how your team and hardware
Others spoken to for this field guide put forward the communicate and integrate, it is important to note
idea of a generalist “Unreal Engine specialist” or even that—relative to traditional production—rehearsal,
Image courtesy of Capacity Studios
“Unreal Engine wizard,” meaning somebody that calibration, and testing matter a great deal. While
can understand the game engine and its interaction the granular nuance of camera tracking technology
and integration with the wider ecosystem both in and technique is beyond the scope of this guide, storage may be put under significant strain. In short, more expedient model when it comes to live mixed
rehearsals and on the go-live day. quality camera tracking for mixed reality broadcast in thoroughly test and calibrate absolutely everything, reality production, but the greatest time savings might
particular forms the keystone of quality. With that in particularly where it involves a wireless or internet actually be found in the long term. Real-time assets
None of those roles are established or recognised mind, considerable thought, time, and effort needs to connection or data flow such as camera tracking data, made in a game engine can still have a considerable
as including a specific skill set or authority within be put into factors such as time of day and weather and consider variables such as weather conditions. lifetime after the live production wraps, thanks to being
production hierarchies. The best way forward may for outdoor events, or crowd size and how that may In addition, schedule ample time not just for multiple easy to rework or apply to other types of media.
simply be to give more time than you might imagine to interfere with the robustness of a connection. rehearsals, but also post-rehearsal sessions to break
establish a team hierarchy while pursuing specialist down, interpret, and test your data. Should a customer later want a mobile game for
hires. Hire through skill set rather than job title, and Geography can also be a factor—at a music festival the same brand, or an entirely new live production,
consider engaging with the games industry to employ in the middle of a desert, for example, remote access
LED Screens Working on real-time live mixed reality productions the real-time nature of the assets means they can
hires or freelancers that are well versed in the wider to one-server tech such as version control and isn’t just about overcoming challenge and complexity. readily be adapted and reused. In other words, you
The shift to real-time here also brings main gains in can consider the real-time workflow for a live mixed
efficiency and even simplifications of process. reality production as simultaneously establishing a
funnel that could be used to generate new content
When live productions send pre-rendered content as from existing assets, for weeks, months, or years into
video to on-site screens, the process is more familiar. the future.
While the opportunity for dynamic, adaptive content
is not available, many will see it as an inherently less Turnkey Studios
risky approach. But going with pre-rendered content New pipelines, software, engines, and formats are still
often means much of your effort is duplicated—where emerging, and in terms of the physical interconnected
artists can block out and prototype assets in simpler hardware systems required on the day, things are
DCC software, high-resolution versions of those not yet entirely standardized. Things are progressing
LED
assets often have to be reconstructed from scratch in fast, with companies like disguise offering a line of
LED
LED
CAD software. fully integrated media server hardware, software,
infrastructure, training, community, and support that
Moving to work in a real-time pipeline via a game endeavors to provide a complete ecosystem for such
engine, meanwhile, means using a single solution from events—Unreal Engine being part of that ecosystem.
Figure 3: Sample setup for a system utilizing LED panels and a tracked camera. You can find more technical
details and setups in the white paper nDisplay Technology: Limitless scaling of real-time content. early concept to final pixel. Real-time is often a much

62 63
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Case Study: Psyonix | RLCS Hype Chamber | Capacity Studio and Partners
However, we are also seeing studios emerge that cater To that end, the studio is building its tools and Project type: Esports broadcast
specifically to broadcast and live events clients who pipeline, and integrating with disguise, to establish an
want a production experience that closely resembles environment that feels just like a broadcast studio. The Rocket League Championship Series (RLCS) Hype Chamber is a remarkable project that deftly captures the
familiar workflows. Such studios will mitigate some of potential of real-time technology to intersect opportunities in broadcast, live events, gaming, mixed reality, and
the pain points and challenges of acquiring hardware “That's our number one priority,” says Millar. “The even virtual production.
and training staff that clients might face if attempting studio has the same controls and familiar setup, but
to create their own real-time setups and teams. it just happens to have a massive, multi-million-dollar Developed by game studio Psyonix, Rocket League has emerged as a star of esports’ global rise. In offering an
LED screen at the end with specific media servers automotive take on soccer, the game follows the great arcade tradition of being easy to pick up and difficult to
XR Studios has taken this approach, offering a full- running it.” master. Its knack for delivering dramatic clashes has drawn huge audiences to esports broadcasts like the RLCS
service production studio built for XR and MR, with esports tournament—with all eyes fixed on the in-game arena in which each match plays out.
the team supporting everything from concept to final Millar adds that he sees more studios going with such
delivery, and a highly flexible, permanent LED volume a solution in the future. “It’s a way to help creative Long-time Psyonix collaborator and experienced creative agency Capacity Studios saw an opportunity to extend
at the heart of things. By providing the location, teams embrace this opportunity, without them having that arena and deliver a digital broadcast concept that brought fans, professional players, and the game closer
hardware, pipeline, and creative process as a single to relearn nearly everything.” together. The RLCS Hype Chamber project is a joint collaboration between many forces from Epic Games,
integrated ecosystem, XR Studios removes many of Psyonix, Capacity, and DreamHack.
the challenges of adapting a traditional live production XR Studios’ LA-based location is able to host
setup to serve as a live mixed reality production everything from single performers to full orchestras “The Hype Chamber was designed as a launching point to reimagine what a sports broadcast might look like for a
offering, and frees up customers and creatives to via esports events and even game shows. That digital-first audience,” explains Jasveer Sidhu, Art Director at Capacity Studio.
focus on performers, performance, and creativity. kind of blank-canvas concept may be critical to the
future of live mixed reality production, and to the The Hype Chamber offers an imagined extension to the game’s motorsports arena, presented as a virtual stage
XR Studios’ goal is to provide the most complete wider broadcast and live events real-time movement. where vehicles are displayed before rolling out onto the pitch and taking their places. Its role is to present a
canvas for XR creativity, explains Scott Millar, XR But some productions will always need to be site- transitional space between a studio broadcast and the gameplay itself, where teams can be introduced, hype can
Studios’ CTO, to ensure that when a client comes specific—not every project will be able to take place at be built, and winners can be announced. In other words, it serves a multifunctional role in RLCS broadcasts—and
in with their idea for the LED volume, there are no an XR location. it proved so popular that it has now been reimagined as a physical space.
technical barriers or concerns about integrations.
The design of the physical iteration consists of a 360-degree set of curved LED screens. Those screens are fed
outputs from the Unreal Engine version of the scene, recreating the digital Hype Chamber environment on stage
as a real-time entity. Competing teams can sit in real seats on either side of the Hype Chamber while game
assets are displayed between and around them.

Image courtesy of Capacity Studios

64 65
The Broadcast and Live Events Field Guide Chapter 4: Real-Time Graphics Content for Live Events

Image courtesy of Capacity Studios

Offering a dynamic virtual stage with reconfigurable The physical version is similarly flexible. “Because the
geometry, the physical Hype Chamber can display Hype Chamber was built in Unreal Engine, there are a
screen content that adjusts automatically according number of ways these same assets can be leveraged
to image composition and the teams present in the for live events,” explains Benji Thiem, Creative Director
space. The Capacity team is even exploring ways to and Partner at Capacity Studios. “Since we developed
make the graphics on the physical stage interactive, an entire scene that exists in 360 degrees, we were
where viewers can influence the content. able to map the portions of the space we wanted to
feature onto a set of real LED screens, creating a
“There are a couple of dynamics specific to RLCS dynamic backdrop for the live event, which already had
that make the real-time nature of interacting with the much of the functionality for team customization built
Hype Chamber special,” says Ellerey Gave, Executive in. We further expanded on this package by including
Creative Director at Capacity Studios. “The open custom graphics, as well as a toolkit of video loops that
format of the league means that there are usually new could drive other smaller screens in the space.”
teams competing each week. In traditional pipelines,
this would either equate to a ton of additional work and The project has been made available to Unreal Engine
rendering each time a new team is added, or it would users as a sample project, giving them the opportunity
lead to the add-on teams receiving a less special to get under the skin of the design, development, and
treatment than the established teams. With the way delivery of the RLCS Hype Chamber.
the Hype Chamber is wired up in Unreal Engine, we
can swap out logos and team colors very quickly, and Ultimately, Capacity has created combined real-world
instantly have the same high-quality assets ready to and virtual spaces that serve as bridges to seamlessly
go for that weekend’s broadcast.” move between different realities, fusing concepts of
events, broadcast, and gaming.
The digital version of the Hype Chamber can be used
in a number of additional ways, thanks to real-time
technology’s capacity to rapidly generate content for
a reworked environment. It can serve as a storytelling
platform, or even as a showroom for team car skins
available to fans within the game. Image courtesy of Capacity Studios

66 67
The Broadcast and Live Events Field Guide Chapter 5: Summary

CHAPTER 5:

Summary Throughout this guide and its interviews and


case studies, several challenges to adopting
real-time technology in broadcast and live
events have presented themselves. In this
section, we will highlight and consider some
of the most significant of those challenges,
and the available solutions. This section is
intended to summarize threads that pass
through the wider guide, providing an “at a
glance” list of key practical considerations
for those looking to embrace the real-time
opportunity in broadcast and live events.

Image courtesy of Capacity Studios

68 69
The Broadcast and Live Events Field Guide Chapter
Chapter XX:
5: Summary
XXXXXX

New Skills and Retraining Staff


In many interviews for this guide, users expressed are yet to solidify or be consistent. Ultimately, many
concerns around hiring suitably experienced arrangements can lack a liaison between the creative
specialists when the talent pool is small, and also team providing assets; the A/V team managing
Image courtesy of Moment Factory
retraining existing staff to adopt real-time approaches. hardware, displays, and media servers; and entities
This is admittedly a complex challenge, as many real- such as the artist and venue team.
The Solution Sometimes real-time broadcast and live events
time specialties and roles are still emerging.
productions can go from initial idea to completion
The Solution
Adopting a game engine does not mean a wild leap in staggeringly short time frames. In the case of
The Solution
from working with other design, editing, and production broadcasters who have established real-time-ready
Hiring is a likely solution here, and the real-time
software. Many of the fundamental principles will studios with everything in place, incredible things can
While the problem here is surmountable, the challenges community may help fill those roles. Many productions
apply, and you may be surprised to realize how much repeatedly be achieved in very short windows of time.
are a reality, and there are no quick and easy universal thrive when served by a floating liaison who is adept
the Unreal Engine Editor compares to other software
solutions. Rather, there is nuance—many interviewed with real-time technology and Unreal Engine, and can
environments you or your team use on a daily basis. Other real-time projects may take months of
for this guide pointed out that most “traditional” skills move between creative, hardware, artist, and venue
Beyond immersing yourself with the broadcast and planning and implementation—for example, a real-
in broadcast and live events remain highly relevant, teams. Elsewhere, screen producers with comparable
live events real-time community, you will find a vast time-embellished music tour might require updates
and can be adapted to real-time technology without skills have a senior position and take responsibility
support network and a large quantity of documentation and maintenance constantly, and for years. The
too much friction or disruption. The fundamentals of for everything appearing on screen, an alternative
across the Unreal Engine community, provided by both most ambitious projects can require long periods of
understanding audience, storytelling, and engagement framing of a go-between for various on-site teams.
Epic Games and users on their own channels. planning simply to get started.
in existing TV, broadcast, film, video, and event sectors
can be broadly applied to real-time. Many speak to the value of securing a “system
Having spent years serving game development teams Equally, while some real-time productions cost
integrator” who can handle the technical side
from vast operations with hundreds of staff to tiny relatively little, others might demand vast custom
Despite this fact, embracing real-time skills can’t be of connecting content with infrastructure, while
studios literally based in garages, Epic has learned stages, or towering bespoke LED screens or
rushed. Give it time: start with a smaller or exploratory also serving as a go-between who smooths the
much about developing training and documentation projection systems.
project, hire specialists when needed, and immerse interactions between hardware people, performers,
that serves a range of needs and experience levels.
yourself in the events and online activities of the real- and content providers. Ultimately, and at this stage
Self-teaching may not be the ultimate solution at scale, The solution here, as much as there is one, is to free
time broadcast and live events community. in the industry’s forward journey with real-time
but many in the space are self-taught. yourself from assumptions that real-time always
technology, hiring experts to complement your team is
means time and money savings. A more accurate
That community can be key. With a shared interest likely more sensible than restructuring or reinventing
In other cases, the documentation and community statement is that going with real-time technology
in growing the real-time space and establishing your existing team.
around Epic (and related technologies that sit in a real- offers more options for engaging your target public,
convention around technology, technique, and
time pipeline) are about individual users empowering and can lead to stronger relationships with audiences
hiring, there is much interest in knowledge sharing, Adopting Unknown and themselves enough to start to explore the wider that benefit you in other ways—sold-out shows,
partnerships, and formal and informal collaboration.
Connecting with that community is a strong starting New Technologies opportunity, while making them literate enough in the great reviews, returning customers, or word-of-
technology and practice to engage more meaningfully mouth marketing.
point. Ultimately, a more immediate solution may come
With so much of timeline-based production software with conversations and learning across the sector.
from hiring a real-time specialist; if you go this route,
base your job listings on needs and skills, rather than and motion graphics well established, and with many New Hardware Choices
practitioners having used the tools needed to create If you’re interested in finding out about premium
hiring for a specific title or existing role.
them for years, moving to work with new real-time support and custom, private training, contact us and Traditional video-based media servers and display
technology and platforms can be intimidating. It can we’ll let you know about all of the available options. systems offer a known, familiar entity: the projection
Establishing New Roles and be easy to assume such tools are unwelcoming, or display of canned footage or pre-rendered
Production Hierarchies or will take months or years of training to embrace Time-Saving and animations to a screen in a linear fashion. Conversely,
and understand.
The established hierarchies, structures, and roles
Budgetary Misconceptions a real-time workflow often involves displaying dynamic
content to contemporary options such as LED
found across more traditional broadcast and events screens or high-resolution, multi-projection systems.
Many mistakenly believe that real-time may be
production often need adjusting to better serve
cheaper, while also saving time. In reality, that
real-time; and yet the roles required to plug gaps
perception oversimplifies the situation.

70 71
The Broadcast and Live Events Field Guide Chapter 5: Summary

Profile: Moment Factory | AT&T Discovery District


Additionally, real-time’s strengths mean it has “theater of real-time"—is never advisable, and could Project type: Live multimedia experience
expanded interest in displays of non-traditional ratios, lead to less impact, more time and budgetary sink, or
shapes, and sizes—and even moving or uneven display even failure to deliver on a brief. AT&T and global architecture firm Gensler were looking to establish an immersive and engaging multimedia
surfaces, and display on the sides of buildings. It can experience in the heart of downtown Dallas. That meant mixing the realities of physical buildings and digital
feel like embracing these opportunities demands The Solution works—which, in turn, meant real-time was the perfect solution. With that in mind, AT&T and Gensler enlisted
an intimidating change or upgrade to unfamiliar Moment Factory, a multimedia studio specializing in creating experiences that bring people together in
infrastructure, which in turn may impact efficiency, Be open-minded to game engines and real-time public spaces.
quality, or budget. workflows, and endeavor to explore their potential
through practice, but never pick a real-time approach The mandate included overseeing strategic project development and experiential design, including content and
The Solution just to tick a box or impress your clients with digital platform integration.
your toolset.
Although a fundamental understanding of real-time Moment Factory delivers a very distinct form of live mixed reality experience. With over 20 years of experience,
media servers’ function, role, and essential workings There are so many reasons to be excited about real- the team has learned a lot about the value of established skills, the power of storytelling, and the potential in
is helpful, for now the reality of such productions likely time not only because it is a new or innovative way using a game engine to make your own tools—all of which proved powerful in delivering content for what became
means partnering with an individual provider who will of producing or delivering content, but because it is known as the AT&T Discovery District.
deliver both physical infrastructure and expertise—and a completely new way of thinking. Once you embrace
even on-site staff in many cases. it, you will understand the added values it brings, and It was a project that perfectly demonstrated Moment Factory’s playful mantra: “We do it in public”.
in what capacity it will fit your project. It’s not an all-
The same is often true in live events, so you can or-nothing approach—you get to pick exactly where,
consider the solution here as comparable to hiring any or in what capacity, it will fit your project. Should you
A/V provider. Practically, that currently means working do your project entirely in traditional pre-rendered
with a provider of real-time capable media servers. workflows, or leverage real-time platforms only
Many providers of that kind of infrastructure can also when a project calls for high levels of interactivity or
provide insight, consultation, on-site staff, and more, reactivity, remember that the choice is entirely up to
letting you embrace the potential of real-time without a you. Essentially, pick real-time in the desired capacity
significant investment in internal infrastructure, skills, because it is adding value to the experience, to the
and staff. workflow, or to your time-cost analysis considerations.

Equally, we are likely to see more models in the vein of Should you or your team be concerned about the
that provided by XR Studios, where a provider offers required knowledge and learning curve, we suggest
a flexible, adaptable studio space made for real-time, you jump right at it and start internal, experimental,
meaning all hardware and hardware integration is or exploratory real-time projects to test the water and
offered as part of the service. Once more, partnership learn practically. Some even advocate for making a
and collaboration are key. simple video game to understand the fundamental
concepts—consider that today, code-free game
Does it Have to be Real-Time? development platforms exist that are aimed at children
under the age of 10—and such a project may be more
This last challenge is the most defining and persistent. welcoming and achievable than you assume.
It can be hard to know if real-time is right for your
project or aims. Oftentimes clients, artists, or brands Consider the gains game engines and real-time
might push for real-time because they have seen rivals workflows can bring, and endeavor to explore their
succeed with it. But to adopt real-time methodologies potential through practice, because once you or your
and approaches simply for the sake of it—practicing team get the hang of it, it's very hard to go back!

Image courtesy of Moment Factory

72 73
The Broadcast and Live Events Field Guide Chapter 5: Summary

“Everything we do is about bringing people together,” Director Alberto Ramirez. “The answers there aren't just about technology. What works in storytelling, and what
explains Annie Leclerc-Casavant, Communications works with human nature—those things are the same whether we are talking about a real-time display on the side
Advisor at Moment Factory. “We work to create a form of a building, or more traditional art forms.”
of entertainment that offers people new experiences
they can share together. We use multimedia to create Half of what Moment Factory does, he says, is still done with traditional skills and methods. “But the other half,”
a context where people can experience what we call he continues, “is working towards giving people this connection with different mediums, different formats, or
“modern day gathering places”. It could be a concert, different types of interaction, the kinds that are only possible through current and new technologies.”
or a brand event at a flagship store, or something at
an airport, in a forest, or at a theme park. In all those Achieving these connections means a rethinking of many of the old rules and approaches, but not necessarily
contexts, what's really important for us is to inspire learning everything all over again, Ramirez tells us. “It's not as intimidating as people might think,” he says. “It's
connections and a collective sense of wonder. And more like we're discovering this change, and new ways to think about old things like storytelling.”
everything we do is about the physical, real world.
There might be a virtual element, but the real world is It’s also about picking the stories that need to be told through those technologies. “We don’t just use technology
always part of it.” because it is there,” says Moment Factory Innovation Producer Céline Mornet, who worked on the AT&T project.
“We are not looking to find ways to use some new AI algorithm just because we can, or build new hardware when
The AT&T Discovery District is one such gathering it isn’t required. Our purpose as a team is to be multiskilled and understand many different ways to be creative
place in the real world. At its heart is the AT&T and use technology, and then scan any industry to understand the creative needs. From there, we can find the
Dallas headquarter building, fitted with a towering right tool to achieve those creative needs.
104-foot tall, 6K media wall and LED-powered trellis
that displays evolving visual content designed to “We like to say we ‘hijack technology,’ and that means we use and repurpose any technology that helps deliver the
fit the tone of the local area, helping those passing creative vision.”
through to feel relaxed, connected, or inspired. The
screen can even output sporting content or film In other words, it should never be about starting with a technology and then concocting a means to use it. As
screenings, reinterpreting a corporate building as a we’ve heard throughout this guide, story should always come first.
cultural destination.
A game engine, however, can serve as a means to connect skills both emerging and established, while providing a
The content is delivered as a blend of pre-rendered consistent content creation platform that is highly compatible with most other tools required for a given project.
and real-time content deftly adapted to time of day, So while Moment Factory is platform-agnostic, Unreal Engine has provided a reliable foundation for so many of
weather, and season, with attention to the visual their pipelines, enabling artists to work in familiar environments while serving highly atypical end destinations—
fidelity required for such a vast screen space. such as a display on a large Dallas building.

Moment Factory has also reimagined spaces in the Another strength the game engine brings is the ability to use it to build tools—something game makers have
Tokyo subway, the Notre-Dame de Reims cathedral, expected of engines for many years. Those game developers might build custom tools in Unreal Engine to
the Resorts World Las Vegas on the city's famed strip, connect various elements of a game’s environment and interactive systems. In the case of Moment Factory, that
and other distinct and iconic locations. proved highly useful because of the wide variations in their projects, some involving custom-built or oddly shaped
displays, or highly unusual display spaces.
Crucially, while Moment Factory has completed many
truly innovative and technologically cutting-edge “It's really about being able to make the tools to create an interface with the physical world,” Mornet confirms.
projects since its founding in 2001, the team sees what “We work with a game engine to create the tools to connect with the lights, the sound systems, the augmented
they do in places like Dallas as simply taking a different reality, and the sensors. With Unreal Engine, everything becomes an integrated process.”
approach to the well-established craft of storytelling.
“We can also use Unreal to simply facilitate our daily lives, to keep things running more smoothly,” adds Ramirez.
“We are essentially trying to understand how to engage “We’ve built our own library of assets and systems, and so on. But Unreal is also this huge sandbox and immense
people, and the basic nature of people engaging with toolbox, and we’re still finding new ways to take advantage of its tools.”
each other,” explains Moment Factory’s Multimedia Image courtesy of Moment Factory

74 75
The Broadcast and Live Events Field Guide Chapter 5: Summary

Profile: XR Studios
Project type: Turnkey studio for broadcast and live events projects “We're also a creative agnostic technical solutions provider, meaning that we partner with different content,
virtual teams, project agencies, and so on, to make sure the creative team gets to the finish line,” explains
XR Studios is a full-service, turnkey production studio in Los Angeles, CA, specializing in the merging of real Executive Producer Francesca Benevento. “That flexible and collaborative approach is really central to what we
and virtual worlds for AR, MR, and virtual production projects. With a client list that includes Billie Eilish, Riot do, and central to making XR projects more workable and welcoming.”
Games, Katy Perry, Twitch, Amazon, TikTok, and Post Malone, XR Studios represents a powerful example of
where the technology has got to today, and where the future of production might take us. The team at XR Studios is mindful of the potential to have a wider influence around mixed reality, and feels a
sense of responsibility to the industry as a whole.
The company provides both in-studio and remote services, for projects ranging from small corporate event
videos to commercials, massive concerts, and major livestreams of awards shows. “With our permanent studio, we’re trying to not only deliver quality and reliability, but also help set a new
standard in the industry when it comes to working with this technology,” states XR Studios President J.T.
XR Studios features a stage with a calibrated LED volume, effectively offering a blank canvas that is primed Rooney. “We want to lead by example, and create a space for everyone—from clients and record labels, to
for XR projects, and providing an environment where clients can be free to explore creative ideas. “We hide the companies and creative agencies—to not feel rushed with their projects or feel like they are left with poor-
complexities as much as possible,” explains XR Studios CTO Scott Millar. “We want people to feel free to work quality content. We understand the importance of investing our time, resources, and care in executing first-rate
toward the end vision, and not worry about the technology.” experiences; that is what’s going to make a huge impact on the industry, and help the space overall.”

That core location in Los Angeles presents a striking example of what a modern production studio can be. The work can be difficult and challenging, he admits, particularly because it’s new and complex. Rooney
Central to the concept is that XR Studios can handle every element of a project, from conception to production encourages companies looking to get involved with real-time projects to carefully consider what their vision or
logistics to technical execution, giving clients unfamiliar with the nuance of mixed reality projects a complete project requires, and to speak to experienced teams specializing in XR to get a realistic idea of these challenges.
service for delivering powerful work, all without having to retrain their teams or rebuild their pipelines. It’s an
approach built to eradicate the technical barriers typically present when a creative agency or similar group As for XR Studios, the permanent, highly adaptable location provides the company with a built-in opportunity to
decides they want to embrace mixed reality, and work with virtual production methodologies, LED screens, and improve and refine, taking learnings from one shoot and implementing them before the next. “That’s a big part
projection to atypical displays. of our approach,” he explains. “When we're not doing shoots, we're very focused on making our offering better,
and working to further it.”

Courtesy of XR Studios

76 77
The Broadcast and Live Events Field Guide It’s Showtime!

This focus on R&D in between productions, says Rooney, is part of the company’s contribution to the XR space
It’s Showtime!
at large: “We have had years and years of buildup and learning. There is more shared knowledge and experience
in some areas, such as live concerts with real-time elements, but there is still so much for us to explore and
expand on. Working with an advanced set up like real-time in a volume with tracked cameras gives you a whole
set of different challenges. And this really excites us.” In this guide, we have aimed to demystify real-time workflows
Even before choosing a facility for your XR project, Rooney advises, one should really consider whether for broadcast and live events. We hope this guide has shown
extended reality is the right medium for the job.
you what’s possible with real-time technology, and has
“A lot of people, particularly in the world of live tours, ask for real-time content without having a real reason encouraged you to explore the possibilities with Unreal Engine.
to go that way,” Rooney reveals. “And there has to be a reason—there has to be a foundation. Entertainment
with live music is a great example of when real-time makes sense, because real-time can happen with the
performance. It's happening with something that is reactive, that is unpredictable, and that is changing, so real- We encourage you to download Unreal Engine so you can
time can really achieve something that complements or extends the performance.”
start building your own pipelines and infrastructure.
Millar points out that real-time technology works well even for content that isn’t intended to be photoreal. “The
best-looking real-time content on show is the content that's not trying to look real,” he says. “It's trying to be Epic Games remains committed to the broadcast and live
interesting and engaging, but it uses stylization and other aesthetics away from pure realism.”
events community, and we look forward to you continuing
To finish, Benevento offers something fascinating and encouraging in equal measure. While any team working
on a real-time project would love to have months to work on it, the reality is that XR Studios usually gets four to
the journey with us. As you delve into the world of real-time
six weeks, or even less, to prepare. That, however, can be understood as an opportunity as much as a challenge. workflows, be sure to connect with our vast and diverse
“Sometimes, the limitations of the schedule can be a way to harness creative freedom,” Benevento says. “If community of users to get tips and connect with others
you have only two weeks to turn something around, and don’t have the opportunity for preparation or post-
production that other approaches can bring, you can see that as an opportunity to get very creative. So we talk
doing similar projects. You can also find more resources
with our clients about these fast projects essentially being scientifically-minded projects, exploring what we at our Broadcast and Live Events hub, or reach out to
can achieve or discover with real-time, when it has this amazing ability to create content fast.”
the Unreal Engine team directly via our contact form.
Benevento cites the recent example of XR Studios delivering six music performances to TikTok in two days.
“The end result was, we gave TikTok a new format for sharing music that really worked for their platform,” she
says. “So, again, it's about managing client expectations, being open-minded to finding new ways, and really
going with real-time when it's right.”

78 79
The Broadcast and Live Events Field Guide Links

Glossary Links
Augmented reality (AR) Head-mounted display (HMD) Ray tracing 11 The Famous Group https://www.thefamousgroup.com/#1
A technology that integrates CG A device used to display CG A rendering technique for generating
elements into a physical environment. content for VR, AR, or MR. an image by tracing the path of 11 disguise https://www.disguise.one/
Blueprint light as pixels in an image plane 11 XR Studios https://www.xrstudios.live/
Latency
Script created from the Blueprint visual and simulating the effects of its
The delay between when a signal is 11 Moment Factory https://momentfactory.com/home
scripting language in Unreal Engine— encounters with virtual objects.
sent and when it is received at its
defines how an asset interacts. destination; experts consider under 10 Real-time rendering 11 Creative Works https://www.creativeworks.london/
Central processing unit (CPU) milliseconds of latency to be critical The translation of a scene into display 11 Myreze https://myreze.com/
The main computer chip that for real-time camera operation. pixels fast enough for instantaneous
playback at real-time (live) speeds. In 12 frame:work https://framework-community.com/
performs a wide variety of LED display
calculations. Compare with GPU. contrast, traditional offline rendering may 14 frame:work https://framework-community.com/
A panel that emits lights and colors
take minutes or even hours to produce
Cinematic as pixels for a video display, using an 18 deus ex machina https://en.wikipedia.org/wiki/Deus_ex_machina
each frame, with 24 frames required to
A pre-rendered, noninteractive array of light-emitting diodes (LEDs).
display a second's worth of animation. 21 The Panther Project http://www.panthers.com/news/mixed-reality-panther-a-hit-at-season-opener
sequence in an otherwise interactive Media server
experience. Also called "cutscene". Refresh rate 21 Carolina Panthers https://www.panthers.com/
Hardware and/or software for
The frequency with which an electronic
Cluster storing and delivering digital media 21 The Famous Group https://www.thefamousgroup.com/#1
display is refreshed, usually expressed
A group of PCs on a network designated for such as video, audio, or images.
in hertz (Hz). Higher refresh rates can 21 Carolina Panthers Tweet https://twitter.com/Panthers/status/1437103615634726916?s=20&t=usoLykNN5YjCP30_
a specific task, forming a single system. MIDI (Musical Instrument Digital make motion appear smoother. rL1vFQ
Clustered rendering Interface)
Timecode 23 The Panther Project http://www.panthers.com/news/mixed-reality-panther-a-hit-at-season-opener
Rendering by a group of PCs that A long-established protocol for connecting
A sequence of numeric codes that
have been designated as part of electronic musical instruments, 23 Carolina Panthers https://www.panthers.com/
record the exact time a signal was
the same cluster (group). computers, and other audio devices for
generated, such as the exact date 28 The Famous Group https://www.thefamousgroup.com/#1
playing, editing, and recording music.
Distributed rendering and time when video was recorded.
28 Myreze https://myreze.com/
Multiple instances of an engine Mixed reality (MR) Timecode is used for synchronization in
The process of anchoring virtual video production and live show control 29 disguise https://www.disguise.one/
processing the same scene in parallel to
achieve much higher total resolution. objects to the real world and enabling to synchronize various actions. 31 Myreze https://myreze.com/
users to interact with them.
DMX (Digital Multiplex) Virtual camera (Vcam) 32 Moment Factory https://momentfactory.com/home
A protocol for digital control of lighting Motion capture (mocap) A camera in a game engine that can be
The process of recording actions of driven using a device such as an iPad. 33 Gitlab https://about.gitlab.com
and effects. Originally developed for real-
world stages, the standard is now used human actors, and using that information 33 Github https://github.com
Virtual production (VP)
for all-digital lighting and effects as well. to animate digital character models.
The cross section between physical 33 Git https://git-scm.com/
Extended reality (XR) nDisplay and digital worlds, using real-time
A system within Unreal Engine that technology to view and interact with 33 Perforce Helix Core https://www.perforce.com/products/helix-core
An umbrella term for VR, AR, and
MR, and all future realities such distributes the rendering of content virtual environments and characters. 34 vbnm, https://www.illuminarium.com/atlanta/space/
technology might bring. across a network of computers, generating
Virtual reality (VR) 34 Illuminarium http://www.illuminarium.com
images to displays with proper frame/
Frustum An immersive experience using reality
time synchronization and viewing 41 Creative Works https://www.creativeworks.london/
The region of a virtual world that headsets to generate the realistic
frustum based on real-world topology of
appears as a viewport to the camera. sounds, images, and other sensations 41 Illuminarium http://www.illuminarium.com
screens. See the white paper nDisplay
that replicate a real environment
Final pixel Technology: Limitless scaling of real- 43 disguise https://www.disguise.one/
or create an imaginary world.
Final images at a quality sufficient time content for more information.
43 nDisplay system https://cdn2.unrealengine.com/Unreal+Engine%2FnDisplay-Whitepaper-V1.8B-9d99c6448fb
for the purpose, without the need Volume
OSC (Open Sound Control) d96bcd5a8d0770c12c22387683778.pdf
for additional post-production. A physical, enclosed space that houses
A protocol for networking sound
any of various systems for virtual 44 Moment Factory https://momentfactory.com/home
Game engine synthesizers, computers, and multimedia
production such as LED walls for in-
A software-development environment devices for shows and performances. 45 disguise https://www.disguise.one/
camera visual effects, cameras to
designed for the creation of real-time Performance capture record performance capture, etc. 45 2021 Tour https://www.gunsnroses.com/news/title/test
interactive content, initially for video games, An advanced form of motion capture
XR 45 Guns N’ Roses https://www.gunsnroses.com/
but now used in many other applications. that includes faces and/or fingers, and
See Extended Reality. 45 Creative Works London https://www.creativeworks.london/ue
Graphics processing unit (GPU) captures subtle expressions. For more
A specialized type of microprocessor information, see the white paper Choosing 47 Extended Display Identification https://en.wikipedia.org/wiki/Extended_Display_Identification_Data
optimized to display graphics and a real-time performance capture system. Data
do very specific computational
tasks. Modern real-time engines rely 47 disguise https://www.disguise.one/
heavily on GPUs for performance.

80 81
The Broadcast and Live Events Field Guide

53 The International 10 https://www.youtube.com/watch?v=ecHbnqUuqeE


53 Valve Corporation https://www.valvesoftware.com/en/
53 Myreze https://myreze.com/
56 IMR Studio https://www.unrealengine.com/en-US/spotlights/the-weather-channel-s-new-studio-
brings-immersive-mixed-reality-to-daily-live-broadcasts
56 The Weather Channel http://www.unrealengine.com/en-US/spotlights/the-weather-channel-s-new-studio-brings-
immersive-mixed-reality-to-daily-live-broadcasts
61 Myreze https://myreze.com/
62 Limitless scaling of real-time https://cdn2.unrealengine.com/Unreal+Engine%2FnDisplay-Whitepaper-V1.8B-9d99c6448fb
content d96bcd5a8d0770c12c22387683778.pdf
62 Hyperlink https://cdn2.unrealengine.com/Unreal+Engine%2FnDisplay-Whitepaper-V1.8B-9d99c6448fb
d96bcd5a8d0770c12c22387683778.pdf
62 nDisplay Technology: https://cdn2.unrealengine.com/Unreal+Engine%2FnDisplay-Whitepaper-V1.8B-9d99c6448fb
d96bcd5a8d0770c12c22387683778.pdf
63 disguise https://www.disguise.one/
64 XR Studios https://www.xrstudios.live/
65 Capacity Studio and Partners https://capacity.gg/
65 RLCS Hype Chamber https://capacity.gg/project/rlcs-hype-chamber/
65 Psyonix https://www.psyonix.com/
66 Sample project https://www.unrealengine.com/en-US/spotlights/enter-the-rocket-league-hype-chamber-a-
new-sample-for-broadcast-and-live-events
71 Contact us https://www.unrealengine.com/en-US/license#contact-us-form
72 XR Studios https://www.xrstudios.live/
73 AT&T Discovery District https://discoverydistrict.att.com/
73 Moment Factory https://momentfactory.com/home
76 XR Studios https://www.xrstudios.live/
79 Contact us https://www.unrealengine.com/en-US/license#contact-us-form
79 Broadcast and Live Events hub https://www.unrealengine.com/en-US/solutions/broadcast-live-events
79 Epic Games dev community https://www.unrealengine.com/en-US/community
79 download Unreal Engine https://www.unrealengine.com/download
80 Choosing a real-time https://cdn2.unrealengine.com/Unreal+Engine%2Fperformance-capture-
performance capture system whitepaper%2FLPC_Whitepaper_final-7f4163190d9926a15142eafcca15e8da5f4d0701.pdf
80 nDisplay: Limitless scaling of https://cdn2.unrealengine.com/Unreal+Engine%2FnDisplay-Whitepaper-V1.8B-9d99c6448fb
real-time content d96bcd5a8d0770c12c22387683778.pdf

Courtesy of Moment Factory

82 83
There’s a shift happening in the broadcast and
live events industry, one that engages and
excites audiences in ways never seen before—a
shift to real-time content and experiences.
From rock concerts and sports/esports broadcasts
to weather reporting, interstellar journeys, and
more, real-time changes the way broadcast and live
events are designed, delivered, and consumed. By
immersing viewers in a fresh, new world, real-time
brings new ways to produce interactive or mixed
media that will keep audiences coming back for more.
With this guide, join Epic Games on a journey to
developing real-time immersive experiences, with
Unreal Engine at the core of this new type of pipeline.

© 2022 Epic Games / All Rights Reserved.

You might also like