Drone Racing League
Conceptualized, designed, & built a cloud based live production environment capable of network quality programming. 
Under 1 Second of Latency.
At the beginning of the pandemic Drone Racing League challenged Prusik Media to design and build a remote broadcast system capable of producing live, network quality television while keeping all crew and talent in the safety of their own homes. 
Pivoting from what is traditionally a taped event with racing drones in stadiums around the world to a fully remote production that allowed pilots to realistically compete apart from one another required answering every question in a new way. Technical hurdles such as home network infrastructure, broadcast control surfaces, audio mixing, and low latency multiviews delivered as far as a producer in Budapest and a director in the UK. 
Prusik's team conceptualized, designed, and built a cloud based system that supports over 16 on camera talent with fifty discrete sources operated by 36 crew across 4 countries with no "traditional" physical broadcast equipment with a latency from lens to network delivery in under 1 second. 
Video
is transmitted from talent into the Production Environment (PE) using redundant SRT streams from dual talent transmission computers using auto failover ISPs to a UDP stream then NewTek's NDI. Once the incoming video is NDI it is merged into an auto failover unified NDI source that is routed through the PE for the multiple submixes this graphically heavy show requires.
Based on PSDs from DRL Creative we built several multibox layouts that include up to 8 routable live video sources with individually switchable graphic overlays to accommodate the rankings the pilots are seeded into during the semifinals and finals. These different layouts are generated across several AWS instances to balance the processing power required to move the volume of video necessary.
AUDIO
The next largest hurdle was capturing and mixing the audio sources required to hear all the pilots and talent during race call, afterglow and interviews. The pilots wear JBL gaming headsets and their audio follows the video path via SRT into the PE. Using the audio channels embedded in NDI allowed us to keep the audio and video flowing seamlessly throughout the PE with reference video always attached to the audio stream to allow for easy lip sync check even for channels not routed to program. Using a combination of vMix on AWS, USB tunneling and VPNs, we were able to mate a MIDI audio console directly to the PE to allow for motorized fader feedback, mixing automations, audio tracking and cuing directly from the A1s home desk. For redundancy the A2 has an identical configuration and the audio consoles mirror each other so if for any reason the A1 loses internet connection, the A2 can take over right where the A1 left off.

 

You may also like

Back to Top