Use Cases

 

 

 

Use Case 1: Live audio production


This use case contains 4 main areas of a wireless scenario:

- Capturing of live audio data

Producing and capturing of a live event for further exploitation involves many wireless audio streams. 

- Temporary spectrum access

Each wireless application during the whole production time inside the studio requires different spectrum occupancy setup and different spectrum access, respectively.

- Automatic setup of wireless equipment

After receiving a grant for the useable spectrum, setup and configuration of all wireless PMSE equipment is done automatically. 

- Local high-quality network

A typical studio setup is limited in coverage and number of wireless devices (UIs). 

Figure 1. Live audio production use case: simple demonstrator setup.

Figure 1. Live audio production use case: simple demonstrator setup.

Use Case 2: Multiple camera wireless studio


 

Figure 2 shows a general overview of UC2. Leveraging the transition of many parts of content production environments towards IT and IP-based infrastructures the project will provide an additional step forward by making use of 5G technology to provide a fully integrated IP scenario with one or several wireless components. Therefore, the best of IP in studio will be combined with the super-fast, low latency and highly reliable wireless 5G connections. This will create opportunities for media companies to face the continuous challenge of producing more content with less resources along with the automation of some of their processes, reaching new ways to increase efficiency and effectiveness in production.

The wireless IP component, based on 5G, is key to improve technical and operational efficiency, increase flexibility and reduce production cost

Under this scenario, 5G NPNs play a key role to enable a self-operated environment not dependent on the network conditions of any underlying MNO. The scenario also envisages the potential of making 5G-enabled equipment able to be transparently used under NPN and public networks and even move between them seamlessly during productions and while continuing transmitting, thus maximizing the interoperability between different systems and components with the commonality of an IP-based infrastructure.

Figure 1. 5G wireless studio: use case overview.

Figure 2. 5G wireless studio: use case overview. 

This use case also contemplates the deployment of an outdoor production scenario with the additional deployment of two or more 5G-enabled cameras and sound capture devices still connected to the NPN, which acts as an appendix of the indoor TV studio. In here, cameras will be controlled from the broadcast centre located in the studio. Multiple TV cameras, microphones, intercom systems and monitoring devices (provided by Sennheiser) will be connected over radio links to the 5G gNB or using device to device direct communications. 

 

Use Case 3: Live immersive media


 

This use case considers a real-time, end-to-end, FVV system that includes capturing, 5G contribution, virtual view synthesis on an edge server, 5G delivery and visualization on users’ terminals. The system will generate in real-time a synthesized video stream from a free-moving virtual position. 

The video workflow will be based on three main stages:

- Capturing

 A set of high-resolution cameras will partially surround the object of interest, and synchronization will be provided by the 5G network.

- Encoding and transmission

Real-time encoding of the camera streams (color and depth) will require specific processing and lossless delivery for depth information as it represents the scene structure. 

- Synthesis and visualization

A live FVV stream will be generated with minimum latency by taking into account only the reference cameras that are closest to the virtual viewpoint at any given moment. 

In turn, the use case deployment considers three main subsystems:

- Media acquisition

 A set of media acquisition sub-units each one connected to the 5G networks and endowed with a local video processor to manage the output streams of a plurality of high-resolution cameras. 

- Content production

An FVV renderer and encoder for live or on-demand requests, where the operation will be managed through several Virtual Production Consoles (local or remote). 

- Media delivery

Delivery to third parties (content producers, broadcasters) in contribution quality and delivery to event attendees in streaming quality (transcoding and caching and/or multicast).

Figure 1. Live immersive media services: use case overview. 

Figure 3. Live immersive media services: use case overview.