Welcome to the world of Infosys Engineering! It is a half a billion plus organization that takes pride in shaping our engineering aspirations and dreams and bringing them to fruition. We provide engineering services and solutions across the lifecycle of our clients’ offerings, ranging from product ideation to realization and sustenance, that caters to a cross-section of industries - aerospace, automotive, medical devices, retail, telecommunications, hi tech, financial services, energy and utilities just to name a few major ones.

« 5G - Small Cells 'Steal the Thunder' in 5G Era | Main | IOT Edge of Tomorrow »

AR-VR at the intersection of SDN-NFV, 5G and Mobile Edge Computing - a practitioner's perspective

Author: Balaji Thangavelu, Principal Consultant, Engineering Services

In this blog post, I have shared my thoughts on how 5G, Software-Defined Networking & Network Functions Virtualization (SDN-NFV) and Mobile Edge computing (MEC) are an integral part of AR-VR(Augmented Reality and Virtual Reality) use cases, by taking a closer look at one of the interesting AR-VR use cases. I have also highlighted key factors that make it imperative for eco system players to engineer these technology areas with due considerations on some of the critical parameters.

A sneak peek into some interesting AR-VR use-cases

Let us begin with understanding the market size for AR-VR. As per the report from Statista (2019), in the year 2022, the augmented and virtual reality market is expected to reach a market size of 209.2 billion U.S. dollars.

Enterprises are coming up with innovative AR-VR use cases which can improve user experience to drive business growth. With 5G deployments going mainstream this year, we are on course to witness even more AR-VR use cases come to life. I have listed some of them here which I believe are poised to become game changers.


·         AR in public safety to help both first responders and individuals needing assistance

·         AR based field repair and maintenance of complex machinery

·         360-degree immersive VR video experience for live sports streaming

·         VR experience for tourists planning their vacation and city tours

·         3D fitting rooms in retail industry, which allows consumers to virtually try-on any number of clothing.


What makes an AR-VR use case complex and computationally intensive?

f you are an avid reader or follower of AR-VR use cases and solutions, I am sure you would have also come across statements like 'AR-VR video processing is complex and computationally intensive'. To understand this better, i thought of taking a closer look at one of the VR use cases by examining the critical functions involved. To illustrate this better, I have picked '360-degree VR video with immersive experience for live streaming' use case.


Before we unpack the internals of the use case, let us first understand a bit about 360-degree video, VR video and 6-DoF which are of relevance here.


360-degree video vs VR video

VR video at its full implementation allows the viewer to move around and interact with virtual objects generated in the simulated environment. 360-degree video, on the other hand allow viewers to look around but not traversing the depth, as the position is controlled / determined by the person who has recorded the video.


Six Degree of Freedom (6-DoF)

6-DoF allows the VR user to traverse in 6 different directions within the video. The below figure depicts the head movements in the six directions. And the technique allows one to traverse depth in the video thereby improving immersive experience.


Now that we understand what makes a simple 360-degree video into a more immersive VR video content, let us start to look at critical functions in the workflow.


·      Capture / Record is the process of capturing the event using a 360-degree camera (e.g., GoPro Omni). This produces multiple raw stream of videos from different cameras mounted on the rig.

·       Video stitching is the process where multiple of the raw video streams are stitched together to form a VR 360 video in real time. Video stitching plays a crucial role in the pipeline and improves the quality of the video with new techniques like depth-based stitching, region of interest stitching etc. Video calibration is a sub-function where the stitching software analyzes video streams and identifies how each stream is related to one other in order to start the stitching.

·   Video transcoding is the process of converting one video format(codec) to other. This can be software or hardware (GPU accelerated) based encoding depending on the codecs used. Most widely supported codecs are H264, H265, VP8 and VP9.

·   Video Container format support - A container is like a box that contains the video, audio, metadata and other information embedded along with content. Based on the user end devices or soft clients, several container formats need to be supported and made available. In the case of video streaming, formats like HTTP Live streaming (HLS), MPEG-DASH (adaptive streaming), Adobe HTTP Dynamic streaming (HDS) will need to be made available to stream the content.

So, what is required to deliver high-degree user experience in AR-VR use-cases?

In order to achieve the right user experience for an AR-VR use case, your solution needs to cater to three critical requirements

·    High-network-bandwidth: While solution providers, industry forums and other players in the AR-VR ecosystem are continuously inventing the next-generation encoding techniques to compress video content, the demand for high-network bandwidth is here to stay. And for extreme VR use cases, this will be in Gbps.

·   End-to-end low-latency: Latency requirement varies based on the AR-VR use case. In the case of Extreme-VR use cases like 360-degree live streaming, motion-to-photon (MTP) latency determines the underpinned latency requirements in the solution which includes sensor, display, computing and communication. Studies reveal that an MTP latency of 10ms to 15ms is ideal for such use cases.

·   Computation-at-edge: Given the complex functions and its computing requirements in AR-VR, it is not always cost-effective to carry large amount of content to a centralized datacenter. This implies that some of the real-time functions in the pipeline (like the one we have seen in 360-degree VR streaming pipeline) need to be offloaded to the edge closer to source or customer.

Technology pillars that make up an AR-VR solution

Now that we have unpacked some critical aspects of an AR-VR solution, let us see the various technology pillars that make up the solution. While there are many emerging technologies required to orchestrate the solution, the technologies that I believe would bear the weight are 5G, SDN/NFV and MEC.


Let us take a look at how these technology areas are fulfilling key requirements of an AR-VR use case

MEC - MEC stands for Mobile Edge Computing (this is now evolving into Multi-access Edge Computing). MEC provides edge-solutions to host, control and manage all the Mobile-edge applications, referred as ME app. In the context of AR/VR, the functions such as video stitching, video calibration, video transcoding can be hosted as ME apps and services. There are other aspects including management, control, virtualization and integration with orchestrator provided by the MEC. (Refer ETSI MEC standards to know more).

SDN - When more ME apps and sessions deployed in ME hosts compete for the shared bandwidth, the bandwidth management becomes an important aspect. MEC provides bandwidth management function and APIs which the apps can consume to optimize the available bandwidth. And integrating the bandwidth management function with an SDN solution would provide added flexibility for dynamic scaling, steering and traffic engineering capabilities. Beyond this, SDN switch-based deployments in MEC locations delivers the datacenter economy benefits.

NFV - Both MEC and NFV are based on virtualization technology. They complement each other. In terms of infrastructure, one can choose to deploy them in collocated or as separate deployments. Some of the integral components of AR-VR plus MEC platforms like DNS can be offered as VNF from NFV. Apart from this, 5G RAN network functions like RAN-DU will be deployed as VNFs in the same location as MEC.

5G - In my view, 5G helps to extend the boundary of benefits from MEC and SDN/NFV. In the context of AR-VR, the extreme bandwidth and low-latency requirements are being fulfilled by 5G. Especially, the Network Slicing function provided by 5G will be important to address the multi-tenant requirements coming from AR-VR use cases.


While SDN/NFV and MEC started their deployment journey much earlier, now with 5G deployments happening across the globe, we will start witnessing the adoption of even more AR-VR use cases. These end-to-end AR-VR solutions would involve advanced engineering along the following areas:

·         5G-RAN, 5G-Core

·         Standards based NFV deployment

·         Standards based MEC deployment

·         Integration of SDN/NFV infrastructure with MEC

·         Onboarding AR-VR solution components into MEC and NFV

·         Cloud Engineering for Mobile and Telcos

·         Engineering Media and Content Delivery Network (CDN)

·         Deployment Enablers and Automation

Infosys Engineering Services is actively building this ecosystem and investing on solutions and innovation labs like SDN/NFV and 5G. Our recent innovation includes the demonstration of '360 degree live and immersive media streaming of the event' done at the VR experience booth at Australian Open 2019, which was part of our digital innovation partnership with AO.

You can reach out to Infosys Engineering Services to know more. 

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Please key in the two words you see in the box to validate your identity as an authentic user and reduce spam.

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter