Welcome to the world of Infosys Engineering! It is a half a billion plus organization that takes pride in shaping our engineering aspirations and dreams and bringing them to fruition. We provide engineering services and solutions across the lifecycle of our clients’ offerings, ranging from product ideation to realization and sustenance, that caters to a cross-section of industries - aerospace, automotive, medical devices, retail, telecommunications, hi tech, financial services, energy and utilities just to name a few major ones.

« November 2018 | Main | March 2019 »

January 30, 2019

AR-VR at the intersection of SDN-NFV, 5G and Mobile Edge Computing - a practitioner's perspective

Author: Balaji Thangavelu, Principal Consultant, Engineering Services


In this blog post, I have shared my thoughts on how 5G, Software-Defined Networking & Network Functions Virtualization (SDN-NFV) and Mobile Edge computing (MEC) are an integral part of AR-VR(Augmented Reality and Virtual Reality) use cases, by taking a closer look at one of the interesting AR-VR use cases. I have also highlighted key factors that make it imperative for eco system players to engineer these technology areas with due considerations on some of the critical parameters.

A sneak peek into some interesting AR-VR use-cases

Let us begin with understanding the market size for AR-VR. As per the report from Statista (2019), in the year 2022, the augmented and virtual reality market is expected to reach a market size of 209.2 billion U.S. dollars.


Enterprises are coming up with innovative AR-VR use cases which can improve user experience to drive business growth. With 5G deployments going mainstream this year, we are on course to witness even more AR-VR use cases come to life. I have listed some of them here which I believe are poised to become game changers.

 

·         AR in public safety to help both first responders and individuals needing assistance

·         AR based field repair and maintenance of complex machinery

·         360-degree immersive VR video experience for live sports streaming

·         VR experience for tourists planning their vacation and city tours

·         3D fitting rooms in retail industry, which allows consumers to virtually try-on any number of clothing.

 

What makes an AR-VR use case complex and computationally intensive?

f you are an avid reader or follower of AR-VR use cases and solutions, I am sure you would have also come across statements like 'AR-VR video processing is complex and computationally intensive'. To understand this better, i thought of taking a closer look at one of the VR use cases by examining the critical functions involved. To illustrate this better, I have picked '360-degree VR video with immersive experience for live streaming' use case.

 

Before we unpack the internals of the use case, let us first understand a bit about 360-degree video, VR video and 6-DoF which are of relevance here.

 

360-degree video vs VR video

VR video at its full implementation allows the viewer to move around and interact with virtual objects generated in the simulated environment. 360-degree video, on the other hand allow viewers to look around but not traversing the depth, as the position is controlled / determined by the person who has recorded the video.

 

Six Degree of Freedom (6-DoF)

6-DoF allows the VR user to traverse in 6 different directions within the video. The below figure depicts the head movements in the six directions. And the technique allows one to traverse depth in the video thereby improving immersive experience.



6DoF.png

Now that we understand what makes a simple 360-degree video into a more immersive VR video content, let us start to look at critical functions in the workflow.


360DegVideoPipeline.png

·      Capture / Record is the process of capturing the event using a 360-degree camera (e.g., GoPro Omni). This produces multiple raw stream of videos from different cameras mounted on the rig.

·       Video stitching is the process where multiple of the raw video streams are stitched together to form a VR 360 video in real time. Video stitching plays a crucial role in the pipeline and improves the quality of the video with new techniques like depth-based stitching, region of interest stitching etc. Video calibration is a sub-function where the stitching software analyzes video streams and identifies how each stream is related to one other in order to start the stitching.

·   Video transcoding is the process of converting one video format(codec) to other. This can be software or hardware (GPU accelerated) based encoding depending on the codecs used. Most widely supported codecs are H264, H265, VP8 and VP9.

·   Video Container format support - A container is like a box that contains the video, audio, metadata and other information embedded along with content. Based on the user end devices or soft clients, several container formats need to be supported and made available. In the case of video streaming, formats like HTTP Live streaming (HLS), MPEG-DASH (adaptive streaming), Adobe HTTP Dynamic streaming (HDS) will need to be made available to stream the content.

So, what is required to deliver high-degree user experience in AR-VR use-cases?

In order to achieve the right user experience for an AR-VR use case, your solution needs to cater to three critical requirements

·    High-network-bandwidth: While solution providers, industry forums and other players in the AR-VR ecosystem are continuously inventing the next-generation encoding techniques to compress video content, the demand for high-network bandwidth is here to stay. And for extreme VR use cases, this will be in Gbps.

·   End-to-end low-latency: Latency requirement varies based on the AR-VR use case. In the case of Extreme-VR use cases like 360-degree live streaming, motion-to-photon (MTP) latency determines the underpinned latency requirements in the solution which includes sensor, display, computing and communication. Studies reveal that an MTP latency of 10ms to 15ms is ideal for such use cases.

·   Computation-at-edge: Given the complex functions and its computing requirements in AR-VR, it is not always cost-effective to carry large amount of content to a centralized datacenter. This implies that some of the real-time functions in the pipeline (like the one we have seen in 360-degree VR streaming pipeline) need to be offloaded to the edge closer to source or customer.

Technology pillars that make up an AR-VR solution

Now that we have unpacked some critical aspects of an AR-VR solution, let us see the various technology pillars that make up the solution. While there are many emerging technologies required to orchestrate the solution, the technologies that I believe would bear the weight are 5G, SDN/NFV and MEC.


AR-VR-with-SDNNFVMEC.png


Let us take a look at how these technology areas are fulfilling key requirements of an AR-VR use case

MEC - MEC stands for Mobile Edge Computing (this is now evolving into Multi-access Edge Computing). MEC provides edge-solutions to host, control and manage all the Mobile-edge applications, referred as ME app. In the context of AR/VR, the functions such as video stitching, video calibration, video transcoding can be hosted as ME apps and services. There are other aspects including management, control, virtualization and integration with orchestrator provided by the MEC. (Refer ETSI MEC standards to know more).

SDN - When more ME apps and sessions deployed in ME hosts compete for the shared bandwidth, the bandwidth management becomes an important aspect. MEC provides bandwidth management function and APIs which the apps can consume to optimize the available bandwidth. And integrating the bandwidth management function with an SDN solution would provide added flexibility for dynamic scaling, steering and traffic engineering capabilities. Beyond this, SDN switch-based deployments in MEC locations delivers the datacenter economy benefits.

NFV - Both MEC and NFV are based on virtualization technology. They complement each other. In terms of infrastructure, one can choose to deploy them in collocated or as separate deployments. Some of the integral components of AR-VR plus MEC platforms like DNS can be offered as VNF from NFV. Apart from this, 5G RAN network functions like RAN-DU will be deployed as VNFs in the same location as MEC.

5G - In my view, 5G helps to extend the boundary of benefits from MEC and SDN/NFV. In the context of AR-VR, the extreme bandwidth and low-latency requirements are being fulfilled by 5G. Especially, the Network Slicing function provided by 5G will be important to address the multi-tenant requirements coming from AR-VR use cases.

Conclusion

While SDN/NFV and MEC started their deployment journey much earlier, now with 5G deployments happening across the globe, we will start witnessing the adoption of even more AR-VR use cases. These end-to-end AR-VR solutions would involve advanced engineering along the following areas:

·         5G-RAN, 5G-Core

·         Standards based NFV deployment

·         Standards based MEC deployment

·         Integration of SDN/NFV infrastructure with MEC

·         Onboarding AR-VR solution components into MEC and NFV

·         Cloud Engineering for Mobile and Telcos

·         Engineering Media and Content Delivery Network (CDN)

·         Deployment Enablers and Automation

Infosys Engineering Services is actively building this ecosystem and investing on solutions and innovation labs like SDN/NFV and 5G. Our recent innovation includes the demonstration of '360 degree live and immersive media streaming of the event' done at the VR experience booth at Australian Open 2019, which was part of our digital innovation partnership with AO.

You can reach out to Infosys Engineering Services to know more. 


January 9, 2019

5G - Small Cells 'Steal the Thunder' in 5G Era

Author: Balaji Thangavelu, Principal Consultant, Engineering Services


In this blog post, I have shared my thoughts on the significance of small cells in 5G, and how we need to gear up our innovation capabilities to address some unique challenges associated with small cell infrastructure deployment and RF planning.



What's in store for 5G in 2019....

 

If 2018 was the year of hype, marketing and planning for 5G, 2019 looks a bit more attractive and promising in terms of the actual deployment of the technology. And I believe the ecosystem is looking very strong and encouraging for this.

 

Mobile operators like AT&T, Verizon and Telstra have a lot planned for early 5G deployments. 5G Americas predicts that there will be 336,000 5G connections in North Americas by the end of 2019. That will be about 47% of global 5G connections at that point.

 

A pertinent question to ask here is whether all these planned deployments cater to the 5G use cases defined by IMT-2020 vision? The answer is no, because the mobile operators are not planning to go for full scale deployment using millimeter wave frequency spectrum, though speeds would definitely go up with 5G riding on sub 6GHz spectrum.

 

How far is the deployment of mmWave, which carries interesting IOT and high-bandwidth use cases, from reality...

 

Let us first understand what is the millimeter wave and small cells.

 

Millimeter wave:

·         High-frequency (>24Ghz frequency spectrum) millimeter waves will greatly increase wireless capacity and speeds for 5G networks. They are called mmWave for short.

·         They carry game changing 5G use cases, like 4K video, AR/VR, connected vehicle, Massive-IOT etc.

·         The propagation characteristics of millimeter wave bands are very different from those below 4GHz. Typically, only small distances can be achieved and the signals do not pass through walls and other objects in buildings.

 

Small cells:

·         Small cells are low power, short range wireless transmission systems (base stations) to cover a small geographical area or indoor / outdoor applications.

·         Small cells can be attached to street furniture, including lampposts and the sides of buildings.

·         Thousands of small cells can be used to cover a metro area of few square kilometers.

 

 

While there are several technical and business reasons on why operators may want to go slow on mmWave deployments, I believe the below are the primary reasons:

 

·         The obvious one is the release of spectrums through auction process which is determined by local conditions and the process followed by the governing bodies in the respective countries.

·         Then the ones associated with small cell infrastructure buildout - local permitting process, lengthy engagement and procurement exercises with local authorities, high fee structure on street furniture assets, human exposure to RF EMF

·         Mobile operators want to build massive small cell infrastructure at strategic locations and then scale out with learnings.

 

 

 

 

Small cell may be tiny but they are massive in numbers...

 

Clearly, mobile operators have a massive effort in front of them to build the small cell infrastructure. Add to this, what is called as fiberization, which is about fiber to the small cells enabling the front haul network infrastructure.

 

We can imagine this situation close to a green field deployment scenario. The traditional network planning, construction tools and processes may not be of much help given the scale, speed and agility of deployment requirements. I believe a fundamental shift will be required in the way we plan, construct and manage the radio access network infrastructure for 5G if we want to accelerate the deployments.

 

Innovation plays key role in getting there faster....

 

5G calls for innovative methods, tools and processes that would help accelerate the deployment of small cell and fiberization effort. I believe innovations like the ones listed below would create bigger impact in the journey of accelerating small cell deployments and service assurance. Some of these are just a reload from the past but others need new thinking to meet the ground realities.

·         Zero-touch provisioning and configuration for small cells with a PID scan using a mobile app

·         5G Use-case driven modeling and configurations

·         3D/GIS driven network modelling and simulation for RF planning aspects like Beamforming, line-of-site analysis

·         AR/VR enabled RF planning on 3D/GIS enabled drawings.

·         AR/VR enabled installations and support for remote technicians.

 

This plethora of innovative tools and enablers would require partnering between mobile operators, OEMs, RF planning/design vendors and system integrators. Infosys is actively building this partner ecosystem and developing solutions in our 5G microsite labs spread across global locations.


Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter


Categories