Dynamic Adaptive Streaming for Augmented Reality Applications

Proceedings of IEEE International Symposium on Multimedia (IEEE ISM)

Publication date: December 9, 2019

Stefano Petrangeli, Gwendal Simon, Haoliang Wang, Vishy Swaminathan

Augmented Reality (AR) superimposes digital content on top of the real world, to enhance it and provide a new generation of media experiences. To provide a realistic AR experience, objects in the scene should be delivered with both high photorealism and low latency. Current AR experiences are mostly delivered with a download-and-play strategy, where the whole scene is considered a monolithic entity for delivery. This approach results in high start-up latencies and therefore a poor user experience. A similar problem in the video domain has already been tackled with the HTTP Adaptive Streaming (HAS) principle, where the video is split into segments, and a rate adaptation heuristic dynamically adapts the video quality based on the available network resources. In this paper, we apply the adaptive streaming principle from the video to the AR domain, and propose a streaming framework for AR applications. In our proposed framework, the AR objects are available at different Level-Of-Details (LODs) and can be streamed independently from each other. An LOD adaptation heuristic is in charge of dynamically deciding what object should be fetched from the server and at what LOD level. Our proposed heuristic prioritizes content that is more likely to be viewed by the user and selects the best LOD to maximize the object’s perceived visual quality. Moreover, the adaptation takes into account the available bandwidth resources to ensure a timely delivery of the AR objects. Experiments carried out over the Internet using an AR streaming prototype developed on an iOS device allow us to show the gains brought by the proposed framework. Particularly, our approach can decrease start-up latency up to 90% with respect to a download-and-play baseline, and decrease the amount of data needed to deliver the AR experience up to 79%, without sacrificing on the visual quality of the AR objects.