MARVEL: Enabling Mobile Augmented Reality with Low Energy and Low Latency

Hyung-Sin Kim Intelligent, Real-Time, Systems

This paper presents MARVEL, a mobile augmented reality (MAR) system which provides a notation display service with imperceptible latency (<100 ms) and low energy consumption on regular mobile devices. In contrast to conventional MAR systems, which recognize objects using image-based computations performed in the cloud, MARVEL mainly utilizes a mobile device’s local inertial sensors for recognizing and tracking multiple objects, while computing local optical flow and offloading images only when necessary. We propose a system architecture which uses local inertial tracking, local optical flow, and visual tracking in the cloud synergistically. On top of that, we investigate how to minimize the overhead for image computation and offloading. We have implemented and deployed a holistic prototype system in a commercial building and evaluate MARVEL’s performance. The efficient use of a mobile device’s capabilities lowers latency and energy consumption without sacrificing accuracy.

Published On: November 4, 2018

Presented At/In: The 16th ACM Conferenceon Embedded Networked Sensor Systems (SenSys ’18)

Download Paper: http://bets.cs.berkeley.edu/publications/2018sensys_marvel.pdf

Authors: Kaifei Chen, Tong Li, Hyung-Sin Kim, David Culler, Randy Katz