Professors Michael Zink, electrical and computer engineering, and Ramesh Sitaraman of the College of Information and Computer Sciences, invite you to imagine watching a Super Bowl game from the comfort of your own living room, but experiencing the game as if you were on the field with the quarterback.
Such an experience is now possible through a form of virtual reality known as “360 videos” that allow viewers wearing a headset to experience media content in an immersive fashion. But creating and delivering 360 videos to a large audience of viewers over the Internet remains an unsolved scientific problem, they acknowledge.
Now the two, with computer science professor Klara Nahrstedt at the University of Illinois at Urbana-Champaign recently received a $1.2 million grant from the National Science Foundation to develop a new system for creating and delivering 360 videos at scale over the Internet.
If successful, this work will transform 360 video creation and delivery and enable new and much richer educational, training and entertainment experiences, they say. It will also help train a new class of multimedia systems researchers and practitioners.
As they explain, in contrast to traditional video, 360 video is recorded with a special camera that captures the surroundings from almost all directions. Viewers using 360 video can select the direction they are looking by using a pointing device on a regular display or through head movement using a head-mounted device. The new format allows a viewer to change viewing direction when watching the video. This means, for example, that a viewer can watch a sporting event from several perspectives on the field.
“In recent years,” Sitaraman points out, “virtual and augmented reality applications have seen a significant increase in popularity. However, despite these technological advances, major challenges remain with respect to effective representation, storage and distribution of 360 videos on the Internet.” He is known for pioneering content delivery networks that currently deliver much of the world’s online videos to billions of users.
Zink, who is known for his work on multimedia systems, future internet architectures and sensor networks, adds that an additional challenge is cyber sickness, which occurs when a viewer’s interaction with a virtual environment triggers symptoms similar to motion sickness. “A time lag of more than 20 milliseconds between the head movement and the rendering of the new scene may cause cyber sickness for the viewer,” he notes.
Their proposed project incorporates three major research thrusts. The first is a video creation that enables personalized viewing by generating new navigation graphs and cinematographic rules while maintaining a high quality of experience and reducing cyber sickness.
The second focuses on scalable distribution of 360 videos to a global set of diverse viewers using navigation graphs and cinematographic rules for highly efficient delivery over the Internet.
The third focus is on quality of experience, that is, devising new metrics and evaluation methods to assess cyber sickness. The researchers explain that they will extensively evaluate system architectures and algorithms through simulation, emulation and benchmarking, using testbeds to assess the success of the proposed research.
In addition to the obvious industrial impact of the project Sitaraman and Zink say it will have major impact on students because they will be educating a new class of multimedia systems researchers who will be in high demand in academia and the media industry.