High-end VR needs to render each frame at a rate of at least 16 milliseconds, or 60 frames per second. However, the CPUs of top smartphones such as Google’s Pixel XL are only capable of speeds around 110 milliseconds. Today’s leading VR systems use headsets tethered to powerful computers, as wireless networks are not able to transmit rendered frames fast enough.
“Today’s mobile hardware and wireless networks are about 10 times too slow for high-quality, immersive VR,” said Y Charlie Hu, a Purdue University professor of electrical and computer engineering.
“A key observation we made is that waiting for next-generation wireless networks such as 5G will not help because packet processing at 10 times higher data rate will exhaust the CPU on today’s smartphones.”
To deliver high-end, untethered VR via a smartphone, Furion employs innovative solutions. A large part of the computational load in VR apps is rendering the background for each frame. But the background – a landscape or the inside of a room – doesn’t vary much from frame to frame. When it does change, it is generally in relation to the user’s position.
“But the user’s position doesn’t change randomly,” said Hu. “You move continuously and in a very predictable way. So that means we can predict how the background will change based on the user’s position and pre-render the background.”
Furion uses a PC to render the predictable background frames, transmitting the data to the smartphone over WiFi. The background is rendered as a panoramic photo and split into four images, each decoded on one of the Pixel XL’s four microprocessor cores. It can then be automatically cropped to match the users changing viewing angle. By predicting movement within the VR app, Furion ‘pre-fetches’ the required rendered frames from the server, resulting in a frame latency of 14 milliseconds.
“Prefetch means you predict the fetch and you start asking the server to get it before the VR game logic actually asks for it,” Hu said.
A provisional patent for the software has been filed through the Purdue Research Foundation’s Office of Technology Commercialisation, and the paper outlining the research can be found here.