LG Develops AI-based Tech to Reduce Latency & Motion Blur in VR Displays

South Korean tech giant LG Display and a team from Sogang University in Seoul have collaborated on a new AI-based content creation technology that’s designed to address the issue of latency and motion blur in VR headsets – well ahead of the mounting race for higher and higher display resolutions.

For good reason, VR hardware developers are adamant about low motion-to-photon latency, or the amount of time between an input movement like a head turn, and when the screen updates to reflect that movement. High latency between what the user does and what they see can cause nausea.
Ideally that latency should be under 20ms, and while current consumer VR headsets have mostly solved this issue, a new wave of ever higher resolution headsets presents the same engineering challenge yet again. VR display latency and motion blur, or what happens when a display’s pixels don’t illuminate fast enough, are the two big targets for LG and Sogang’s new AI tech.
Motion blur caused by full persistence display – Image courtesy Oculus
“The core of the newly developed technology is an algorithm that can generate ultra-high resolution images from low-resolution ones in real time. Deep learning technology makes this conversion possible without using external memory devices,” the team told Business Korea.
LG and the Sogang University team’s technology is said to both boost power efficiency and make high resolutions possible on mobile headsets. The team says their AI-based setup “cuts motion to photon latency and motion blurs to one fifth or less the current level by slashing system loads when operating displays for VR.”

To test the system’s latency and motion blur, the team also created a device sporting a precision motor that simulates human neck movements and an optical system based on the human visual system.
“This study by LG Display and Sogang University is quite meaningful in that this study developed a semiconductor which accelerates with low power realized through AI without an expensive GPU in a VR device,” said professor Kang Seok-ju, who has carried out this study since 2015 and leads the research team.
LG recently partnered with Google to produce an 18 Mpixel, 4.3-in 1,443-ppi 120Hz OLED display made for wide field of view VR headsets, showing off their display at SID Display Week a few days ago. Google claims the panel is the “world’s highest resolution OLED-on-glass display”
The post LG Develops AI-based Tech to Reduce Latency & Motion Blur in VR Displays appeared first on Road to VR.

Source: LG Develops AI-based Tech to Reduce Latency & Motion Blur in VR Displays