본문 바로가기

과학 칼럼

[VR/AR] How to Pass the Visual Turing Test in AR/VR (1)

300x250

I studied about development of VR/AR in Meta's blog post. This posting is almost written in English.   

 

1. What is the ultimate display?
2. Visual Turing Test
3. What are the key challenges
    -resolution
    -dynamic rage
    -distortion
    -depth of focus
4. Varifocal and the unexpected role of hands

 

 

Demo or Die: How Reality Labs’ Display Systems Research Team Is Pushing the VR Industry Toward the Future | Meta Quest 블로

 

www.meta.com

What is the ultimate display?

Ivan Sutherland, the ultimate display, 1965

The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could literally be Wonderland into which Alice walked.

 


Visual Turing Test

: evaluates whether what’s displayed in a VR headset can be distinguished from the real world.

While VR already creates a strong sense of presence, of being in virtual places in a genuinely convincing way, it’s not yet at the level where anyone would wonder whether what they’re looking at is real or virtual.

 


What are the key challenges

realism을 위해서→ resolution, rield of view, dynamic range, color gamut, occlusion, passthrough

comfort(편안함)을 위해서 → form factor, accomodation, distortion, prescription, passthrouth

resolution

The problem is that VR headsets have much wider fields of view than even the widest monitor, so whatever pixels are available have to be applied across a much larger area than for a 2D display, resulting in lower resolution for a given number of pixels. And not only are a lot more pixels required, but the quality of those pixels needs to increase.

• Resolution that approaches and ultimately exceeds 20/20 human vision

Dynamic range

Today’s VR headsets have substantially lower brightness and contrast than laptops, TVs, and mobile phones. As such, VR can’t yet reach the level of fine detail and accurate representation that we’ve become accustomed to with our 2D displays.

• And high dynamic range (HDR) technology that expands the range of color, brightness, and contrast you can experience in VR

distortion

Additionally, the lenses used in current VR displays often distort the virtual image, reducing realism unless the distortion is fully corrected in software — which is challenging because the distortion varies as the eye moves to look in different directions.

 

• Distortion correction to help address optical aberrations, like color fringes around objects and image warping, that can be introduced by viewing optics

depth of focus

the ability to focus properly at any distance

• “Varifocal” technology that provides correct depth of focus (versus a single fixed focus), thereby enabling clearer and more comfortable vision within arm’s length for extended periods of time


Varifocal and the unexpected role of hands

varifocal(가변 초첨) : a technology that involves adjusting the focus of the display based on what you’re looking at.

That insight was that to use your hands most effectively, you have to be able to focus on them.

 

Vergence-accomodation conflict(VAC): The mismatched cues you receive in VR between the simulated distance of a virtual 3D object and the focusing distance — which again, is fixed at roughly 5 - 6 feet in today's headsets — can cause vergence-accommodation conflict (VAC). well-known phenomenon in the VR field that may lead to temporary fatigue and blurry vision, and can be one source of discomfort that can be experienced when spending extended periods of time in VR.

 

One path to addressing VAC is to dynamically adjust the focal depth in VR to match the distance of the object of interest, enabling our eyes to focus at the right distance, and one potential way to do that, known as “varifocal,” is to move the lenses accordingly as the viewer changes what they’re looking at.

 

DSR’s first complete varifocal prototype : created in 2016, integrated all the necessary components for a compelling experience - variable focus, robust eye tracking, real-time distortion correction that updated with changes in display focus, and rendered blur that increased away from the focal plane, as it does in real world.

 

⭐ DSR(Display System Research) Team: led by Douglas Lanman

 

300x250