The Peripheral Vision and Depth perception of the human eye

The Peripheral Vision and Depth perception of the human eye

Hi, last time we introduced the factors that affect the visual effect of some AR glasses.

(If you're interested, check it out here. Factors affecting the visual effect of AR glasses)

Peripheral Vision

Although the human eye has a limited range of focus, it can see a wide range of out of the corner of the eye. The human eye's natural field of vision is about 150°×120° for one eye and 220°×120° for the combined eyes. Installing a monitor in front of your eyes naturally creates an additional barrier, so an important design goal is to keep this barrier to a minimum.

Therefore, under the ideal state, AR optical display system should minimize the shadow of human eyes. In order to achieve the peripheral view of the smooth.

The figure below shows how the natural view (green), the unobscured view (red) and the augmentable view (blue) roughly relate in size on today’s devices. For simplicity all the areas are drawn as rectangles.

Figure: green is human FOV (monocular), red is peripheral vision that is not blocked by glasses/head-mounted display, and blue is FOV of AR optics.

Therefore, maximize not only the extensible field of view (blue above) but also the uncolored field of view (red above).So anything that's blocking the view needs to move out.

Different to our simplified visualizations above, the visual field is not rectangular.

As the figure below shows the visual field is mainly limited by eyebrows, nose and cheeks: The combined red and yellow area depicts the left eye’s visual field. Similarly, the combined green and yellow area depicts the right eye’s visual field. The yellow area depicts the binocular overlap - the field that both eyes can observe.

Figure: Human visual field (left). The image on the left is generated by a ray projection using a virtual head model (image on the right)

Depth perception

The human eye can perceive depth through a variety of visual cues. For the AR optical display system, the two most important visual cues are vergence (our eyes rotating to look at the same object) and accommodation (our pupils focusing at an object), which should be coordinated at the neural level. If the vergence and accommodation do not match (VAC), it will cause discomfort to the user.

Figure: Vergence and Accommodation in normal viewing conditions (left), Virtual Reality (middle) and Augmented Reality (right).

The image above highlights the difference between normal viewing, virtual reality and augmented reality:

In the case of normal viewing, vergence and accommodation are in sync -- both adjust to the same distance.

In the case of Virtual Reality, the accommodation is always the same distance (usually about two meters), and the vergence depends on the stereo rendering of the screen content.

In the case of Augmented Reality, the conflict may be even greater: objects augmented with virtual content will appear to synchronize in vergence, but the accommodation to real and virtual objects can be very different.

You may have experienced a VAC while watching a 3D movie. Since the distance between the movie screen and your seat is fixed, and your actual focus plane is fixed, watching a dynamic 3D movie, you'll find that you can't tell the depth by pupillary focus, and only the content in the focus plane looks natural.

For AR optical display systems, the 2-meter focal plane is suitable for most scenarios, and it's hard to feel unnatural. And the focus plane is best flat, all colors have the same focal length, but most AR optical display systems on the market do not have the same focal plane alignment.

 

Previous Next

Leave a comment

0 comments