Lecture 23: Virtual Reality (45)
nickjiang2378

How do VR headsets track your eyes? This article might shed some light: https://imotions.com/blog/learning/best-practice/vr-eye-tracking/.

kujjwal

Following up on this article, the VR headset uses an infrared light to track the positional displacement between where your pupil is and the corresponding reflection in your cornea, so the difference between these can be averaged between both eyes. From here, a couple different computer vision techniques are used. This process seems somewhat computationally expensive, so it's surprising to me that VR headsets are able to do this in real time.

agao25

https://www.tobii.com/blog/eye-tracking-in-vr-a-vital-component This article also mentions how the cameras and illuminators of the VR headset might analyze pictures of your eyes/reflections of images to produce a better and more calibrated experience via ML training. I had an opportunity to go through a space-themed VR activity and this question reminded me of how before we started, they needed to calibrate everything according to our eyes.

carolyn-wang

I'm curious where there are still so many issues with fatigue/nausea after using VR headsets over prolonged periods of time. Is it because the display is just too close to the eyes so it creates eye strain? or are there other causes.

saif-m17

This article explains a little bit about why people feel nausea while using VR headsets - one big reason being eye strain. Interestingly, the article mentions that somewhere between 22 and 80 percent of people might experience it. https://www.cbc.ca/news/canada/kitchener-waterloo/university-waterloo-virtual-reality-vr-cybersickness-why-study-1.6840293

carolyn-wang

update to my own question: I think the professor actually covers the causes of nausea later on in the lecture (accommodation vergence conflict)

S-Muddana

I never thought about how much analysis would be needed on the human eye and its field of view for VR displays to be good. I now understand why Apple implemented eye tracking in the Vision Pro. It is not just to unlock new capabilities and features, but also ensure that it is still mimicking a human's point of view.

saif-m17

I notice on the slide that it mentions the humans fov per eye is about 160 degrees not accounting for the ability to rotate in the socket. I'm curious how the variance in this affects VR design and how the eye's ability to rotate in the socket affects this fov and VR design as well.

You must be enrolled in the course to comment