I wonder how cloud computing can be used to speed up latency. Would it be possible to use content delivery networks to cache real world data?
vivek3141
I'm curious to learn more about whether there exists CPUs specifically for VR headsets, and if so, what kind of optimizations are being done in order to reduce the latency as much as possible. I found it fascinating how good Apple's pass-through is in their Vision Pro, so I'm curious to know what's going under the hood there.
A question I would have is would using FPGAs here improve latency? I could see it being used to improve the processing time of data from the sensors, but I'm not sure.
aishikbhattacharyya
When trying to reduce end-to-end latency, I wonder what the cost-efficiency trade off would be up of upgrading sensors. In other words, how much is efficiency worth to upgrade to a better gyroscope if it costs a lot more money.
ShivanPatel2025
What are the major technological barriers to achieving the ideal latency range of 10-25 milliseconds?
omijimo
seeing the things that the vr headset must take into account when rendering images, how well does apple's vision pro do in terms of latency? i've heard that it is pretty good at tricking people into thinking the image is real
keeratsingh2002
It’s pretty wild to think about the fine line VR systems walk to convince our brains they’re real. Nailing that split-second timing between moving your head and what your eyes see is a technological dance that’s as much about feeling as it is about specs.
agao25
https://broadbandlibrary.com/the-near-future-of-latency/ @ShivanPatel2025 This articles goes into the specifics of breaking that barrier right now, but it seems like we are very close. More work on streamlining digital traffic queues and research into 5G and 10G networks is needed, and there's even a grant proposal right now from some Duke researchers who are hoping to look into this more. https://researchfunding.duke.edu/ideas-lab-breaking-low-latency-barrier-verticals-next-g-wireless-networks
saif-m17
How have we come up with the latency goal listed on the slides of 10-25 ms? Obviously since this has to do with "tricking the brain into thinking what it sees is real," so is this estimated based on how quickly we're able to perceive movement changes?
randyen
Due to the difficult latency requirements in VR, I wonder if that plays a role in why many VR devices are so expensive- requiring lots of pricey hard/software. I'm also curious in the way the producers are able to achieve lower latency, and what makes 10-25ms a good threshold?
I wonder how cloud computing can be used to speed up latency. Would it be possible to use content delivery networks to cache real world data?
I'm curious to learn more about whether there exists CPUs specifically for VR headsets, and if so, what kind of optimizations are being done in order to reduce the latency as much as possible. I found it fascinating how good Apple's pass-through is in their Vision Pro, so I'm curious to know what's going under the hood there.
A question I would have is would using FPGAs here improve latency? I could see it being used to improve the processing time of data from the sensors, but I'm not sure.
When trying to reduce end-to-end latency, I wonder what the cost-efficiency trade off would be up of upgrading sensors. In other words, how much is efficiency worth to upgrade to a better gyroscope if it costs a lot more money.
What are the major technological barriers to achieving the ideal latency range of 10-25 milliseconds?
seeing the things that the vr headset must take into account when rendering images, how well does apple's vision pro do in terms of latency? i've heard that it is pretty good at tricking people into thinking the image is real
It’s pretty wild to think about the fine line VR systems walk to convince our brains they’re real. Nailing that split-second timing between moving your head and what your eyes see is a technological dance that’s as much about feeling as it is about specs.
https://broadbandlibrary.com/the-near-future-of-latency/ @ShivanPatel2025 This articles goes into the specifics of breaking that barrier right now, but it seems like we are very close. More work on streamlining digital traffic queues and research into 5G and 10G networks is needed, and there's even a grant proposal right now from some Duke researchers who are hoping to look into this more. https://researchfunding.duke.edu/ideas-lab-breaking-low-latency-barrier-verticals-next-g-wireless-networks
How have we come up with the latency goal listed on the slides of 10-25 ms? Obviously since this has to do with "tricking the brain into thinking what it sees is real," so is this estimated based on how quickly we're able to perceive movement changes?
Due to the difficult latency requirements in VR, I wonder if that plays a role in why many VR devices are so expensive- requiring lots of pricey hard/software. I'm also curious in the way the producers are able to achieve lower latency, and what makes 10-25ms a good threshold?