> After all, our eye really is at different distances from different parts of a straight wall, so it sounds logical that we would see the fisheye effect describe in the article.
It's not the distance to the points in the scene that determine where they appear in a perspective projection, it's the angle. For any single point on the screen/retina/projective plane, it can actually correspond to any distance from the camera/eye (i.e., a ray of possible points).
>Really? You can’t set up more than one monitor in your field of view?
&
Just because we can see movement on the outer ends of our field of view to see predators, doesn't mean I can also comfortably stare at code via the side of my eyes. Filed of focus != field of view. Yeah, I can see nearly 180 degrees, but I can only comfortably focus for work on what's in the center of my FoV.
> In front of the imaging devices there are high speed mask device (LCD?) that blocks off portions of the image that are not at the current focal depth.
It would be interesting to see how this works in practice. The eye can't focus on something that close. For an example, imagine a speck of dirt on your glasses - it shows up as a blob in your vision, not as a dot.
> but the focal length afforded by the lenses is fixed
That seems weird to me because when I look at a screen at a different distance to the one I'm using I can see the other screens getting blurry just like in real life.
That's not actually the case, from what I heard. Most of our peripheral vision is smudged 720p at best, and we can only see a tiny focused area at slightly beyond 4K density. We also can't look at two things at once.
>But the input to the visual system is still an image on your retina, which obeys the same laws as camera optics.
The thing that makes the difference is that the resolution of the retina drops as you move off the center. Which I suppose is what is being simulated here. Or at least it could be efficiently simulated that way - like foveated rendering, only the fovea is kept at the center and the rest of the image is kept with the pixels smashed closer together instead of interpolated.
Consider foveation as well — only a small part of that is truly in “focus”.
Some up and coming be glasses take this into account and expend lesser resources on the part that’s outside the important part (they don’t render that part in a high resolution, so it’s left blurry, which is actually how our eyes work anyway)
Here's an optical illusion which illustrates how surprisingly tiny your fovea's FOV is: https://www.shadertoy.com/view/4dsXzM .
reply