There already exist 1/2 8K VR headsets though (full 16:9 4K per eye), and you can keep seeing a difference there up through around 16K per eye and higher refresh rates than they showed.
I think the visual qualities of current high-end VR headsets are limited by the optical systems. 2K/eye is already reached their limitation even at the center of the view (Pimax released the 4K/eye headset, but it has wider FOV).
When I read the report first, I thought 8K/eye was nonsense because it was a physical limitation rather than a technical problem. Isn't it true?
I don't get why you, or the other person who replied, concluded that I thought 8k per eye was feasable now. I only said that was when I was going back to try VR. I'm positive we have at least that resolution in 10 years. I am also perfectly aware of all the challenges involved in outputting such a resolution and manufacturing such a display.
It gets more interesting when you think of use cases beyond an 8K mono screen. "8K VR" typically means 3840×2160 per eye, for example. It could even theoretically unlock scenarios like simultaneous 4K to a local display and 1080p to a handheld device.
The current leaks and analyst claims are saying an 8K screen per eye [1], with some kind of advanced eye tracking and foveated rendering to keep the GPU processing requirements under control. PSVR2 has already shown that at least the foveated rendering part is totally practical even at a much lower price point.
And all that money is for a VR that doesn’t fully replicate normal vision - the number of pixels is just not there yet. I doubt even quadrupoling the number of pixels in current vr systems would get there.
It's certainly an interesting concept. I'm not an expert but from what I've come across, it's currently very expensive to match the resolution of a 27" 4K monitor that's 2 or 3 feet away from your eyes. This is because the headset would need much denser resolution than 4K on the very small screens sitting a few inches from the user's eyes. I'm sure someone has done the calculation somewhere but I couldn't find it. This page says we're at 2,000 pixels per eye but we need 4,000 to get "Immersive VR" (not sure if that will fit this use case) https://www.makeuseof.com/tag/virtual-reality-desktops-save-...
So it might happen eventually. People are actually leveraging "Virtual Desktops" now, but accepting the inherent resolution issues.
I guess I misspoke about being 1920x1080 which is a "2K" screen split in two. An 8K screen split in two would be ~4000x4000 per eye which is still 16x as many pixels as I said, plus the 33% increase in frame rate which I didn't include which matches your second one. Although with how embarrasingly parallel it is, I don't think it's as far off as it seems. Especially considering that it's the previous generation graphics cards that can handle current day VR fine so we're 1-2 years into the 9 years we have to wait, and with so many pixels anti-aliasing can probably be turned off completely. You could probably build something today that could do it, it just would be very expensive and I don't think 8K panels at cell-phone size exist yet.
Ever-increasing improvements aren't required; the eye has a (more or less) fixed resolution. In fact, HMDs with extremely high resolutions already exist, e.g. the Varjo VR-2 Pro and the XR-1, they just cost an arm and a leg. In a few years we'll get them at consumer-level prices instead of having to drop $6-12k.
8k x 8k per eye at 120Hz is 64x more pixels at 1/3rd increase in frequency ~= 85x more processing power. Making the (maybe faulty) assumption of doubling processing power every 2 years and that current setup is processor limited, this sort of processing power is ~13 years away.
Same computation but with 4k x 4k per eye predicts ~9 years of progress needed.
I meant VR in general. VR at 1440p per eye is actually less pixel pushing than 2160p (2 * QHD = 7.4 million pixels, UHD = 8.3 million pixels). QHD per eye is a significant step up from 1080p per eye, and it's rumored that the next headset iterations will be at 1440p per eye.
VR can be bandwidth intensive. VR immersive-gaming low-end headsets are 60 fps, and high-end are 90 fps. And I've been told 120 fps is a noticeable "like butter" improvement on 90 fps. So a 8K@30 might mean 4K@120.
VR lenses piss away pixels. With a current 1440x1440/eye, you might get only a ~500 pixel circle of crisply clear pixels. So if you want screen-comparable resolution you're all set now... if that screen is a 1980's VGA IBM PC. With this wastage, 5K/eye might let you render virtual 1080p screens.
A headset providing both wide fov and this year's "high" resolution would be about 4K/eye. And I'm thinking about modding a narrow-fov drone headset (less lens blur) with two 4K panels, because I'm tired of waiting for the game-focused VR HMD market to get around to screen-comparable resolution.
"But won't 8K require an insane graphics card?" No. The old generation of VR headsets was doing brute force. Foveated rendering greatly reduces GPU load. And frankly, even without it, I've run a VR desktop (non-immersive low-fps with camera-passthrough) on a WMR HMD using a crufty old laptop's integrated graphics. GPU isn't a blocker for higher resolution HMDs.
I definitely see a need for graphics cards like these with virtual reality around the corner.
I haven't bought a new graphics card in 5 years because even next gen games play well enough. But I can see VR changing that with the need to render the same scene twice (one for each eye).
I wonder how much money you'd have to drop today to have a rig that can push 4K to each eye? There isn't a headset that can support that yet but I'd imagine its atleast in the next 5-10 years.
My guess is we need at least 4K displays per eye for VR to be as readable as a 1080 monitor, without distortions beyond the sweet spot in the very center.
There were some Kickstarter projects a few years back for VR headset like this, but I guess none of them got any traction.
reply