I honestly don't think we are close enough to the tools. It is the culmination of what is currently handwavium: neural interfaces, carbon nanotechnology, who knows? As a VR user and enthusiast, FOV is a problem that lenses simply can't solve - which is an even harder problem for AR.
I'm not a big fan of Apple, but I can hand them the "polish card" (at least under Jobs). With our current level of technology, that level of polish is impossible.
No problem! I'm a VR user because I want to keep up, I want to experience the bleeding edge - I want to experience it as it happens. I'm enthusiastic but realistic - science will solve this though, not Apple.
> With our current level of technology, that level of polish is impossible.
Hence the anticipation and excitement. I can’t wait for jet packs, hoverboards, and flying cars too - one can geek out over cool tech even if it ain’t right around the corner.
> jet packs, hoverboards, and flying cars too - one can geek out over cool tech even if it ain’t right around the corner
Two of those three are entirely doable (and have been done) - the main problems with them are 1) economy, and 2) that, while cool for individual, they'd be absolutely disastrous if ubiquitous. You don't want everyone to get themselves a jet pack or flying car, not unless 99.9% of flying is done by a foolproof AI.
Put your fingers in front of your face, in the middle of each eye. The glasses would only display the overlay between your fingers. The other half of your vision (and it is a hard cut off as you move your head) does not get the overlay. That would be roughly 90 degrees, which puts 60 and 30 in perspective.
Say we have 1,000 pixels. With current lenses we can display 500 of those pixels over 30 degrees of vision. This is the tradeoff between FOV and Resolution. You could also display those 500 pixels over 60 degrees of vision at half the resolution. A better lens increases the maximum resolution, so you can show 500 pixels over 15 degrees of vision. But you're still going to bottleneck on the total number of pixels you can display. And if you're targetting FOV having higher resolution only hurts because it costs more of your total pixel budget for less of the users vision being full.
I totally wouldn't mind that. If the technology reaches then surely app would be integrated. I would love goals app that sync with smartphone and makes life like a videogame. I mean possibilities and unlimited with AR glasses.
There are already impressive things happening with ARKit; I have a feeling there will be plenty of developers using a relatively mature toolset already if/when they drop a real AR product outside of a phone.
They were going to replace tape storage with holographic storage before they figured out how to apply it to AR. Here's the holographic storage pitch deck:
Consider foveation as well — only a small part of that is truly in “focus”.
Some up and coming be glasses take this into account and expend lesser resources on the part that’s outside the important part (they don’t render that part in a high resolution, so it’s left blurry, which is actually how our eyes work anyway)
It's already implemented in the StarVR headset (which has 210° FOV) using Tobii eye tracking tech. They had a working demo at Siggraph a couple weeks ago.
> It’s crazy unbelievable how low-latency Tobii’s eye tracking is
Tobii uses cameras at 60 or 120 Hz, so >8 ms. It's the mature but old and limited tech. MEMS-based tracking is faster. During a saccade, you've the samples to predict where the eye is going to end up fixated, 20 ms before it happens.[1]
From glancing at the edges of my glasses, about 160-180 degrees, but anything close to the edges is always blurry; there's no way to avoid some level of lensing error. (Barring contacts.)
I've always favored extra-wide glasses, though. Some of the smaller ones might be 140 or even 120 degrees.
Interesting question, but I'm not sure how relevant that is. Try covering the parts of your view beyond the edges of your glasses, e.g. with your hands, and see if the experience is the same as with just your glasses. I bet there is a huge difference.
I think the issue is that the parts outside your glasses are blurry anyway (even for people with perfect vision), but even the blurry part is important for the experience.
I’m quite near sighted as well but being able to see light and motion (even if blurry) is important to me even if it’s only really beneficial mentally. I might be blind but I don’t want to feel blinded.
Thanks for the details. I briefly looked at holographic storage when I did my undergrad thesis on holography.
Anyone care to comment as to whether this storage methodology could be moved forward?
(Apple acquisitions are always fascinating. I did a deck on PrimeSense for an old boss when we thought it was going into an Apple TV product rather than FaceID.)
That's not so easy, but you can make some predictions. What areas are going to be interesting to FANG in the coming years? What companies are doing something useful in that space? Who has something unique that FANG companies would have difficulty replicating in-house?
This is an interesting claim and I can't decide if it's true.
On the one hand, of the set of people working on companies which get acquired by FANG, I would bet a lot of money that less than 1/4 are founders. Even if the vast majority of them were acqui-hires with only founders, it just takes a few mid-sized acquisitions to tilt things back toward more non-founders.
On the other hand, P(FANG acquisition | founding a company) might be higher than P(FANG acquisition | taking a dev job ~someplace~). Ie there are a lot of small companies out there who'd like to hire you but aren't high-growth high-tech startups with a shot at the outcome you want.
While that’s a decent starting point, the larger reason is the App Store. Apple sees the value in being the interface for your daily computer interaction.
Imagine being the interface to your world. You are the gatekeeper to everything a person sees. There is exponentially more value possible than the App Store and the App Store made them the largest company in the world.
Everything that video says is why they can succeed but the driving force is selling more software.
The primary information available on their website seems to be associated with holographic data storage. They had a prototype working in 2010 [1].
Anyone familiar with this technology, and how it may compare to, say, SSDs? A 2012 article seems pretty down on them [2].
Akonia also simply acquired the assets after InPhase filed for bankruptcy in 2011 [3]. If so, being purchased by Apple has to be a big win for the Akonia investors!
reply