Bounds for the best possible designs for optical devices: well-studied [0, 1, 2, 3, 4], yet really hard.
More specifically, whenever you give a designer a design spec, it is always worth asking, how good is the best possible design for this spec? And, of course, can the designer actually achieve it, or something close to it? This is the question here.
In this scenario, the design spec is the optimization problem (what you want to optimize), the designer then gets to choose how to best approach this problem. In this case, you want to give a number that states, independent of how this problem is solved, what is the best any designer (no matter how smart or sophisticated, how much computational power they have, etc) can hope to do. In many cases giving such a number is actually possible! (See below references.)
Does this sound like something out of Dirk Gently to anyone else?
> Now the Stanford engineers believe they've broken that bottleneck by inventing what they call an inverse design algorithm.
> It works as the name suggests: the engineers specify what they want the optical circuit to do, and the software provides the details of how to fabricate a silicon structure to perform the task.
I'm quite sure the first novel, Holistic Detective Agency described a computer program that could do that. (In the novel it was of course sold to a governmental body, maybe military.)
The problem of these approaches is that they are limited by the electronics needed to control the optics. It is all about the electronics. The numbers presented for the chip were worse than what can be accomplished with conventional digital electronics at equivalent bit depth.
The optical filter is critical to the design, and big curiosity to me. Normal dielectric filters have a large angle dependence. Of course and It’s solved by machine-learning, I am hopeful but skeptical. Need to dig further into this.
The fact that there is an analytical solution to the problem is a really nice achievement, but I have to wonder: solving this problem numerically has surely been within reach of computers for a long time, and should therefore have made actually manufacturing such lenses somewhat straightforward.
We have tons of numerical simulations in engineering. Light modulation alone, just 3-5 lenses with different qualities can occupy a modern processor for a few hours.
Its always fun when somebody from another field (computer science) discovers something that is very well know in your field (E&M). One of the parts missing from this paper is that the PSF's of optical systems have a real and imaginary part but measuring the imaginary part is quite difficult and is mostly only done for biological samples.
> Is there a technical limitation that make it impractical?
Yes. Diffractive optics (which includes meta-lenses) have significant wavelength dependence. Visible light is 400-700nm, which are different by about a factor of 2. This means blue light will focus almost twice as far away as red light does.
The neat bit is this is actually the reverse of how refractive optics behave, which means you can use both together and cancel out a significant portion of chromatic aberration. If we can scale up the manufacturing (and ideally apply them to curved surfaces) they could improve performance and reduce complexity and weight of VR/AR optics.
Friend of mine worked for an asml supplier. He was working on adjusting the optical path based on how the laser going through a lens heats that lens up and changes its optical qualities. There are so many challenges we don't even think about
You'd run into diffraction limit for any reasonable pixel size long before the pinhole diameter gets smaller than the wavelength of visible light. The rest of your comment I fully agree with.
The actual paper is about realization of an algorithmically-designed optical device rather than optical switching. The device is "just" a computer-designed multiplexer.
Classic innovator's dilemma. I personally am biased to silicon optics after seeing a few impressive demos. Albeit in laboratory conditions, under controlled settings ;)
Thanks, but I think you may have misunderstood me. I meant actual optics.
I agree that better sensor tech is also very useful. It is the end of the line for the optical path, after all. Any way you can get photons picked up better is great!
But the actual optics, the mirrors, the lenses, the fibers, the filters, the E-O waves, etc. That's where all the jazz is. I'm not kidding when I say it's an easy nobel. STED was just putting the right filter in the right place. DIC was just using a 1/4 plate just off the focus. PALM/STORM is just using a specific dye and a fast lamp. Blue LEDS are just a bit of chem. Optogenetics is just the right slime out of just the right pool.
Now, all those things built up on a LOT of other work, but it's not terribly difficult stuff to do. The processes are very straightforward. I mean Hell built STED in his living room out of cardboard boxes, literally.
But, you can also toil away on these projects for decades, tweaking this, isolating that.
More specifically, whenever you give a designer a design spec, it is always worth asking, how good is the best possible design for this spec? And, of course, can the designer actually achieve it, or something close to it? This is the question here.
In this scenario, the design spec is the optimization problem (what you want to optimize), the designer then gets to choose how to best approach this problem. In this case, you want to give a number that states, independent of how this problem is solved, what is the best any designer (no matter how smart or sophisticated, how much computational power they have, etc) can hope to do. In many cases giving such a number is actually possible! (See below references.)
-----
[0] https://pubs.acs.org/doi/10.1021/acsphotonics.9b00154 (PDF: http://web.stanford.edu/~boyd/papers/pdf/comp_imposs_res.pdf)
[1] https://arxiv.org/abs/2002.00521
[2] https://arxiv.org/abs/2003.00374
[3] https://arxiv.org/abs/2002.05644
[4] https://arxiv.org/abs/2001.11531
reply