For a long long time, science fiction movies have used transparent displays to seem more "futuristic." However, actual transparent displays have been underwhelming in practice.
This technology is being use by Xaomi to make a "transparent TV"[1] which in theory you will be able to buy. (I realize that not all of these sorts of announcements survive until actual product shipment).
It will be interesting to see one of these "in the flesh" as it were, I'm at a loss what you'd have behind it that wouldn't make it really really annoying to watch, but the cool factor could be right up there.
It would be cool if they could be used to implement a wearable HUD-style/face mask interface, like Google Glass but where the entire lens is the screen.
You end up needing more complex optics to focus the image comfortably onto your retina at a distance that feels comfortable. An OLED display on a glasses lens with no other optics would just look like a bright blur between you and the rest of the world.
Why not combine the two? Have your traditional opaque displays, then overlay the transparent system. I imagine this could be useful in court, allowing both the prosecution and defense to scribble on video evidence presented to the jury to point things out and highlight important bits, without ever modifying the underlying evidence video. Kinda similar to how sports commentators will use video overlay to scribble on replay segments to point out the movements of the various players to highlight why those movements were important and contributed to the outcome.
Or double layered: Transparent black LED behind transparent OLED
The black LED pixels can supplement the dark/opaque colors, while simultaneously supporting transparent colors with OLED pixels (i.e. like the png format).
Yes, I worked on a video game that was in the near future and had transparent and semi-transparent screens everywhere. I spent a lot of time adding new rendering features to the engine that did proper transparency sorting, blur/darkening passes and helping artists tweak materials to make them more readable. At multiple points in the project I wanted the art direction to switch to opaque screens. Not only because of the technical effort (in many ways that was fun) but just because how impractical they were in many scenarios.
Tangentially related, the "futuristic" displays in video games and movies and TV often look worse than what we'd see in real life.
Sometimes it's a hologram, which would be cool even if the fidelity were a bit shit, but it's interesting that art directors make them pretty bad looking even in fiction, presumably as a cue to the viewer that it's a hologram. Any fiction that portrays holograms as opaque and convincing usually plays it for surprise, because standard-issue holograms in fiction aren't convincing.
I was particulary offended by this in the recent Star Trek: Picard.
In previous shows the Federation has demonstrated the ability to create totally flawless and interactable holograms.
Yet somehow in this show everyone has switched over to glowly flickering interfaces hovering over your desk.
In the headquarters of Star Fleet, a giant hologram of a star ship is in the atrium, and it glitches out for a moment as the camera looks at it just to let the viewer know it's a hologram.
Not only is there no reason for holograms to glitch out all the time assuming that the technology could be created, but this TV series has already demonstrated their ability to do this much better!
ST:Picard and ST:Discovery are unfortunately both regressions where it comes to ST wordbuilding and style. Frankly, Lower Decks - currently airing animated "Rick&Morty meets ST:TNG" - does it better. Holograms there look and behave like they should.
I've heard, and the art director on the game followed the same philosophy, that art directors tend to make future tech look worse\glitchy on purpose so it's very clear to the audience that it's "tech". Technology that for all intents and purposes works mostly flawlessly looks a little too much like magic to audiences I guess? Like really you'd just want to have a person show up and look like they were there, but that might not read clearly as a hologram to a 20th or early 21st century movie goer so they're monochrome blue, flickering and noisy with clear projector rays. I'd like to see a sci-fi game or movie where all of the tech mostly worked flawlessly except maybe in some exceptional situations.
Right. Tech in Star Trek is part of the world, protagonists are living and breathing it, so it's shown as mostly working and not exceptional to people on-screen. Still, usually the viewer is told that something is a piece of tech through sounds and blinking lights.
As for Star Trek holograms, when the episode doesn't want to "play for surprise", it usually briefly shows the hologram appearing - which uses its very specific VFX, so that it's immediately recognized as a hologram (and not confused with teleportation, which uses a different style of VFX).
I thought the Hologram technology in Blade Runner 2049 was pretty "flawless". The most interesting thing for that movie was that you see different levels of hologram technology.
The hologram that follows the main characters seems to be pretty high tech, and looks flawless. And they show the hologram stuff by having her move through objects.
But when the main characters walks around the city, you see hologram commercials that are the size of buildings. Those don't look lifelike. They're transparent and have limited colors.
Yep, contrast and color accuracy are terrible problems to solve. It would be easier to fake it with a camera on the back of the phone... like Merged Reality.
Interesting. My Palm V (palm pilot) had a nice transreflective display 20 years ago. It was great in direct sunlight. I often think of that when using devices that can’t crank up the nits enough to compete with sunlight.
That is pretty cool. I hadn't thought what it would be like with two of these back to back in a gaming situation.
From the video you linked the opacity seems to be much better than I expected. I'm hoping one or more of the high end TV shops put this into their show room so I can see it in person.
Well for one it would be great to place on mirrors to "try on" clothes, or any kind of augmented reality like that. They can also be placed on your windshield for directions and other info like your speed. Gaming monitors with "designer" internals so that when your screen is off you can see the cool inside, similar to some watches with exposed gears.
My first thought was "well of course they don't," but their concept renders do indeed have opaque black.
I'm guessing these are just technically uninformed renders with no basis in actually-proposed applications, but it seems at least plausible that one could combine transparent OLED with a transparent LCD backing (sort of the inversion of a backlight) to create a variable-transparency screen.
Pardon my ignorance but wouldn't LCD need polarized light to perform the darkening? Maybe a transparent pixelated electrochromic display layer might work
The backlight of an LCD isn't itself polarized. A TN LCD display is made of two orthogonal polarization filters with the liquid crystals in between. In their relaxed (twisted) state, the crystals rotate the polarization of the light so that it can pass the second filter. When a voltage is applied, the crystals straighten and the second filter blocks the light.
A basic transparent OLED screen doesn't, but is often slightly tinted, and (indoors) its pixels are bright enough that the transparent pixels are dark enough to appear black by comparison. The same way you can run a projector on a white wall during the day, and its "black" is just the normal wall color.
But while I've never seen it, I don't see why you couldn't put a transparent OLED over an LCD screen like the one in your laptop. Combined, I think they'd give you every combination of color and opacity you could want.
I don't know if there's any technology that combines the two capabilities into a single layer.
Well, this would be another method for achieving heads up displays. Here's an application that I think would be really cool: HUD retrofits for cars. Take a transparent plastic film, stick it to the inside of your windshield in a similar manner like you would apply a tint film to the side and rear windows. Then, take the connector dangling from the corner of the film and connect it to the supplied box which connects to the ODB port, or some other standardized connector which connects it to the necessary vehicle systems.
I could also totally see Tesla or one of the older luxury car brands doing something similar, where the HUD in the front windows is integrated into the car systems.
IMO, the harder part will be not integrating the technology in this article (or something similar), but designing the HUD to be something actually useful and informative to the vehicle operator, affording them better, more informed control of their vehicle, instead of being some distracting thing that you can't disable.
That would be cool for aftermarket HUDs. As far as making a useful HUD goes, some carmakers are already doing that. I have a 2019 Mazda CX-5 and it has a HUD that is projected on to the windshield. It tells me my current speed, the current speed limit of the road I am on, if there is a stop sign ahead, and if I run the GPS system it displays the next cross street, and arrow symbols for upcoming turns.
I personally find it super useful! I don't need to look down at my instrument panel as my current speed gets sort of projected on road ahead of me. It's small and a barebones design but it works.
The thing you describe works by passing an image through lenses, and reflecting it off the windscreen. That way, you can focus on it and the road at the same time. For transparent OLED-based systems, you'd have to stare at the windscreen to look at them – so you wouldn't see them on the road; they'd look much nearer.
My 2011 Camaro also has a HUD, and I’d hate to go back to a car without it. I only use cruise control very infrequently, so being able to see what speed I’m going without taking my eyes off the road is great. I honestly think it should be a mandated feature.
While I fully expect automakers to play that game however they can, there's not a huge benefit to moving the display up into your field of view from just below it (like a Prius), and there are a number of downsides.
And a real HUD, beyond deciding what information to present and how to present it (as you rightly pointed out), is also technically far more complex than slapping a display on glass:
"I commented on the original post, but this is a great example of reason #1 why HUDs are still green after 50+ years.
Colors are additive with the outside world. There are conditions (sunset/sunrise) where you could get misleading information. Ambers and reds mean specific things. (https://images.app.goo.gl/G5Udbponiu1M1Fnc6)
"The optics in projecting an image reliably at optical infinity (makes it look like a billboard 500 yards in front of you rather than a TV screen 6 feet from you) are actually quite complicated and tuned to the specific wavelength. Having multiple colors drastically increases the complexity, or reduces the clarity because you have to compromise on the optimal wavelength.
"Generally the goal is to be as consistent with the head-down display as possible to limit the head cycles spent context switching, but in the case of colors we have other means of calling attention to important info: boxing and flashing both do a pretty effective job."
Anything less than that just risks becoming obscured, with less contrast and similar focal-distance issues than a good instrument cluster.
That said, I wouldn't mind something I could stick on the glass of a vintage vehicle up in the corner so I can add instrumentation without modifying the dash.
But you're still going to get a much better approach with a windshield-mounted glass combiner, Picture Generating Unit and a projector.
I had an idea recently about creating slight-3D displays by stacking layer of LCD panels that give some impression of depth when moving your head, this might make that feasible.
Thank you, it's not what I was thinking of but looks interesting. Seems like there is some history of multi-layer displays that I wasn't aware of, not that I took the idea any further than a "wouldn't it be cool if..." scenario.
I can't tell if it's a feature of the screen or just the very dark nature of subway tunnels, but I really like the fact that the screen goes completely opaque on the bright blue screens.
Transparent screens could be useful in all sorts of scenarios, but I think tech demos too often ignore the fact that, most of the time, you want the image to be as clear as possible, which means opaque.
I'd love to see transparent screens in windows, like, real holes in buildings and furniture that are usually filled with glass. This would be a step towards ambient computing.
I wonder why no one suggested putting a very thin, transparent monochrome LCD behind one of these. It would allow darkening the background, as opposed to OLED's ability to emit light but not block it from passing through. Basically, you'd use two display technologies to complement each other.
If you put your webcam behind your video chat window, you can achieve better eye contact with your remote participants. I wonder how well a webcam can see through these screens and/or how much of a hole would be required in the rendered image to avoid obstructing the camera.
With the right level of integration the camera could subtract the image that is in front of it. That would requires having a pretty good model of the point spread function, perfect synchronization etc.
OLEDs have a fast response... I wonder if it would be possible to put the camera shutter and the oled out of phase enough to substantially dim it.
Perhaps polarization could be used to get better isolation.
With all the engineering required to do it, it might be much less expensive to have three or four cameras at the edges of the display, then extract a depth map and resynthesize an view from the perspective of the centre of the screen. :)
Plus that would give you bonus features like being able to automatically blur our or heavily denoise the background. :)
> I wonder if it would be possible to put the camera shutter and the oled out of phase enough to substantially dim it.
That's a neat idea. It reminds me of the early fighter planes that fired bullets between the propeller blades by having the gun driven off the engine, timed so that the bullet passes through the plane of the propeller while the blades are not in the way.
> I wonder if it would be possible to put the camera shutter and the oled out of phase enough to substantially dim it.
That is exactly the idea, to sync the camera to capture in between OLED pulses. It's not going to be perfect, but then to use computational photography to subtract whatever bleed of pixel color is left.
In theory, it should be totally workable. In practice, I'm not sure what the tradeoffs are in terms of acceptable image quality and resolution for the camera.
I would think a teleprompter style setup should work.
I remember ages ago someone created a reverse version, basically a periscope you hang over the camera on your laptop. I think the idea was you position the window with the face behind the mirror.
It can, I've tried gaze correction software, but it has the unfortunate side effect of making the speaker look possessed. Total uncanny valley.
We humans are sensitive to every tiny detail of the eye, because we express so much emotion through it. It's no joke that the "eye is the window to the soul". Gaze correction software literally winds up looking like it messes with your soul.
Maybe someday we'll invent gaze correction that actually looks realistic -- but because it involves not just moving the eyeball, but opening/closing the eyelid (the eyelid moves as our eyes move vertically, like for a webcam), therefore moving the eyelashes, involving compensation with the lower eyelid, and more... it's going to be really hard.
I recall once having a conversation with someone working on computer vision at Facebook, and they mentioned that the company had spent a considerable amount of talent & resources trying to get this feature to work cleanly and they did not get past the uncanny valley problems.
Gaze redirection looks bad partly because head orientation ends up decoupling from eye direction.
With a depth camera you can move the camera's entire vantage point as if it were behind the right area on screen. Hat's probably a better approach, along with AI filling in the disocclusions (more cameras can also do this).
What we have today is called "teleprompter", and not even particularly recent. With some slight pre-distortion and a tilted monitor it should also handle the focus issue a classic teleprompter setup has.
Basically, you just place a half-reflective mirror at a 45 degree angle, lay the screen flat facing up (I propose a slight back-tilt "so the camera can see better", but you loose contrast due to viewing angle, which would ideally be corrected by the LCD), and look at the camera that's behind the mirror, while being able to view the screen in the mirror.
The first time I saw an optical finger print sensor under a glass display, that was my immediate hope that it would kick off an under-glass optics explosion.
> everyone gets to make direct eye contact again!
As someone who doesn't always have "good" days with eye-contact, being forced to do that feels a bit terrifying.
Gazing at people is usually part of power play or makes people feel you want the control or don't trust them. Some people will feel uncomfortable to even look at you if your gaze fixates them.
Give people a chance to have an "unobserved" look at you — this makes no sense, but might make things less tense.
This is btw. a very common error bad/inexperienced actors make: they have more eye contact with the other person than would be normal in that situation. If you observe people in their natural habitat (e.g. two strangers meeting), many look at something else while talking and only have short gazes at each other in order to check if the words arrived. Sometimes there are longer periods of looking at each other but it is nearly never all the time, unless it is some sort of power play.
If you don't want to wait for eye-contact and have two monitors, you can place the webcam between the monitors and center your conversation-partner's image/window on the camera. Works great.
I think that’s a poor approach. While you end up with gaze more in the center of the display, you don’t always have a face directly centered. This alleviates some of the problem, but doesn’t remove it.
A better approach that I’ve always wanted is to have four cameras on the corners of the display that generate a real time photogrametric projection of your face. This would mean not only would your camera view be dead center, it’d be center no matter where you were looking on the screen. Much better especially in situations where there are multiple people in a meeting. You could even have different streams for each meaning to the other attendees it appeared as if you were looking directly at them only when you actually were.
This is surprisingly difficult with a device like an iPad or smartphone in landscape where the camera is off to one side. Even knowing where the camera is, the image of the person on your screen that you're talking to tends to pull your gaze back. You can do it; it's just an effort.
It would be relatively trivial to automatically keep the face of the person you're talking to centered around the webcam using face detection. As a person moves in frame, your computer will "pan" to keep them centered.
Four cameras is intriguing, but also expensive and complicated, and will probably have a very rough time with things like eyelashes, edges of hair, different patterns of reflections on the eyeballs, different glare from lighting, etc. Realistic photogrammetry is really hard in practice, and runs the risk of being really uncanny valley.
Photogrammetry doesn't yet properly handle non-diffuse surfaces for the most part. There are some experiments, but they're way behind in other aspects.
And I'm not even talking real-time. So, yeah, uncanny valley or teleprompter it is.
The thought of a webcam constantly pointing at users has already creeped them out enough to make camera covers a popular item. Imagine how well that segment of the population would react to a hidden camera...
Yes I can see the discussion at hardware manufacturers now.
Marketing: 'So, we can monitor users and collect sellable data through a camera hidden behind their monitor?'
Engineer: 'Why yes, but maybe we should allow users to disable this through some kind of hardware switch. It could lead to potential privacy violations etc.'
Marketing: 'If they turned it off could we still collect marketable data?'
Engineer: 'Well no...that's sort of the point...'
Marketing: 'I see...so invisible hidden cameras that can't be disabled you say?'
Marketing: Make the news eh? Well any publicity's good publicity and we can outsource these kinds of things can't we? Maybe we could save a bit of money that way. Excellent idea. Get right on it.
They exists don't get me wrong but you know plenty of engineers don't care or bother with more simple things like recycling, keeping themselfs in shape, working on their own mental health etc. etc.
There are plenty fish in the ocean which do not care or just never thought about the implications.
After a decade of hardware buttons that aren't, e.g. TV power buttons, I think for most people trust has sailed. It works if the company builds it's brand on it (Purism) but I would not trust a hardware switch from Lenovo or Dell.
I do remember the lenovo driver that installed itself from the bios - even on a clean windows install - and would snoop for months and then silently uninstall itself.
I've owned a Dell XPS13 which was the crappiest product I've bought in a while. So I'd say the problem with Dell is more inablity vs. malicious intent.
We've bought/had replaced half a dozen XPS13s in the last 18 months and all but one have had to be returned, some are on their second replacement. QA seems to be non-existent
Not the person you were replying to, but anecdotally my XPS 15 9550 has been nothing but a nightmare.
It's on battery #3 from the battery swelling issue, a lot of backlight bleed on the 4K Display, coil whine in the power supply (went through three of those), three DC jacks (might need to be a 4th soon), the infamous VRM/throttling issue, and to top it all off I had the main board replaced within the first two years because of a firmware problem preventing Windows from activating. Less of problem with the system itself, but during that mobo job Dell replaced the 512GB Samsung SSD with a Toshiba 1TB and R/W speeds dropped to ~1/3 of what they were before. Also anecdotal but a buddy of mine with a 9560 has gone through more or less the same issues, sans the firmware problem.
They look so appealing on paper (which is why I chose it over a MBP), but there's just something about them that feels proof-of-concept, like it's an unfinished product that was pushed out too quick. I wanted to give Dell another chance, having gone private three years before and having seemingly improved their build quality. When you spend $3800 on what is supposed to be the best-of-the-best and it turns out to be the headache of a lifetime, it leaves a really sour aftertaste.
Wow, so it's not just the Precision models that are lottery. In my area of the company we mostly use DELL precision and I kid you not, every second one (Precision 7550) has some sort of a problem. They vary from the ones that give you blue screen of death, throttling issue in CAD/CAE software ect.
I ordered a XPS 15 (9570) last year to replace my wife's slow Walmart special. We debated between Mac and Wintel and decided to stick with Wintel due to familiarity and various programs we run. We get the delivery, take it out of the box, plug it in, turn it on... and this horrible screeching noise greets us like someone is cutting a piece of metal with an angle grinder. It was not a shipping bug either, the package arrived in perfect shape.
That pretty much summed up the difference between Apple and everyone else for me.
As a counterpoint to that (and maybe it has just been a particular model series), I've had the 9343 XPS13 for 5 years now without issue. The battery life is much shorter now, for sure, but it's been a daily driver for all that time so I'm not so surprised.
I don't recall the model number anymore, but I also had an XPS13 for around three years at my last job. Had it running linux as my primary machine for web (and some mobile) development, and the only issue I ever had was that the onboard microphone sounded pretty bad.
I've seen a lot of bad reports, so maybe I just got lucky, but I hope not since I loved that machine and the XPS line would be pretty high up on my list if I was in the market for a new laptop now.
I'll just add on to this. Also have had an XPS13 as a daily driver for about a year and have nothing major to complain about. A slight coil buzz while charging, nostril cam and shitty microphone are some minor gripes, but all in all it's been great.
I had one and it was a lemon. It was so bad that I gave it away and got a second-hand laptop to work for almost a year, it made me so much more productive.
(XPS13 was max specced) Battery died, Fan died, booted during the night on it's own, due to the failed fan made a loud alarm noise, louder than our fire alarm (wondered what the developer thought there), paid for premium business support with next day replacement, Dell didn't want to repair the thing but wanted me to jump through dozens of update loops, mouse pad went bad.
Alot of vendors (Lenovo[0], ASUS, Dell, ) are using Windows Platform Binary Table, WPBT (https://news.ycombinator.com/item?id=18095848) to get the OS to execute binary. Dell seem to use it to load the Absolute's Lojack anti theft rootkit[1]. Also on Absolute's website there's a list[2] of manufacturers that are using Absolute Persistance[3].
Dell has an anti theft ‘solution’ that works the same as this, as do all the major vendors. The fault is with Microsoft for allowing that crap in the first place.
I think the OLEDs shipping in Samsung's phones are already transparent to a degree. They are able to put light sensor, proximity sensor and finger print sensor behind the screen. The can see through the screen. It's only a matter of time before they are able to put a full camera behind the display.
Someone correct me if I'm wrong, but would that work? I'm assuming (it wasn't clear) the goal there would be to display the person's whole face in a non-distracting way. But if the OLEDs are transparent and they're projecting the image that person's face, the projection would be both forwards and backwards, so the camera would just capture the space between that person's eyes instead of you.
Can transparent OLED's have a unidirectional display?
A much more general solution would be a neural net that can alter gaze. Integrate with eye tracking and it becomes even better. Only the user you are looking at would see your direct gaze. Other users would see you looking towards the user you are looking at.
and here I am the simple science fiction fan who sees us closing in on the possibility of having large portions of our phones being wholly see through, so as with OLED televisions you tuck all your electronics at the bottom and the top half is glorious thin and soon transparent screen
It will (https://www.youtube.com/watch?v=v3XcQtoja_Y), and VR already conveys MUCH more of the micro-interactions of being with people that we are unknowingly blind to in Zoom meetings, which is one reason Zoom meetings are so draining.
How? At best it can take pictures of part of your face. And usually there's a screen inside that headset. Don't you have the exact same problem of needing a camera to be in the middle of the screen? But now it's even worse because the other person's eyes could be anywhere on the screen as you move your head.
Edit: Oh, you have a link now. So it's going to render a fake head based on eye tracking? You can do that without VR!
You can deliver a rendered avatar based on facial tracking to remove the "camera in the wrong place" eye contact problems, even with normal screens and cameras.
VR makes things immersive, but it is neither necessary nor sufficient to solve the eye contact problem. It's orthogonal.
It's not about just eye contact, but a whole lot of body language that we are totally blind to with a just video of someone who's not even looking at you. This body language is a very important part of communicating and can be mediated with VR.
This blindless in webcam meetings make everyone something like socially "autistic", and is a big part of what can make them much more draining than in-person meetings.
They did the same thing with bezel free phones: early adoption of Korean (LG) or Japanese (Sharp) display tech. I'm not sure what their angle is, but these early adopter products seem like one offs.
Assuming the transparency (article says 70-80%) can be improved for clarity and the pixels small enough then it could be great.
However you'd have to keep in mind that others looking at your face could see the same image. So don't go running that "X-RAY:make them appear naked!" app...
Fair point re: sunglasses!
Not sure "mirrored" would stop light being projected back through but may well help disguise it. Maybe a simple black LCD layer to block the outgoing light might help.
Also, I don't want the creepy Xray app.
But porn has this way of working into society/technology so I don't doubt for a second some form of that app will exist if it doesn't already.
Well, you get serious eye strain from focusing too close, so it would need optics to create a virtual image at a comfortable distance that your eyes can focus on.
In reality some lens(es) take the screen's light and bend it so that the angles of the light rays correspond to the light of a screen at that comfortable distance, but your eyes can't tell, just like they can't tell that the things are actually smaller when you look through a good pair of binoculars.
I really hope at some point they will have some pane available by the unit with normal retailer. I foresee some really cool HUD interface possible with a RPI :)
Isn’t a transparent monitor just a window with a HUD? My first thought was that this could have exciting use-cases in vehicles. However, aside from nighttime driving, the relatively poor brightness would likely quickly rule it out. Anyway, for a fun thought experiment, if I were thoroughly reviewing it for use I’d be most curious about its visual clarity when off (and perhaps durability, but to a lesser extent). How exactly does one measure contrast ratios when you have no idea what you’re contrasting against?
You could put an eInk display behind it, and have sort of a dual mode of interaction. If both can be constructed flexibly/foldable (with or without a hinge) I can see some very interesting designs. The ability of eInk to switch to black, or just display semi-permanent information with amazing readability and low power usage, while still having portions of the screen be high refresh rate and full color would be amazing.
This technology is being use by Xaomi to make a "transparent TV"[1] which in theory you will be able to buy. (I realize that not all of these sorts of announcements survive until actual product shipment).
It will be interesting to see one of these "in the flesh" as it were, I'm at a loss what you'd have behind it that wouldn't make it really really annoying to watch, but the cool factor could be right up there.
[1] https://www.theverge.com/2020/8/11/21363861/xiaomi-oled-tv-t...
reply