Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
First LED cinema screen in the U.S.A. officially opened (www.hollywoodreporter.com) similar stories update story
39 points by adrian_mrd | karma 15583 | avg karma 5.09 2018-04-22 15:42:02 | hide | past | favorite | 47 comments



view as:

The article seems to pitch LED screens as something to upcharge for, which makes me think there isn't really any/enough cost savings for the theater from a "no projection booth" standpoint, but that does surprise me a little. Obviously the costs of paying a person to run the booth isn't high, but perhaps the lower likelihood of something going wrong has its benefits as well.

I could see a marketing push for the, "New LED Screen!" but if cinemagoers aren't generally able to notice a substantial difference (or if it's a worse experience, even), then I don't really think people will opt in. After all, even the 3d push has basically died, and that's a substantially different experience.


The 14-screen theatre I used to work at as a projectionist doesn't employ anyone to run the projectors anymore- it's all automated.

Regarding 3D - a lot of people have slightly (or completely, for example me) different eyes. For these people, 3D movies cause headache and it's not a pleasant experience overall - the last time I visited a 3D cinema, I wasn't even able to look at the projection without my eyes hurting, thankfully it was just a short educational sketch.

What gives me a headache is not being able to control the focus. Normally you can choose to focus on something close or far away. With a 3D screen, the focal point is always the same. Something that's not in focus on the screen can't be brought into focus.

The problem is the same for non-3D screens, but they don't bother me(probably because I've been looking at 2D screens my entire life). The added 3D effect is what seems to trigger it for me.


> which is 14 foot-lamberts (a measure of luminance in cinema). In comparison, the LED screen has a peak brightness level around 300 nits (a measurement of brightness)

in case anyone is wondering:

https://www.wolframalpha.com/input/?i=300+nits%2F14+footlamb...

more than 6 times the brightness per unit area


It's mentioned as 88 foot-lambert in the next sentence or 2.

> [...] which Samsung estimates could display roughly 88 foot-lamberts.

Without WA and with more of the article, 88 / 14 approximates 6.29.


Nice. So now, in addition to being deaf after watching movies at the cinema, I can also be blind as well.

Hopefully they won't sell too many of these before they can build Rec. 2100 screens with HDR and much larger color gamut. MicroLED tech would be nice, too.

https://en.wikipedia.org/wiki/Rec._2100


According to the page you link,

"Rec. 2100 has the same color space as Rec. 2020."

This would have been less confusing, because the Recent. 2020 space is already being targeted by the highest end displays.


why would I want to go watch a TV screen at the cinema?

Because it's still a huge screen with high quality audio?

Why would you care how the image gets on the screen of it ends up looking better?


On the contrary, the audio is horrible (Yes, I've heard it). Movie screens are perforated to permit "screen speakers" to be positioned directly behind the screen. LED screens do not allow such positioning. The center channel (i.e. where most dialog emanates) is positioned above the top of the screen. This has the effect of disassociating the voice from the speaker and breaking the suspension of disbelief.

I doubt this will break the suspension of disbelief that easily. People have no problem listening to Movies on crappy stereo or mono on their laptop's display speakers or using crappy 3.5mm earplugs (sometimes with only one plug in the ears).

Because of its size and because of the sound system, and also because movies are released to cinemas earlier than you can watch them at home?

This comment implies that the conventional cinema screen setup is more preferable over a TV screen when it comes to watching movies, and I have to agree. Maybe not for other content, but staring against any more powerful illuminating surface in a dark room for 2 hours would strain the eyes too much to be pleasurable.

This is different to “LED TVs” at home. It’s closer to an OLED TV since each pixel in this cinema screen is individually lit.

Because with the vastly greater luminance dynamic range, the display reproduces orders if magnitude more distinguishable colours.

wont LED screens cause more stress on the eyes than a projection? I hate 3D movies, my eyes feel very tired from the constant refocusing. I think LED screens at this size and in a dark environment would have the same effect.

No, why would they?

Most LEDs have their brightness controlled via PWM, which essentially switches the LED on and off many thousand times per second.

You can see this in most LED car taillights if you move your eyes while paying attention to the light -- it will create a 'strobe' effect, a trail of images along the path that it traveled in your field of view.

Some people are sensitive to this flicker. A similar thing happens with most DLP projectors.


Most people could barely perceive the flicker of 60Hz CRT televisions. The kHz refresh of LEDs is not perceptible.

There are plenty of people[1] that can perceive whatever PWM frequency is typically used in vehicle LED taillights. Like a few the people in the linked forum, the switching in the vertical Cadillac tail lights are particularly visible to me.

I agree that when viewed directly, it's quite unlikely anyone can perceive a switching frequency in the thousands of hertz. But if you sit in one of the closer seats, your eyes will be darting back and forth across the screen, which would be enough for that strobe effect to bother me if the frequency is anywhere near what is used in the taillights.

[1] https://www.thenakedscientists.com/forum/index.php?topic=451...


I’m able to reliably detect flicker up to ~ 80Hz. Made for expensive monitors back in the CRT days

Crts don't switch on an off. The phospher decays much more slowly then a switched diode.

I hate PWM car taillights, but I doubt that LED screens will have the same problems. PWM is not the only way to control LED brightness and even PWM can be OK if the frequency is high enough. Also, projectors already flicker.

What other methods are there to control LED brightness?

PWM is just one (very inexpensive) way to implement a DAC (digital to analog converter). Other methods actually set the output voltage to the target voltage, such as a R-2R ladder. I don't think R-2R ladders are used very much in practice anymore but I can't remember any other implementations off the top of my head.

The other main way would be using a constant current driver[1]. By adjusting the amount of current to the LED, you can control the brightness.

[1] https://electronics.stackexchange.com/questions/256336/does-...


LEDs are current devices; any mechanism for controlling current precisely can be used to control LED brightness. there are a lot of them.

running a smaller, continuous, current through them.

PWM is done because it's cheap to implement given a fixed power rail and a digital control : need a single/couple of power switches that can even be integrated on die for low-ish power targets. continuous things usually need some kind of discrete filters, that means more components etc


Eizo has screens that use DC dimming:

http://www.eizoglobal.com/library/basics/eyestrain/index.htm...

I never tried them though.


I have some kind of "flickering sensitivity" (if this is a thing). I bought a retina MBP about 4-5 years ago and I had to return it because it was killing me. After 30 minutes watching the screen, I had terrible headaches and my vision was becoming blury for the next hours.

Now I have a new (2017) MBP and I don't have any problem anymore, so I guess that they improved the technology somehow.


Plasma TVs do it to me also. It used to drive me nuts to go into a Best Buy and look at a wall of plasma TVs.

Looking up the retina displays, I see they are either IPS or OLED. The IPS likely has an LED backlight, and OLED is...LED. So a flicker is possible if they used PWM to limit the current to the LEDs.

Did you notice any change in the effect at different brightness levels?


The only Apple device with an OLED screen is the iPhone X; the built-in displays on the 2012-2015 retina MacBook Pros as well as the ones made after the 2016 redesign are IPS with a 60hz refresh rate. There were issues with the graphics on the mid 2012 to early 2013 retina MacBook Pros - perhaps the parent got one of the defective models.

> The only Apple device with an OLED screen is the iPhone X

I think the iWatch also has an OLED panel.


All three of them do! I'm not sure how it slipped my mind. Thankfully, Cunningham's Law [1].

[1]https://meta.wikimedia.org/wiki/Cunningham%27s_Law


do you enjoy watching movie on a tablet in a dark environment? i think its a similar experience.

i have no eye strain from watching a projection.


No, to the contrary. The image is sharper, the contrast is higher, the display is active, not passive. This helps the eye. Think of Amuled smartphone displays with its superior quality.

> I hate 3D movies, my eyes feel very tired from the constant refocusing.

Does anyone here actually prefer 3D over 2D movies? I go out of my way to attend 2D screenings, even though my local cinema only offers these at very inconvenient times (starting after 22:00), probably to skew the numbers in favor of the (more expensive) 3D tickets.


Surprised nobody is pointing out the possible labor savings. No more projectionists. One or two people will be able to remotely run 20,000 screens because it will be mostly automated. Once they fully automate the vending of tickets the only people needed will be those running the concession stand.

That's already done. It's called multiplex. The projectionist controlls the 6-35 digital screens in his theatre complex, and the money comes from the foodstand. Since the studios demand too high fees for the movies, theatres have to stay profitable by selling food and drinks. Quality of the movies and the projections sink, but interestingly the quality of the food is rising.

> quality of the food is rising.

unless you're buying some gold class dining ticket (which tends to cost upwards of 100 dollars), the food (or snacks) is terribly overpriced and quality isn't getting better.


Perhaps one day the quality will rise to the level of 'barely adequate for human consumption'

Cinema Projector Rooms are basically fully automated now, there won't be much labor savings. The only labor that goes on in there is when the intern brings in new harddrives with movies once a week.

This is already true - sort of. Not to the extent that one person can run 20,000 screens, but that also wouldn't be solved by this new LED screen.

I'm the lead projectionist for our local 8-screen cinema but there are no times at which I'm specifically 'the projectionist' sitting up in the box, because we run entirely digitally and everything is automated - rather, I'm one of many trained staff who can resolve issues when they arise.

The issues that do arise most commonly are related to the media server that play the digital cinema files, and rarely do we have issues with the actual projectors. If we were to use this LED screen, I would assume the rest of the hardware would remain the same, and the new screen would just act as the projector replacement, so I don't think this would solve any of the software issues for which we currently need trained projectionists onsite.


Legal | privacy