Chrome does this for a long time now and I don't consider it an issue. You see a frozen version of the last decoded keyframe until the next one is decoded. Elegant solution IMO.
I'm pretty sure it only discards the original after x number of other (new) images have been decoded. (Or perhaps it's memory footprint based?)
I ran into a Chrome performance bug years ago with animations, because the animation had more frames than the decoded cache size. Everything ground to a halt on the machine when it happened. Meanwhile older unoptimized browsers ran it just fine.
In the past you could freeze them and step through frames by right-clicking the image, but I can't remember which browser it was in. It was over a decade ago iirc.
Yeah it works out OK with caching. Very easy to track down with the new initiator tab on the in Chrome Dev Tools. They have _initLightBoxItems which removes all the images/gifs from the dom and then readds them which is bound to the scroll event (with a 50ms throttle).
The only problem with that is that uBO applies it via JS polling (I think?) and this results in the unstyled content flashing very briefly. Pretty annoying sometimes; hope it also gets into a standard (or jyst gets a native implementation for the sake of adblocking).
Per (0) I don't think "don't decode until you scroll" is done yet. This issue alone makes Firefox annoying for use on image boards and "infiniscroll"-style galleries like Facebook's.
Also, "discard images when you scroll them offscreen" was disabled in (1) because it broke sites with lots of images (like Pinterest and Tumblr Archives, according to the bug).
These two issues alone make Firefox pretty painful for a lot of common use cases on real-world machines (like my old web browsing machine with only 2GB of RAM).
Doesn't a JavaScript implementation offset most of the performance benefits? Today we have browsers that are smart about when to cache the decoded image and when not, etc; does that have to be reimplemented in javascript?
If image decoding is asynchronous (don't know - guessing it is already?) and you decode them when they get near to the viewport and not just when they become visible, then it should always be decoded by the time you scroll to it and yet never jank the page. Scrolling a long way really fast probably means there's a small delay while it decodes images, but surely that's worth it to save gigabytes of memory?
I have already found one case in Firefox where this is true. FF will load the image data in background tabs, but it won't decode it (e.g. render a JPG) until you switch to that tab. I may be nitpicking but the extra half-second or so bothers me like texture pop-in in a game. So I go to about:config and set image.mem.decodeondraw and image.mem.discardable to false.
> Every browser has an option to disable loading of images, javascript, flash or java. At least Firefox and Opera have them for sure, I suspect others do too, because why not?
For me the crucial difference is that Opera could toggle between loading images/cached images with a single keypress: 'i'. These days you have to press Shift-i. Toggling CSS on/off is Shift-g. Having this so easily accessible makes a big difference. I don't know how to do this in Chrome. Looking quickly through the menu, I can't even find the options for this.
Not necessarily, the decoding can be stopped once you get enough usable information in regards to the usage of the image (display size ...) with the same source image. That's neat!
Lazy loading really sucks when you're left with a blurry image because it timed out while serving the full image (can happen even with an acceptable connection).
Unless it's really clever I'd prefer if they just let the browser do it's thing with an img tag, because that worked fine in the 56K times.
Does the javascript decoder download the images, or does the browser do that, then the javascript simply decode it? If it downloads them itself, then any parallel fetching by browsers will stop working. And if they are decoded by javascript, they will take (very slightly) longer, and both the encoded and decoded images need to be held in memory both in javascript, and the browser. Are they decoded to .bmp, .jpg, .png? Can CSS reference them without javascript changing the styles? Can they be cached locally if downloaded directly?
If you needed this for any page that loaded lots of images, all of the above would significantly increase page-load time, and memory usage. Especially on a mobile browser, it would use more resources (hence more battery), than otherwise. So personally I wouldn't like to see this be the future. Maybe just as a fall back, but detection for that might be a little hard.
It's a risk but implementers can take the sting out of it. Browsers aren't currently smart enough to do things like unload decoded <img> memory for things which aren't visible but you can avoid the worst of it if you use a CSS background-image (which browsers do unload) and a visibility test on scroll to avoid loading things which aren't visible or soon to be visible. This works as far back as IE8 so it might be worth the hassle.
The idea was that you'd see the image get progressively better as each pass was completed, thus the term "progressive". But in practice the browsers decided not to display anything until the image was finished loading anyway.
reply