Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> there's a "scan pattern" where different vertical lines have different brightness/gains. Also highlights and dark areas are more contrasted than the good version.

You get all this “automatically” by using much simpler circuitry doing the equivalent job of your “fine tuned” versions.



view as:

Yes, but only in one direction. If you low-pass filter a raster scan signal, then the image is blurred horizontally, but retains its vertical resolution.

Vertical filtering can be implemented in analogue circuitry using line delays (this was done for standards conversion as an alternative to pointing a camera at a screen), but it's a lot harder than taking a photo of a folded-up print-out.


What I’ve meant is that to get the “stripes” to be less aligned and more jagged (or with different quality properties) it’s enough using worse circuitry or worse tuning than what they probably used to align them correctly. As an example, that's what old vacuum tube TV sets produced easily when the circuits weren't correctly tuned. The last phase, including producing the “worse suited” contrasts was most probably done in the photo processing phase, I agree with that. I also agree that making the image more “blurry” is also easiest done by making a bad photo. But, it could also have been that the "better" images were in one pass simply run through the "facsimile" machines which were anyway used by the publishers an the news agencies at that time.

Legal | privacy