Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Why should the perception of speed have to be based on deceit?

I bet I would perceive a progress bar that progresses with constant speed as faster than one that stalls and stutters even when both of them take the same time to completion. Even more if the alternative would be an indefinite progress indicator (hourglass pointer, spinner) that just goes away when the task finishes.

On the other hand if the result is that a user is less annoyed by a process, I do not see why it should be wrong to convey that feeling artificially by setting up a situation in which the same result makes them feel better than in a more "honest" one (as you might call it?).



view as:

I definitely agree that showing a progress bar makes a slow operation feel faster, I've seen this work many times.

I think I disagree about the constant speed vs stuttering progress bar example though - Progress bars which progress smoothly are great if they are accurate. But because sometimes a UI will show a fake progress bar smoothly filling that will empty after filling and fill again, I've been trained to become skeptical of them. I don't think I've ever seen a stuttery progress bar that was "fake" in that way.


Showing a constant speed progress bar to hide stutter is effectively applying a low-pass filter. Since stutter carries information (that may not be of interest to all your users, but perhaps some), you're low-pass filtering that information.

It is a deceit - you're purposefully concealing information, and you're doing it for some reason. How bad is that? It depends on how much a given piece of information is useful to the recipient. Smoothing out something where its high-frequency data matters to the user isn't nice (and can be paternalizing at times). Imagine your car speedometer didn't display instantaneous speed, but a running average over a one-minute window. Would you feel comfortable about it? Or if your bank account displayed your balance as a running average over a week-long window?

Progress bars being accurate isn't that important for most, but it may be for some. In a same way you can get a feel for your car by listening to the sounds it makes, you can get a feel for the state of your computing environment by observing a progress bar. For example: does it seem to slow down hard whenever the "storage" light on your laptop is blinking fast (in the old days - when your HDD makes noise)? It may imply you have an issue with your storage. Smooth a progress bar out, and you're removing such environmental cues.


The progress was meant to be truthful in both cases. Imagine the case in which someone is getting enough feedback from the backend to be able to show percent-wise changes vs. the case where updates are pretty coarse.

I think the dishonesty comes in when the progress bar uses time as an input instead of observable events. It's fine and good to modify your api to return better progress information as long as that doesn't cause the operation to take longer or significantly increase load.

However, representing a coarse process with time based progress is perhaps dishonest.

For example assuming that step 2 of some 3 stage process always takes 20 seconds and artificially filling the progress bar over that duration is presenting potentially false progress to the user. For operations which may take a long time with some variance the user may be waiting for a while at a stopped progress bar and think an error has occured.


Legal | privacy