No, this absolutely checks with my experience - I chase 5 and 10ms improvements all the time because we've measured and know it increases conversion.
But it makes sense, too: if the metric here is average latency, that doesn't mean that some users didn't see a much more dramatic increase. Every tiny bit of frustration removed from the experience adds up.
That's fine, but I wonder if that carries over to most people. There have been plenty of studies that suggest perceivable slowness has a large effect on user engagement. Amazon famously did a study on their website that showed 100ms of latency cost them 1% of sales.
There is no published research I'm aware of. However, you can find an article about every major tech company talking about how reducing their latency increased whatever their core metric was (not necessarily linearly though, sometimes it was and even steeper curve). I know for sure this is true at eBay, Netflix, and reddit because I've seen the numbers personally, and I have friends at Google, Amazon and Facebook who have said the same.
I work in mobile ads, and for fun, i injected 50ms, 100ms, 250ms, 500ms latency in an experiment. The results were not as dramatic. 500ms had less than 1% impact. 50ms looked almost positive (didn't wait enough to have it statistically significant). This is 2018. Results are rather nonlinear, and different formats with different latency requirements had different results (like formats lending themselves to advance preloading would not be impacted at all even at 500ms, but ohers which had tighter UX requirements would).
I realize results are context-dependent, this is why everyone should make their own conclusions from their own analysis. You might have bigger bangs to make than shaving off some latency.
Yes. Numerous reputable entities have published reports demonstrating that users notice quite a lot. Amazon claims that every 100ms costs them 1% of revenue. Google claims 500ms costs them 20% of traffic. 5 seconds is a fucking eternity, and anything you expose to users on the web with such horrible performance will suffer greatly because of if. One exception may be banks. Users are more forgiving of latency as their financial connection to it increases.
I know it's in the spirit of the talk, but the histogram at 10:45 and the related discussion about how the latency improved for most users but the average latency increasing meaning a worse experience for other users reminds me of the anecdote a Google engineer had when Youtube started rolling out their HTML5 player, the responsiveness of the page had increased but the average latency graphs went up. This wasn't down to it being a bad update, or some users getting a worse experience - not really anyway, but the switch to the HTML5 player allowed a wider audience to start using Youtube where they wouldn't have been able to do this previously. A change increasing average latency, even on a histogram, doesn't necessarily mean it's a bad change. Look at your data indeed.
I don't mind so much if I can pay for improved latency, but if the standard service provided is purposefully degraded to justify it then that's when it's iffy
There would also need to be some serious data to support an actual benefit compared to the standard latency
Personally: the lower the “action to result” latency, the more compelling and tactile something is.
We all know about the sub 100ms “golden zone”, but if something is ~10ms (custom hardware with optimized software) it’s significantly more real to me, and if something is <1ms it’s almost irresistible.
Like; 100ms is the barrier to entry and it goes up logarithmically from there.
Sure, 10ms vs 40ms is measurable, and for the keen-eyed noticeable. But if you're only pressing the button once every 5 minutes, it doesn't matter. Similarly, if the button triggers an asynchronous call to a third-party webservice that takes seconds to respond, it doesn't matter. And so on.
Of course, for the things where users are affected by low latency, we try to take care. But overall that's a very, very small portion out of our full functionality.
But it makes sense, too: if the metric here is average latency, that doesn't mean that some users didn't see a much more dramatic increase. Every tiny bit of frustration removed from the experience adds up.
reply