Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

No, this absolutely checks with my experience - I chase 5 and 10ms improvements all the time because we've measured and know it increases conversion.

But it makes sense, too: if the metric here is average latency, that doesn't mean that some users didn't see a much more dramatic increase. Every tiny bit of frustration removed from the experience adds up.



sort by: page size:

Yes, but personally speaking, I don't even see the latency improvement as being terribly compelling.

A study completed by Annett and company makes a strong argument that improving latency beyond 50ms will have diminishing returns.

https://webdocs.cs.ualberta.ca/~wfb/publications/C-2014-GI-L...


That's fine, but I wonder if that carries over to most people. There have been plenty of studies that suggest perceivable slowness has a large effect on user engagement. Amazon famously did a study on their website that showed 100ms of latency cost them 1% of sales.

Fair point! I might file this under 'latency' which is often ignored for improvement so long as it's tolerable.

There is no published research I'm aware of. However, you can find an article about every major tech company talking about how reducing their latency increased whatever their core metric was (not necessarily linearly though, sometimes it was and even steeper curve). I know for sure this is true at eBay, Netflix, and reddit because I've seen the numbers personally, and I have friends at Google, Amazon and Facebook who have said the same.

I work in mobile ads, and for fun, i injected 50ms, 100ms, 250ms, 500ms latency in an experiment. The results were not as dramatic. 500ms had less than 1% impact. 50ms looked almost positive (didn't wait enough to have it statistically significant). This is 2018. Results are rather nonlinear, and different formats with different latency requirements had different results (like formats lending themselves to advance preloading would not be impacted at all even at 500ms, but ohers which had tighter UX requirements would).

I realize results are context-dependent, this is why everyone should make their own conclusions from their own analysis. You might have bigger bangs to make than shaving off some latency.


Yes, the latency improvement is very true

Yes. Numerous reputable entities have published reports demonstrating that users notice quite a lot. Amazon claims that every 100ms costs them 1% of revenue. Google claims 500ms costs them 20% of traffic. 5 seconds is a fucking eternity, and anything you expose to users on the web with such horrible performance will suffer greatly because of if. One exception may be banks. Users are more forgiving of latency as their financial connection to it increases.

I know it's in the spirit of the talk, but the histogram at 10:45 and the related discussion about how the latency improved for most users but the average latency increasing meaning a worse experience for other users reminds me of the anecdote a Google engineer had when Youtube started rolling out their HTML5 player, the responsiveness of the page had increased but the average latency graphs went up. This wasn't down to it being a bad update, or some users getting a worse experience - not really anyway, but the switch to the HTML5 player allowed a wider audience to start using Youtube where they wouldn't have been able to do this previously. A change increasing average latency, even on a histogram, doesn't necessarily mean it's a bad change. Look at your data indeed.

We underestimate how latency matters for user experience.

Like the famous "100ms = 1% of sales" at Amazon https://news.ycombinator.com/item?id=273900


I think as another person pointed out

I don't mind so much if I can pay for improved latency, but if the standard service provided is purposefully degraded to justify it then that's when it's iffy

There would also need to be some serious data to support an actual benefit compared to the standard latency


Came here to say this.

Personally: the lower the “action to result” latency, the more compelling and tactile something is.

We all know about the sub 100ms “golden zone”, but if something is ~10ms (custom hardware with optimized software) it’s significantly more real to me, and if something is <1ms it’s almost irresistible.

Like; 100ms is the barrier to entry and it goes up logarithmically from there.


You're missing the point.

Sure, 10ms vs 40ms is measurable, and for the keen-eyed noticeable. But if you're only pressing the button once every 5 minutes, it doesn't matter. Similarly, if the button triggers an asynchronous call to a third-party webservice that takes seconds to respond, it doesn't matter. And so on.

Of course, for the things where users are affected by low latency, we try to take care. But overall that's a very, very small portion out of our full functionality.


The rest of the internet doesn't care about that. Latency improvements are from your device to the cell tower only. So probably 5ms improvement.

The results of the latency benchmark say that a larger latency is better. Is that a mistake?

You would bother if you wanted reduced latency for a better user experience.

That's awesome! Do you expect significant latency improvements to come from this?

Actual latency has improved, but clock-relative latency hasn't. So what?

Latency can be better if they get it right.
next

Legal | privacy