Defining best is a biased process. Is cost or quality more important? There is no objective criteria for such judgments it’s all based on personal bias.
So Google favoring low latency responses is saying latency (cost) matters when people browsing may not care.
Yes, users care about latency, but less efficiency doesn't necessarily lead to higher latency. (I have no idea what "worse servers" means.)
As long as Google can afford to provide decent anwers with acceptable latency, it will have users.
I think that efficiency is like programming effort, algorithms, and programming languages in that users don't care. They only care about results and the costs that affect them.
If the respect was for the receiving device or connection speed, the metric used to determine priority of results would be size of payload / speed of delivery, rather than if the technology / hosting was google-owned.
I mean, HN isn’t an apples to apples comparison at all. I bet Google search (where I work) loads much faster when we serve 10 blue links to an older phone or IE6, and that page is at least an order of magnitude more complex to assemble. When we serve you a live stock ticker, video results and carousel of relevant tweets though it’ll take a few more milliseconds to get that data.
My point isn’t that latency is the only thing worth optimizing for, it’s that you can spend your latency budget on something like feature detection, or, adding new features and better UX for newer browsers.
When you're Google, 10ms makes a difference. In my own personal experience, a decrease of 100ms in my response time results in a 20% increase of traffic.
> what i took away from it is this: speed is mostly relevant when there's a competing product providing similar value and you risk losing business due to inferior user experience
Even if you don't have competition, you leave the door wide open if requests take 5 seconds.
There's also indirect competition ... like a lot of people type the name of the service they want directly into Google's search box ... a practice which would stop if it took 5 seconds, because accessing your local bookmarks would be faster.
If it means average latency is 100ms instead of 1s or whatever the huge GPT model can currently serve to millions of content users, that matters a lot for search.
I couldn't agree more. Look at internet speed tests. I used the ookla speed test because it was the first result until one day Google had created its own, built right into the search results page. I don't know which product is better, but it's easy to see ookla getting starved out by this regardless of product merit.
It is all about bytes and connections. Google minimises bytes transferred to you and number of connections from your browser to their servers. In scale of Google it does matter.
You are both technically correct and at the same time do not have an idea of what you are talking about. The break even point for these algorithm is so huge that even at say Google scale it does not matter. Data access latency plays an overwhelmingly more important role.
I think with search results you're right, and with things like GMail it seems performance is at the bottom of the priority list. Perhaps even off of the priority list entirely.
So Google favoring low latency responses is saying latency (cost) matters when people browsing may not care.
reply