Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The point is that most users and use-cases of networking don't have high requirements on bandwidth or latency that warrant a network stack design focused on high performance. Let the ones who want to live on the edge do so if they want, but don't force your high performance, one-bug-away-from-total-disaster network stack design based on your own (probably overblown) requirements on everyone else.

Grandma doesn't care if her tablet can't saturate a WiFi 6 link. Grandma doesn't care if her bank's web page takes an extra 75µs to traverse the user-land network stack. But she will care a whole lot if her savings are emptied while managing her bank account through her tablet. Even worse if her only fault was having her tablet powered on when the smart toaster of a neighbor compromised it because of a remotely exploitable vulnerability in her tablet's WiFi stack.

Or are you suggesting that grandma should've known better than to let her tablet outside of a Faraday cage?

> Pretending that it's trivial amounts of performance drop without evidence is the wrong approach.

Amdahl's law begs to differ. If it takes 5s for the web site to arrive from the bank's server, spending 5µs or 500µs in the network stack is completely irrelevant to grandma. Upgrading her cable internet to fiber to cut these 5s down to 500ms will have much more positive impact to her user experience than optimizing the crap out of her tablet's network stack from 5µs down to 1µs.



sort by: page size:

Why?

Edit: It seems to me that depending on the market one targets there's not much to be gained from designing for slow connection speeds...


But this is HN, and a lot of us are responsible for the state of web browsing! (Plus a lot are junior and have not considered these performance topics.) Absolutely everyone's connection to the internet should be fast and unfiltered, but if the supply side doesn't behave responsibly, the gains will be lost.

(edit-added)

And let's pause a moment to consider actual bandwidth needs. viop, video calls, and 1080 streaming all need less than 10mbps.

Online gaming usually benefits more from low latency than from bandwidth.

Honestly, only downloading files (app and OS updates) really benefits from bandwidth above 10mbps. And to be fair, now that phone, app, and OS updates are 1GB+, there is some value in increasing bandwidth.

Most day to day problems come from unreliable networks, where latency spikes, or from heavily throttled upload speeds which effectively tie up the overall connection.

And finally, the issue is not in forcing providers to offer reasonable performance, but instead to undo the last 20 years of corporate-government cozying which has been allowed to increase dramatically in the US. This has resulted in effective monopolies with high prices and low performance - and no alternatives.


I don't understand how this is the top comment here. This is a modern equivalent of "640K of memory should be enough for anyone." You're also making your assumptions off of a single device on the network. I don't even know a home that has only one connected device at any given time. It also doesn't consider things like how speed can help an unstable connection as a high speed but with moderate variance can beat out a low speed that is stable. There's just so many factors at play that are ignored here that this comment is odd to be placed so high in a forum of tech literate people.

Edit: okay, not top comment anymore.


Not moving goal posts at all mate.

My original comment, in reply to someone saying "a lot of people already get more than 1G from their ISP" and implying that it's therefore worthwhile to have 2.5GEth on all local devices ends with:

> In this scenario a 2.5G (or 10G) router is all that's really required to get the benefit, while using the existing 20 year old wiring.

I'm sorry if the correlation between having a 2.5G router and having greater than 1G WAN wasn't obvious to you.

Complaining that a quasi backbone link saturates gig eth when my entire point was that single computers are unlikely to need more kind of misses the whole point I was making for an excuse to complain.

I never said no one needs more than gig for anything.


I think OP's point is that many users aren't pushing machines to the limit or saturating 10g links. They just need the equivalent of a several-year-old desktop to run a webserver or whatever. Consumer-grade colo.

> The utility of an internet connection doesn't scale linearly with speed.

Sure, up to a point. But when you've got a family of 4+ sharing an internet connection, 50Mb can be used easily.

A recent example:

* Younger kid running HD Netflix stream downstairs. * Older kid listening to music and downloading a game in his room. * I'm watching an HD Amazon stream movie rental. * Wife is shuffling large PSDs to/from remote storage.

This definitely doesn't apply to everyone, but a family can put some hurting on an internet connection. Particularly during the winter when it gets dark by 5PM.


It isn't about the bandwidth to your home, it's about the bandwidth further up the line. It doesn't scale to infinity.

And that's not opinion: that's a fact of life in tech that repeats endlessly. Induced demand, Jevons paradox, Parkinson's law... the principle shows up endlessly.

Now, what effect it would have on a 1Gbit anonymity network is anyone's guess. All the streaming and web apps on my network don't really impact its normal performance because they're much slower than it. So, this concern might not affect what the other commenter proposes in practice.


There's kind of a chicken & egg dynamic here. People won't experiment with futuristic high-bandwidth applications before they have high bandwidth.

Speed is not so important when biggest bottlenecks are DB and network. Safety is more important.

And why shouldn't they? That's like the whole reason to even have a high-bandwidth internet connection.

I'm not arguing against faster networks, but scenarios like "one fast download makes video streams buffer" can be solved by using better routing algorithms (CAKE for example) instead of making the pipe so wide that it'll never be close to full. One of these is a configuration flag that you can flip today and costs nothing, the other means upgrading infrastructure.

> I mean what's the point of a faster network if you're still capped at 2,5,10GB.

For those (a large majority, I suspect) whose actual data usage is determine by factors that have little to do with the network, the point of a faster network is less time waiting.

For example, I'm going to take about the same number of photos on my phone if I'm on a fast network as I will if I'm on a slow network. The fast network, though, will mean less waiting to transfer the photos to my desktop for editing or for sharing.

A good way to see this is to go the other way. Suppose you have a 10 GB cap. Would you object if your network were slowed down to dial-up modem speed (56 kb/sec)? That's more than enough to transfer 10 GB/month, so what's the point of anything faster if you have a 10 GB cap?


> The network infrastructure to do that is just about there.

There's a long way to go from having enough bandwidth to serve your personal blog or small-time web app out of your {apartment, house, office} to getting what you'd need for a large-scale web app (at a minimum, redundant 100 Mbps fiber connections with five-nines SLAs) routed there. That and getting a nominal 100 Mbps connection from your friendly neighborhood telco is a lot different from getting a 100 Mbps connection you're expecting to saturate 24/7. The telco won't put up with that for very long, because they overcommit their subscriber bandwidth in the (correct) expectation that most of their customers will not use all of it.


In many cases, like satellite Internet access or spotty mobile service, for sure. But if you have low bandwidth but fast response times, that 2mb is murder and the big pile o requests is NBD.If you have slow response times but good throughput, the 2MB is NBD but the requests are murder.

An extreme and outdated example, but back when cable modems first became available, online FPS players were astonished to see how much better the ping times were for many dial up players. If you were downloading a floppy disk of information, the cable modem user would obviously blow them away, but their round trip time sucked!

Like if you're on a totally reliable but low throughput LTE connection, the requests are NBD but the download is terrible. If you're on spotty 5g service, it's probably the opposite. If you're on, like, a heavily deprioritized MVNO with a slower device, they both super suck.

It's not like optimization is free though, which is why it's important to have a solid UX research phase to get data on who is going to use it, and what their use case is.


> There is very little practical use to speeds beyond 10 gigabit for home use

Give it a few years and we'll all be wondering how SPA web apps became gigabytes in size and trying to work out how to make 1Tbs/s to the home practical.


> there aren't many reasons anyone would want more--going to 2.5, 5, or 10GbE in a home network is not likely to allow you to actually do anything faster because at that volume of data you're probably reading from or writing to storage, and consumer storage devices cannot reach even gigabit speeds.

a sata ssd is a bit under 6Gbit. new fast pcie v4 ssd's are a bit shy of 6Gbyte (48Gbit), & your budget models are still frequently 16Gbit & up.

it really makes me sad how widely accepted it is that consumers don't need good connectivity. meanwhile usb4 is arriving & allows direct computer to computer links over regular usb-c cables at 40Gbit. hopefully there's growing discontent at this old old old 1Gbit, & also meh 2.5Gbit ethernet, as ways to shuffle data around.


If you purchase consumer service and expect your 95th percentile usage level to be 100mbps, you’re going to discover the difference between consumer and business grades of service and be eternally disappointed.

If they say 100mbps, they should be able to peak a connection to that level when necessary. If you’re a residential user, ‘when necessary’ is about 1% of the month or less.

Yes, you, a capable HN community member, may have something running in your home datacenter that can continuously saturate 100mbps 24/7/365, but don’t hold your breath expecting that level of service from residential plans.

Residential users need stable bandwidth (for remote working) in 10mbit increments (4K video * number of concurrent streaming devices). If you have a family of 10, then you’ll be potentially needing continuous 100mbps service when everyone is streaming at once, and you’ll probably discover at that point that most cheap consumer routers aren’t going to do a very good job with that (especially on the wireless side). For anyone with less than 10 people in their home, it scales down rapidly to a fraction of 100mbps peak, with a much lower 95th percentile than a business would have.

In an ideal world bandwidth would be infinite and peak/constant capacity tradeoffs irrelevant. We definitely do not live in that world yet. Measuring this rural residential product against the standard of a business/commercial 99th percentile 100mbps 24/7/365 SLA guarantee is inappropriate. It should be compared to the products it’s competing with, not to the highest expectations possible.


(I help people get faster internet connections): One common example I run into is people trying to stream IP security cameras to off-site. Once you have more than a couple of cameras then the upload on cable and DSL isn't enough.

Once they get things set up for their cameras, they realise that they can suddenly reliably watch 4K netflix; some have started streaming to twitch/youtube because it costs them nothing, and heaps more.

On the other side of the argument, there is the 'privacy' aspect of things: most people would never run a home server that runs email/social/backup for you (and connect to it from your phone while out and about). However once fast home internet connections are ubiquitous, I fully expect people to start taking back their data from google/facebook/etc of the world

Faster internet is like wider roads: the mere presence of it increases demand.

next

Legal | privacy