Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

IMO, 5G is about automation and command/control of roving robots, “Smart Cities”, and finally Augmented Reality (AR) in industrial and urban contexts. Applications range from benign factory automation to Boston Dynamics police dogs patrolling your neighborhood, to your walking in downtown Tokyo and being (virtually) greeted by AR entities.

What is it that WiFi can not do but 5G can? 5G’s much lower latencies allow for R/T CnC (which enables all of the above).

So, for example, your factory floor bot’s camera sees something that it’s local AI can’t categorize, so a central node with greater computational power and storage capacity can (in R/T) make decisions and send operative commands:

Do the roundtrip latency calc for below for 4G and 5G:

    node -> base station -> fast network —> central control


view as:

Is WiFi really the slow part there? How does 5G improve on it? No retransmits?

This what the search engine pulled up when asked about 5G latency:

"5G technology offers an extremely low latency rate,the delay between the sending and receiving of information. From 200 milliseconds for 4G, we go down to 1 millisecond(1ms) with 5G. Just think about it. A millisecond is 1/1000 of a second."

--

National Security Implications of Fifth Generation (5G) Mobile Technologies

5G technologies could have a number of potential military applications, particularly for autonomous vehicles, C2, logistics, maintenance, augmented and virtual reality, and ISR systems—all of which would benefit from improved data rates and lower latency (time delay).

https://fas.org/sgp/crs/natsec/IF11251.pdf

Interesting enough, a simple 'r/national security/civil liberty' and 'r/military/police' highlights a peculiarly undiscussed aspect of the enthusiasm of goverments worldwide to implement 5g, so that "gamers can have a better gaming experience!".


That's about what you get with WiFi. It still doesn't matter because the internet will add 30ms in the best of cases.

I don't remember the source that discussed the specific applications. But rtt from edge to basestation is 400ms wifi and 2ms for 5g, which is almost like something vs nothing. So we're talking applications that are OK with the hardwire network latencies (here lets say 100ms rtt including the time for computing). That 400ms rtt for wifi must get the total rtt over some viability threshold, would be my guess.

I'm pretty sure I get closer to 2ms on my local 2.4ghz WiFi network without doing anything special. If those numbers where real then no one could play games over WiFi.

Legal | privacy