I ran a quick `while true {}` in the background. It used 7B cycles per second (twice the clock speed for some reason?), while Discord uses about 700M. Discord does drop lower if I minimize the window. But still, thousands of context switches per second, while completely logged out. I have no idea what it could be doing.
You're right that I'm probably not benchmarking rigorously enough. But people don't care about any of this stuff. They care about "my FPS gets lower when Discord is open, pls fix". And they don't want to micromanage CPU usage either. Make sure you minimize Discord after starting the call, make sure you close that gmail tab during the game then open it back up when you're done so you don't miss notifications? No thanks. I'd rather just pay for a few more cores.
I guess what triggered me is you saying you cringe at the end users. They don't deserve that. They just want the stuff they buy to work reliably. And if they need a bunch of extra headroom for that to happen, well, that's just the sorry state of the industry. If you must cringe at someone, direct your cringe at developers.
Discord uses 0% CPU (4770k) and 85mb ram on my pc currently, while being in an active call. What would it take for it to no longer be "too bloated" to have idling in the background?
Background services (Discord, etc) use a negligible amount of processor power. It should be like sub-5% on a modern processor, with the processor clocked down to its lowest frequency. With the processor at turbo, you should be pulling 1-3% easily, even actively using multiple tasks.
These days even Chrome is very aggressive about throttling background tabs, much to the consternation of some webdevs here.
I haven't read through the forum thread as there are ten pages and much of it seems off-topic. But the requests per second doesn't even matter here, the fact that you were capable of maxing the CPU at all shows me you did something wrong. I easily get 12k requests on an i7 box and the CPU wasn't even stressed then. Eventually it was IO that limited it.
interesting, i've actually been oblivious to the mechanics when i'm using it (too much business logic to do).
I'm also using it and burning a whole core doing a busy wait. I was going to try and fix this but i'm not sure how much i care. if we aren't paying for cpu time then it doesn't matter
it's an obscenity that somebody pasting one high frame rate GIF into a slack channel can cause a quad-core, tenth generation core i7 laptop CPU to peg itself at 85% usage and begin wasting battery, heating up, spinning fans, etc.
if somebody told me twenty years ago how powerful this CPU was, relative to what I was using at the time for a desktop PC, and that a chat application could pretty much max out the CPU and use multi gigs of RAM, I would have laughed...
Just ask me to test. I have a real crap PC and constantly scream about devs who think everyone has multi-cores and multi-GHz. Every day, this machine becomes slower due to website and browser bloat. And I'm not the only one. All the others think it's something wrong on their end.
I do it out of respect for my hardware. On Windows, it wasn't unusual to be idling ~45c with Spotify and Discord open. That same workload stays well under 35c on Linux, even with my rather paltry 6-year-old CPU.
Yeah that stood out to me too, since when do people boast about how much resources they consume?
That's 2k customers (not concurrents) per core, which is a terribly low rate. Even if they've only been running 1 month that's 1200 seconds (20 minutes!) of CPU time per customer, and it gets rapidly worse the longer you assume they've existed...
> "the stressgrid benchmarks look bad because they forgot to turn off the "spin your cpus" setting"
stressgrid blog post:
> "When running HTTP workloads with Cowboy on dedicated hardware, it would make sense to leave the default—busy waiting enabled—in place. When running BEAM on an OS kernel shared with other software, it makes sense to turn off busy waiting, to avoid stealing time from non-BEAM processes."
stressgrid seems to recommend the opposite of what you mention, taken from the blog post you shared [0].
Why, starting I think about 30 or so days ago, does Discord start using insane amounts of CPU time when it is started on a system that doesn't have network connectivity? On a Surface 3 Pro, A newer laptop, and a desktop system I have seen this happen. It will just use 20-40% of available CPU unit networking is restored.
Have run a web-app with 30K+ users. Only been 'harassed' when I've been warned that I'm running the CPU at 350% of max rate and I should be concerned. That's ok in my book.
You're right that I'm probably not benchmarking rigorously enough. But people don't care about any of this stuff. They care about "my FPS gets lower when Discord is open, pls fix". And they don't want to micromanage CPU usage either. Make sure you minimize Discord after starting the call, make sure you close that gmail tab during the game then open it back up when you're done so you don't miss notifications? No thanks. I'd rather just pay for a few more cores.
I guess what triggered me is you saying you cringe at the end users. They don't deserve that. They just want the stuff they buy to work reliably. And if they need a bunch of extra headroom for that to happen, well, that's just the sorry state of the industry. If you must cringe at someone, direct your cringe at developers.
reply