Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It's probably a very old decision, from times when most desktop computers had one CPU and threads were mostly used to keep GUI responsive while the application was churning data in the background, so there was little interaction between them.


sort by: page size:

20-odd years ago, my team solved some performance issues in our desktop app by splitting a couple of tasks into their own threads on a computer with a single CPU. Being able to write threaded code is really useful.

I don't get you. Are you saying that the only reason people would want threading capabilities is to exploit multicore/multiprocessor architectures?

That isn't true. Having multiple threads for GUI applications makes sense for responsiveness even on a single-core machine.


But relatively few desktop computing applications are single threaded.

That's called cooperative multitasking, and it was roughly how things worked in the Windows 3.1 era. Nowadays it's much better to use real threads, as they much better protect against latency (responsiveness) issues.

The problem with this is you only get 1 thread for the UI and other stuff to run on.

Forgive my ignorance, but why would doing all of the application work on a single thread make more sense 35 years ago?

Anyone who has seen the responsiveness of e.g. AmigaOS under heavy load next to many modern systems might be inclined to want (more) multi-threaded GUIs.

Heavy use of multi-threading to disconnect GUI updates from the actual work was essential to making that happen.

AmigaOS sacrificed throughput over responsiveness all over the place (e.g. something as trivial as cut and paste from a terminal would easily involve half a dozen threads with message passing).

You don't need separate threads for every little component, though.


That's significantly less true than it was in the past. I think it's pretty easy to argue that the browser is the most important desktop app now and all modern browsers are multiprocess / multithreaded.

Beyond that, almost everything that does signal processing of some shape or another (audio, video, whatever) is multithreaded these days, so that one's out too. Compiling code is multi-process too.

I actually struggle to think of important work loads that are single-threaded these days. Obviously most simple apps are, but mainly because they don't use enough CPU to bother optimizing. We're more than a decade into multi-core systems being the norm and most CPU-heavy apps have taken advantage of that.


From my freshman years around 2005, I vaguely remember that multi-thread GUI apps were:

- a way to avoid blocking the UI,

- tool to gain some perf. on new fancy multi-core processors,

- making the code much harder to read and debug, which often makes them a victim in a cost-benefit analysis.


I've always really liked the idea of cooperative multitasking: fewer gotchas, possibly safer memory access, and easier to control where/when a thread switches. of course in those days we had to deal with timing issues, couldn't let a cpu hog block another cpu hog :)

There are valid reasons for programming with threads with only a single CPU (indeed when I learnedb thread programming in the 90s pretty much only single core cpus were available). For instance, IO blocking.

OK, but surely there are cases where it would be useful (I would argue necessary) to have a single process with multiple threads - when dealing with GUIs for example.

Yeah, s/everyone/webdevs.

Typical audio software, for example, has been multithreaded since the 90s (simply because the audio callback and the GUI must run in different threads). And in the mid 2000s these threads started to run on different CPUs.


People have been warning developers to 'get ready for multithreading' for at least a decade and somehow everything is mostly the same. Mostly because of abstractions (e.g. on GPUs) and also because the USERS of our software are ALSO getting parallelised! So we're back to one thread for one user, since most of the time you really don't want to do loads of work for one user request. Cases where parallelism matters (graphics, data stores, query engines) are already pretty solid on multithreading anyway.

Meh.


That's not really a good thing, though. You don't need "a thread per window" especially w/ modern async computation, you just need the UI to be in a separate thread from anything that might keep the CPU busy to begin with.

Why would it be difficult to turn off web sockets or turn off multi-threading? Multiple threads already run on single cores all the time, there is no reason a web browser couldn't just put them all on one core.

It's a decision choice to be single-threaded. Using threads requires locking or sophisticated concurrent data structures which can sometimes outweigh benefits in both code correctness/maintainability and performance.

In the late 80s and early 90s we wrote things with threads but they were primarily a kind of convenience to get multitasking behaviour and not any kind of performance boost.

Multicore / multiprocessor systems were not a mainstream thing in consumer hardware until the 21st century.


Well, multithreading got more important with time, so it makes sense to me.
next

Legal | privacy