Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Thermals are never a solved problem. System designers simply use up what is available (and usually won't stop there).

It's like saying processing speed is a solved problem. Engineers/scientist don't have any problems finding productive ways to use more processing speed.



sort by: page size:

IMHO the real reason is CPUs and manufacturing technologies have become extremely complex. So although there are known, simple ways to address the heat dissipation it takes a lot of time to implement them.

I keep seeing thermal issues come back to bite some of the smartest companies with products designed by the world's best engineers. Everything from batteries to laptops to phones is severely affected by heat, and improper thermal design can lead to some huge issues.

For example, the biggest thing affecting battery life is thermal load. The biggest thing affecting the maximum sustained power of a laptop or phone is the thermal design. Anyone can slap a cheap, powerful chip in; good thermal design is much harder.

I think people are going to start realizing this more and more, especially as powerful chips get extremely cheap.


And? Semiconductor thermal engineers are paid salaries to solve exactly that.

This is already a thing in lower-power applications. Thermals become an issue in higher-power applications.

The issue is more that transistors will always generate some heat- making the threshold higher / being more efficient doesn't solve the fact that there's still some waste thermal energy that needs to be removed.

I feel like there must be some clever solution to this using heat sinks / heat pipes.

True, but usually the problem is removing the heat from the chip, not the total amount of heat produced. If the heat is mostly produced by the cooling system then that problem all but goes away.

> energy efficiency (...) many different implementations

Yep, thermal throttling is a thing, and sometimes all you need is either useless silicon padding or some specialized, most of the time dark silicon to both make it feasible to cool and prevent it from melting.


>temperature

Using a proper thermal interface material in their CPUs would be a start...

When you can decrease the temps of Intel CPUs by 20°C with delidding, the heat argument seems quite constructed.


I suspect heat is a bigger problem than you're assuming, but also, you would want a similar architecture to simplify the os code.

Then it's clear what must be done in the CPU world -- we need to insulate our CPUs better and eliminate all heat transfer.

From my understanding many years ago, solving the heat dissipation problem is the real problem behind higher transistor density. Is that still the case?

Reason for it is thermals

> A few overheated transistors may not greatly affect reliability, but the heat generated from a few billion transistors does. This is particularly true for AI/ML/DL designs, where high utilization increases thermal dissipation, but thermal density affects every advanced node chip and package, which are used in smart phones, server chips, AR/VR, and a number of other high-performance devices. For all of them, DRAM placement and performance is now a top design consideration.

I know this may not be a cheap solution, but why not start selling pre-built computers with active cooling systems? Refrigerant liquids like those used in refrigerators or water cooling could be an option. The article addresses this:

> Although it sounds like a near-perfect solution in theory, and has been shown to work in labs, John Parry, industry lead, electronics and semiconductor at Siemens Digital Industries Software, noted that it’s unlikely to work in commercial production. “You’ve got everything from erosion by the fluid to issues with, of course, leaks because you’re dealing with extremely small, very fine physical geometry. And they are pumped. One of the features that we typically find has the lowest reliability associated with it are electromechanical devices like fans and pumps, so you end up with complexity in a number of different directions.”

So instead of integrating fluids within the computer, build powerful mini-freezers for computers and store the computer inside. Or split the warm transistors from the rest of the build and store only those inside the mini freezer, with cables to connect to the rest of the computer outside.


Presumably the big challenge is cooling?

Interconnects and I/O are probably a challenge too.


Maybe design electronics that are happy at higher temperatures?

Thermal properties of the chip and any issues arising from those are pieces that the engineers should have sussed out in the beginning though. If I don't put a heatsink and fan on my desktop CPU, then is that Intel's fault? Of course not.

Hopefully the firmware can take care of the issue for those impacted.


Most of that heat is generated in the silicon, not the wiring. Unless we discover a semiconducting superconductor with which to replace the silicon junctions, we'll still have plenty of heat dissipation in the CPU.

It was never a solution, Moore's law has more than one dimension as well, not just density but heat dissipation. Can't cool down a transistor that's surrounded by transistors on all sides.
next

Legal | privacy