I think you are misunderstand where most of that energy comes from. Say I have some refrigerant in liquid form at a low pressure. That refrigerant will have a boiling point of say, -20 degrees F. I use the ambient air, which is say 0 degrees F, to boil the refrigerant into a gas. The internal energy of the refrigerant goes up significantly, but it's temperature does not change. Then, I compress the gas so that it goes up in temperature. This change in temperature does not correspond to a large change in the internal energy of the gas. The same energy is pushed closer together, resulting in a higher temperature[1]. Then, I condense the gas at it's new boiling point, say 100 F, into a liquid. Then you decompress the resulting liquid and the cycle repeats.
Most of the energy is coming from the ambient air, only a small amount of electricity is used to change what temperature that energy is available at.
[1] A very simple mental model you can use to understand the difference between temperature and energy for a gas is that temperature is the number of times the gas molecules bump into each other in a second, while energy is how fast the gas molecules are moving. So if you have a very low pressure gas, even all of those molecules have a ton of energy in the form of motion, the temperature of the gas will be quite low because the gas molecules won't interact often. But compress that gas down to a small volume, and the molecules will bump into each other all the time even though they don't have much more energy.
You probably already know this, but for anyone else reading - it's more than just compressing and decompressing - it's changing phase between liquid and solid!
It takes a lot of extra energy to go from a liquid to a gas - think about how when boiling water, it doesn't all immediately turn into steam when it hits 100 degrees. It takes a long time to boil it down to nothing, even on a really hot stove. That's because switching from liquid to gas takes more energy than just getting hotter.
The refrigerants used in ACs and heat pumps take advantage of this behavior to maximize how much heat they can move - much more than if they stayed as a gas
When you compress a gas, it releases energy as heat. When the gas expands, it absorbs energy by getting cold.
This gas is the refrigerant that the article refers to. Basically, what air conditioners, refrigerators, heat pumps, ect, do, is compress and expand the gas in a closed loop.
The whole theory behind a heat pump is that you get more heat, per unit of input energy, than if you used the input energy directly for heat. The consequence is that you end up blowing cold air outside.
Important to note, which the article does not, that substances which are gaseous at room temperature are chilled to liquid using excess energy (i.e., energy produced at night). These liquids are stored chilled in vacuum-insulated canisters. These liquids are then brought up to room temperature when needed, and their expansion into gas drives a turbine. Presumably, no external power sources (or only waste heat) is used to warm the gases.
Releasing pressurized gas is cold when there is a phase change, i.e. from liquid to gas. Going from high pressure gas to low does involve a temperature drop but it's not going to be anything like a real air conditioner.
The room temperature compressed air still has some internal energy, and that energy can be converted to work by expanding the air through a turbine. The air at the outlet of the turbine will be really cold. Expanding compressed air through expanders like this is part of how air liquefaction plants operate.
Normal refrigeration work by turning a liquid into gas, and gas back into liquid. Doing so heats or cools the medium (one can imagine the molecules inside being more pressed together when in liquid, thus having more energy in a smaller volume of physical space, which translate as being hotter). When turned into a gas the opposite happens with fewer moving molecules in the volume of space.
Going from liquid to solid do not generally causes any changes in volume, but it does releases some energy as part of the conversion process, which if captured could contribute to make both the solid and liquid cooler (less energy among the molecules in the same volume).
With the use of a compressor and a vapor chamber in a fridge you can control when and where the liquid turn to a gas or back into a liquid. The hot place will be near a radiator, and the cold place in a vapor chamber next to the cooling area. With this system the salt would be near the radiator turning the solid into a liquid, heating the radiator. Then you would "remove the salt" in what would be similar to a vapor chamber, turning the medium cold.
The biggest issue I have with this concept is the efficiency. With a compressor you can make a gas really really hot by turning it into a liquid, and that heat will then have a large gradient difference when run through a radiator (which cools it down using ambient air). I am unsure how one would get a similar effect with solid to liquid.
New refrigerants will make a big difference here. The main theoretical restriction is the pressure delta required to go from a low enough pressure to transfer heat into the system at a low ambient temperature, then to a high enough pressure to transfer heat out of the system at a high ambient temperature. That’s what causes the loss in COP at low temperatures—the increased work the compressor has to do.
It’s definitely theoretically possible. There are refrigerants that can do this now with a relatively low delta, but generally they are flammable, toxic, or have high global warming potential.
There are other ways around this like booster systems, but it becomes a $$$ issue at that point.
Refrigerants have to be compressible to increase their temperature above ambient. Or equivalently, to be expandable to cool them. Salt water wouldn't work at all.
> then the boiling point of the refrigerant must be below -20F
This isn't the limiting factor for choice of refrigerant... There is always a low enough pressure that anything boils.
The problem is that at very low pressures (think a few millibars), gasses need huge diameter pipes and huge pumps to move even a small number of kilowatts of heat.
Also the refrigerants matter - unfortunately, the MOST efficient ones also happen to be really bad for the ozone layer so they've been "banned" (there are still plenty of units out there using them - just can't use on new systems).
So there's always a trade-off. You pay for this or you pay for that. And if its energy, you have to consider the material and energy supply chains or you are probably bullshitting people and yourself.
Ultimately an air con is a heat engine so Carnot's law applies. No possible magic beyond what thermodynamics allows.
ALL TOO MANY "inventions" summarily . You even have UC Berkeley getting into the con-game with the Water Seer (which claims violates thermodynamics). You'd THINK UCB wouldn't ever push physics-violating ideas but NOPE. A stain on the engineering school reputation!
>Compressing air necessarily raises its temperature. And then if you want to you can transfer that heat elsewhere, go ahead. That's how air conditioners and refrigerators work, after all.
The oh-so-clever trick is to transfer heat elsewhere while you're compressing the gas, so you actually reduce back-loading on the piston instead of having it 'fight' the temperature rise. This can be accomplished via water spray, or by compressing a gas bubble that's surrounded by water (eg in the trompe).
By continuously removing heat as you compress the gas, it effectively acts like a train of compressors and intercoolers with an infinite number of stages, ie true isothermal compression. No magic, just physics.
You'd definitely still need a compressor - otherwise there'd be no change in pressure driving a change in temperature. I guess you're suggesting that the photomolecular effect could be used to boil the refrigerant at the heat pump's evaporator. Boiling is a bulk process caused when the temperature of the fluid rises so much that all it's intermolecular bonds start to get broken. Evaporation is a surface process where a few molecules randomly break free of their intermolecular bonds and mix with surrounding air. Boiling, not evaporation, occurs in a heat pump's evaporator (despite the name). The photomolecular effect doesn't cause boiling, only evaporation. You wouldn't really want evaporation to happen in an evaporator because you'd need to have air around and then you'd be compressing a bunch of air along with your refrigerant, which would waste a lot of energy.
My first thought was some sort of cooling tower application. Cooling towers use evaporating water to cool various process fluids. But, when building a cooling tower, you want to pull down the temperature of the water by having evaporation absorb the heat in the water. This evaporation process actually reduces the amount of heat that gets absorbed from the water because it uses energy from light and heat from the air instead.
If this has engineering applications, it will likely be in places where the end goal is the evaporation of the water itself, such as a drying process or passive desalination.
Can you explain to me how an entirely gas-phase refrigeration cycle would work? The phase change is the whole purpose of the system in a traditional refrigeration cycle.
Boyle's Law says that compression and temperature reduction are equivalents: PV = nrT
That is, increasing temperature is equivalent to increasing pressure, and vice versa: decreasing either is equivalent.
The problem with liquifying -- cooling -- a gas, is that:
1. You're removing thermal energy. Which itself cannot be usefully stored. So you're losing that unless it can be applied to some local low-quality heat process.
2. Re-gassifying the liquified air requires energy. If you've managed to store (some of) the removed heat, you can apply that. Otherwise, whatever you're using to introduce heat to the liquified air will itself get very* cold, very quickly, and eventually reach thermal equilibrium. Alternatively, you could apply a fuel-based heat source sufficient to boil off the liquid, but that's going to cost you energy.
Depending on the temperature of the freshly-generated gas, you're also going to be chilling whatever generating process you've got (probably gas turbine), which means both metal embrittlement and potential for frosting if there's any degree of water vapour in the air.
The more usual form of air-based energy storage, compressed air energy storage (CAES) likewise has problems with both heat loss and chilling on expansion. Compressing a gas heats it, and that heat will tend to escape to the environment, similarly to the case for chilling. On the energy-recovery side, expanding the gas to run a turbine will cool it (and the turbine) rapidly. Many CAES designs incorporate natural gas simply as a heating function to heat the freshly-expanded gas, meaning the storage system is not a no-fuel system, though it requires far less fuel than a conventional natural-gas generating plant.
The biggest issue I have with the system as described is that the re-expansion of liquified nitrogen isn't free, and requires a source of external heat. Given the phenomenally cold temperature of liquid nitrogen, any passive heating design will rapidly approach thermal equilibrium with the stored medium, limiting the rate of net energy release.
The long and short of it is that if the heat pump works below -20F, then the boiling point of the refrigerant must be below -20F. This, in turn, implies a higher pressurization (as per the Clausius-Clapeyron eq) required in order to achieve a T_hot of 80F (or whatever output temperature you want. The higher pressurizations require more expensive components and compressors.
As I understand it, they're refrigerating air, not compressing it. If so, your calculations are entirely inapplicable. Carnot efficiency approaches 100% as your cold reservoir approaches 0 K, so it's plausible for such a cryogenic system to get very high efficiencies from an air motor, much higher than normal.
The problem only happens when no one bothers to learn how something works. Look at all of those big iron systems out there that few people know how to program, there is reason Cobol and Fortran programmers still make good money.
Oh and refrigeration is simple, it is just an application of the ideal gas law PV=nRT, and a pump.
Refrigerant is compressed then cooled through use of a heat-sink then pumped into the refrigerator and allowed to expand where the refrigerant absorbs thermal energy and is pumped out and repeated.
One very interesting refrigeration cycle I heard about recently is using a proton exchange fuel cell in reverse as a compressor for hydrocarbons or ammonia in a closed loop.
Protons jump across the membrane, creating a small pressure differential that is allegedly big enough to do heat pumping.
Most of the energy is coming from the ambient air, only a small amount of electricity is used to change what temperature that energy is available at.
[1] A very simple mental model you can use to understand the difference between temperature and energy for a gas is that temperature is the number of times the gas molecules bump into each other in a second, while energy is how fast the gas molecules are moving. So if you have a very low pressure gas, even all of those molecules have a ton of energy in the form of motion, the temperature of the gas will be quite low because the gas molecules won't interact often. But compress that gas down to a small volume, and the molecules will bump into each other all the time even though they don't have much more energy.
reply