Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Rice University researchers propose a way to boost solar efficiency (polyarch.co) similar stories update story
283 points by shry4ns | karma 431 | avg karma 5.01 2019-08-09 13:12:22 | hide | past | favorite | 127 comments



view as:

This is what we need before we're all fucked.

probably too late but we should never give up.

Best that we prepare a smooth transition for our robot overlords.

a day late and a dollar short I'm afraid, bud.

original article: https://news.rice.edu/2019/07/12/rice-device-channels-heat-i...

Sounds like 80% is a theoretical prediction for this method, not an experimental result:

Naik said adding the emitters to standard solar cells could boost their efficiency from the current peak of about 22%. “By squeezing all the wasted thermal energy into a small spectral region, we can turn it into electricity very efficiently,” he said. “The theoretical prediction is that we can get 80% efficiency.”


That really needs some editing, it reads like gibberish.

Just go to the source: https://news.rice.edu/2019/07/12/rice-device-channels-heat-i...


I am especially infuriated by how it repeats a common misconception of infrared light as the epitome of "heat", as if it was some kind of separate particle.

I'm not infuriated, but having teachers and other adults conflate IR with heat certainly added to my confusion about the physical world when I was a child.

Sorry for a potentially stupid question: why do they need to convert heat into visible light? Why can't we convert heat directly into electricity at the same time as we convert visible light into electricity?

"Light" here is a misnomer. What they're doing is taking the wide array of infrared wavelengths and channeling them into a single wavelength. It's easier to build a solar cell that's tuned to that wavelength.

I don't know why they're calling that "light". It's not clear to me if it means visible light. The paper seems to imply that it peaks in the near-infrared. They appear to be using "light" in the general sense of "photons"... but it's all photons.

They're talking about very high temperatures, the kind that emit significant visible light as well.


You're confused – heat isn't being converted into anything. The article conflates the words "heat" and "infrared radiation", which are not at all the same thing.

As a rule, everything around you glows – in other words, it emits photons. Some infrared ones, some visible ones, some ultraviolet (or even higher-frequency) ones. How much of each depends on the temperature of the material. Most things (e.g. you) are too cold to be visibly glowing, but they still emit plenty of infrared radiation.

Now, standard solar cells convert visible photons into electricity. Any incoming infrared photons, however, aren't strong enough to kick any electrons loose, so they go to waste. Here, the researchers used carbon nanotubes to absorb the wasted infrared photons. This heats up the nanotubes, which then emit visible photons, which then increase electricity generation.

There is no way to convert heat to electricity directly, unfortunately. However, you can get small amounts of electricity out of temperature differences: https://en.wikipedia.org/wiki/Thermoelectric_effect


This heats up the nanotubes, which then emit visible photons, which then increase electricity generation

Once a solar cell is built with this technology, do you think there will be enough light emitted by the nanotubes to make the solar panels appear to glow, themselves? Because that would be pretty interesting.


If the solar panels appear glowing, it's because visible light is being dispersed away from the solar panel itself. The consequence of that would be wasted photons that could otherwise be converted into electricity.

So, from a pragmatic perspective, I would say that no, they wouldn't glow - proper engineering would ensure these particles are collected.


The idea is to direct all of the nanotube-emitted light onto the PV cells, so hopefully you won't see anything. Citing the Rice article:

> Because electrons in nanotubes can only travel in one direction, the aligned films are metallic in that direction while insulating in the perpendicular direction, an effect Naik called hyperbolic dispersion. Thermal photons can strike the film from any direction, but can only leave via one.


I see, thanks! Why are visible light photons "stronger" than infrared photons?

Visible light photons have more energy each. The shorter the wavelength, the higher the energy of the photon.

Light with a short enough wavelength (UV-B, for example) has enough energy to damage the DNA in our skin and increase our risk of skin cancer.


If that's the case, how is it possible that low energy photons absorbed by carbon nanotubes can generate high energy photons?

Because it's not a one-to-one conversion. See e.g. https://en.wikipedia.org/wiki/Second-harmonic_generation

>This heats up the nanotubes, which then emit visible photons

I'm a bit skeptical that that would work. Otherwise you could build a perpetual motion machine by having the nanotubes in a hot room and running your perpetual motor off a solar cell powered by the light.

My guess is when the article says emit light they mean emit infra red light / radiation rather than visible. The article doesn't really make sense to me - I imagine someone got confused by the physics rather than it being an actual breakthrough in energy generation. I note the article says "propose a way to build" rather than having actually built something that works.


The article doesn't provide very much detail, but it's absolutely possible to absorb multiple low-energy photons and emit fewer higher-energy ones, see e.g. https://en.wikipedia.org/wiki/Second-harmonic_generation – and apparently carbon nanotubes can do this as well.

> This heats up the nanotubes, which then emit visible photons

Critically these photons are not heating up the material. It’s like saying photons reflecting off a mirror heat it. No, the photons that are not reflected do heat the mirror, but not the reflected ones. (At least mostly.)


You're correct, that was sloppy wording on my part.

"There is no way to convert heat to electricity directly, unfortunately."

Well, if you have enough heat, you can boil water and then use the steam to turn a turbine... I suppose in some sense this is dependent on a temperature difference, but I wouldn't call the amount of electricity we get by this method 'small.'


> I suppose in some sense this is dependent on a temperature difference

It is absolutely dependent on a temperature difference, because it relies on cooler water that isn't already flashing into steam.


What is the accurate model of infrared light vs. heat?

Hot things glow. Medium hot things (like you) emit lots of infrared light. Really hot things (like lightbulbs) emit infrared and visible light. Ridiculously hot things (like the sun) emit infrared, visible light, and higher-energy radiation (UV etc.).

The exact details depend on the chemical composition of the material, but for a simple model, check out https://en.wikipedia.org/wiki/Black-body_radiation


It's a bit complicated, but taking a stab at it:

All physical objects emit electromagnetic radiation corresponding to their temperature (as opposed to reflected, reaction-induced, or stimulated radiation, more below), with a corresponding blackbody radiation curve. Note that blackbody emissions don't have a specific frequency (say, unlike laser light), though the curve has a peak emission.

Traditional incandescent light bulbs are blackbody emitters, as is a hot coal, the Sun and other stars, or the glowing elements of an electric radiant heater. Human eyes see blackbody radiation, in the visible range, as having a characteristic colour, which paradoxically gets bluer as the temperature gets higher, so the ruddy tones of low-voltage incandescent lamps are low colour temperature, and the intense white of halogen lamps are high colour temperatures. Blackbody radiation and colour temperatures are given in Kelvin, with typical visible light corresponding roughly to ~3,000 - 8,000K. You might also recognise these values from gamma or colour correction values on monitors and video equipment.

(Colour temperatures were used to judge processing for ceramics and metal processing / firing / smelting processes as well. They're also used to classify stellar temperatures, and in conjunction with brightness and/or distance estimates can be used to find stellar sizes.)

Infrared light is characteristic of peak blackbody emissions of bodies at or near "room temperature". So IR is not heat, but is associated with hot objects. Heat itself is ... thermal energy, which is its own complex thing.

Since all EMR carries energy, all EMR heats objects in which it is absorbed. A sufficiently intense emission above the IR range will also heat an object. However hot objects in near proximity transfer a great deal of thermal energy directly as IR emissions. The illuminating capability of "white" light (typical solar emissions) allows a small amout of EMR energy to provide a high degree of visual information without imparting much heat on the illuminated objects (though it does impart some). Consider that even an (inefficient) 100W incandescent bulb, whilst hot to the touch, does not significantly heat the objects it is illuminating, and a far more efficient equivalent LED lamp, drawing about 15W of electrical power, accomplishes the same illuminating power with vastly less heat. We don't have to toast things to be able to see them, just bounce a small amount of (high-energy) visible light off of them.

Not all light (or more accurately, EMR) is blackbody radiation. There can be chemically-emitted light, directly from chemical reactions. The blue glow of methane (natural gas) combustion is a chemical emission, contrasted with the yellow-white glow of a candle, actually blackbody emissions of suspended soot particles in the smoke plume. (There's a blue region of chemical emission at the base of the flame.)

Chemical emissions are based on specific frequencies of EMR emitted as electrons transition between energy states, if I understand correctly are a quantum phenomenon. Fluourescent and LED lamps also work based on chemical / valance emissions, and operate in far narrower bands than blackbody emissions -- one of the reasons these lamps can appear harsher than incandescents. Similarly various chemical lamps, especially high- or low-pressure sodium vapour, formerly popular as street lighting, which emit in narrow bands (low-pressure especially).

Some forms of ionizing radiation are also EMR emissions, at far higher energy levels, triggered by nuclear rather than electron emissions. Typically gamma-rays.

And stimulated emissions such as lasers and masers are ... another phenomenon I understand only poorly, but are also tuned to very tight frequencies. Radio and microwave emitters are somewhat similar.

https://en.wikipedia.org/wiki/Black-body_radiation

https://en.wikipedia.org/wiki/Infrared

https://en.wikipedia.org/wiki/Heat


Thanks, very informative!

While this is a very interesting and ingenious development, what's needed for greater deployment are lower costs per unit of energy. Increasing efficiency will reduce the cost of land, but land isn't typically a huge factor in solar deployments. In this NREL analysis, land acquisition gets put into "Other Soft Costs" along with permitting, inspection, interconnection, sales tax, engineering, procurement, construction, developer overhead, and net profit:

https://www.nrel.gov/docs/fy17osti/68925.pdf

And that broad category is ~25% of costs.


We shouldn't discourage progress just because it doesn't fit exactly into todays model of the world.

I certainly didn't mean to discourage this at all!

But I think it's important to point out that even though people often disparage solar photovoltaic for having low "efficiency," that metric is not an impediment to its broad deployment or great utility to us.


Sure it is. Lower efficiency means you need more panels, larger panels, and more support infrastructure. It limits where it can be deployed and makes it less competitive.

And the need for more/larger panels and support structures will only be a win if overall, the costs decrease.

The cost is what makes the decision for deployment, not the efficiency. Perhaps this efficiency makes 2-axis tracking economical enough to justify, and then it gets deployed, but in the end the efficiency wasn't as important as the improved costs.


I'm no expert, but if the panels now generate additional energy/money that offsets the cost of adding these nano-tubes, doesn't that mean you can meet your power needs with less hardware, or generate extra money to sell back to the grid? It seems like what matters is whether these nanotubes generate more value than they cost.

To use a Computer Science paradigm. Performance (Efficiency) is not the end goal, but performance (efficiency) acts like currencies so that you can 'buy' other things from it.

Yes, and since installation of home solar panels is a small fraction of the total installation cost, reducing the overall size of the project could save money even if the panels themselves are a lot more expensive. https://earthtechling.com/solar-panel-cost/ estimates they are only 15% of the total cost of a home solar power system.

Not really. The difference in the land cost is effectively negligible. At this point (and it has been this way for many years), the limit in solar deployments is cost of panels and storage limitations.

There are vast chunks of empty land in AZ, NV, NM, TX, CA that could easily fit enough solar panels (each) to power all of the US. The issue is cost of panels and no base load supply.


Isn't this pretty different for utility-style solar installations versus the rest? For the former, land cost is clearly a variable. But especially for home PV, it seems like land cost is fixed.

There are multiple models (markets) even in today's world, and they have different needs and constraints. When you have lots of space but not a lot of money, you probably optimize for least money per unit of energy. When you have money but limited space (skyscraper rooftop, spacecraft), you might want the most energy per unit of area. And there are times when you need a secondary non-grid solution for redundant backup, so reliability matters more than other factors.

Different technologies with different pros/cons could be best for different markets.


Not all applications involve land.

Fixed size applications like rooftop would see a multiplier.


And before that, we'll see this used for satellites - if there's not too much of a weight penalty.

If you could multiply the efficiency of a given solar cell by 4, all else being equal, you would also reduce the manufacturing cost by 75%.

From other comments, I am guessing this result doesn't give anything like that but one should keep in mind greater efficiency certainly could give great payoffs.


And installation costs, which are enormous.

I'd expect most of the costs scales linearly with the number of panels: land, installation, maintenance, interconnect, so this should translate pretty directly to cost savings.

Welp, seems like a 4x improvement in efficiency would result in a lower cost per unit of energy.

Nano tubes are super expensive

They have no use at scale yet. Expect prices to drop exponentially once a use at scale is found.

This sounds great, but for some years we hear of various discoveries that will greatly increase solar panel efficiency. Why are we still at 22%-ish?

Usually the materials to get higher efficiency are prohibitively expensive or dangerous to use at scale.

The primary issue with everything in this category is that plenty of things are possible in the lab or in theory which just isn't possible when applied in reality and at scale.

Blame journalists for hyping this because of the climate change focus.

I am involved in actively looking for better materials and better sources of energy. The reality is that there are no fundamental breakthroughs since oil and nuclear.

No matter what you hear we just haven't had anything that fundamentally changes the game.

So instead of actual breakthroughs in the fundamentals, we get marketing, branding, and communication. But this is not something that can be fixed with that, it's physics, not product innovation

We will most likely get fusion (far out) fuel cells (even further out) before we get any fundamental breakthroughs here.


Why do you consider cheap solar not a breakthrough? If I 30-year mortgaged some solar cells, I could put them on my house right now and save $20 a month on net because of reduced electricity costs

Well, in the grand scheme of things, photovoltaic cells were developed back when nuclear reactors began to appear as well. They are not a new technology, just something that is getting optimized diligently now. In that regard, the parent post is technically correct. But that view is focused too much on big shifts in the market, if you ask me. Smaller, more incremental steps have added up to amazing things around wind and solar power.

Oh, it's definitely amazing relative to where it was, just not to where it needs to be to compete.

Solar and wind is less than 1% of world energy consumption and is not expected to be more than 3-4% in 2040.

Energy is not a product/market problem it's a physics problem and we haven't had any major breakthroughs in that space for a long long time.

https://www.iea.org/weo/?fbclid=IwAR3eH1AFcRSPSEit8JINLCMvE_...


This... doesn't have to be the case. Some markets already have way higher renewable penetration. Solar and wind represented about 20% of all generation in California in 2018 (https://ww2.energy.ca.gov/almanac/electricity_data/total_sys...). In the UK this year, solar and wind are almost 25% of generated electricity (https://assets.publishing.service.gov.uk/government/uploads/...)

Incremental improvement and economic forces do matter. Solar is now cheaper than coal in many climates, and we are seeing the effects.

Of course, there are still many problems to be solved. But we're also working on incrementally on solutions to those. For example, battery cost is falling precipitously. "grid scale battery" was not a thing a decade ago, now you see installations popping up regularly.

I'd like to see cold fusion as much as the next guy, but dismissing incremental progress is foolish.


Wind's about 20% in Texas, too, and growing.

http://www.ercot.com/content/wcm/lists/181766/IntGenbyFuel20...

The state produces by far the most electricity of any state, too. It's almost twice as much as Florida.

https://www.eia.gov/state/?sid=TX

Unfortunately it's also a huge petroleum and natural gas producer. About half of the energy used in the state is for industrial use, much of it for the oil and gas industries. On the bright side, natural gas is displacing coal in the meantime while solar and wind are not replacing it fast enough. It needs to get better, but things could be much worse.


You are talking about electricity, not energy.

Electricity is only ~20% of the entire energy usage as far as I remember.


That depends on where you are. I asked around a bit, and solar is a net negative for me (cost-wise). The only option one company gave me required converting the roof of my garage to a flat surface (from a standard peak), which is utterly impractical around here.

And you can use them to power your watch perfectly fine too.

That doesn't mean it will work at scale for society which is the primary issue.

The key thing to look for is energy density as that will be more likely to give society a bang for the buck.

Distributing solar cells out with the capacity factor of solar isn't economically feasible for society and neither possible as a main source of energy.

Nuclear and Hydro to some extent Thermal none of them are in vogue.


These are arbitrary metrics, why does energy density matter for anywhere which has the space? Even in Northern latitudes and high density countries you can generate a lot of electricity with a nominal amount of land. All the more if they can be fitted on rooftops. For sunny places near deserts it can absolutely provide a big percentage, even a dominant percentage of power. The main issue is matching demand to supply, production density is a distant concern.

Energy density isn't arbitrary. It normally means something can be used in compact form (batteries, oil, gasoline, coal, uranium) and deliver energy at will.

Look at the capacity factor of wind and solar, add to that the huge areas they need and the fact that they are intermittent and you start to get a glimpse of the problem.

This is neither economically nor technically feasible. You can't generate as much as you think and you can't do it at a cost that makes it feasible generally for society not even with lowered cost.

Currently, we are talking 1% of world energy consumption not expected to be much more than 3-4% in 2040 and that's despite huge investments and all the political goodwill you can ask for [1] Keep in mind that the numbers you normally see displayed is for electricity not for energy. Electricity is only a subset of energy.

Furthermore, investments in solar and wind is decreasing especially when you take china out of the equation. [2]

And again. Lab results or theoretically possible advantages most of the times aren't feasible in reality and at scale.

[1] https://www.iea.org/weo/?fbclid=IwAR3eH1AFcRSPSEit8JINLCMvE_...

[2] https://www.globalresearch.ca/growth-renewables-stalled-inve...


Those IEA renewable projections are comically wrong every year:

https://pbs.twimg.com/media/DsX2rpPW0AIVORG?format=jpg&name=...


Ate they wrong about the current situation?

Why is it not feasible? I run 32 panels on my roof, in the pacific northwest which is known for its rain, it covers 100% of my residential and electric car usage when considered on an annual basis. If roofs required solar panels we would have a large part of power generation done at a much lower intallation cost and could focus on large scale energy storage projects instead. My payoff period is 11-12 years, system is warranted to 20. No breakthrough required.

>I run 32 panels on my roof, in the pacific northwest which is known for its rain

Aren't you still connected to the traditional grid anyway?


That's right. All my neighbors use the excess solar power during the summer, and during the winter I use more grid power than I create. I get credits in the summer and use them up during the winter.

A giant leap in solar efficiency doesn't solve the storage problem, and the storage problem doesn't need a magic physics solution, just large scale heads down engineering. My point is: solar is efficient enough now, and a sound economic choice. I make enough power in a year just with my roof for my house and car. I do need a storage solution to do power leveling, which is what I use the grid for now.


It does need a magical physics solution unless you are fine with keep using coal, oil, gas and nuclear as backup.

When you factor in the infrastructure to support you when you don't have power yourself then it suddenly looks less economically sound.

That's the point.


That's where we disagree. You think power storage will take a magic physics solution, and I think it is normal engineering (pumped storage, rail/gravity potential generators, utility scale battery banks for leveling).

5000 classified patents per year in the US alone. No one else publishes a number.

Modest estimate would be 200 000 over the last 30-40 years.

Some of it must relate to energy.


Those patents aren't about fundamentals in physics. This is a physics problem not a product/market problem.

There are plenty of patents on energy just not many fundamental breakthroughs.


Fuel cells have been commercially available since the 1960s, though they were mostly confined to space program applications then.

Automobile manufacturers have produced small numbers of fuel cell vehicles since the turn of the millennium:

https://en.wikipedia.org/wiki/Fuel_cell_vehicle#List_of_mode...

I'm not making claims about the actual merits of fuel cells, mind you. But claiming they're further out than fusion is bizarre. Some people are driving them around today. Nobody is generating net power from fusion today.


There are many types of fuel cells.

I am talking about the kind of fuel cells which can be used by wind and solar to store energy and distributed economically feasible.

We are not even close to that. If you know of anything, by all means, please let me know as the group I am part of would invest in it in a heart beat.


I am talking about the kind of fuel cells which can be used by wind and solar to store energy and distributed economically feasible.

I don't know of anything like that either. Small fuel cells rely on expensive precious metal catalysts and large ones have too much capacity to be a good match for the distributed generation sector.

A technology that is commercially available but too expensive at present for large scale adoption, like fuel cells, is more mature than a technology that hasn't been demonstrated even in a cost-is-no-object context. That's why it's strange to hear fuel cells described as even further out than fusion power.


Again no. Fuel cells that can be used for wind and solar to store their energy isn't close at all it's not just cost it's also the physics.

Could have issues with longevity, too. Who knows.

Because everyone reports relative gains in these articles and not absolute efficiency.

In the past decade, (2010 -> 2019) the area efficiency of commercial crystalline solar modules has improved from 14% to 20%. This is modules, not cells. That's an increase of more than 40%. More detail here - https://www.nrel.gov/pv/cell-efficiency.html

Cost. We can make single junction GaAs solar cells with efficiency close to 30% (damn close to the theoretical limit for a single junction cell of ~33%)and we can make GaAs based multijunction cells that have efficiencies above 40%. It's just that those cells are so expensive that they really are only useful for niche applications (e.g. space).

Meanwhile silicon and CdTe solar cells continue to decrease in price and increase in efficiency year after year. At this point the current technology is cheap enough to be profitably used to supply power in many regions of the world. Any one who says we need a break through in solar to make it economic is wrong. At this point the solar panels themselves are less than half the cost of a solar installation.

Nearly all press releases talking about efficiencies over 50% are just talking about unpractical theoretical limits that would require cooling the solar cells to freezing temperatures or other schemes (i.e. talking about the carnot limit instead of the shockleyy-quiesser limit).


Interesting work - seems like this could also be used to create a thin film that could yield thermal vision, if indeed the nanotubes upshift IR frequencies to visible light. But I’m not a physicist.

This is essentially the same principle behind current night vision tech.

I wonder if the process of changing heat into light can be used for transferring heat more efficiently than current AC units.

I doubt it would work similar, since generating heat is always a byproduct.

This has been discussed in several previous threads on HN. From what I’ve been able to gather, creative ways to reduce the energy consumption of A/C will require a change to building designs, and/or increase the cost and complexity of new buildings. Whereas the cost of running the A/C is up to the future tenant(s).

This is why you see some companies applying eco friendly tech to some new buildings (Apple) but not in general commercial development.

Again this is just what I’ve surmised from reading threads like this. YMMV


Interesting to contemplate what could be done to shift those incentives around. Obviously there are certifications (LEED), and there's straight-up regulation, but would there be a way to mandate that a builder or landlord is responsible for a portion of future HVAC expenses such that they are motivated to get this right upfront?

I’m sure if some energy saving tech was commercialized and reasonably cost effective local municipalities could update building codes. LEED standards could be updated as well. What goes into a building isn’t entirely up to the developer.

But right now most of this stuff is just in a lab. Frankly the tall cooling towers you see on some buildings weren’t in widespread use until at least the 50’s even though they were invented at the turn of the century.


The market should (and has in the past) figured this out. Energy costs money, tenants want to spend less money. Tenants will pay more money up front to save themselves money down the line. Therefore landlords should have an incentive to build energy efficient structures even if they aren’t paying for the energy.

The problem is when an energy source with an high externality (climate related or otherwise) isn’t priced in. This isn’t a hard problem for the government’s perspective though: just price in the externality through some free market system. Carbon credits are a good example of this.


Part of the reason is that it is hard to gauge what the energy bill of an apartment will be before you start renting. That is where certification should come in.

> changing heat into light

It's not actually doing that though. It's changing one set of light-frequencies into another.


> turning heat into light

> channel mid-infrared radiation (heat energy) into light energy

> absorbs thermal photons and emits light.

Goddamnit, no! None of this is "heat", it's just one range of light being converted into another! This is shitty "science"-journalism parroting a common misconception.

After all, the only reason we associate infrared with "heat" in the first place is that it's useful for detecting things which happen to be at a range of temperatures, temperatures which just happen to be slightly warmer than the operating state of self-reproducing bags-of-mostly-water on a small rocky planet.


Yeah it seems more like it.. fluoresces (?) around the infrared spectrum. Fluorescence might actually the right term for what is happening.

As far as I can tell, this is functionally just a coating with low emittance in the IR outside a narrow band. A matched PV device that is kept cool can, in principle, approach the Carnot efficiency for the temperature difference between the emitter and the PV junction. If the emitter is the sun, then the source temperature is very high and the Carnot efficiency isn’t a major limit. If the source is a solar panel, I’m having a hard time seeing how this is useful.

I can see this being somewhat useful as a no-moving-parts heat engine for something like a solar concentrator, but there’s another relevant thermodynamic limit: even if this magic material has emissivity 1, it won’t radiate at a greater power per unit area than the blackbody spectrum predicts. At non-crazy temperatures, this is not very high, which will limit output for small things like solar concentrator targets.

So I can see this being useful to convert waste industrial heat, or maybe as a bottoming engine for a combined cycle plant, but I am having trouble understanding how it could be useful for solar.


For solar, it somehow has to integrate with the solar panel, so that the same square meter is used both for direct photovoltaic generation of electricity, and for this IR capture, so energy is obtained from a wider band.

I understand your point that it can't just be driven by a secondary IR emission from a warmed-up solar panel; that's not hot enough to be that useful.

In a more detailed article, there is talk about the carbon nanotube device being useful because it can withstand high temperatures:

https://news.rice.edu/2019/07/12/rice-device-channels-heat-i...

They of course understand that they need a big temperature differential.


One sketch of such a system is to physically mate an effective high T solar absorber to an effective narrow spectrum photon emitter (as described in this work). Then that can be coupled with a typical solar cell with bandgap matched precisely to the emitter wavelength. So your solar cell is near ideally efficient for the photons it receives. As I understand the emissivity of these materials can exceed blackbody radiation in the near-field but not the far-field (or something like that? Search "Superplanckian emission").

The devices as I am familiar are often called thermophotovoltaic cells: https://en.wikipedia.org/wiki/Thermophotovoltaic

See eg http://xlab.me.berkeley.edu/pdf/259.pdf for a great overview, esp section 3.3.

I saw this video recently that I found accessible from an undergrad physics background and got me interested: https://www.youtube.com/watch?v=XnVVyTD7CzM


> Then that can be coupled with a typical solar cell with bandgap matched precisely to the emitter wavelength. So your solar cell is near ideally efficient for the photons it receives.

One way or another, once you've converted sunlight to heat, you are limited by the Carnot efficiency. For the 80% efficiency they claim, if all of it comes from thermophotovoltaics, they need a hot side temperature at least 5x ambient, which is over 1000 C. I wish them luck getting anything resembling a solar panel up to 1000 C. (I'm not, in any respect, saying it's impossible -- I'm saying it's very hard. You'd need excellect spectrally or directionally specific absorption to avoid re-radiating all that heat out the top of your panel, and you'd need conventional transparent insulation to stop conduction.)

On top of that, super-Plankian emission or no, if it's limited to the near field, then the PV cell is very, very close to the hot surface. That PV cell needs to be kept near room temperature to get that efficiency.

This whole thing seems extraordinary complex for something that wants to be cost-effective.


The absorber/emitter assembly doesn't really resemble a solar cell. Demonstrated tungsten emitters have exceeded 1500K - it's not really that wild, it's what's in incandescent light bulbs. You concentrate sunlight typically.

Very cool! Let's see if it reproduces experimentally, and then let's see if it scales.

ive always wondered why leaves all generally evolved to be green instead of black. wouldnt black be more efficient?

The short answer is that absorbing a narrow band of energies is more efficient than a wide band.

See: http://scienceline.ucsb.edu/getkey.php?key=4979


Because other wavelengths were probably already taken. https://en.m.wikipedia.org/wiki/Purple_Earth_hypothesis (Black is not a wavelength but rather the absence of visible light, e. g. it has all been absorbed)

I'm really curious as to what this will do for waste heat from other processes.

For anyone curious: Carbon Nanotubes, so this will likely not see the outside of a lab any time soon.

Why not? Nawa's ultracapacitors are based on carbon nanotubes and they're going in to production:

https://newatlas.com/nawa-nanotube-ultracapacitor-production...

Vantablack is based on carbon nanotubes and you can buy it now:

https://www.surreynanosystems.com/vantablack/science-of-vant...


Is Rice University located in Asia?

The title is misleading. The full title is:

> Researchers at Rice University developed a method to convert heat into light that could boost solar efficiency from 22% to 80%

The conditional tense signals that the researchers didn't actually do so, but that the study might enable it. The article re-iterates that this is speculation:

> The implications of their discovery are significant. Research from Chloe Doiron, a Rice graduate student, revealed that 20% of industrial energy consumption is wasted through heat. It could also mean an increase in the efficiency of solar cells, which are currently only 22% efficient at their peak. Recycling the thermal energy from solar cells using carbon nanotube technology could increase the efficiency to 80% according to the researchers. ...


Maybe speculation should not be presented as a statement of fact in the title.

"invent" is the key misleading word

perhaps "hypothesize" would fit better.

Ok, we swapped invent for propose above.

I hope this means that my computer fans will be replaced by a laser show

"This light can then be used as electricity."

No. One cannot use light as electricity. One can use light to make electricity. This is not just a minor typo; it's flat-out wrong.


If you want to get pedantic, you can't "make electricity" either because 'electricity' isn't a defined concept. Usually when people use words such as "make electricity", they actually mean to refer to energy or power transferred using electromagnetic fields interacting with a conductor. But a purely chemical reaction that charges a battery could be said to "make electricity" without even involving that.

More info: http://amasci.com/miscon/elect.html


I was going for a simpler idea: Light is photons, electricity is moving electrons. And yeah, if you're talking about electromagnetic fields we're back to photons again. It just struck me as a sentence that was very misleading to lay people.

It's only simpler because "electricity is moving electrons" is not a meaningful claim.

What is the physical unit of 'electricity'? There's not one, because it's not a thing.


On topic of the HN title, there is research being done into carbon nanotube rectennas. Theoretical efficiency is in the upper 90's%. Problem is that they have very narrow bandwidth. Also as far as I'm aware, no one has even built lab samples that do decently.

> there is research being done into carbon nanotube rectennas. Theoretical efficiency is in the upper 90's

I googled optical rectenna.

"An optical rectenna—a device that directly converts free-propagating electromagnetic waves at optical frequencies to direct current" [1]

[1] https://www.nature.com/articles/nnano.2015.220


These guys are trying: http://NovaSolix.Com

I get the maybes and the caveats, the misleading title but the direction seems clear - solar efficiency is a tractable materials science problem - and "we" should see this as a penicillin moment - invest hugely into the research and development until we find the right processes and approach to make cheap ubiquitous solar. Manhattan project levels of finding is what I mean - because the pay off is humans cutting their carbon output.

Some projects have really big ROI


The penicillin moment happened because penicillin actually came into existence and became widely available pretty rapidly.

Research and speculation is awesome stuff, but the hard part isn't discovering new concepts. It is actually making those concepts into a marketable and affordable good.

The evidence for this is that even though there is essentially a new 'breakthrough' discovery every other week for solar panels, or electric motors, or batteries.. We are still using what amounts to cutting-edge tech from the late 90's.

In the modern era this means we generally have to wait till the patents expire and market competition kicks in in order to get the price low enough and the product perfected enough to see widespread usage. If it goes anywhere at all.

It's also worth noting that Florey, the man who is largely responsible in making penicillin practical drug, refused to patent his early innovations to make it widespread as possible.


> wait till the patents expire

If this is the case, then wouldn't "buy a bunch of patents and (with great fanfare) make them open-source" be a relatively low-complexity way for a billionaire who feels like making a name for himself to accelerate progress on fighting climate change?


Penicillin had to undergo decade long R&D to go from Fleming's petria dish to a cheap practical drug - I cannot find the article now but I believe the investment levels from the US Military were compared to Manhattan (obviously poorly compared) but the point is this was not the "gosh what luck" story it is in mass media.

Having made that first breakthrough, world class teams across the globe fought to bring the efficiency up from "froth on the top of a brew" to "gallons of the stuff"

Florey was a big part of the story but so were teams in US and Europe and then the US military scaled it up beyond belief.

We spent money, targeted money, on the best teams globally and then put serious industrial might to it once they found the answers.

That exact approach is what I am calling for again.

And as for patents - if enough global effort is put in, with enough government funds, the pressure to put the results "in public hands" rather than hold out for patents is really strong

https://en.m.wikipedia.org/wiki/History_of_penicillin


This is not quite correct. Silicon solar cells have a maximum efficiency of 32%. Various ways, such as using e.g Carbon Nanotubes as Antennas to directly rectify visible light (http://NovaSolix.Com) may eventually yield much higher overall efficiencies.

Is solar efficiency the new "battery breakthrough" that will transform our lives "in 10 years", for 25 years straight now?

Legal | privacy