Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

OK, so perhaps it's not the de-carbonization future we all wanted, but this could be an absolute game changer for planetary exploration, where RTG (nuclear-decay-driven thermal engines) are common. The existing efficiency for those is abysmal, which is actually OK since heat is such a useful product in itself.

I'm looking at Dragonfly, specifically, where an RTG provides the electricity and heat to keep everything alive. Imagine what 10x longer flights would do for that mission.



The advantage does appear to be higher efficiency compared to themocouples, and the TVP should in theory require less cooling relative to the input due to the higher efficiency - However the disadvantage is that TVPs generally require a far higher operating temperature for effective power production, which may actually require more sophisticated cooling.

This particular one requires around 2000C, which appears to be above the critical temperature of most RTGs (though not all!):

https://www.researchgate.net/figure/Critical-temperatures-to...

I wonder if those RTGs also have any disadvantages or are simply more substantial.

[edit]

Corrected cooling requirements as pointed out by the8472 in another comment.


The temperature will rise until either the rate of loss of heat balances the rate of its generation, or the container loses structural integrity.

I wondered whether 'critical temperature' meant a temperature above which the junction fails to generate an EMF - which would not be an issue here, even if this were the case, as we know this material works at 2000C - but the article seems to use the term to mean certain temperatures that have engineering relevance, with regard to the design of the device, as opposed to fundamental physical limits to feasibility.


To be clear, wouldn't, say, the atmosphere of Titan provide ample cooling opportunity? Heat distribution is, itself, a feature of RTGs, where planet-side things are usually very, very cold.


"OK, so perhaps it's not the de-carbonization future we all wanted, but this could be an absolute game changer"

I hate to throw water on the situation but we constantly read articles about tech that will be a game changer only to never be seen again mainly because it can't be scaled to the size needed and provide the advantages we need.

Yes, it sounds good. But what we need now is a proof of concept rather than theories on how much of a miracle the tech is. My question is, "How can we help to move it forward to a point where we can see actual advantages?"


Do RTG reach the >2170K necessary for this?


Apparently not. This 1991 document [0] says the GPHS [1] heat source's iridium cladding has an upper operating temperature limit of 1,300 °C (1,573 K). There's a suggestion this could be safely raised to 1,500 °C. (This is the structural part that contains the Pu-238 fuel; the plutonium itself could be hotter than this, and the thermoelectric junction is much colder).

(Note the reason given for this limit is to guarantee that, if this capsule accidentally falls to Earth, the cladding has the mechanical performance to stay intact on impact. If iridium gets too hot, its strength is permanently degraded (so it says). I suspect if you voided this requirement -- if you created a separate of class of spacecraft "no longer capable of impacting earth" -- the limit could be raised much higher).

[0, pdf] https://ntrs.nasa.gov/api/citations/19910015359/downloads/19... (2.2 "Temperature Constraints")

[1] https://en.wikipedia.org/wiki/GPHS-RTG


I also suspect that thermal ranges are limited due to the challenges of radiating away heat.

Cooling things in space turns out to be hard. There's no convection or conduction, so all cooling occurs via radiation. I believe the ISS radiators are rated at about 1 watt per m^2, which means that you can calculate the radiator area (and mass) given your power budget.

To a first approximation, the ISS radiators are about the same size as its solar panels.

This is a factor missing from virtually all sci-fi representations of space ships. I had this realisation a few years back talking about 2001 --- the Discovery should have had a huge set of radiator fins given its nuclear propulsion (ion drive I believe, in the story universe). Wikipedia states that there were radiators in the book, though I don't recall that element.

https://en.wikipedia.org/wiki/Discovery_One


That's a constraint on thermal power, not on temperature (?). Hotter objects radiate faster (T^4 scaling even), so it's *easier* to cool them in a vacuum, all else being equal.

MMRTG [0] dissipates 2,000 W, in a package that only has several m^2 of radiator fins. That's >100x more efficient than the ISS (1 W/m^2). ISS' radiators have to operate below room temperature; MMRTG fins [1] are in the 100 °C - 200 °C range. Higher temperature -> higher heat dissipation per fin area.

(By the way, the comment we're replying to asks about nuclear quadcopters on Titan [2], which is in dense atmosphere not a vacuum!)

[0] https://en.wikipedia.org/wiki/Multi-mission_radioisotope_the...

[1] https://sci-hub.se/10.1063/1.2169255

[2] https://en.wikipedia.org/wiki/Dragonfly_(spacecraft)


That's somewhat the conclusion I was coming to as I was thinking through the question.

And yes, Titan / cold atmosphere would make for highly-efficient thermal gradients.


Apparently not, as someone below pointed out.

Oh well, I was briefly excited but there's a big enough gap that it probably won't work as is


Bear in mind that, according to TFA, this work is noteworthy specifically because of the high temperature - past work has achieved the same results, albeit with lower efficiency, at lower temperatures.


I think the idea is that you basically have a giant pit of lava under your neighborhood or whatever, and which you then put a bunch of these things over to generate electricity from the heat. Other energy sources (wind, solar, coal, shakeweight power, whatever) are used to keep the lava molton and replace whatever heat was recently converted to electricity.

If you had a pit huge enough and hot enough, seems plausible to me.


>The existing efficiency for those is abysmal, which is actually OK since heat is such a useful product in itself.

How does one lose this heat in space?


Radiation, if you're in a vacuum. It causes no end of problems getting this right. But if you're on a planet you've probably got an atmosphere to help.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: