Hacker Newsnew | past | comments | ask | show | jobs | submit | jasonpeacock's commentslogin

My house originally had a recirculating pump for the hot water but it burned out. Somehow though, it still (mostly) worked and had instant _warm_ water.

I think it was through natural convection/circulation - the hot water expanded in the tank and pushed it through the recirculating loop?

So maybe there's a good-enough solution that doesn't require a pump, just a return loop.

Now I have an on-demand water heater with a built-in recirculating pump, so it's instantly hot :)


This is how rooftop solar water heaters circulate water as well. Though these days you'd be better off with PEV plus a small electric heater.


Growing up (born in late 70s), all I heard was “OMG OVER POPULATION” and how the planet can’t support the projected N billion people who will be living on it.

Now the birth rate actually slows down to correct itself and we’re not all breeding like rabbits, that’s a bad thing?

This feels like a capitalist concern, “we won’t have enough workers to produce goods and then consume them!”


The system at large hasn't been great at forward planning so the whole pyramid shape might collapse.

Elderly care is basically going to wipe generational savings from the 20th century off the map and all that wealth will be reallocated to PE.


Is AI going to take all the jobs, or isn't it?


> Elderly care is basically going to wipe generational savings from the 20th century off the map

Probably for the best.

Currently most of that wealth is being hoarded by the top 0.1%, at the expense of 8 billion people having to deal with global warming for the foreseeable future (e.g. - centuries).

If that's the best humanity can do with wealth, then burn it all down. As long as we keep some advances from medicine (vaccines, dentistry) and technology which aren't as energy intensive, it should all work itself out in the end.


I mean elder care is unlikely to wipe out billionaires as much as low-single-digit-millionaires


What's going to wipe out billionaires is lack of a highly-educated workforce, because no one is having babies.

And no, you can't completely solve this by immigration (because the demographic crisis is global).

They might still stay billionaires in absolute terms, but a lot of their wealth will be wiped out as companies struggle to sell their goods to a population with reduced purchasing power (since we're too busy taking care of elderly folks)


Overpopulation is still a concern when considering biodiversity, groundwater loss, etc.

The latest UNEP report includes it - see page 37 from https://www.unep.org/resources/global-environment-outlook-7 -> https://wedocs.unep.org/rest/api/core/bitstreams/902187bf-ea...

"Among the major global environmental crises – climate change, biodiversity loss and land degradation, and pollution and waste – population growth is most evidently a key factor in biodiversity decline. This is largely due to increased demand for food production, which leads to agricultural expansion and land degradation (Cafaro, Hansson and Götmark 2022). As the population grows and consumption rises, fewer resources and less habitat are available for non-human species (Crist 2019). Overpopulation occurs when the total human population multiplied by per capita consumption surpasses the capacity of sustainable ecosystems and resources. Although the global human population continues to grow, per capita consumption is increasing at a faster rate. To the extent that people are disrupting natural habitats and degrading ecosystem services for future generations, despite regional heterogeneity, some research suggests that most of the world’s nations may be considered overpopulated (Lianos and Pseiridis 2016; Tucker 2019)"

Specifically going back to 70s overpopulation concerns, thing shifted with the Green Revolution / Norman Borlaug but it came at the cost of reducing groundwater supply and reducing agricultural diversity. See 'The Globalization of Wheat' and https://climatewaterproject.substack.com/p/groundwater-and-c...


I see slowing birth rates as a net positive.

People in these comments are considering to enslave women like The Handmaid's Tale before even asking if it’s a problem.


It's possible to have both overpopulation(too large of a population for a given metric like water, energy, pollution, etc) and demographic collapse(too many old people, not enough young workers). It's not intuitive but they are separate phenomenon.

The reaction to overpopulation concerns probably discouraged people from having kids but it's unlikely to be the main cause.


Without enough children, who will be taking care of you when you are older?

All of society and industrial functions require young people.


Less consumer demand means fewer jobs. When people can't find good well-paying jobs, they become pretty unhappy, and they won't be magically enlightened and out of misery by being told it's the capitalist wheel turning.

Capitalist concern is human concern.


One is a problem of humanity, the other of capitalism.

Capitalism needs constant growth


Not even capitalism. Every economic system has pensions and healthcare costs rising with age, coupled with a decreased productivity.


You decreased productivity of the elder, as a society we are getting more and more productive. We create more and more billionaires.

Productivity isn’t our problem, distribution is.


I see these as separate issues.

On a macro scale you want to see country wide economic statistic numbers go up, regardless of who the money gets to in the end. When your population's age isn't evenly distributed it causes spikes in productivity and costs associated with the elderly which makes the metrics go down. Combined with short term politics that are not incentivized to prepare for it, but rather to play hot potato with it, it makes for interesting situations. If, in the worst case, the country is functioning paycheck to paycheck, you have every member of the workforce supporting multiple elderly and children via taxes, since their taxes were already spent on X or stolen long ago during the productivity boom.


Capitalism needs private property and free markets. Everything beyond that is cultural


Capitalism does not need and has never had free markets, though some arguments for capitalism being ideal rest on the assumption of free markets, along with a stack of other idealized assumptions, like human behavior conforming to rational choice theory.


Then explain why capitalism made China its workbench.


China encourages exports and has no recent history of confiscating property owned by foreigners. Combined with cheap labor this makes it a great place to set up sweat shops. If you are selling a good that can be made in a low labor area but you use high cost labor you will be outcompeted and the market wont buy your expensive products so over time all the successful firms make their low skill products in sweatshop zones


OTECs are amazing, and step 1 of "The Millennial Project: Colonizing the galaxy in eight easy steps"[0]

[0]: https://en.wikipedia.org/wiki/The_Millennial_Project

There's a shore-based research OTEC in Hawaii, but the best is a floating, closed-loop OTEC in the ocean.


Interesting link. I would think step 7 would come before step 6 though. I thought about this for a few minutes and can't come with a reason otherwise.


The timelines are increasing powers of 2. It’ll take much longer to colonize all asteroids than to settle Mars.


wiki article states "Up to 10,000 TWh/yr of power could be generated from OTEC without affecting the ocean's thermal structure". which converts to about 500GW which... isn't that much


10 000 TWh/yr is one third of the current total electric energy generation of the whole planet, is not a small amount.

Source, page 39 of the full report:

https://www.iea.org/reports/global-energy-review-2025/electr...


This can't be correct.

10,000 TWh/y = 1e+7 GWh/y, divide it by 365.25 days/y to produce daily output of 27,379 GWh/day, then by 24 h/day to get pure power of 1,141 GW. It's still more than a terawatt, three orders of magnitude larger than the largest nuclear reactors.


oops. yes. still not that much though. i mean it's a lot but it's "one more large industrialized country" a lot not "kardashev 2" a lot


Those goalposts of yours are on a FTL ship...


Kardashev 2 has a Dyson sphere. Of course anything on a single planet can never have that much.


This is neat, but it's not zettelkasten - it's building a browse-able knowledge DB from content.

Zettelkasten is about about writing down your ideas in response to content, with a link to that content, and then linking to other ideas that your already logged. It's not an extraction of ideas from that content. This is a common mis-understanding of zettelkasten.


> rampant monkeypatching that made the standard library hard to count on

That was very frustrating when doing regular development after using Rails, all the "built-ins" were actually patched into the stdlib by Rails and not available without it.


> What I want is a dispassionate discussion of how different language features impact code quality

This can be difficult because code quality, productivity, safety are hard to objectively define and measure, so we always fall back to differences in interpretation and experience.


I would be interested in serious attempts to argue for this, even if they can't reasonably be backed up by data.

For example, I think there's a pretty strong argument that immutability makes it easier to write multithreaded code (perhaps at some performance cost), because it entirely prevents common types of bugs.

Similarly there's a good argument that making methods open to extension (like Kotlin or Julia) makes it easier for an ecosystem to adopt unified APIs without explicit coordination.

There's obviously a very strong argument that Garbage Collection prevents a lot of memory safety bugs, at costs to interoperability and performance.


Perforce’s binary support is basically equivalent to Git LFS, it does the same thing.

What does Perforce binary support have that Git LFS doesn’t?

AFAIK, the base issue that Perforce is already in use and it has enterprise support.


I am the last person to ever promote perforce, but as of last yearish it has the option for binary delta transfer using fastCDC.

Even without that, it is a just straight up a lot faster than git lfs. I know this because I benchmark it against git pretty frequently as I am creating my own large file capable VCS.


What do you mean by this? It's hardly equivalent to LFS. The binary files aren't replaced with a text pointer with actual content stored on a server elsewhere. Binary files are stored in the same place as text files.


From the user's perspective, when setup correctly Git LFS is transparent and they don't see the text pointers - the binary files are replaced on push and pull to the server.

It's the same user experience as Perforce?

Yes, Git is more low-level and it's possible to see those text pointers if you want to.


This is what you want to believe but its not true.

I’m really sorry, git lfs is an ugly hack, and its always painful when you discover that some gamedev team has been forced into it by “better knowing” software developers.

It reminds me a lot of “features” of software that is clearly a box ticking exercise, like technically MS Teams has a whiteboard feature. Yet it lacks any depth: its not persistent so its gone after the call, and it’s clunky to use and to save.

… but technically the feature exists, so it’s harder to argue for better software thats fit for purpose, like miro or mural.


Not a belief, but my experience. Maybe I've had a blessed experience with LFS? It's always "just worked" for me.


But I’d make a guess that the majority of the files you’re working on are text based.

If the primary filetype you use is binary, you’ll start to feel the jank.


1. Perforce checkout requests always do the round trip to the server.

2. Artists can actually understand Perforce.


> Perforce checkout requests always do the round trip to the server

That's literally the antithesis of Git. If that's a requirement, then yeah - Git's the wrong thing.

It's like complaining that bicycles don't have motors like motorcycles. If it had a motor, it wouldn't be a bicycle.


The question as I recall was what Perforce does that Git LFS doesn't, so I'm sorry to disappoint but my hands were tied.

Anyway, I dunno, man. If you want binary files to work, some form of per-file mutex is indeed a requirement. And for this to work well, without being a lot of hassle (and regarding that, see point 2, which I note has been accepted without comment - not that I expected anything else, the argument that Git is the artist-friendly choice would be a difficult one to make), any modification of the mutex's state has to involve a round trip to ensure the info is up to date. You can't rely on something local, that only gets updated sometimes, because then the info can be out of date! Worst case, N people find out too late that they've all been making changes simultaneously, and now N-1 of them will almost certainly lose work.

(You might be inclined to moan at people for not going through the full process, but: we have computers now! They can do the full process for us!)


"If productivity can be measured by throughput then it shouldn't be done by humans."

I forget the author, or the exact quote, but basically this. Brainless jobs should be automated, nobody should be an automaton.

This doesn't mean we give up on craftsmanship, but mass production and busy work should be eliminated from human roles.


I agree but the problem is there aren't enough other jobs/ways for people to survive in our current system.


There's also the simpler/smaller git-worktree-switcher (`wt`):

https://github.com/yankeexe/git-worktree-switcher

It does what it says on the tin.


I mean, the same thing would happen if Bash stopped writing to `~/.bash_history` and its last item was `rm`, right?


when was the last time this happened to you?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: