Hacker Newsnew | past | comments | ask | show | jobs | submit | Irishsteve's commentslogin

Some random team or engineer does it to get a promo.

I thin k you can only run on google cloud not aws bare metal azure etc


It’s an insurance company so basically pensions.


Right, insurance companies are the new "financial dark matter". The next financial crisis will probably be triggered when a few large life and property insurers fail because they purchased debt assets which were highly rated but turn out to be junk. (Medical and auto insurers aren't exposed here because they operate on much shorter timeframes.)


It's more disturbing than you think:

America’s Retirement Savings in Bermuda entities that lose US protections while making opaque, complex bets: https://www.bloomberg.com/graphics/2025-america-insurance-pa... https://archive.ph/lhZv9


> It’s an insurance company

What is?


Would that not be a slippery slope ? Tax foreigners more if on a h1b, but not foreigners on other work visas.


I’m fine with taxing foreigners more if on other visas too.


I know people who've taken money through these routes. The biggest surprise is the paperwork; since it's public money, everything must be fully transparent, and the government needs to justify why funds went to a specific person / entity.

In contrast, private investors have more discretion and fewer stakeholders to answer to.


Better off getting a pro contract with them I guess


Mostly projects / products that start off with very low usage thus python is perfectly fine (Why overoptimize). And then it becomes useful - and a rewrite isn't worth it.


How is that Python’s problem though? If you paint yourself into a corner by choosing the wrong language then eat the rewrite or eat the hardware costs.

There was a consensus after the Python 3 debacle of “No major breaking changes”. We seem to have lost that because of moneyed interests and that’s sad.


That's such a strangely distorted world view. If a car company releases a, idk, trunk-extension in response to customer feedback would you go "How is this Ford's problem? If you didn't think about the trunk size eat the loss and buy a new car" ? Python developers want python to remain useful to the developers who want to keep using the language. It's not an incomprehensible motivation.


If I attach something to my car it doesn’t affect your car.

Who does no-GIL benefit? For the majority who use Python for single threaded code, no-GIL will make their code slower because a thread-safe interpreter is always going to be slower than one that can assume ST operation.

For the minority who want parallelism, there’s two other options: OS processes and subinterpreters. If you can use either of these then you will get better performance with a GIL for the same reason.

So no-GIL will only be faster for a minority of a minority who want parallelism but can’t use OS processes or subinterpreters.

Meanwhile everybody else writing libraries has to make sure that their code is no-GIL safe, to support this tiny minority, and if no-GIL ever becomes default then everybody else has to do something to turn it off.

It’s such a stupid idea.


> If I attach something to my car it doesn’t affect your car.

Yes, that was my point.

> For the majority who use Python for single threaded code, no-GIL will make their code slower because a thread-safe interpreter is always going to be slower than one that can assume ST operation.

I'm almost sure the python developers said that they will compensate the slow down with other optimizations, so that you'd never have single-threaded performance degradation version-to-version.

> So no-GIL will only be faster for a minority of a minority who want parallelism but can’t use OS processes or subinterpreters.

One would hope to 1. open new use cases for python thus attracting developers that would have otherwise not given the language consideration and 2. other users could benefit from new optimizations that could be implemented further down the dependency stack.

Of course there's no guarantee that that will materialize, but the idea the adding support to an established, lightweight and well-supported concurrency primitive is so obviously a "stupid idea" shows to me that your (rudely expressed) opinion is entirely self-centered and nearsighted.

I might add that the move from python 2 to 3 was incredibly painful, but I assume most agree (with the benefit of hindsight) that it was entirely correct.


> I'm almost sure the python developers said that they will compensate the slow down with other optimizations

Those optimisations are not there to compensate anything; they will improve performance of single-threaded code with or without GIL.


That I meant to express yes. That those non-GIL-related optimizations would soften the blow of any slowdown from the GIL removal project.


Maybe, but making GIL optional (rather than removing it completely) solves both problems.


I however creates another, arguably bigger problem, it fragments the ecosystem.


You can then rewrite performance critical bits in a fast language. Much Python usage is a result of people making use of this.


When you say parse - do you mean for prior art or to generate ideas?


I think by parse it means more like document understanding


Pin this?


The claim that AI products will never work might be valid. But NVIDIA don't need to care about that at the moment. What NVIDIA care about is the likes of AWS who are investing heavily into GPUs so that AWS customers can determine that maybe they don't need some model with tens of billions of parameters to make a search engine for an e-commerce product that has 100 listings.


> But NVIDIA don't need to care about that at the moment

But NVIDIA investors do, NVIDIA's current valuations relies on GPU's continuing to sell at premium prices at ever increasing demand for decades to come, otherwise they wont be close to make trillions in total future profits that their current market cap demands.


Couldn't agree more with you. I can't invest into Nvidias current pricing because it relies on expectational execution and macros that continue to drive demand with either no competitors eating up market share OR technology not changing on them.

It just seems like the current valuation is upheld by excellent performance and demand outstripping supply in the current moment. Throw in everyone being blown away by LLMs performance + lots of hype and people choosing the one company to put their bet on.


Premium prices is real sticking point for me. 100 billion is not lot to get say 30% that is 1 trillion of market cap. And this can happen in 3, 5 or 10 years... So there will be competition.

And on other hand question can also be total cost of ownership. If you have slower and more energy inefficient competing product. But the cost over lifetime is only say 75% it is not that bad deal...


If gen AI companies and gen AI initiatives within major players start shutting down, there might be a glut of graphics card flooding the market, though the supply might fuel another crypto bubble, so NVIDIA may not be in trouble after all thanks to market irrationality.

And I blame squarely business majors for their blind faith in assets whose price goes up with no explanation, they are the ones who always push the hardest for adoption of these fads with no understanding of what the actual value behind them is.


The blame may not lie in businesses having blind faith in the potential value of LLMs, it may be an issue with incentives. It seems like every major tech player leaned fully into LLMs and valuations ballooned because of the hype. There would need to be concerns over an even worse crash if/when LLMs don't match the hype, otherwise they're incentivized to take the cash up front and handle a smaller hit later.

Worse, those tech companies have been the only thing propping up the stock market, and arguably the economy as a whole. Collectively they may very well fall into the too big to fail category, whether because the feds can't allow them to lose all that value or because investors can't afford to properly value the companies when the hype dies.


I think the blame is pretty clearly on the big tech founders who have so much money that their mentality is "a billion dollar doesnt matter, all that matters is that my company isnt left behind." The founders incentives arent aligned with wall streets, so they have their companies waste all this money on GPUs for the <10% chance that they'll be able to create a product since failing doesnt cost them anything.


I'm actually not sold that big tech founders' problem is their views on money or being led behind. The underlying concern for both is a fear of to lose power and control.

I do agree that part of the massive investment in GPUs and LLMs at the biggest tech companies is fear of missing out. I also think a decent part of it is a lack of concern over real downsides. What will really happen when all the tech companies have to write off all those investments? They either get what is deserved, something like the dotcom bubble, or they all see a short lived decline in stocks that disappears quickly when they find a new hype train to jump on.


I hope the glut will spur a bunch of research that is currently not possible outside large labs.


Glut still costs $$$$ to run, electricity isn't cheap at that scale.


It's a different order of magnitude, though. Buying an H200 GPU costs upward of $30,000. The electricity cost to run it at full power for a year would only cost me $675 (700 watt TDP, 11 cents per kilowatt hour at my house.)


Everyone having their own private multimodal assistant on the cheap will be pretty futuristic if we can make it happen


I feel they are selling to the same businesses who were convinced they needed BigData to manage their 100 row database and CloudComputing to manage their 100 page static web site.


True - and the 'enterprise AI' are the clouderas of the modern day.


Eventually the Ouroboros reaches the end.

https://en.wikipedia.org/wiki/Ouroboros


>But NVIDIA don't need to care about that at the moment.

What? How does a company not need to care whether it's customers are sitting on a financial bubble and might not exist anymore in a few years? Why would Amazon continue to massively invest in GPUs if they don't have customers for them?

This argument makes literally no sense.


> The claim that AI products will never work might be valid.

Doesn't look very valid from my perspective, having used ChatGPT almost every day for more than a year.


ChatGPT isn't a multi trillion dollar product though, what they mean is that AI products doesn't today work well enough to warrant trillions of dollars in market cap that you see in NVIDIA etc.


ChatGPT is a tech demo for the purpose of pitching a product market. The chat UI is not the cash cow.


It comes down to how many of these offerings can deliver value worth $X/month to fund the multi-billion dollar ventures behind them, where companies find that they are willing to pay that $X per seat for it.

There must be at least 6 major players among OpenAI, Microsoft, Google, Meta, Apple, Stability, and more that I haven't kept track of. NVIDIA has no way of knowing how viable all these businesses are in the medium term, they know how many GPUs they're buying while investment capital is flowing to them.


You are correct chatgpt is a great product. But if thats the only 'AI' product, or there is a handful. Those companies will focus on replacing NVIDIA ASAP. NVIDIA benefits from having a broad set of customers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: