The fact everyone thinks we are in an AI bubble is practically proof we are not in an AI bubble.
The crowd is always wrong on these things. Just like everyone "knew" we were going into a deep recession sometime in late 2022, early 2023. The crowd has an incredibly short memory too.
What it means is that people are really cautious about AI. That is not a self reinforcing, fear of missing out, explosive process bubble. That is a classic bull market climbing a wall of worry.
technical ICs actually trialing the AI tools think we’re in a bubble. Executives, boards, directors and managers are still tumbling head over heels down the mountain in a race to shovel more money into the fire, because their engineering orgs are not delivering results and they are desperate to find a solution
This. Highly competent technical ICs in my circles continue to (metaphorically) scream at their Juniors submitting AI slop and being unable to describe what it's doing, why it's doing it that way, or how they could optimize it further, since all management cares about is "that it works".
Current models excel because of the corpus of the open internet they (stole from) built off of. New languages aren't likely to see as consistent results as old ones simply because these pattern matchers are trained on past history and not new information (see Rust vs C). I think the fact nobody's minting billions turning LLMs into trading bots should be pretty telling in that regard, since finance is a blend of relying on old data for models and intuiting new patterns from fresh data - in other words, directly targeting the weak points of LLMs specifically (inability to adapt to real-time data streams over the long haul).
AI's not going away, and I don't think even the doomiest of AI doomers is claiming or hoping for that. Rather, we're at a crossroads like you say: stakeholders want more money and higher returns (which AI promises), while the people doing the actual work are trying to highlight that internal strife and politics are the holdups, not a lack of brute-force AI. Meanwhile both sides are trying to rattle the proverbial prison bars over the threats to employment real AI will pose (and the threats current LLMs pose to society writ large), but the booster side's actions (e.g., donating to far-right candidates that oppose the very social reforms AI CEOs claim are needed) betray their real motives: more money, less workers, more power.
> AI's not going away, and I don't think even the doomiest of AI doomers is claiming or hoping for that.
Is this the consensus on nomenclature? I though "AI doomers" was people thinking some dystopia will come out of it due to it. In that case I've read so much text wrong.
At this point, my perspective is that the bubble talk has effectively boiled viewpoints into booster or doomer camps based solely on one’s buy-in of the argument these companies have created actual intelligence that can wholesale replace humans. There doesn’t seem to be much room for nuance at the moment, as the proverbial battle lines have been drawn by the loudest voices on either side.
Not really. Worked through the dotcom bubble. It was obvious to some people on the ground doing the work. It was obvious to some execs who took advantage of it. Feels similar. Especially if you are burning through tokens on Gemini CLI and Claude Code where the spend doesn’t match the outcomes.
I saw someone earnestly say that a business model with potential to generate actual revenue was no longer relevant, and companies need only generate enough excitement to draw investors to be successful because “the rules have changed.” At that moment, I saw that telltale soapy iridescent sheen. I’ve heard that before.
I’m worried that the US knowledge industries jumped the shark in the teens and have been living off hopeful investors assuming the next equivalent of the SaaS revolution is right around the corner, and AI for whatever reason just won’t change things that much, or if it does, the US tech industry will fumble it, assuming their resources and reputations will insulate them from the competition, just like the tech giants of the 90s vs Internet startups. If that’s true, some industries like biotech will still do fine, but the trajectory of the tech sector, generally, will start looking like that of the manufacturing sector in the 90s.
There is absolutely FOMO. It's even being deliberately stoked. "AI won't take your job. People using AI will." This is this hype cycle's "have fun being poor."
We must live in different realities because most design has almost no creativity or originality at all. "Good design" means the website/design looks exactly the same as everything else.
We have the tools to do anything imaginable with film and video but the top box office films right now in the US are all completely derivative, non-creative human slop.
"Good design" is so trivial to do with generative AI.
We hardly live in 1910 Paris with all the cool people drinking absinthe in between cranking out all these artistic masterpieces.
The crowd is always wrong on these things. Just like everyone "knew" we were going into a deep recession sometime in late 2022, early 2023. The crowd has an incredibly short memory too.
What it means is that people are really cautious about AI. That is not a self reinforcing, fear of missing out, explosive process bubble. That is a classic bull market climbing a wall of worry.