I fear the title of this article is going drive most of the conversation.
I haven’t read through the whole thing yet, but so far the parts of the argument I can pull out are about how Institutions actually work, as in a collection of humans. AI, as it currently stands, interacts with humans themselves in ways that hollow out the kind of behavior we want from institutions.
“ Perhaps if human nature were a little less vulnerable to the siren’s call of
shortcuts, then AI could achieve the potential its creators envisioned for it. But that is not the world we live in. Short-term political and financial incentives amplify the worst aspects of AI systems, including domination of human will, abrogation of accountability, delegation of responsibility, and obfuscation of knowledge and control”
An analogy that I find increasingly useful is that of someone using a forklift to lift weights at gym. There is an observable tendency when using LLMs, to cede agency entirely to the machine.
> Perhaps if human nature were a little less vulnerable to the siren’s call of shortcuts, then AI could achieve the potential its creators envisioned for it.
I can't see how, given that potential is 99% shorcuts.
Come now. You mean the highly regulated, more competitive world of law? That too, as it is practiced in America? The once capital of economic competition?
That “cartel”?
Vs the leaders of an industry that built their tools through insane amounts of copyright infringement, and have forced the coining of “enshittification” to describe all pervasive business strategies?
The same industry which employs acqui-hire to find ways to cull competition?
The industry where you can be a paralegal for 20 years, but not allowed to even attempt to take the bar exam because you haven't paid your $250K and 3 years lost earnings to get your degree from the lawyer's cartel? That "competitive" industry?
The first weakness of your claim is that it is inherently one of the elite.
You read the works of the cognitive elite, when they support AI. When most people sing its praises, it’s from the highest echelons of white collar work priesthood.
AI is fundamentally a tool of the cognitively trained, and shows its greatest capability in the hands of those capable of assessing its output as accurate at a glance. The more complex the realm, the deeper the expertise to find value in it.
Secondly, linguists are not the sole group espousing the concerns with these tools. I’ve seen rando streamers and normal folk in WhatsApp groups, completely disconnected from the AI elite hating what is being wrought. Students and young adults outright wonder if they will have any worthwhile economic future.
Perhaps it is not a “movement”, but there is an all pervasive fear and concern in the population when it comes to AI.
Finally, position is eerily similar to the dismissal of concerns from mid level and factory floor job workers in the 80s and 90s. It was forgivable given the then prevalent belief that people would be retrained and reabsorbed into equivalently sustaining roles in other new industries.
The marketing hype is economy defining at this point, so calling it overblown is an understatement.
Simplifying the hype into 2 threads, the first is that AI is an existential risk and the second is the promise of “reliable intelligence”.
The second is the bugbear, and the analogy I use is factories and assembly lines vs power tools.
LLMs are power tools. They are being hyped as factories of thoughts.
String the right tool calls, agents, and code together and you have an assembly line that manufactures research reports, gives advice, or whatever white collar work you need. No Holidays, HR, work hours, overhead etc.
I personally want everyone who can see why this second analogy does not work, to do their part in disabusing people of this notion.
LLMs are power tools, and impressive ones at that. In the right hands, they can do much. Power tools are wildly useful.
But Power tools do not make automatically make someone a carpenter. They don’t ensure you’ve built a house to spec. Nor is a planar saw going to evolve into a robot.
The hype needs to be taken to task, preferably clinically, so that we know what we are working with, and can use them effectively.
You are conflating two things here - business models and sustainable operations.
Even NGOs can be said to have "business models" in the sense that it was being used here. It doesn't have to be profitable, but it has to at least match operational costs.
Reporters have to eat, and pay costs, its not free. That money has to come from somewhere.
And we are only talking about the production of news copy.
The production of good quality local journalism is itself in the service of a more informed polity and information economy. An information economy that is currently using every trick in the book to suck attention out of the polity.
So you will need even more money to ensure you can compete effectively at scale.
Someone needs to pay for this, and ideally it would be a self sustaining manner, which allows local news agencies to remain independent.
> You are conflating two things here - business models and sustainable operations.
> Even NGOs can be said to have "business models"
The narrative force is strong here. I will let you free. A public service doesn't need a business model. They don't do business. Anyone dealing with a budget isn't automatically a business.
The principle of a public service is that it focuses on its service, given its budget constraints. Completely different from a business, they don't have a model in common.
> Someone needs to pay for this, and ideally it would be a self sustaining manner
Yeag, you end up with a niche. Too small to be relevant to function as the Fourth Estate. These things exist already. Your average citizen isn't going to pay for it. You are basically proposing Fox News, that is the consequence. It is about the whole of society that needs to be informed.
Government funding allows public services to be independent. This is a matter of judicial oversight. "But government bad, market good". It will take a generation of detoxing from the cultural memes and sponsored narratives, to reverse decades of cultural programming.
I'm not trying to beat you over the head with a dictionary, but I can understand when business models is used in places where it is not strictly accurate. I wouldn't personally say the Army or Government (providers of public goods) have a business model, but in the discussion of new agencies, there is enough overlap and history for the term to still hold its meaning.
Even then, using your definition, does not help us escape the point - there needs to be a source of funds for local news. I am perfectly fine with government money being used to pay for it.
Ok! I want to steer people away from the historic model, because that has been a problem and weak point since its inception. I am happy you are open to that.
This position is suitable, for the 1990s. Even then, the BBC showed that public journalism != propaganda.
In fact, the evidence is that if you build institutions, you can actually have very effective public options.
However, in the current era, news is simply being outcompeted for revenue. Even the NYT is dependent on games for relevance.
And the attack vectors to mould and muzzle public understanding have changed. Instead of a steady drip of controlled information, it is private production of overwhelming amounts of content.
Most good people are fighting yesterdays war, with yesterdays weapons, tactics and ideas when it comes to speech.
The real reporting now comes from individual creators often with a gopro or cellphone camera and a youtube/tiktok channel.
It's cheap to make, doesn't require state/institutional funding. It's also quite hard to buyout all the creators and thus at least slightly resilient against the usual attack vectors.
Currently the most succesful method of assaulting the "marketplace of ideas" is by overwhelming channels with content. Most of our guard rails and fears were around government over reach, not through the attrition of attention and via the production of overhwelming amounts of content.
As a result, more competition (more speech) has been defanged as a solution.
Producing Local news is never going to be more interesting and attention grabbing, and thus revenue generating, than pure dopamine stimulation.
To keep local news alive, it needs money.
A public news option may seem sub ideal, but the option is on the table because the other avenues have been destroyed. Hell - even news itself is losing. The NYT is now dependent on video game revenue to keep itself afloat.
The common ground of the eralier information ecosystem was a result of chance. New factors are at play, and if we want it to survive, then we need to address the revenue issue, some how.
Yep. There is some network effect nonsense that comes into play when it comes to news. Only stuff which carries at the largest, broadest, most simplified level survives.
I haven’t read through the whole thing yet, but so far the parts of the argument I can pull out are about how Institutions actually work, as in a collection of humans. AI, as it currently stands, interacts with humans themselves in ways that hollow out the kind of behavior we want from institutions.
“ Perhaps if human nature were a little less vulnerable to the siren’s call of shortcuts, then AI could achieve the potential its creators envisioned for it. But that is not the world we live in. Short-term political and financial incentives amplify the worst aspects of AI systems, including domination of human will, abrogation of accountability, delegation of responsibility, and obfuscation of knowledge and control”
An analogy that I find increasingly useful is that of someone using a forklift to lift weights at gym. There is an observable tendency when using LLMs, to cede agency entirely to the machine.
reply