This may be seen as satire, but webdev as a phenomenon really is a collection of all sorts of poor reinventions and crutches, for which there’s naturally billions of lines of code, which AI loves to have. It will replace webdevs as is. The downside is that we’ll never make a step forward with that (not that webdevs planned to do so anyway).
In fact, web developers are the first for AI to completely replace them as it is best trained on the javascript and typescript ecosystem(s) and that was all done by themselves due to it's high popularity and the lowest of all barriers in this industry.
It will take time for the other sectors, but web developers are the first to be completely displaced.
For some, mccoffee is a solution. So it does depend on expectations.
When there's no budget for hiring a human professional, then any AI output seemingly resembling what is needed would be a solution. Because, how to properly verify it?
Followed by an AI model executing through Javascript, followed by one executing through WASM, followed by an AI model running natively in the browser which will be sold as the 'lean' version which no longer needs Javascript. You'll just have an AI model in the browser interpret data sent by the AI-driven server, 'content' tailor-generated for the user.
Also imagine the compression factor made possible by such a system. Instead of sending a few dozens of megabytes over the line the AI-server will tell the AI-renderer to:
X-PAGE-PROMPT = generate a page showing a number of cats engaged in play with a fluffy toy, make sure to have product placement for products $A, $B and $C, have a catalogue for the furniture used in the scene lie open op the table and make sure that the brand and vendor names are clearly visible
GPT4o-mini, I asked it to refactor my class to a property with backing field and trim. It did that, but it also lost my Email property in the process. I asked it about it and it apologized and gave it back. Same with GPT4o.
Losing a property seems like a small price to pay for the work that it saved you. Yes, you need to do code review and that takes time, but from my experience the weights are totally in favor of using AI. Also, nice to remember, we are still at the early stages of AI, looking forward even 3 years from now doesn't look good for devs.
Yes! Although not really funny (if you want to use ChatGPT to do some real work), it's kinda cute how it always apologizes and says sorry if you remark that it lost something you were sure you included in your prompt. :)
Personally, I would like all my personal interaction with people going like that. Life would be totally easier!
They're obnoxiously obsequious, though. They remind me of a customer service worker dialed up to 11. I'd rather be talked to straight, without all the "You're absolutely right!" and "Oh, I'm sorry!" nonsense.
Have you tried o3-mini? I'm finding it to be significantly better than GPT-4o for code - I'm even finding myself using it in place of Claude 3.5 Sonnet a lot of the time.
What kind of code are you writing, if I may ask; and how long have you been doing it?
I sometimes ask whatever AI open ended questions on subjects I know very little about, and sometimes I find some kind of starting point for less random research using better tools.
Beyond that, and whatever fancy autocomplete my IDE forces down my throat, nope.
I've been leaning hard on LLMs to help me write code for just over two years now. I mainly work in Python and JavaScript, which are both languages that LLMs are extremely good at, but I also use them for languages I hardly know at all like jq and Bash and AppleScript and Go.
What kind of code is the LLM writing for you that you see as such a big win?
Sorry, about the interrogation, but I'm curious to understand this disconnect
between us. To me, for my way of working, LLM's are obviously mostly useless, destructive to the process even.
More interesting view. Will the web change again. Will everything just be thin veneer in front of AI. Why even develop anything, but a frontend for AI and then do everything with AI...
Maybe idea of using AI to develop anything is going to die and instead it is just setting up AI that everyone then has to use...
Code has externally verifiable boundaries - tests pass, features work, requirements are met. These are measurable, binary states.
But an artist's boundaries are internal and subjective. They're set by:
- Emotional satisfaction with the work
- Achievement of their vision
- Cultural/personal context
- Intuitive sense of "rightness"
This is why AI can more readily determine when code is "done" - it matches against explicit criteria. But AI struggles with artistic completion because it requires an internal, subjective experience of satisfaction that AI can't access.
The artist knows to stop when the work resonates with their intended emotional impact. Code is done when it works as specified. One is bounded by feeling, the other by function.
It’s hard to know how this will affect jobs but the argument that current day capabilities will be relevant in the near future doesn’t make much sense to me.
I think it would be unexpected if within 2 years we don’t have AI systems that have excellent taste and judgement.
It doesn't need to completely replace software devs. Rather, AI will make a $25k/year software dev as good as a $250k/year software dev, and the 25k one won't even need to speak English. The real question is: will AI enable the now jobless 250k software dev do something that's so valuable that the US companies will need his services at this price?
IMO, if the company knows how to leverage them, yes. They’ll be the ones bringing online fleets of “junior dev” software agents, connecting them to telemetry and evaluation systems, maintaining subsystems in various states of modernity, interfacing between stakeholders and systems, and being a responsible party for the bosses. A lot of that isn’t too different from the role today, except they’ll be more impactful, produce more value, and have more time for the deep reflection that generates insight and invention.
While I wouldn’t go as far $25K vs $250K, the main difference between the $125K developer and the $250K developer is the latter was willing to memorize enough leetCode to get into BigTech.
AI will not take anyone's jobs. Executives looking to juice the value of their stock portfolio and cut expenses to create a better revenue story for the companies they run will fire people and attempt to smear "AI" all over their operations.
It's going to look similar to the trend of off-shoring developer jobs to the lowest bidder over seas. I made my career cleaning up the webs of technical debt that many companies accrued.
LLM code assistants in the hands of a skilled developer are a productivity multiplier. In the hands of a novice, they bloom technical debt like cancer. Expect rapid cost cutting, followed in about a year or two my panicked recruiting of expensive contractors to fix the unmaintainable mess, making the whole endeavor a loss (which looked good enough on the books for enough time to earn raises and bonuses for managers).
I've been following developments since OpenAI's 2020 announcement with exactly this in mind.
My intent is to understand
- how these tools are being hyped for programming
- the kinds of things inexperienced or poor programmers are using the tools for
- the errors and failure modes typically resulting from naive usage or misuse
- mitigation strategies derived from well-known techniques of dealing with legacy systems, applied to the particulars of AI-generated code and systems.
AI needs the web to thrive & survive ...its leech of the worlds' knowledge & where it gained the majority of its knowledge from and will continue to do so.
The next big platform winner I think will be an AI phone... one that embraces the web over app stores. Its web browser will be where all the games and everything is built even if we no longer open our web browsers up as we do now.
AI isn't taking American jobs. Foreign developer agencies utilizing AI and being paid a fraction of what American employees are being paid are taking American jobs.
Why do you assume that this is about outsourcing? So far, every international company I've worked for in the last decade has insourced dev, but moved it out of the USA, then laid off the American devs. That is not about outsourcing at all, it is about responding to cost discrepancies in the global market.
But by the time it can replace (not merely assist) senior engineers, anyone who has a job in front of a computer, about 80% of workers, will have their job automated by AI and we'll need an alternative (or big changes to) to capitalism. Otherwise, this particular economic system will experience massive collapse.
It's missing one critical point: if AI makes a good developer 10% more productive, the industry needs 10% less developers to achieve the same output.
You're not getting replaced by AI. You're getting replaced by a coworker using AI. It doesn't matter how poorly AI performs at any remotely complicated task, what matters is how many developer-hours it saves by not having to manually write as much boilerplate.
My company has had a backlog thousands of developer-hours long for years. Maybe eventually somebody will get fired because we're not needed, but the LLM would have to actually 10x our productivity for us to get through the backlog in a year.
No one knows what will bring in revenue. It’s all just a guess and at most bigcos there are far more people who can create tickets than there are people who can close them.
Every PM I’ve ever worked with had 10x more things they wanted to add than they had devs to add them.
Alternatively: if AI makes a good developer 2x more productive (which I do think is possible based on my own experience), developers are now 2x more valuable to employers. The cost of building software drops in half, which means employers that previously wouldn't have developed their own software (too expensive) are now in the market for custom solutions. Demand for developers goes up.
If you believe that making developers 10% more productive results in a need for ~10% less developers, why didn't open source software over the past ~20 years harm our profession?
Working with open source packages from npm and PyPI has given me WAY more than a 10% boost - so much stuff I don't have to write from scratch now!
Dunno man, the times I’ve used AI to help me code, it got the solution wrong. I spent more time backtracking and fixing things than if I’d done it by hand in the first place. So it actually made me 20% less productive.
So far more efficient development tools have resulted in more tech companies / products being created, resulting in a increase in net total developers needed. The idea that 10% more efficient devs would mean less devs needed assumes that we're living on a flat line of productivity and invention, which is simply not true.
> You're not getting replaced by AI. You're getting replaced by a coworker using AI.
How long until people realise AI cannot take responsibility? Sure, that coworker can do 10% more coding. But can they handle 10% increase in mental workload to juggle all the hard problems? What about 20%? At what point do people start cracking?
But for web developers? Most certainly Yes. They will be the first and especially for those who love redoing their web app with hundreds of web frameworks, releasing web app clones, javascript and debating about the sea of libraries to use that compete against themselves.
It is best trained on the entire javascript and typescript ecosystem and those specializing in web development which is the low hanging fruit, will be easily replaced.
It's perfect for replacing web developers.
It can make mobile apps, too.