I think a lot of people have a fear of AI coding because they're worried that we will move from a world where nobody understands how the whole system works, to a world where nobody knows how any of it works.
This comment on the article sums it up for me, at least in part:
“Nobody knows how the whole system works, but at least everybody should know the part of the system they are contributing to.
Being an engineer I am used to be expert of the very layer of the stack I work on, knowing something of the adjacent layers, mostly ignoring how the rest work.
Now that LLMs write my very code, what is the part that I’m supposed to master? I think the table is still shifting and everybody is failing to grasp where it will stabilize. Analogies with past shifts aren’t helping either.‘
While that may very well be true, it's a valid reply to the GP who made this claim, not to my comment explaining to the parent why their argument was logically flawed.
Just because you disagree with me doesn’t mean my argument is “logically flawed.” And as the other commenter said, I never said AI had no value. I have used various AI tools for probably 4 years now.
If you’re going to talk to and about people in such a condescending way then you at least ask clarifying questions before jumping to the starkest, least charitable interpretation of their point.
I've built a SaaS (with paying customers) in a month that would have taken me easily 6 months to build with this level of quality and features. AI wrote I'd say 99.9% of code. Without AI I wouldn't even have done this because it would have been too large of a task.
In addition, for my old product which is 5+ years old, AI now writes 95%+ of code for me. Now the programming itself takes a small percentage of my time, freeing me time for other tasks.
Quality is better both from a user and a code perspective.
From a user perspective I often implement a feature and then just throw it away no worries because I can reimplement it in an hour again based on my findings. No sunken cost. Also I can implement very small details that otherwise I'd have to backlog. This leads to a higher quality product for the user.
From a code standpoint I frequently do large refactors that also would never have been worth it by hand. I have a level of test coverage that would be infeasible for a one man show.
It's boring glorified CRUD for SMBs of a certain industry focused on compliance and workflows specific to my country. Think your typical inventory, ticketing, CRM + industry specific features.
Boring stuff from a programming standpoint but stuff that helps businesses so they pay for it.
Not to mention how hard it is to actually get what you want out of it. The image might be pretty, and kinda sorta what you asked for. But if you need something specific, trying to get AI to generate it is like pulling teeth.
> The reason I'm not in favor of natural language programming has nothing to do with its probabilistic nature and everything to do with its lack of precision
Yeah, even if they're made to be 100% deterministic, you've now got a programming language whose rules are deterministic, but hard to understand. You've effectively pinned the meaning of the natural language in some way, but not a way that anyone can effectively learn, and one that doesn't necessarily match their understanding of the actual natural language.
And it's weird that this even needs to be argued given that our long explanations are needed to even convey fairly simple concepts. Not to mention that it still relies upon correct interpretation.
The result of natural language programming is either an extremely limited programming language or an extremely verbose one (again, look at law). Presumably it'll result in both.
It's a nice idea but ignores the reason we invented symbolic languages in the first place. They were invented after natural language. It's not like code is some vestigial language raiment. We're trying to replace it because it's hard and annoying. But I'm certain that's mainly due to the level of abstraction we're trying to work with more than due to the language we're using
This is the best description of value from AI that I've seen so far. It allows people who don't like writing code to build things without doing so.
I don't think it's nearly as valuable to people who do enjoy writing code, because I don't think prompting an agent (at least in their current state) is actually more productive than just writing the code. So I don't see any reason to mourn on either side.
Because a junior developer doesn't stay a junior developer forever. The value of junior developers has never been the code they write. In fact, in my experience they're initially a net negative, as more senior developers take time to help them learn. But it's an investment, because they will grow into more senior developers.
The question really is what you think the long term direction of SWE as a profession is. If we need juniors later and senior's become expensive that's a nice problem to have mostly and can be fixed via training and knowledge transfer. Conversely people being hired and trained, especially when young into a sinking industry isn't doing anyone any favors.
While I think both sides have an argument on the eventual SWE career viability there is a problem. The downsides of hiring now (costs, uncertainity of work velocity, dry backlogs, etc) are certain; the risk of paying more later is not guaranteed and maybe not as big of an issue. Also training juniors doesn't always benefit the person paying.
* If you think long term that we will need seniors again (industry stays same size or starts growing again) given the usual high ROI on software most can afford to defer that decision till later. Goes back to pre-AI calculus and SWE's were expensive then and people still payed for them.
* If you think that the industry shrinks then its better to hold off so you get more out of your current staff, and you don't "hire to fire". Hopefully the industry on average shrinks in proportion to natural retirement of staff - I've seen this happen for example in local manufacturing where the plant lives but slowly winds down over time and as people retire they aren't replaced.
> The question really is what you think the long term direction of SWE as a profession is. If we need juniors later and senior's become expensive that's a nice problem to have mostly and can be fixed via training and knowledge transfer. Conversely people being hired and trained, especially when young into a sinking industry isn't doing anyone any favors.
Yes exactly!
What will SWE look like in 1 year? 5 years? 10?
Hiring juniors implies you're building something that's going to last long enough that the cost of training them will pay off. And hiring now implies that there's some useful knowledge/skill you can impart upon them to prepare them.
I think two things are true: there will be way fewer developer type jobs, full stop. And I also think whatever "developers" are / do day to day will be completely alien from what we do now.
If I "zoom out" and put my capitalist had on, this is the time to stop hiring and figure out who you already have who is capable of adapting. People who don't adapt will not have a role.
> If you think that the industry shrinks then its better to hold off so you get more out of your current staff, and you don't "hire to fire". Hopefully the industry on average shrinks in proportion to natural retirement of staff - I've seen this happen for example in local manufacturing where the plant lives but slowly winds down over time and as people retire they aren't replaced.
You can look even closer than that - look at some legacy techs like mainframe / COBOL / etc. Stuff that basically wound down but lasted long enough to keep seniors gainfully employed as they turned off the lights on the way out.
reply