Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ah. I think it still requires you to understand what you're doing. I tried to use both copilot and ChatGPT 3.5 and 4. It doesn't get a lot of things right in my interest areas. I think it's great as a crazy-good autocompletion or refactoring tool. It also helps with boilerplate. But every time you need to know what you want, and it just saves you typing the code.

Sometimes it's just amusingly wrong, too. Sometimes annoyingly so, especially after it repeats mistakes.

I always said our jobs aren't hard because we have to write code for computer, but because we have to capture intent for other humans in code. Sometimes we don't even know what the intent is, because half the time we don't even know what we're building exactly.

So, this all slightly shifts the level of abstraction for developers to think more about intent and less about code. You still have to make sure the code is correct. So, you need to understand the code and the intent. Not much has changed. It's just gotten less mechanical.

Anyway, I don't think people need to be scared of AI just yet.

Obvs there's programmers that enjoy the nitty gritty keyboard slapping, and they can still do that for problems they find interesting. But for things they want to get out of the way, why not unburden ourselves?

Typing on phone so pardon the typos/autocorrect



I think LLMs will make a software engineers job harder and more in demand. By removing boiler plate, engineers will be expected to create more output per person, and the job moves up the stack from junior code monkey stuff to architect / senior debugging guru style work. Not everyone can do such a role. It will be a very tough time to be a junior.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: