We need efficiency when we want to maximize comfort and minimize labor.
But nothing forbids people to pursue less efficient endeavors during their free time. There are people maintaining old cars and locomotive. There are people gardening or woodworking "inefficiently" for their own pleasure.
What we remove is the need to force people to work on these fields. Whether we abandon them altogether depends solely on our culture.
What has gone wrong is that AI researchers and futurologists have warned for decades that AIs replacing jobs was coming (though everybody was taken by surprise that artists would be first) and that we needed to change our paradigms about labor quickly.
We should be embarrassing the weight of labor lifted from our shoulders and organize society so that it becomes desirable. Replaced professionals should be allowed to go into retirement directly and access some form of basic income. There should be incentives to automating your own work.
Now incentives are on the other way: everyone tries (understandably) to make it as hard as possible to be replaced but this just leads to a world of bullshit jobs pretending to be useful while the AIs do all the heavy lifting.
We should be embarrassing the weight of labor lifted from our shoulders and organize society so that it becomes desirable. Replaced professionals should be allowed to go into retirement directly and access some form of basic income. There should be incentives to automating your own work.
Completely disagree with this view, especially when it comes to art.
This is not advancing society, or art, or anything, it's a freaking knock off factory that's been developed and used for commercial gain. It's not the same as manual loom weavers being replaced, this is a way to steal peoples stuff with little trace or accountability. This is legal trash and MS has plenty of lawyers to make you feel otherwise.
I was playing with it today and I couldn't believe the crap I was seeing. Nearly every single known artist is known because they developed a unique style, and worked to develop that style. They didn't just copy other peoples stuff and sell it. To make a machine that just rips off peoples styles and then you charge money for it is freaking lame as and I really hope they get sued hard for it, actually they should be taxed to pay for all the fantasy retirement money you speak of.
Do you know that throughout history there has been many, many artists who made counterfeit art works for money? There has been many, many print shops ripping off work etc. This is absolutely no different to automating that same level of theft. In my opinion, what's going on there isn't even really that original.
This isn't just about research or advancing anything, because it's not. It takes much and gives back very little. If it was for research, do it in a university, generate some new cool things with it and leave it at that. Having a machine which can just "draw this and do a half ass job at ripping off someone who worked hard to create something cool", what a low blow.
Disclaimer: I'm not an artist, so it's not personal, I just now bullshit when I see it, and that thing is bullshit, regardless of the technology behind it.
> "Nearly every single known artist is known because they developed a unique style, and worked to develop that style. They didn't just copy other peoples stuff and sell it. To make a machine that just rips off peoples styles and then you charge money for it is freaking lame as and I really hope they get sued hard for it."
Because when you drink Cherry Cola or eat Cherry Bakewell Tart, your first thought is to the chemist who worked hard to make the synthetic flavour, and then criticism to the company that just "copies other peoples stuff and sells it" hoping they get sued hard for it?
I also think it’s a matter of scale, I guess you’re staying a company has stolen another companies flavouring formulae, and used it for their own devices, which does seem like ripping something off, anyway…I shouldn’t and don’t care in that case so I shouldn’t care when artists work is is used to profit off without paying them royalties ? ..
I remember a pretty old interview with Linus Torvalds where they are talking about object oriented programming. The interviewer asked him if he expected a similar paradigm change in the coming years and I remember being surprised by his answer: (quoting from memory) No, I don't see anything big coming. Probably the next change will be caused by AI.
Yes, differentiable code is already a new paradigm (write a function with millions of parameter, a loss function that requires more craft than people realize and train). That has a property that used to be the grail of IT project management: it is a field where, when you want to improve your code performance, you can just throw more compute at it.
And I think that the clumsy but still impressive attempts at code generation hints at the possibility that yet another AI-caused paradigm change is on the horizon: coding through prompt, adding another huge step on the abstraction ladder we have been climbing.
Forget ChatGPT coding mistakes, but down the road there is a team that will manage to propose a highly abstract yet predictable code generator fueled by language models. It will change our work totally.
We might get into another slump of efficiency as an outcome of this, again stopping us from making the most of the hardware and computational resources we have, due to prompts being too unspecific. Did not specify the OS your code will run on? Well, we better use this general cross OS available library here, instead of the optimized one for the actual OS the thing will run on.
The same mentality, that causes today's "everything must be a web app", will caused terrible inefficiency in AI generated (and human prompted for) code. In the end our systems might not be more performant than anything we already have, because there are dozens of useless abstraction layers inserted.
At the same time other people might complain, that the AI does not generate code, that can be run everywhere. That they have to be too specific. People might work on that, producing code generators which output even more overheady code.
At least some of that overhead will slip through the cracks into production systems, as companies wont be willing to invest into proof-reading software engineers and long prompt-generate-review-feedback cycles.
I don't feel a comparison to DSLs works here at all. If you are just using plain human language, is a comparison to DSLs apt?
The point of DSLs are to provide a deliberately limited-scope language optimised for a specific problem or problem domain. LLMs that use general human language is like the furthest opposite of a DSL - its the broadest scope language for describing any problem, and they try to solve them all.
Also, few popular DSLs are truly blackbox in the sense chatGPT is - many of them have exposed source or even line-by-line debuggers available. There are a ton of other reasons this doesn't make sense to compare.
You can even get ChatGPT to write programs to query different APIs or to do calculations.
Had ChatGPT been an open model, like OpenAI was supposed to produce, this kind of applications would have seemed obvious and happened in the first two weeks after release.