I think bostrom/Yudlowski's arguments are a bit flawed on thsi topic.
The AGI would improve its intelligence, not because it values more intelligence in its own right, but because more intelligence would help it achieve its goal of accumulating paperclips.
Why is the worthiness of this goal not subject to intelligent analysis, though? The whole scenario rests on the idea of an entity so intelligent as to wipe out all humanity, but simultaneously so limited as to be satisfied with maximizing paperclips (or any other limited goal for which this is a proxy).
An AGI is simply an optimization process—a goal-seeker, a utility-function-maximizer.
Then I submit that it's not an artificial general intelligence because it apparently lacks the ability to evaluate or set its own goals. I'm reminded of the 6th sally from the Cyberiad in which an inquisitive space pirate is undone by his excessive appetite for facts.
>it apparently lacks the ability to evaluate or set its own goals.
The AI would have to evaluate the goal by some standard, so 'maximize paperclips' is a proxy for whatever goals get a high evaluation from the standard. Getting the standard right presents essentially the same problem as setting the goal.
Putting in 'a need to be intellectually satisfied by the complexity of your end product' is complicated and still wouldn't save humanity.
The AGI would improve its intelligence, not because it values more intelligence in its own right, but because more intelligence would help it achieve its goal of accumulating paperclips.
Why is the worthiness of this goal not subject to intelligent analysis, though? The whole scenario rests on the idea of an entity so intelligent as to wipe out all humanity, but simultaneously so limited as to be satisfied with maximizing paperclips (or any other limited goal for which this is a proxy).
An AGI is simply an optimization process—a goal-seeker, a utility-function-maximizer.
Then I submit that it's not an artificial general intelligence because it apparently lacks the ability to evaluate or set its own goals. I'm reminded of the 6th sally from the Cyberiad in which an inquisitive space pirate is undone by his excessive appetite for facts.