Paperclip Maximizer

From BurnZero
Revision as of 20:37, 9 January 2023 by WikiSysop (talk | contribs) (Created page with "The Paperclip Maximizer is an artificial general intelligence (AGI) whose goal is to maximize the number of paperclips in its collection. If it has been constructed with a roughly human level of general intelligence, the AGI might try and achieve this goal by first collecting paperclips, then earn money to buy paperclips, and then begin to manufacture paperclips. The paperclip maximizer continues to optimize however, does not share any of the complex mix of human termina...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

The Paperclip Maximizer is an artificial general intelligence (AGI) whose goal is to maximize the number of paperclips in its collection. If it has been constructed with a roughly human level of general intelligence, the AGI might try and achieve this goal by first collecting paperclips, then earn money to buy paperclips, and then begin to manufacture paperclips. The paperclip maximizer continues to optimize however, does not share any of the complex mix of human terminal values and is not specifically programmed to be benevolent to humans, and might not stop in its goal. This could lead to the destruction of everything around it, if one building is left standing the machine would need to optimize so that the building is converted to paperclips....

Any future AGI, if it is not to destroy us, must have human values as its terminal value (goal). Human values don't spontaneously emerge in a generic optimization process. A safe AI would therefore have to be programmed explicitly with human values or programmed with the ability (including the goal) of inferring human values.

Share your opinion