Main Page: Difference between revisions

From BurnZero
No edit summary
Line 32: Line 32:


== '''Part 6''': The Good Machine ==
== '''Part 6''': The Good Machine ==
When the concept of robotics was first invented, Isaac Asimov created a thought experiment in his book ''i, Robot''. In it, he imagines the creation of autonomous intelligence in the form of androids however a conundrum arose. If a machine is developed which has autonomy and was sufficiently enabled, how can we ensure that it primarily does no harm to humans? Without any protective laws a machine with the purpose of purely making money will destroy everything in its path to achieve that goal. As such, Asimov developed the ''Four''<ref>'''Isaac Asimov’s Laws of Robotics Are Wrong''' - Peter W. Singer published: May 18, 2009, accessed on 8th July 2022 via: https://www.brookings.edu/opinions/isaac-asimovs-laws-of-robotics-are-wrong/</ref> laws of robotics, distinct ethical rules to protect humans from the ruthlessness of machines:
When the concept of robotics was first invented, Isaac Asimov imagined the creation of autonomous intelligence in the form of androids however a conundrum arose. If a machine is developed which has autonomy and was sufficiently enabled, how can we ensure that it primarily does no harm to humans? Without any protective laws a machine with the purpose of purely making money will destroy everything in its path to achieve its goal. As such, Asimov developed the ''Four''<ref>'''Isaac Asimov’s Laws of Robotics Are Wrong''' - Peter W. Singer published: May 18, 2009, accessed on 8th July 2022 via: https://www.brookings.edu/opinions/isaac-asimovs-laws-of-robotics-are-wrong/</ref> laws of robotics, distinct ethical rules to protect humans from the ruthlessness of machines:


* '''Zeroth Law''' - A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
* '''Zeroth Law''' - A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
Line 39: Line 39:
* '''Third Law''' - A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
* '''Third Law''' - A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.


These laws have been used as the basic ethical rules for the formulation of a series of [[tenet]]<nowiki/>s which has led to the development of an [[Transparent incorporation statement|incorporation statement of''The Transparent Company'']].
If a corporation is a type of machine, perhaps these laws might serve as basis of a series of [[tenet]] ethical rules for the formulation of a series of s which has led to the development of an [[Transparent incorporation statement|incorporation statement of''The Transparent Company'']].


<hr/>
<hr/>
'''References'''
'''References'''
<references />
<references />

Share your opinion