A Moral Paradox in the Creation of Artificial Intelligence

Mark Walker

For moral reasons, we should not (now or in the future) create robots to replace humans in every undesirable job. At least some of the labour we might hope to avoid will require human-equivalent intelligence. If we make machines with human-equivalent intelligence then we must start thinking about them as our moral equivalents. If they are our moral equivalents then it is prima facie wrong to own them, or design them for the express purpose of doing our labour; for this would be to treat them as slaves, and it is wrong to treat our moral equivalents as slaves.

Subjects: 6. Computer-Human Interaction; 9.4 Philosophical Foundations

Submitted: May 16, 2006

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.