Robotics is a quickly expanding field. Robots are being built and used in the classroom, in medical labs, in security innovations, and in manufacturing plants. Unfortunately, they are also used to commit crimes, launch lethal attacks, and to impede airport operations.
Rapid advances have posed ethical dilemmas. Robots that can act autonomously could potentially inflict damage never intended by their designers. They can be weaponized by terrorists and political extremists.
In January 2017, the U.S. Department of Defense released a video showing an autonomous drone swarm of 103 individual robots successfully flying over California. Nobody was in control of the drones; their flight paths were choreographed in real-time by an advanced algorithm. The drones “are a collective organism, sharing one distributed brain for decision-making and adapting to each other like swarms in nature,” a spokesman said. The drones in the video were not weaponized — but the technology to do so is rapidly evolving.
This article, "Robotics researchers have a duty to prevent autonomous weapons," explores the topic.
Related reading: A Global Arms Race for Killer Robots Is Transforming the Battlefield; What Happens When Your Bomb-Defusing Robot Becomes a Weapon
No comments:
Post a Comment