Categories
News

Let’s talk about killer robots • CableFree TV

Looking for Thanksgiving dinner table conversation that doesn’t involve politics or professional sports? Okay, let’s talk about killer robots. It’s a concept that has long gone from the pages of science fiction to reality, depending on how vague you use the term “robot.” Abandoned military drones Asimov’s first law of robotics – “A robot cannot harm a person or by its inaction allow a person to be harmed” – decades ago.

Recently, this topic has again heated up due to the growing prospect of the use of killer robots in the country’s law enforcement agencies. One of the most famous robot makers of the time, Boston Dynamics, raised some public policy red flags when it showed footage of its Spot robot deployed as part of a Massachusetts State Police exercise on our stage back in 2019.

The robots were not armed and instead were part of an exercise designed to determine how they could help keep officers out of harm’s way during a hostage situation or a terrorist situation. But the prospect of using robots in scenarios where human lives are in immediate danger was enough to prompt an ACLU request. what TechCrunch said:

We urgently need more transparency on the part of government agencies, which should openly communicate to the public about their plans to test and deploy new technologies. We also need nationwide rules to protect civil liberties, civil rights, and racial justice in the age of artificial intelligence.

Meanwhile last year NYPD broke the deal with Boston Dynamics after a strong public backlash after images emerged of Spot being deployed in response to a home invasion in the Bronx.

For its part, Boston Dynamics has been very vocal in its opposition to the use of its robots as weapons. Last month he signed open letter, along with other leading firms Agility, ANYbotics, Clearpath Robotics and Open Robotics, condemned the actions. He notes:

We believe that adding weapons to robots that are remotely or autonomously controlled, widely accessible to the public, and able to move to previously inaccessible places where people live and work, creates new risks of harm and serious ethical issues. The weaponization of these new robots will also damage public confidence in the technology in a way that will damage the enormous benefits they will bring to society.

The letter was thought to be in part a response to Ghost Robotics’ work with the US military. When images of one of its own autonomous rifle-wielding robot dogs surfaced on Twitter, the Philadelphia firm told TechCrunch that it was becoming agnostic about how the systems are being used by its military partners:

We don’t make a payload. Are we going to promote and advertise any of these weapon systems? Probably no. It’s hard to answer. Since we sell to the military, we don’t know what they do with them. We are not going to dictate to our government customers how they use robots.

We draw a line where they are sold. We only sell to US and Allied governments. We don’t even sell our robots to corporate clients in competitive markets. We receive many inquiries about our robots in Russia and China. We do not ship there even for our corporate clients.

Currently Boston Dynamics and Ghost Robotics embroiled in litigation involving multiple patents.

This week the local police website Mission Local Concerns about killer robots have resurfaced, this time in San Francisco. The site notes that the policy proposal, which will be considered by the city’s Supervisory Board next week, includes language about killer robots. “Law Enforcement Equipment Policy” begins with an inventory of the robots currently in the possession of the San Francisco Police Department.

There are 17 in total, 12 of which are active. They are primarily designed for detecting and disarming bombs, meaning none of them are designed specifically for killing.

“The robots listed in this section must not be used outside of training and simulations, apprehending criminals, critical incidents, exigent circumstances, executing a warrant, or during the evaluation of suspicious devices,” the policy states. He then adds, more troublingly, “Robots will only be used as a lethal force option when the risk of death to members of the public or officers is imminent and outweighs any other force option available to the SFPD.”

Basically, according to the language, robots can be used for assassination to potentially save the lives of officers or the populace. Perhaps in this context it seems harmless enough. At least that seems to fall under the legal definition “justified” deadly force. But there are new concerns about what may seem like a profound policy change.

First, the use of a bomb disposal robot to kill a suspect is not unprecedented. In July 2016, Dallas cops did just that for what was thought to be for the first time in US history. “We didn’t see any other option but to use our bomber robot and place a device on its extension cord to make it explode where the suspect was,” police chief David Brown said at the time.

Second, it is easy to see how the new use case can be used in Scenario DKAif the robot is intentionally or accidentally used in this way. Third, and perhaps most disturbingly, one can imagine the language applicable to the acquisition of a future robotic system that is not solely dedicated to the detection and disposal of explosives.

Mission Local adds that SF Supervisory Board Rules Committee Chairman Aaron Peskin tried to insert a more Asimov-friendly phrase: “Robots should not be used as a use of force against anyone.” The SFPD apparently struck out Peskin’s change and updated it to the current language.

The renewed talk of killer robots in California is due in part to Assembly Proposition 481. The legislation, signed into law by Gov. Gavin Newsom last September, is designed to make police operations more transparent. This includes an inventory of military equipment used by law enforcement.

The 17 robots included in the San Francisco document are part of a longer list that also includes Armored car Lenco BearCatstun grenades and 15 assault rifles.

Last month, Oakland police said that won’t seek approval for armed remote robots. This is stated in the message of the department. statement:

The Oakland Police Department (OPD) does not add armed remote controlled vehicles to the department. The OPD did take part in select committee discussions with the Oakland Police Commission and community members to explore every possible use of the vehicle. However, after further discussions with the manager and the executive team, the department decided that it no longer wanted to consider this particular option.

The statement caused a public outcry.

Toothpaste is already out of the tube Asimov’s First Law. Killer robots are already here. As for the second law – “A robot must obey orders given to it by humans” – this is still mostly within our reach. Society must determine how its robots behave.

By Peter Kavinsky

Peter Kavinsky is the Executive Editor at cablefreetv.org