June 28, 2020
by Steve Stofka
Last week Senate Democrats blocked a Republican sponsored policing bill that did not go far enough. This week House Democrats proposed a bill that Republicans said went too far. A divided Congress where nothing gets accomplished but many fine speeches are made. Giving a microphone to a politician is like giving a lollipop to a young child.
Libertarians prefer a divided Congress. Aren’t there enough rules already? Each year, the Supreme Court decides on the different interpretations of more than hundred laws that are already on the books. They turn down thousands more cases (U.S. Courts, n.d.). Voters send their elected representatives to Washington to write laws. Even in a divided Congress, a few hundred bills pass both houses of Congress and become law. The media attention often focuses on those bills that are blocked by either chamber (U.S. Congress, n.d.). Policing laws…hold onto that thread for a minute.
Apple TV has announced the 2021 release of Isaac Asimov’s groundbreaking Foundation Series (Apple, 2020). Mr. Asimov is also known for his imaginative stories about robots. He invented the 3 laws of robotics, and his stories explore the contradictions and complexities of writing rules, or algorithms for robot behavior (Anderson, 2019).
Pick up the policing laws thread again. What rule for supervising police behavior might Asimov suggest? In a situation under review, ask this question: would a highly sophisticated robot cop behave in such a manner? A quick refresher on the 3 laws: 1) don’t cause or allow harm to humans; 2) obey humans unless that conflicts with the first law; 3) a robot’s self-preservation unless that comes into conflict with the first two laws.
Let’s look at the George Floyd case (Hill, 2020), and begin with a consideration of possible violations of Law #1. Did the officer cause harm to George Floyd, a human being? Yes. But wait, there’s possible rule conflict here and this is the subject of some of Asimov’s stories. A robot might have to cause harm to a human being to stop that human being from causing even greater harm to another human being. So let’s ask. Did George Floyd cause harm to another human being at this time? No. Was he likely to cause harm, given that he was handcuffed and several officers were surrounding him? No.
On to Law #2: When George Floyd repeatedly said “I can’t breathe,” did the officer respond by adjusting his position so that Mr. Floyd could breathe? No. A violation of Law #2.
Law #3: Self-preservation. Was the officer in imminent danger of destruction? No.
If the officer were a robot, his behavior would have been in violation of the laws. His positronic brain would have been replaced and later analyzed to understand the circuitry malfunction.
The laws and many of Asimov’s works explore the tensions and interpretations of several foundational philosophers: the universal rule-making of Immanuel Kant; the utilitarian and consequentialist principles of Jeremy Bentham; and the virtue ethics of Aristotle. Could R. Daneel Olivaw, the robot detective found in many of Asimov’s novels, practice virtue (Fandom, n.d.)? Yes, if a robot’s behavior is indistinguishable from that of a human being who acts with virtue.
Like the behavior of Asimov’s robots, most of our laws are guided by the principles stated by Aristotle, Bentham and Kant. Our courts and juries judge human beings based on those laws. Police officers are not expected to act rationally like a robot, but like a reasonable person whose actions can be justified in the circumstances (Gardner, 2019). The reasonable person standard is a fictional person just as Daneel Olivaw is a fictional robot. Our legal institutions have difficulty defining and employing a consistent reasonable person standard.
Programmers would have as much difficulty coding mostly-rational-but-sometimes-erratic-but-understandably-so algorithms. Our cells behave like those algorithms – rational most of the time and cancerous when they become erratic.
In the far distant future, if we have robots policing our communities, we will have problems similar to our current concerns. Supervising the legal use of force has troubled many human societies and technology will not solve that persistent problem. Some robots will have defective positronic brains and commit acts of violence in violation of their programming. We’ll argue over the rules for robots and how to write them – at least I hope so. I hope that there is a Congress or some other deliberative body that argues over policing tactics as the House and Senate did these past two weeks. I worry when we stop arguing. That’s when the guns start arguing.
Anderson, M. R. (2019, November 11). After 75 years, Isaac Asimov’s Three Laws of Robotics need updating. Retrieved June 25, 2020, from https://theconversation.com/after-75-years-isaac-asimovs-three-laws-of-robotics-need-updating-74501
Apple. (2021, April 01). Foundation on Apple TV+. Retrieved June 26, 2020, from https://tv.apple.com/us/show/foundation/umc.cmc.5983fipzqbicvrve6jdfep4x3?at=1000lDR
Fandom. (n.d.). R. Daneel Olivaw. Retrieved June 26, 2020, from https://asimov.fandom.com/wiki/R._Daneel_Olivaw
Gardner, J. (2019). The Many Faces of the Reasonable Person. Torts and Other Wrongs, 271-303. doi:10.1093/oso/9780198852940.003.0009
Hill, E., etal. (2020, June 01). 8 Minutes and 46 Seconds: How George Floyd Was Killed in Police Custody. NY Times. Retrieved June 26, 2020, from https://www.nytimes.com/2020/05/31/us/george-floyd-investigation.html
U.S. Congress. (n.d.). Public Laws. Retrieved June 26, 2020, from https://www.congress.gov/public-laws/116th-congress
U.S. Courts. (n.d.). Supreme Court Procedures. Retrieved June 26, 2020, from https://www.uscourts.gov/about-federal-courts/educational-resources/about-educational-outreach/activity-resources/supreme-1