Friday, September 13, 2013

On the Morality of Things



"All air travel in the continental United States has been suspended following the arrest of the distributed computer system, ENTERACK. The system's reactive AI permitted two fully loaded Boeing 797's to collide, midair, over sparsely populated farmland in Colorado after it received notice that two known terrorists were aboard the separate planes. While not much is known at this time, system logs cite that the decision was made after calculating that the collision would result in less harm than if the terrorists were allowed to reach their destinations...."  
Ridiculous right? How would you even arrest a computer, let alone try it in the court of law? Would you take a single blade from the server and put it on the stand? Maybe create an avatar for it like IBM's Watson. Either way, once you could finally question it, the system would simply state facts, devoid of opinion and emotional response. The system wouldn't show remorse, confusion, or even attempt to explain its actions with anything more then facts. The jury would be unable to relate to it, see why the system did what it did, all because it lacks emotion. It's simple really, morals and ethics come from emotion, based on things that "feel" right. While we use reason and logic to turn these feelings into laws and social norms, they still originate from a faculty that computers do not have. It's odd, if you think about it. Computer systems are under more scrutiny now after Snowden came forward. Questions are being lofted but the morality of this computer system, X type of communication, public vs private, etc. Computers are always at the center of these questions, but everyone likes to gloss over who the real culprit is, people. 

Its always been people. Some try to give their tools a certain sense of morality but I believe that there is no such thing. Take a knife for example. In its most basic form, a thin piece of metal sandwiched between some type of plastic or wood. Moral, immoral, amoral, who's to say. It's not the object that brings about these labels, but the actions that are performed using the object. In a chef's hand a knife can help create a world famous dish while in a soldier's take a life. The soldier can turn around and use that knife to cut strips for bandages, adding more gray to the canvas.  A baseball bat in Major Leaguers hands is used to play a game, in the hands of father at home, a defensive weapon. The same object, different perspective.

I guess what I am trying to say is that objects do not and should not have any morality attached to them. That label should be attached to the action the object was used for and by extension, the person who initiated the action. When people say that chemical weapons are immoral, they are really saying that using chemical weapons is immoral. Without a person to deploy the weapon, it will just sit there, collecting dust until it is forgotten. This may just be semantics, but the inclusion of the word "using" makes a difference in how objects are perceived. I know of no one who would consider cars immoral , but would consider using their car to hit a person quite immoral. And that's the point, the action holds the morality, the object doesn't. So next time someone gets on the TV and questions the morality of the NSA programs stop and ask yourself, Is the program immoral, or just what the program is used for, and by extension, the person who wanted the program in the first place.

1 comment:

  1. I think the point of our readings was that some objects and technologies lend themselves to a certain type of use, and that use has a morality associated with it, and thus you could ascribe morality to the technology based on the uses it facilitates. Chemical weapons do not HAVE to be used for mass destruction, but it is very likely that they WILL be, and so it is fair to imagine them as immoral objects.

    ReplyDelete