Systems that are capable of detecting information related to emotion have many beneficial uses and also many frightening misuses. While often these frightening misuses may work to the benefit of a particular individual, in many cases they cause harm to others. As such, many of these frightening uses are adversarial in nature.
When individuals oppose one another their relationship can be described as adversarial. For instance, if an individual's goal is to hinder another person by speaking against them, acting against them, or behaving in a hostile manner then this individual may be described as the other person's adversary.
Individuals may behave adversarially not just through actions or spoken words, but through artifacts such as letters, legislation, weapons, or technologies. Information technology is already used in some adversarial ways:
The TALON robo-soldier, a tracked robot equipped with machine gun [fostermiller2005]. Current research efforts are developing robots capable of full autonomy that will "be equipped with a pump-action shotgun system able to recycle itself and fire remotely" [stamant2004].
Uncontrollable haptic devices used to explore the theme of "human-machine conflict" [schiessl2003].
The "fruit machine," a device developed in Canada for the purpose of identifying homosexuals. "The fruit machine was employed in Canada in the 1950s and 1960s during a campaign to eliminate all homosexuals from the civil service, the RCMP, and the military." It worked by measuring pupil dilation, perspiration and pulse for arousal [sawatsky1980].
The polygraph or lie detector is a device designed to detect deception. Polygraphs have been used in criminal investigations and by intelligence agencies to screen employees [ota1983].
"Integrated System for Emotional State Recognition for the Enhancement of Human Performance and Detection of Criminal Intent," is the subject of a recent DARPA SBIR [darpa2004]. This initiative emphasizes technologies that can be used without the consent or knowledge of users.
Existing work has observed and analyzed adversarial relationships. Cohen et al. performed ethnography investigating the "phenomena of adversarial collaboration" in work-flow systems used in a law firm [cohen2000]. Applbaum provides an ethical analysis of adversarial roles [applbaum2000].
Affective computing is "computing that relates to, arises from, or deliberately influences emotion" [picard1997]. This thesis examines use (or misuse) of affective computing in adversarial contexts.
Individuals often view their emotions to be especially sensitive and private matters. As such, adversarial uses of systems that sense and communicate affect are especially interesting as a domain of inquiry. The problem that this work is addressing is the lack of information concerning user responses to affective communication systems in an adversarial use context. The approach taken to this problem is to repeatedly induce situations that are adversarial and then collect performance and survey data in an experimental context. The idea is to use this information to inform design of future systems that sense and transmit information related to emotion in ways that are ethical.
As an example of a system that uses affective computing in an adversarial manner let us first consider Bentham's Panopticon and then a hypothetical system called "Panemoticon" that seeks to observe information related to emotion from a large number of people.
Bentham took an utilitarian perspective on ethics and also sought to improve prison conditions [bentham1787]. His conception of an ideal prison was a "surveillance machine" allowing an inspector to watch prisoners without being seen. It was his contention that the mere possibility of surveillance would induce prisoners to mend their ways. The Panopticon places an inspector in a privileged centralized position in the architecture of the prison, as a sort of all-seeing-eye. The Panopticon thus enforces a power relationship between the inspector who is in a dominant position and the prisoner who is in a submissive position [foucault1975].
Foucault argued that "power somehow inheres in institutions themselves rather than in the individuals that make those institutions function" [felluga2002]. He cited the Panopticon as a way to illustrate his point about how power can become invested in artifacts: "it automatizes and disindividualizes power."
Affective computing systems used adversarially also have the capability of automatizing and disindividualizing power. A pervasive network of sensors that observe emotional states and uses them to watch for criminal intent takes a process that was once limited (interrogation and observation of affective cues by individuals) and automatically reproduces the interrogative process on a larger scale and more frequent basis. While such a state of affairs seems rather far-fetched consider the title of a recent DARPA research solicitation: "Integrated System for Emotional State Recognition for the Enhancement of Human Performance and Detection of Criminal Intent" [darpa2004].
The nightmarish world of Orwell's 1984 portrays a dystopia where pervasive communication of affect to a dominant party is realized [orwell1949]. Orwell's fiction elaborated on Bentham's Panopticon, with the introduction of pervasive telescreens. Orwell's narrator comments "There was no place where you could be more certain that the telescreens were watched continuously." In doing so he echos the goal of the panopticon, that is constant pervasive and internalized surveillance.
In the excerpt below we see how the telescreens in Orwell's 1984 capture many pieces of information related to emotion and some that have already been used in the development of affective computing systems.
He took his scribbling pad on his knee and pushed back his chair so as to get as far away from the telescreen as possible. To keep your face expressionless was not difficult, and even your breathing could be controlled, with an effort: but you could not control the beating of your heart, and the telescreen was quite delicate enough to pick it up. He let what he judged to be ten minutes go by, tormented all the while by the fear that some accident -- a sudden draught blowing across his desk, for instance -- would betray him.
Later the protagonist Winston comments on how subtle and unconscious facial movements are dangerous to exhibit in a society with pervasive monitoring:
Your worst enemy, he reflected, was your own nervous system. At any moment the tension inside you was liable to translate itself into some visible symptom. He thought of a man whom he had passed in the street a few weeks back; a quite ordinary-looking man, a Party member, aged thirty-five to forty, tallish and thin, carrying a brief-case. They were a few meters apart when the left side of the man's face was suddenly contorted by a sort of spasm. It happened again just as they were passing one another: it was only a twitch, a quiver, rapid as the clicking of a camera shutter, but obviously habitual. He remembered thinking at the time: That poor devil is done for. And what was frightening was that the action was quite possibly unconscious.
Many systems exist that allow Internet users to observe activity of individuals on the Internet. "AIM Sniffers" for instance allow an individual to monitor and archive Internet chatting activity [aimsniff2003]. IMWatching furthermore allows users to track presence information of chat users [harfst2004].
Such systems are a special variety of more generalized network sniffing and logging software. The UNIX program tcpdump allows systems to programmatically filter and store TCP/IP network traffic [richardson2004]. A Windows version of this software also exists called WinDump [windump2004]. Additionally the library libpcap can be used to develop applications that capture TCP/IP traffic [richardson2004]. While there are many legitimate and ethically acceptable uses for such libraries (e.g. debugging, firewalls) there are also many malicious and potentially ethically unacceptable uses (e.g. stealing passwords or credit card information).
Imagine an internet application that uses such technologies to detect information related to emotions; one might call it "Panemoticion." Panemoticon could employ sniffing techniques and feed this data to systems that classify the emotional orientation and valence of observed words. Panemoticon could then construct graphical displays of the affective content of communication of networked users over time. Panemoticon does not exist; however, it could be built. As a thought experiment it provides a context for discussing the problems with unchecked affect sensing.
Such a system might be desirable to mid-level managers concerned with employee morale and alertness. It might however, be seen as adversarial by employees. It is these types of uses for which we would like to begin ethical inquiry.