Metadata, and How You Feel

February 12

By: Paula Kift

Metadata, and How You Feel

In “We Know How You Feel,” an article published in the New Yorker on January 19th, 2015, Raffi Khatchadourian describes the work of a startup company called Affectiva, which develops emotion-sensing software. Affectiva was founded by Rana el Kaliouby, an Egyptian scientist, and Rosalind Picard, a professor at the MIT Media Lab, in 2009. The company’s signature software, Affdex, calculates the proportions between non-deformable facial features such as mouth, nose, eyes and eyebrows. Affdex then “scans for the shifting texture of skin – the distribution of wrinkles around an eye, or the furrow of a brow – and combines that information with the deformable points to build detailed models of the face as it reacts. The algorithm identifies an emotional expression by comparing it with countless others that it has previously analyzed.” The software was initially developed to help autistic children classify human emotions. However, the business world was quick to identify more lucrative applications of the software. For instance, “CBS uses the software at its Las Vegas laboratory, Television City, where it tests new shows. During the 2012 Presidential elections, Kaliouby’s team used Affdex to track more than two hundred people watching clips of the Obama-Romney debates, and concluded that the software was able to predict voting preference with seventy-three-per-cent accuracy.” Perhaps more problematically, Affectiva could also be used in videoconferencing “to determine what the person on the other end of the call is not telling you. ‘The technology will say, ‘O.K., Mr. Whatever is showing signs of engagement – or he just smirked, and that means he was not persuaded.’”


Picard admits that some of the requests Affectiva received from corporations seemed unethical: “We had people come and say, ‘Can you spy on our employees without them knowing?’ or ‘Can you tell me how my customers are feeling?’ and I was like, ‘Well, here is why that is a bad idea.’ I can remember one wanted to put our stuff in these terminals and measure people, and we just went back to Affectiva and shook our heads. We told them, ‘We will not be a part of that – we have respect for the participant.’ But it’s tough when you are a little startup, and someone is willing to pay you, and you have to tell them to go away.” Picard eventually left Affectiva as the interest of the company shifted away from the medical to the corporate space.


Kaliouby and her team demonstrated that, in the age of big data, “even emotions could be quantified, aggregated, leveraged.” As of today the company has “analyzed more than two million videos, of respondents in eighty countries.” Given the wealth of the data, Affdex is now sophisticated enough to “read nuances of smiles better than most people can.” Kaliouby could imagine that one day cookies might be installed on computers that turn on laptop cameras as soon as somebody watches a YouTube video to analyze the user’s emotional response in real time.


Regulation is lagging. “In 2013, Representative Mike Capuano of Massachusetts, drafted the We Are Watching You Act, to compel companies to indicate when sensing begins, and to give consumers the right to disable it.” However, Capuano was unable to garner enough support for the bill as industry started lobbying against it. Meanwhile more and more companies are recognizing the financial potential of the Emotion Economy.


The technology described in the article raises intriguing questions with regard to the nature of electronically transmitted information and the third party doctrine. What category of information does emotional communication fit into? In the beginning of the article, the author suggests that “by some estimates we transmit more data with our expressions than with what we say.” Could emotional communication be classified as metadata? If so this would have problematic consequences for the privacy in our emotions since metadata is the kind of information that is least protected by current law. Even though Kaliouby and her colleagues assert that they turned away government inquiries about the technology, it seems likely that national security agencies are already in the process of developing their own. What if emotion-sensing technology were added to CCTV cameras?


Besides, if customers voluntarily allow third parties to collect information about their emotional communication, the government could easily gain access to that information by means of a subpoena. One could even imagine a time in which national security agencies collect emotional information on a grand scale and use it for predictive policing. For instance, national intelligence could determine that, based on an analysis of millions of emotional responses, a certain group of people is more likely to respond to certain information in a certain way. Everyone who reacts in a similar way would then be considered a part of that group and potentially threatening. In the age of big data, correlation trumps causation. Perhaps this scenario seems farfetched. But as Representative Capuano points out, “The most difficult part is getting people to realize that this is real. People were saying, ‘Come on. What are you, crazy, Capuano? What, do you have tinfoil wrapped around your head?’ And I was like, ‘Well, no. But if I did, it’s still real.”