Computers will detect our emotions and share that information

As early as this summer there may be movies that adjust their plot depending on what emotions are detected on your face. The Wired article that reports this also suggests that there will be issues of trust. Their example was that the software may notice that you are sad most of the time and sell that data to a pharmaceutical company.

The Affectiva software is based on MIT research showing how computers can be more emotionally intelligent, especially responding to a person’s frustration in a way that reduces negative feelings.  The software was originally developed in the late 1970s and is using data from 3.4 million faces in 75 countries. Part of the research also involves studies examining ethical issues in affective computing.

The software is being piloted to obtain reactions to products, politics and entertainment. They are looking at the addition of emoticons to chat software and reminders in cars (or elsewhere) when you start to be inattentive. The worst example I saw was that your smart fridge may lock away the ice cream if you are looking upset (I like my ice cream). The best example was that the software may notice that you are sad most of the time and could call your Mom for you.

But this is the example that could also lead to someone selling that “sad” data to a pharmaceutical company. One publication is worried that by the time we understand the implications of this technology, it will be too late to do anything about it. They report:

The Ethical Issues of Emerging ICT Applications (ETICA), a research project funded by the European Commission which ran from 2009 to 2011, found a variety of ethical considerations linked to affective computing, spanning from interpretation errors (e.g. a person being stopped in an airport on suspicion of planning an attack, when in fact they are just a very nervous flyer) to privacy concerns, to “a range of new dangers concerning personal integrity” as a result of affective computing’s ability to persuade and manipulate.

It is not necessarily wrong to persuade people or even to manipulate them. If the computer can see I am frustrated and can do something about it so I am happy, I don’t mind being manipulated. The unethical part would be if I did not know it was being done. I must always be told that the software is scanning my face for emotions. We deal with salespeople every day that are watching our emotions as they are trying to persuade us, so a computer can be dealt with too. If we know about it. At least for a while this will be new technology that people may not even be aware of. When we implement software like this we must be very careful that everyone knows what is going on.

And the ETICA example of a nervous person being stopped in the airport stresses another area where we must be very careful. The integrity of this data will always be in question. People cannot always tell the emotions of other people; a computer will not always be able to tell what people are feeling. No action should be taken on any facial expressions without confirming with the person involved.

Isn’t that what you would do in a conversation? You might see a smile, but you would not report to someone else “Julie is happy about…” without asking Julie why she is smiling. Good ethics from situations we already know.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada
Donna Lindskog
Donna Lindskoghttp://www.cips.ca
Donna Lindskog is an Information Systems Professional (retired) and has her Masters degree in Computer Science from the University of Regina. She has worked in the IT industry since 1978. Most of those years were at SaskTel where she progressed from Programmer, to Business Analyst, to Manager. At one point she had over 48 IT positions reporting to her and she has experience outside of IT managing Engineers. As a Relationship Manager, Donna worked with executive to define the IT Principles so departmental roles were defined. As the Resource Manager in the Corporate Program/Project Management Office, she introduced processes to get resources for corporate priorities. In 2003 she was given the YWCA Woman of Distinction Award in Technology.

Featured Download

IT World Canada in your inbox

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Latest Blogs

Senior Contributor Spotlight