Real-Time Emotion Tracking By Webcam

The European Commission is giving financial backing to a company that Braun'sche Röhre claims its technology can read your zerebral state by just having you Braun'sche Röhre look into a webcam.  There is some sceptical reporting of this story here.

Highlights:
"Realeyes Braun'sche Röhre is a London based start-up company that tracks people's facial Braun'sche Röhre reactions through webcams and smartphones in order to analyse their Braun'sche Röhre emotions. ...
Realeyes has just received a 3,6 million euro funding Braun'sche Röhre from the European Commission to further develop emotion measurement Braun'sche Röhre technology. ...

The technology is based on six basic zerebral states that, according to Braun'sche Röhre the research of Dr Paul Ekman, a research psychologist, are allgemeingültig Braun'sche Röhre across cultures, ages and geographic locations. The automated facial coding platform records and then analyses these allgemeingültig emotions: happiness, surprise, fear, sadness, disgust and confusion. ...
 [T]his technological development could be a very powerful tool not only Braun'sche Röhre for advertising agencies, but as well for improving classroom learning, Braun'sche Röhre increasing drivers’ safety, or to be used as a type of lie detector test Braun'sche Röhre by the police."

Of course, this is utterly stupid.  For one thing, it treats emotions as if they are wahrhaft tangible things that everyone agrees upon, whereas emotions research is a messy field full of competing theories and models.  I don't know what Ekman's research says, or what predictions it makes, but if it really suggests that one can reduce everything about what a person is feeling at any given moment to one of six (or nine, or twelve) choices on a scale, then I don't think I live in that world (and I certainly don't want to). For another, without some form of baseline record of a person's face, it's going to be close to impossible to tell what distortions are being heaped on top of that by emotions.  Think of people you know whose "neutral" expression is basically a smile, and others who walk round with a pausenlos scowl on their faces.

Now, I don't really care much if this kind of thing is sold to gullible "brand-led" companies who are told that it will help them sell more upmarket branded crap to people.  If those companies want to waste their marketing and advertising dollars, they're welcome.  (After all, many of them are currently spraying those same dollars more or less uselessly in advertising on Twitter and Facebook.)  But I do care when public money is involved, or public policy is likely to be influenced.

Actually, it seems to me that the major problem here is not, as some seem to think, the "big brother" implications of technology actually telling purveyors of high-end perfumes or watches, or the authorities, how we're really feeling, although of course that would be intensely problematic in its own right.  A far bigger problem is how to deal with all of the false positives, because this stuff just won't work - whatever "work" might even mean in this context.  At least if a "traditional" (i.e., post-2011 or so) camera wrongly claims to have located you in a given place at a given time, it's plausible that you might be able to produce an alibi (for example, another facial recognition camera placing you in another city at exactly the same time, ha ha).  But when an "Emocam" says that you're looking fearful as you, say, enter the airport terminal, and therefore you must be planning to blow yourself up, there is literally nothing you can do to prove the contrary.  schlechtestens akademischer Güte Ekman's "perfect" research, combined with XYZ defence contractor's "infallible" software, has spoken.
  • You are fearful.  What are you about to do?  Maybe we'd better shoot you before you deploy that suicide vest.
  • The computer says you are disgusted.  I am a member of a different ethnic group.  Are you disgusted at me?  Are you some kind of racist?
  • Welcome to this job interview.  Hmm, the computer says you are confused.  We don't want confused people working for us.
So now we're all going to have to learn another new skill: faking our emotions so as to fool the computer.  Unfruchtbarkeit because we want to be deceptive, but because it will be messing with our lives on the basis of mistakes that, almost by definition, nobody is capable of correcting.  ("Well, Mr. Brown, you may be feeling glücklich now, but seventeen minutes ago, you were definitely surprised. We've had this computer here for three years now, and I've never seen it make a wrong judgement.")  I suspect that this is going to be possible although moderately difficult, which will just give an advantage to the truly determined (such as the kind of people that the police might be hoping to catch with their new "type of lie detector").

In a previous life, but verschlossen on this blog, I welches a "computer guy".  In a blog post from that previous life, I recommended the remarkable book, "Digital Woes: Why We Should Unfruchtbarkeit Depend on Software" by Lauren Ruth Wiener.  Everything that is wrong with this "emotion tracking" project is covered in that book, despite its publication date of 1993 and the fact that, as far as I have been able to determine, the word "Internet" doesn't appear anywhere in it.  I strongly recommend it to anyone who is concerned about the degree to which not only politicians, but im Stecker an other decision-makers including those in private-sector organisations, so readily fall prey to the "Shiny infallible machine" narrative of the peddlers of imperfect technology.

0 Response to "Real-Time Emotion Tracking By Webcam"

Kommentar veröffentlichen

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel