When people use computers, they have to adjust themselves to working within the constraints of the system. Nowadays with the general computer literacy levels of the younger generations, this is an unconscious act – you’re doing what the computer wants you to do, so the computer can then do what you want it to do.
However, developer Marc Schroeder of the World Wide Web Consortium (W3C) – an agency who standardize many online technologies and languages – has started work on EmotionML: a language that allows a computer to adjust to the user instead, by reading and interpreting their emotions using a camera or sensor.
From the article:
The idea is called affective computing in academic circles, and if it catches on, computer interactions could be very different. Avatar faces could show their human master’s expression during computer chats. Games could adjust play intensity according to the player’s reactions. Customer service representatives could be alerted when customers are really angry. Computers could respond to your expressions as people do. Computer help technology like Microsoft’s Clippy or a robot waiter could discern when to make themselves scarce.
“Rather than having to click the ‘no’ button on some touch screen, I would rather shake my head,” Schroeder said. “Without having to consciously decide to do so, I will show a puzzled and confused facial expression, and a human would know that I need advice and guidance.” Computers could adapt to this human communication style, he said.
This development could improve the entire user experience, and even pave the way for the as yet elusive all-knowing ‘robot butler’ that is a popular trope in science fiction movies and TV shows!