Scientists develop computer than can distinguish human emotions

An "emotionally aware" computer system designed to read people's minds by analysing expressions will be featured at a major London exhibition.

Visitors to the Royal Society Summer Science Exhibition are being invited to help "train" the computer how to read joy, anger and other expressions.

Its designers say there are potential commercial uses, such as picking the right time to sell someone something. But it may also help improve driver safety and help people with autism. The computer, which is connected to a camera, locates and tracks 24 facial "feature points" such as the edge of the nose, the eyebrows and the corners of the mouth.

A total of 20 key facial movements - including a nod or shake of the head, a raise of the eyebrow or a pull on the corner of the mouth - have been identified, reports BBC News.

According to Reuters, the scientists, who are developing the technology in collaboration with researchers at the Massachusetts Institute of Technology (MIT) in the United States, also hope to get it to accept other inputs such as posture and gesture.

"The system we have developed allows a wide range of mental states to be identified just by pointing a video camera at someone," said Professor Peter Robinson, of the University of Cambridge in England.

He and his collaborators believe the mind-reading computer's applications could range from improving people's driving skills to helping companies tailor advertising to people's moods.

"Imagine a computer that could pick the right emotional moment to try to sell you something, a future where mobile phones, cars and Web sites could read our mind and react to our moods," he added.

Prof Robinson said: "The system can cope with the variation in people's facial composition, for example if you have a round or thin face or if you wear glasses or have a beard.

"However, there are small variations in the way people express the same emotion. My colleagues working at the Massachusetts Institute of Technology are fine-tuning the system by testing it with real people's reactions to everyday life using cameras attached to neck-braces."

Alternative applications include the improvement of drivers’ safety and comfort. The developers are registering the moods and facial expressions of tested subjects and are monitoring them to identify more complex expressions linked to confusion, boredom or tiredness.

"We are working with a major car company and it is possible that this technology could feature in cars within five years," Prof Robinson said.

He also added that the computer can be helpful in online teaching- to assess the extent of understanding of a student of what had been explained by analyzing his or her facial expressions, reports Playfuls.


Subscribe to Pravda.Ru Telegram channel, Facebook, RSS!

Author`s name Editorial Team