Thursday

Now, a computer to lip-read and decode emotions


Acomputer is being taught to interpret human emotions based on lip-reading, one which could improve our interaction with these machines and perhaps allow disabled people to use voice synthesizers, more effectively and efficiently.Karthigayan Muthukaruppan of Manipal International University in Selangor, Malaysia, and co-workers have developed a system using a genetic algorithm that gets better and better with each use to match irregular ellipse (lip shapes) fitting equations to the shape of the human mouth displaying different emotions.They have used photos of individuals from South-East Asia and Japan to train a computer to recognize the six commonly accepted human emotions - happiness, sadness, fear, angry, disgust, surprise - and a neutral expression.
Machines being trained to read lips and gestures
Machines being trained to read emotions


The upper and lower lip is each analyzed as two separate ellipses by the algorithm, the International Journal of Artificial Intelligence and Soft Computing reported."In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers especially in the area of human emotion recognition by observing facial expression," the team explained, according to a statement of Manipal University.Earlier researchers had developed an understanding that allows emotion to be recreated by manipulating a representation of the human face on a computer screen.However, lips remain a crucial part of the outward expression of emotion. The team's algorithm can successfully classify the seven emotions and a neutral expression described.Researchers suggest that initial applications of such an emotion detector might be helping disabled patients lacking speech to interact more effectively with computer-based communication devices, for instance.

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

TECH 4 COMPUTER Headline Animator