Google’s smart glasses can do many things but at the moment it can’t tell you how the people around you are feeling, this could be set to change. A Google Glass emotion reader has been added in private beta tests, working with an app from Emotient.
This partnership between Google and Emotient will see facial recognition technology implemented into Google Glass. The feature will scan the targets face for traces of emotion and relay the findings back to the wearer. The reader will go beyond basic readings of happy and sad, instead going more in depth on the primary emotions such as surprise, disgust, anger, fear, contempt, sadness and joy.
Emotient will even attempt to pick up more advanced emotions such as frustration and confusion. All of the readings that Google Glass picks up will be sent back to the app and compile data for users to access via smartphone or tablet. Seeing as Google Glass too is still on its trial period and has not been fully released, it is not certain this emotion reading feature will be added in the future.
You can find out plenty more details on this beta project by visiting The Next Web, who saw a demo first hand on how the process works. Whilst this technology is smart, is it really required and can a camera and computer software truly read a person’s facial expressions accurately? Let us know your thoughts in the comments section below.
Also See: Amazon (Fire) Google Glass rival likely