The Emotion API takes a facial expression in an image as an input, and returns the confidence across a set of emotions for each face in the image, as well as bounding box for the face, using the Face API.
Science professor Wei Xiaoyong developed the new “face reader” to identify emotions which suggest if students are bored or stimulated.His technique produces a “curve” for each student showing how much they are either “happy” or “neutral”, and that data can indicate whether they are bored, he said.“When we correlate that kind of information to the way we teach, and we use a timeline, then you will know where you are actually attracting the students’ attention,” Professor Wei told The Telegraph.
Hoping to raise concerns about the potential misuses of FindFace, Tsvetkov seems to have inspired a particularly nasty effort to identify and harass Russian women who appear in pornography.
This is faceshift GDC 2015 reel, featuring our performance recording session at OMUK, our latest range of motion demos, and generally a bunch of people having loads of fun with real time mocap.
An experimental algorithm out of Facebook’s artificial intelligence lab can recognise people in photographs even when it can’t see their faces. Instead it looks for other unique characteristics like your hairdo, clothing, body shape and pose.
At a base minimum, people should be able to walk down a public street without fear that companies they’ve never heard of are tracking their every movement –and identifying them by name –using facial recognition technology