English

Designing Frameworks for Automatic Affect Prediction and Classification in Dimensional

By  





A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. To realize this prediction, next-generation computing should develop anticipatory user interfaces that are human-centred, built for humans, and based on naturally occurring multimodal human behaviour such as affective and social signaling. The facial behaviour is our preeminent means to communicating affective and social signals. This talk discusses a number of components of human facial behavior, how they can be automatically sensed and analysed by computer, what is the past research in the field conducted by the iBUG group at Imperial College London, and how far we are from enabling computers to understand human facial behavior. **//Disclaimer:// There may be mistakes or omissions in the interpretation as the interpreters are not experts in the field of interest and performed a simultaneous translation without comprehensive preparation.**
Find OpenCourseWare Online Exams!
Attribution: The Open Education Consortium
http://www.ocwconsortium.org/courses/view/26a1cb36d466533eeae534bae80b12d7/
Course Home http://videolectures.net/gesturerecognition2011_pantic_designing/