This was a presentation by researchers at the University of Memphis who mined a variety of data about user behavior to detect emotions while individuals are engaged in online learning or assessments. What I found really interesting about their approach is that it is multi-modal: they take into account facial features, bodily postures and even physiological parameters (eg. heart waves). They take this data and try to predict different emotions more accurately using machine learning techniques.
Here is a link to a related paper.
And here is a link to their project page.