Human Development Colloquium on Learning Analytics
I attended a colloquium today at the Human Development Department of TC where Prof. Alyssa Friend Wise from NYU came to present her research in using learning analytics for online discussions in MOOC. Her research has a high relevance to Vialogues because we have been researching how to evaluate the productivity of video discussions in Vialogues. Here I'm sharing some notes and thoughts about her colloquium.
- She looked at five types of data: demographic, performance, activities, artifacts, and contexts.
- She cited a research of Sinha et al. (2014) on "fuzzy pattern matching" of MOOC video watching data analytics. Very interesting and informative to analyze the video watching activities data we collect in Vialogues.
- Sinha, T., Jermann, P., Li, N., & Dillenbourg, P. (2014). Your click decides your fate: Inferring information processing and attrition behavior from mooc video clickstream interactions
- What's especially interesting is this paper differentiates the video events of the same type by relating the event to the context. For instance, rather than a simple "seek" event, the paper uses "seek ahead" and "seek back" events. The benefit of making the differentiation is we can better "guess" the intention of the learner behavior. "Seek back" is more likely to happen when the learner wants to understand a segment more thoroughly, whereas "seek ahead" is more likely to look for interested segments. She clustered the differentiated or higher-order events rather than the raw events. I believe her approach is more accurate.
- She cited a series of papers by herself on clustering user activity events in online discussions. She pays attention to four dimensions of the discussions in the clustering algorithm: breadth, depth, temporal continuity (whether discussants conduct learning regularly or intermittent), and revisitation.
- One of the papers is Wise, A. F., & Chiu, M. M. (2014). The impact of rotating summarizing roles in online discussions: Effects on learners’ listening behaviors during and subsequent to role assignment. Computers in Human Behavior, 38, 261-271.
- I can relate the four dimensions to our attempts to establish a "good model" of video discussion. Productive discussions are likely to stand out in all four dimensions rather than just a few. If not all dimensions are met, the instructor or the software should prompt the students to pay attention.
- When she exposed the learners with the analytics data, the learners were more wary about their learning performance and increased participation in the discussions.
- She also conducted social network analysis and found the instructor participation in the discussions are highly correlated with the student activities. For example, she found the social network has a bigger size and higher connectedness if the instructor makes responses to all levels instead of only to the thread starter, provides support to the thinking process instead of making straightforward answers to student questions, and demonstrates higher social presence in the course.
- Prof. Wise also did something similar to my doctoral dissertation i.e. using natural language processing to understand the quality of the discussion. Unfortunately she only talked a little about the her NLP experiment so I don't know more details. One interesting thing though is, unlike the typical NLP practice where stop words like "and", "to", "the" are removed from analysis, she kept those stop words and said they were important in analyzing online discussions. I feel particularly curious how she managed to differentiate which stop words are useful versus not useful in the corpus.