This website uses cookies and similar technologies to understand visitors' experiences. By continuing to use this website, you accept our use of cookies and similar technologies,Terms of Use, and Privacy Policy.

Jul 06 2017 - 11:18am
Educational Data Mining Topics: Educational Resources and Affect and Engagement

Related topics:

Educational Resources

Effective framework for automatically generating and ranking topics in MOOC videos

Inferring Frequently Asked Questions from Student Question Answering Forums

Affect and Engagement

Learner Affect Through the Looking Glass: Characterization and Detection of Confusion in Online Courses

Generalizability of Face-Based Mind Wandering Detection Across Task Contexts (Best Paper Award)

Behavior-Based Latent Variable Model for Learner Engagement

Predicting Student Retention from Behavior in an Online Orientation Course

Affect and engagement was a topic that I was not very familiar with at all and it was interesting to see the variety covered by the research in this topic. Similarly the talks I attended in the educational resource topic, focused on addressing moments of stress, and confusion. Addressing a learners affect is a challenge for online platforms given the lack of direct contact between real educators who could address issues of stress and confusion. In the world of MOOC's this isssue is particularly important given the fact that this lack of interaction or guidance may contribute to the high dropout rates overall. It has been shown in many studies that learner affect, engagement and learning outcomes are all interconnected.

In an Effective framework for automatically generating and ranking topics in MOOC videos had the goal of being able to provide guidance at an early stage of the learning process by highlighting the most important topics within the videos. This study also used data from the transcription of the video dialogue to generate the topics within a video using topic modelling techniques such as LDA. Given the topics the researchers further implemented ranking algorithms to assign importance to each topic.  As such the generation of suggestions and guidance was completely automated using these methods. To measure the accuracy of the importance ranking of the topics they compared this result with the number of problems assigned in each topic. Their combination of LDA and PageRank algorithm allowed them to successfully gather the important topics of a course, providing to students a sense of direction and vision of the course.

In the EDM Keynote by Jie Tang he described the way AI systems could benefit online educational platforms. Some MOOC websites have collected data on tracking users interactions with video lessons on MOOCs such as the number of times a learner rewinds a video, the size of the jump and the number of jumps. An intelligent system on the MOOC platform is able to recommend a smart jump or an automated suggestion for video navigation based on this data. Data at the micro level of clicks and video interactions can provide a lot more insight into the user. Would such a system reduce frustration for you? Could we use a similar methods to partition long videos in Vialogues into different topics and provide even more discrete recommendations to sections of video?

Many of the presenters repeated the claim that participation on a MOOC forums was a good predictor of success in the course. In Inferring Frequently Asked Questions from Student Question Answering Forums the researchers proposed that observing the interactions on question answering forums on MOOCs, or other online learning environments like Piazza, can provide an opportunity to better understand the students. The researchers were able to identify similar questions by performing a hierarchical clustering method which looks at the content and semantic similarity of questions. If we were to solicit questions or statements about problems users encounter, such a method might be useful for the EdLab platforms to similarly identify common issues users encounter in using our tools.

In the same vein Learner Affect Through the Looking Glass: Characterization and Detection of Confusion in Online Courses looked to analyze the text from posts, and interactions on forums in order to detect moments of user confusion. With their text analysis, the researchers are provided with information as to the aspects of a course which may be confusing or frustrating, as well as important information to provide timely and accurate support for the struggling learners. Both of these aspects must be addressed in order to create the most successful course. They were also able to characterize the nature of the confusion; Such as factual, technical confusion, or instances of frustration or advice seeking. As in many other studies - feature selection - the identification of the relevant data was the key innovation that this study presents. The way features which were used to detect confusion included the Automated readability index (ARI), post length, the topic based on LDA analysis, and the presence of a question mark. The readability index measures how well communicated a post is. The researchers therefore hypothesized that a rambling or even incoherent post would signal a higher level of confusion than a well communicated post. Additional features were based on the communities interactions with the posts, such as the number of views or "up-votes" a post received. This study found that all of the features except for the LDA analysis increased the accuracy of the confusion detection.

Can such information be gathered for a course here at TC that uses one of EdLab's online platform such as Rhizr, or Pressible? Such information could be useful to instructors who teach both MOOCs and combined online and offline courses.

With all the data which can be collected on a MOOC, which can be as detailed as the click-stream of a user during a session, some researchers sought to create detailed models of user engagement as in the research Behavior-Based Latent Variable Model for Learner EngagementSome current methods of measuring engagement such as self-reporting may be inaccurate and not correlate with the learning outcomes. Other methods such as facial recordings or eye tracking such as in Generalizability of Face-Based Mind Wandering Detection Across Task Contexts may be considered invasive and behaviors may not be universal.

As such, the researchers of Behavior-Based Latent Variable Model for Learner Engagement looked at a wide variety of data from the click-stream data of learners in an online course as well as the score of assessments and results of specific questions. Further, combining this information with data from the user interactions with videos, forums and other parts of the platform hosting the MOOC, the researchers were able to find which features most heavily correlated with engagement and success. One of the findings was that learners who viewed the video at a faster than normal playback speed were more likely to be engaged. Further their method allows for the visualization of engagement over time. Would educators who host discussions on Vialogues, benefit from a similar type of engagement analysis during real-time discussions? What are the limits of invasiveness?

Beyond just engagement many researchers at the conference looked to predict student success in MOOCs and in offline courses. Predicting Student Retention from Behavior in an Online Orientation Course similarly looked at the large scope of MOOC data, including the number of times learners used resources, the number of days between the uses, as well as the average number of days inactive. Again the study was able to select features which correlated heavily with course outcomes. The study again found that students who participate or viewed the forums or discussion boards on the orientation course are likely to sign up for further program specific courses and therefore have higher likelihood of retention.

Therefore the key points I took away from these talks were:

1. MOOCs and online learning platforms can contain an enormous amount of data on users

2. This data can reveal a lot about the affect and engagement of users.

3. Such information can be vital to understand improvements that need to be made on online learning platforms.

Would it be helpful to look at the affect and engagement of our users beyond those that use the EdLab tools for class?

Back to the topics

Posted in: Research|By: Alvaro Ortiz-Vazquez|781 Reads