This website uses cookies and similar technologies to understand visitors' experiences. By continuing to use this website, you accept our use of cookies and similar technologies,Terms of Use, and Privacy Policy.

Jul 06 2017 - 02:45pm
Educational Data Mining Topics: Learner Behaviors


The Yangtze River


Related Talks

Modeling MOOC Student Behavior With Two-Layer Hidden Markov Models


Grade Prediction with Temporal Course-wise Influence


Can Typical Behaviors Identified in MOOCs be Discovered in Other Courses?


On the Prevalence of Multiple-Account Cheating in Massive Open Online Learning


On the Influence on Learning of Student Compliance with Prompts Fostering Self-Regulated Learning


Efficient Feature Embeddings for Student Classification with Variational Auto-encoders


When and who at risk? Call back at these critical points


Sequence Modelling For Analysing Student Interaction with Educational Systems


Clustering Student Sequential Trajectories Using Dynamic Time Wrapping


Student Learning Strategies to Predict Success in an Online Adaptive Mathematics Tutoring System


The topic of learner behavior appears to be very broad with a lot of overlap with the research in the topic of affect and engagement. However, the development of user behavior models can make the large amounts of data on online learning platforms more comprehensible and reflect a very important contribution of the whole educational data mining field. The topic of learner behavior is perhaps the most relevant to our work in EdLab as we seek to establish the Institute for Self-Directed Learning and also understand more about how learners are using our suite of tools. In these presentations I encountered promising methods for clustering users and behaviors, along with a greater understanding of the relevant features we may want to observe and analyze from our users.


One of the promising methods was presented in Modeling MOOC Student Behavior With Two-Layer Hidden Markov Models which analyzed user browsing sessions on Coursera MOOCs A user session on a MOOC was assumed to  be a Hidden Markov Model able to be represented by a directed graph showing the transitions from different components on the site such as the forums, wiki, quizzes, video lectures, etc. 




In order to make sense of the myriad of possible session graphs, they fit a model with these four possible states representing the dominant behavior patterns during a session. Later by adding a layer to the model, they could represent the transitions between these dominant behavior patterns.



They were able to compare the most frequent behaviors among students who were very high performing (a) with low performing students  (b,c) to see which behavior patterns were most common for each group of students.


Both Sequence Modelling For Analysing Student Interaction with Educational Systems and Clustering Student Sequential Trajectories Using Dynamic Time Wrapping describe similar methods of analyzing or clustering users or sequences of behaviors on online learning platforms.


Such methods could be very interesting if applied to our EdLab suite. As users embark on a self-directed learning missions we may similarly model their learning behaviors as a Hidden Markov Model and compare those individuals who successfully conducted their self-directed learning with those who struggled. Which tools or combination of were most associated with high success?


The paper of this study can be found here.


Graphing methods were similarly used in Grade Prediction with Temporal Course-wise Influence to generate graphs of the prerequisite courses most associated with success in a course. By looking at grades from previously taken classes found to be connected or related, the model created by the researchers was able to predict the grade outcome of students in future classes. With these graphs and information the researchers are able to determine information as to which courses and results had the most influence on future courses. At TC, a course recommendation systems based on similar models as well as degree planners would be very beneficial to students.


With similar goals to these studies Can Typical Behaviors Identified in MOOCs be Discovered in Other Courses? looked to compare learner behavior with learning outcomes. This time clustering users based on any data associated with a timestamp such as interaction with elements on a page of the MOOC. The high dimensionality of the features and behaviors meant that clustering methods would be challenging to implement but they were nonetheless successful in clustering using k-means and X-means. They found that the users clustered with similar behaviors had similar outcomes in the MOOCs. In our own work we have sought to cluster our users. Nevertheless, on our platforms which are not the same as MOOCs, we must select appropriate features for analysis and perhaps find indicators for success in self-directed learning outcomes.


As we think about self-directed learning, and the recommendation systems we are looking to implement on our EdLab suite, we can think about the amount of scaffolding or direction we would seek to provide to our users. On the Influence on Learning of Student Compliance with Prompts Fostering Self-Regulated Learning makes some interesting observations including that self-regulated learning comprises a set of cognitive and meta cognitive processes known to impact learning. Of course self-regulated learning differs from self-directed learning in that the subjects of the former are predetermined and structured. They are similar, however, in that they both require self-driven motivation to proceed towards the learning goals. This study suggests that scaffolding with directing prompts in self-regulated learning environments can support students' learning. The suggestions in the study were small, and included reviewing notes, writing a summary of the chapter or topic, and suggesting a sub-goal or different topic for the learner to focus on. They found significant improvements when learners followed the suggestions. What kind of suggestions could our platforms provide for self-directed learners?


Similar feedback and suggestions were provided for students predicted to drop out of MOOCs in When and who at risk? Call back at these critical points based on data collected within the first two weeks of the course. The model accurately predicted potential drop-outs by using various features on the amount of activity the user had on the course. When our users slow down their activity is it right to provide encouragement?


Some interesting work was further done on the subject of predicting failure or success in Student Learning Strategies to Predict Success in an Online Adaptive Mathematics Tutoring System. Their research showed a method of quickly predicting success on the next steps. A similar predictive capacity was developed by

Learnta, an online MOOC platform. In their workshop session, they claimed they could predict success in a math course with >90% accuracy by a small assessment of 12 questions. Both platforms used artificial intelligence techniques to intelligently develop the models and assessments which would provide the most accurate results in the shortest period of time. Perhaps the takeaway from these studies would be that there is an importance in observing the early behaviors for self-directed learners as they may have a substantial impact on the outcomes.


How do you envision our EdLab tools helping self-directed learners achieve their goals?


Back to the topics





Posted in: Research|By: Alvaro Ortiz-Vazquez|469 Reads