MOOCs have transformed education by offering the opportunity to study and access course materials for free, virtually anytime, anywhere. Typically, complementary forums also allow learners and educators to commune with one another in an interactive academic environment, further enriching the learning experience. An unfortunate phenomenon particular to MOOCs, however, is an incredibly low course completion rate. For instance, statistics show that Chinese University MOOC courses bear a mere 1.5% completion rate. The researchers sought to find a method to automatically gauge the progress of learners throughout the duration of a course. Perhaps, knowing this information, a professor could adjust their teaching approach to best support a student who might be on the precipice of dropping out.
The research problem at hand boiled down to a time series prediction task, since the researchers were attempting to forecast a learner’s future action based on past behavior. Adhering to the weekly nature of the course, the researchers used one week as the time step of prediction. Dropout prediction was treated as a sequence labeling task, where both input features and output labels were seen as sequences. Features included factors such as participation in class discussion and lecture video views. To manage the time sequences, the researchers implemented a Long Short-Term Memory (LSTM) recurrent neural network architecture, which is known for its success in capturing long-range dependencies. Using datasets provided by Chinese University’s MOOC platform, the researchers trained and tested their predictive model. On average, the model achieved an accuracy of 90% in predicting learner retention results for each of five courses.
Given the aforementioned accuracy rate, LSTM is a promising tool for devising educational strategies that are responsive to a student’s needs, even in settings where teacher and student are remote. One future goal for this research involves introducing optimization techniques to refine the parameters of the current prediction model. Likewise, the researchers hope to improve performance of the model on predicting results for the same course but in different semesters.
Find out more here.
Kaylen Sanders, ODSC
I currently study Computational Linguistics as an M.S. candidate at Brandeis University. I received my Bachelor's degree from the University of Pittsburgh where I explored linguistics, computer science, and nonfiction writing. I'm interested in the crossroads where language and technology meet.
- Reviewing Amazon’s Machine Learning University – Is it Worth All of the Hype? 859 views | by Daniel Gutierrez, ODSC | under Featured Post, Modeling
- Learn Interpretability for Data Science 134 views | by Rajiv Shah | under Conferences, Featured Post
- The Promise of Retrofitting: Building Better Models for Natural Language Processing 110 views | by Catherine Havasi | under Conferences, Featured Post, Modeling, NLP/Text Analytics, ODSC Speaker