MOOCs have transformed education by offering the opportunity to study and access course materials for free, virtually anytime, anywhere. Typically, complementary forums also allow learners and educators to commune with one another in an interactive academic environment, further enriching the learning experience. An unfortunate phenomenon particular to MOOCs, however, is an incredibly low course completion rate. For instance, statistics show that Chinese University MOOC courses bear a mere 1.5% completion rate. The researchers sought to find a method to automatically gauge the progress of learners throughout the duration of a course. Perhaps, knowing this information, a professor could adjust their teaching approach to best support a student who might be on the precipice of dropping out.
The research problem at hand boiled down to a time series prediction task, since the researchers were attempting to forecast a learner’s future action based on past behavior. Adhering to the weekly nature of the course, the researchers used one week as the time step of prediction. Dropout prediction was treated as a sequence labeling task, where both input features and output labels were seen as sequences. Features included factors such as participation in class discussion and lecture video views. To manage the time sequences, the researchers implemented a Long Short-Term Memory (LSTM) recurrent neural network architecture, which is known for its success in capturing long-range dependencies. Using datasets provided by Chinese University’s MOOC platform, the researchers trained and tested their predictive model. On average, the model achieved an accuracy of 90% in predicting learner retention results for each of five courses.
Given the aforementioned accuracy rate, LSTM is a promising tool for devising educational strategies that are responsive to a student’s needs, even in settings where teacher and student are remote. One future goal for this research involves introducing optimization techniques to refine the parameters of the current prediction model. Likewise, the researchers hope to improve performance of the model on predicting results for the same course but in different semesters.
Find out more here.
Kaylen Sanders, ODSC
I currently study Computational Linguistics as an M.S. candidate at Brandeis University. I received my Bachelor's degree from the University of Pittsburgh where I explored linguistics, computer science, and nonfiction writing. I'm interested in the crossroads where language and technology meet.
ODSC’s Accelerate AI focuses on three key areas: Innovation, Expertise, and Management. Learn what the latest advances in AI and applied data science are, how they can affect your company, and how to build an effective team around their potential. Ready to learn more? Learn more here.
- Cracking the Box: Interpreting Black Box Machine Learning Models 126 views | by Yuriy Gavrilin | under Machine Learning, Modeling
- 7 Reasons Your Data Science Resume is Suboptimal 82 views | by ODSC Team | under Career Insights, Featured Post
- What Are a Few AI Research Labs on the West Coast? 41 views | by Alex Landa, ODSC | under Academic Research, Conferences, Featured Post, Opinion, Research