Typically, a project (regardless of which industry it originates in) is deemed “successful” if it ends well. Putting that in eLearning context then, a popular consensus among course developers and moderators is that the course is a success if:
- A higher number of learners sign up for it, compared to the previous iteration
- A larger number of learners have completed the course successfully
- The amount of revenue earned, per learner, steadily rises over time
While these metrics point to some form of eLearning success, they are usually measured “after the fact” – i.e., once the course is over. Today, this traditional model of measuring eLearning is considered obsolete. With eLearning analytics, course providers are measuring courses in real time – from the time a learner signs up, until they finish the final evaluation, and at every stage in-between.
ELearning Analytics 101
ELearning analytics is a data-driven model to evaluate and assess the success of a course. These data-dependent models track, measure, analyze and report on multiple dimensions of the course. In learning analytics context, the term “success” is not viewed in the traditional way that success is measured. eLearning analytics measures learning success using multiple metrics, such as:
1) How much time do learners spend on your learning platform: If your learners (as well as prospective learners) aren’t captivated by the content they find on the site, they will likely not spend much time there. Being aware of time spent on the platform can provide you with valuable inputs on whether you need to revamp the site/page, or whether to add or remove content from it.
2) Lessons abandoned or sessions aborted: Another key metric on the eLearning analytics card is how often, and which, sessions or lessons are abandoned by learners. Course outcome isn’t about how many lessons a learner completes, but also about measuring those that he/she abandons. Course developers can learn a lot by assessing which courses/lessons are abandoned, and why.
3) Learner proficiency: ELearning analytics can continually have a finger on the pulse of your learners, in real-time, offering you vital ongoing data about their proficiency. Knowing as soon as possible whether a course is too “easy” for a learner, or if it is extremely difficult, can help trainers and moderators take remedial action to ensure a better outcome for learners.
4) Learner engagement: While a learner might have finished the course in its entirety, he/she might not have fully engaged with the course environment, other learners or with the trainer/evaluator. ELearning analytics can quickly flag lack of engagement (learner doesn’t post comments on the forum; he/she doesn’t ask questions of the trainer or responds quickly enough to comments directed at him/her).
5) Progress tracking: One very key eLearning analytics metric is learner progress: How far has he/she progressed in the course; and is that in line with, behind or ahead of the benchmark; how does his/her progress compare with other learners? A learner who is too far ahead of the rest of the class might not be sufficiently challenged. A learner who is far behind the rest might be finding the content excessively challenging. Progress tracking in real-time can help trainers take immediate corrective action to ensure appropriate learning outcomes are achieved.
These five eLearning metrics can help course developers and sponsors monitor a course in real time, and make decisions that positively impact course outcomes for learners. Using a traditional course assessment models would not allow such decisions to be made prior to completion of the course. In that respect, eLearning analytics is an active approach to measuring learning success.
Building an eLearning Analytics Framework
Typically, there are five pillars involved in creating an effective eLearning analytics framework within your organization:
1) Data Capture: Learning analytics is a data-driven process, and by definition needs data to provide trainers and content developers meaningful, actionable metrics:
- Data from across the continuum – from initial contact through signup and course feedback – must be available as part of the analytics model
- The data should be available as close to the decision-makers as possible. For instance, if course creators want to learn why a learner signed up for a particular course, so they can re-work certain content, they shouldn’t have to go back to the Sales personnel to get that information. It must be available on-demand
- Original data: Preferably, data should be available in its original source – for instance original emails or source data input forms – so decisions can be made using unfiltered data
2) Reporting: The data capture process should then pass that data to the reporting module, which uses reporting requirements:
- Who are the various stakeholders?
- What do they want to see in the reports?
- What analytics metrics will provide them the information they need?
A good reporting process uses data captured in the first step, and then turns it into useable information.
3) Prediction: The next pillar in the eLearning analytics framework is the predictive module. Here is where historic data on learner behavior, performance and feedback is used, to make predictions about a learner’s future behavior:
- Will the learner complete the course successfully?
- What is the likelihood that he/she will abandon the course/lesson in the next six days?
- Which test questions/assessments are likely to challenge the learner the most?
The predictive requirements of an eLearning analytics model will largely depend upon what the course sponsors’ objectives are. The predictive capability of the model is only as effective as the data gathered. So, if you wish to learn where, when, and how your learners are logging-in to access the course materials, your data collection model must capture those metrics.
4) Action: Next, make sure your learning analytics framework supports you in making decisions about any action – remedial or proactive – deemed necessary to support course learning outcomes:
- Are tests and assessments too difficult? Too easy? Not challenging enough? Maybe the questions need to be rewritten to address the difficulty level.
- Do students gloss over some of the content, not spending too much time on those pages? Perhaps the content is redundant and needs to be either improved or removed? What can be done to address the bounce rate?
5) Review/Refine: Understand that learning analytics is an ongoing process – it doesn’t end once corrective/remedial action is taken. To make the model effective, you need to make sure the loop is closed through continuous review, learning and refinement.
Tools of the Trade
Depending on the level of sophistication you are looking for in your eLearning analytics, and the budget you have allocated to putting an analytics solution in place, you could choose a number of approaches to move forward.
The most common approach is to use a commercial off-the-shelf Learning Management System (LMS). These tools have built-in analytical modules to provide you with all of the metrics discussed earlier…and much more. For instance, Blackboard Analytics offers such a solution to its users.
If you deliver your courses through popular social media networks, you could probably tap into native tools offered by the platform you are using. For instance, Facebook Analytics offers a rich blend of metrics about how visitors interact with your Facebook Pages.
You could also invest in proprietary data warehousing solutions, such as Learning Locker, to store and mine important data about your learning environment.
Regardless of which option you take to create and implement your eLearning analytics framework, you need to do a proper assessment first to ensure that you address all of the pre-requisites for a successful implementation.