We’re never that curious about how well our car is operating until the check engine light goes on. But had we been a little more inquisitive, maybe even proactive, we could have noticed we were running low on oil, observed a frayed fan belt, or added a bit of air to a tire—all before the warning light flashed.
That same curiosity should apply to monitoring learning in your organization. Data logs from your intranet, learning management system, and social media—even brief classroom surveys—hold many secrets about your learners. You can determine what training they prefer, whether existing demand is being served, and the extent of their capacity for learning consumption.
In addition, while analytics are important to making certain that training is developing people and improving corporate performance, they also can provide perspective about how your organization’s learning programs perform within the industry.
Learning analytics is the examination of learner-produced data to discover how employees engage with training and, ultimately, what outcomes it produces. After any training experience—classroom, online or via knowledge databases or performance support tools—data is generated that represents what happened during the activity: A learner completed a course; scored an 87 on a test; clicked six times on the same link and left the website; or posted a comment on an expert forum. All of these captured metrics represent learning analytics.
Depending on the size and budget of your training organization, data might be represented in a sophisticated dashboard that is used to drive program change. Analytics also might be used in less sophisticated but still telling ways, such as monitoring student volumes and program satisfaction and managing continuing education licensing or other compliance requirements. The point is, data can identify challenges and opportunities and inspire change in the learning organization and beyond.
Aligning Analytics by Audience
One size does not fit all, nor does one metric mean the same thing to all learners. Rather, analytics and results need to be aligned with the learner and the training. The following table segments common learner types and training within an organization; gives an example of an appropriate metric; and suggests a predictive outcome—a potential result that can be inferred from the data.
To illustrate this point, one can look to the new insurance industry Property Technical Certification (PTC). After adjusters complete the 12-course PTC I program, their average improvement over pre-test scores has been quite significant, in general. One likely interpretation is that this data indicates that the program provides a thorough instructional design, allowing learners to absorb and retain the curriculum. Perhaps more interestingly, Crawford reports that initial data among certificate holders also shows improvement in the speed of claims handling and a reduction in the days that claims remain open. In this way, learning analytics provide a predictive value to the business far removed from the training department’s task of transferring knowledge.
When considering the variety of data trainers work with each day, you can see the power of information and its potential impact on the organization. This highlights the importance of learning analytics and the value a learning organization can deliver to the business by correlating and managing this data. In addition, linking test results to actual improvements in job performance validates training and helps organizations develop the ROI analyses necessary to support their educational budgets, which is always a key concern among training managers.
How Are We Doing?
Data analysis can indicate whether and how training contributes to the performance of an organization. By understanding the high and low ranges for each piece of data, an organization can determine whether it is operating within acceptable limits. However, with learning analytics, a company is often challenged by a lack of parameters that help interpret how its results map to those in the rest of the industry. By knowing industry standards, however, an insurer can understand its performance within a larger context. That is where benchmarking comes into play.
Benchmarking within the insurance or any other industry is not always simple, particularly in highly competitive or regulated environments. This is where industry associations play an important role as clearinghouses for data by protecting individual inputs to present the collective view. Associations can harness analytic wisdom, provide a perspective on best practices, and often recommend vendors to help an organization move in a new direction.
If you are not interested in sharing data, then using a research company specializing in learning analytics is an option. Such firms can conduct primary research, comb through secondary research, and conduct qualitative research by interviewing business contacts or conducting focus groups to identify industry standards and best practices. They can also develop reports, dashboards, and analytic datasets out of your raw data. Either approach will yield an external perspective on internal performance. Benchmarking will also provide access to networking opportunities and potential collaborative partnerships.
Now that we have discussed learning analytics and the importance of benchmarking, it is time to put this knowledge to work. Following are a few broad recommendations for learning organizations and training departments, as well as for those operating a learning management system (an online system for learning that can include courses, knowledge databases and forums, collaboration, and other computer-supported learning tools).
Best Practices for Analytics: Learning Organization
- Whatever you choose to measure, make sure it is available on a continuous basis. A snapshot only gives you information about a point in time, while a trend provides direction.
- Keep the power of benchmarking in mind. Make sure a metric is consistent with how others measure for the benefit of comparison with peers and the industry.
- Make sure what you measure is meaningful and, when analyzed, provides value to the organization. Meaningful analytics correlate to business impact.
- Optimize existing systems and processes to gather analytics. Technology must be your partner in gathering, assessing, and managing learning analytics.
- Realize that trended analytics will result in change. Manage the change process with the goal of improving the learning function.
Best Practices for Analytics: Learning Management System
- Know where your learners access online training—intranet, learning management system, the Internet, and so on—and ensure that multiple points of entry are available where your audience browses. You can then track page accesses and dwell times to ascertain whether learners are interacting with the intended training.
- Treat the system like a website and track activity on top-level pages, click streams, and drop-offs. Don’t let the system drive away learners.
- Index the top courses selected and consider additional content in high-demand subject areas. In addition, monitor key search terms to determine whether unmet content needs exist.
- If your organization uses social media for collaborative learning, incorporate basic measures within your learning analytics. Informal learning will become the majority of your learning in the future. Don’t wait to start measuring it.
Putting this information into action starts with assembling data, understanding the audience it represents, and correlating it to your organization’s performance requirements. By tying these concepts together, a learning organization can offer important insights about what the business must do to train employees for better performance, ultimately bringing far greater value to the enterprise than data alone can provide.