Determining Return-on-Investment for Learning, Performance and Talent Management Systems

Introduction

There are many reasons for implementing a new learning or talent management system. Some are related to saving money, others to improving productivity either of employees or administration staff, and others to meeting compliance.

But after you have implemented your system, how can you be sure that you have accomplished your goal — whatever that goal may have been? One way, is to use return-on-investment (ROI) metrics. These metrics measure differences between costs, productivity and other factors before and after the new system was implemented. The purpose of ROI is to prove that the time, effort and money invested in the new system is providing some — ultimately financial — return, whether that is direct cost savings or indirect profitability from increased productivity or efficiency.

However, implementing new learning or talent management software is often part of a larger initiative, so it can be difficult to separate the two when measuring outcomes. Software supports processes, and if those processes are changed at the time the software was implemented, improvements can be a result of one or both changes.

Some measurements can be directly attributed to the use of software (such as time to generate a report from existing data), but others are related to new opportunities that are created by the addition of the new systems (such as the ability to offer Web-based training or virtual classroom instruction instead of conventional classroom-based training).

Often, the implementation of a learning or talent initiative can have wide-spread results, such as reducing employee turnover. This may have resulted from employees having access to critical learning and thus being more satisfied with their jobs. Or it may be that a talent initiative offers them more say in the direction of their work resulting in feelings of greater responsibility and fulfillment. A new LMS may offer the ability for employees to self-assess skills and create learning plans that will help them accomplish personal career goals and thus feel more productive or valuable.

Separating software from process results can be tricky, unless the project involved the simple replacement of one technology with another, which is seldom the case. The implementation of a new learning or talent management system can be a large endeavor and has probably been initiated because of some other changes at an organizational or at least department level. Therefore, the following suggestions are related to the implementation of a new learning and/or talent management initiative and not to software alone. But each situation is different and yours will determine how and what you measure.

Acquiring Data

Some measurement data should already be available in financial systems, for example, the expenses related to employee travel for training. Others will be more subjective and require staff to estimate or track activities. Many of these metrics can be accomplished by creating survey forms that workers complete based on current tasks. Often these surveys are done during the planning stages for an initiative and are used as part of a larger needs analysis. These same surveys can be re-administered at key milestones during the project and then again at regular intervals (yearly, for example) after the implementation is complete.

Initially, some of these results can be predicted. For example, if the new system will allow the use of virtual training not previously available, the metrics can be estimated based on time and/or expenses related to delivering traditional classroom training. Figures can be acquired for instructor and student travel to and from a classroom venue. Then estimated results would be based on no travel time/expenses being incurred if the instructor stays in his/her office and the students stay at their workstations.

Utilize whatever information you have available to determine what you should and can measure. Strategic plans, organizational performance data and needs analysis reports can all be mined for measurement ideas. And you can make your measurements as large or as detailed as you can accomplish with the data you are able to acquire.

Ideally, a new system should automate many of the tasks previously performed manually. For those that were already automated, the new software should improve the process or make the data easier to access or more relevant. Measuring these areas should be fairly straightforward. Others will be more difficult and will require developing surveys or subjective ratings. But all data, whether qualitative or quantitative, is relevant and can be used to determine return-on-investment of new technology or of new processes.

What to Measure

Following are some samples of metrics related to learning and talent management initiatives. These are just samples and are by no means the only metrics that you can consider. What you measure should be customized to your specific organization and to the implementation. You may want to look at your organization’s strategic plan to see what areas are targeted for improvement, and then extrapolate existing data based on these goals to use as your baseline or starting point.

1. Company/Organization Level

2. Administrator/Department Level

3. Instructor/Supervisor Level

4. Student/Employee Level

Sample Metrics

Following are samples showing what measurements and results might look like for different types of metrics. These are all for tangible results that can be easily measured. Performing an audit of information before the project starts can provide baseline data. But even if that wasn’t done, you can still measure based on previously existing data, estimates and anecdotal information.

Direct Cost Savings Sample

The following analysis could be made of a mandatory course that each employee must take once a year. A contracted trainer travels to several different locations to present the information to the staff in person. Employees, in this case, do not need to travel as the trainer has traditionally come to them. The analysis compares the same course offered as elearning, where, of course, no instructor or classroom is required.

The savings are obvious at a glance. While elearning courses typically cost more to develop, the savings usually outweighs the initial costs. Also, the course can be reused with minor updates and relative costs each year.

Time Saved Sample

In this sample, we will look at the time it takes employees to prepare monthly expense reports related to training. Before the initiative, expense figures were tracked by each department using spreadsheets. These spreadsheets were sent to training department staff at the end of each month. The staff compiled the data from all departments into master spreadsheets. The figures were then merged into report documents.

In our sample system, many expenses related to training are automatically tracked. For example, the costs of facilities, equipment and supplies are all tracked in the new software as the training occurs and therefore do not need to be compiled at the end of the month. Departments are only responsible for compiling expenses submitted by staff for travel and accommodation related to training — and this data is entered into the new system rather than into spreadsheets. Since the data is centralized, no submission is required. Training department staff members can simply call up the figures and generate the reports they need. They then export or merge the data into the report document and prepare narratives as required. Visually, the time savings are impressive.

Compliance Rate Sample

The following analysis is of compliance rates before and after a new learning management initiative. In this case, all employees must take training and complete an evaluation to meet government health and safety regulations. Before the initiative, administration staff did not have the ability to track each employee’s progress through the compliance process. Administrators sent out notifications that compliance training was available, and later departments reported the number of employees who had: signed up; attended; completed; and passed.

With new learning management software, administrators can enroll employees directly into the mandatory training, track progress and instantly report on which employees have completed and passed and which have not. Administrators can now enroll employees who have not completed the course into another class until they eventually pass or other action is taken. At the time of the analysis, the following percentages might be obtained.

In the old system, administrators had little control over the process. They had to rely on employees to enroll themselves or on supervisors to ensure that the training and evaluation completion happened. All the administrators could do was send out notices and hope for the best. With the new system, they have much more control. Visually depicted, the results look even more dramatic.

Performance Appraisal Sample

A similar comparison could be used to measure performance appraisal completions before and after a new performance or talent management system is implemented. An automated system should make it easier for both supervisors and employees to complete appraisal forms, and thus the results should show an increase in numbers or percentages of finalized (closed) appraisal processes.

But the new system should also provide HR with the ability to track the process and see if and when breakdowns occur, and if those breakdowns happen at the supervisor level or at the employee level. HR can use the data to prompt for the process to be restarted until it is completed or until other action is taken. Again, this involves a change in process, but the software can provide HR with the information needed to keep the ball rolling.

Following might be the analysis with the before data from a paper-based system, the after data from the first year of an automated system, and finally the results after a new HR push to get both supervisors and employees onboard with the performance appraisal process.

The initial data from the first year shows that the new system was breaking down significantly at the supervisor level, something that may not have been apparent previously since this data was unavailable from the paper-based system. HR only knew that 20 percent of appraisals were closed and filed and the rest were not. For all HR knew, employees may not have been completing the forms and thus slowing or stopping the process. In this instance, employees seemed onboard, but supervisors were not. Possibly supervisors did not feel they had time to perform appraisals or did not understand the value. Follow-ups would be needed to determine the cause, but once known, the problem can be rectified. Results show that appraisal filing increased to 80 percent.

Conclusion

The above suggestions and samples are for tangible outcomes that new software or processes can provide. But, as noted, there are many intangible and trickle-down results that are difficult to measure when implementing a new initiative. For example, meeting compliance can prevent costly mistakes, which can mitigate risks and liability. Improved access to data can reduce response times for time-sensitive issues, which can in turn reduce stress. Eliminating the need to travel can provide more time for other tasks that staff would like to tackle but did not have the time for before. Reducing staff turnover eliminates production delays that occur when mission-critical employees leave. Since hiring new staff is expensive (both in recruitment and onboarding time/costs), improving this process can save money and increase profitability.

Obtaining precise measurements of all the effects may not be possible, and measuring the bottom line alone may be deceptive to an overall success strategy. A classic example of this is with business initiatives that involve laying off employees to save money. There may be an immediate cost savings, however, service levels may suffer as a result, which may cause customers to leave, reducing profits in the long term. In this same way, a lack of instant cost savings at the beginning of a new project may not indicate that the project is unsuccessful. All other areas should be considered, and time should be allowed for the trickle-down effects to become apparent.

Be aware that attempts to measure the intangibles related to learning and performance have not necessarily been successful nor widely accepted as accurate. This difficulty goes hand-in-hand with issues related to valuing the human capital possessed by an organization. Much has been written on the subject of “human-resource accounting” in the past couple of decades both in academia and in the business world — mostly about its impracticality.

And don’t spend so much time, effort and money measuring that you offset the improvements your new system is providing. If the measurements you seek are subjective or not easily available, assigning staff hours to the acquisition and measurement of data can reduce productivity. Be sure to measure only what is relevant to the implementation of the new initiative, and then use that information for future planning and even more improvement. But don’t expect your new system to eliminate problems that were the result of poor planning, inefficient processes or organizational issues. Software is only as good as the processes and people it supports.

Click this line to download the PDF.