What L&D Professionals need to know about data analysis

Speak First Avatar

1 July 2019
Written by Speak First Linked-in icon

What L&D professionals need to know about data analysis

 

These days, almost all areas of modern businesses are results and data driven. A sales team needs to know exactly how much they’ve sold and whether or not they’re making the company a profit; the marketing team needs to know how many new clients their expensive new campaign brings in; and even I need to think about the level of engagement on this blog post. So, why would anyone expect Learning and Development teams to be different?

Data analysis has lots of uses within L&D: unless you want to rely on guesswork, training courses should be based on several sets of data, starting with understanding what training is needed (by assessing the organisation’s skill gaps), and ending with measuring results and reporting against key performance metrics. We previously wrote about how important it is to be able to show your training delivers a financial benefit to your organisation. Being able to analyse quantitative results has a big part to play in this.

Given the importance of data analysis to the L&D field, perhaps it’s unsurprising that it’s been predicted that by 2020, 80% of organisations will run training in data literacy.1 This is why we’re looking at the key points L&D professionals should know about analysing data.

 

Don’t underestimate the importance of data analysis

It can be tempting to think that an L&D professional should spend all their time designing or organising learning solutions for their organisation. While this is, obviously, an important part of the job, it’s just as important to analyse information before and after training occurs.

L&D is effectively irrelevant to an organisation if there’s no plan or structure in place. Use KPIs and metrics to see where the skill gaps are and what the company most needs. If the sales team struggles to reach their targets, then run sales-focused training; if there’s issues with a certain team’s performance since a new manager took over, then get some coaching arranged for the manager. Without this data-driven approach to learning, L&D professionals end up organising training based on little more than educated guesses and gut feelings, and it should be a lot more scientific than that.

74% of talent developers say they perform internal skills gaps assessments and 66% monitor business KPIs and key metrics to find the most important skills to develop within their organisation.2 However, when it comes to measuring results from formal learning sessions, only 12% say their methods are effective.3

This disconnect between the comparatively high use of qualitative data to find which skills to train and ineffectual post-course measuring can be explained by L&D professionals using the wrong data to review their courses. While key metrics might be analysed before training, often the success of learning is measured by feedback forms after the training session, rather than actual performance results and the use of their newly learnt skills. Clearly, useful results can only come from the right data sources, and conversely, the wrong data gives useless and ineffective results.

 

Data vs Analysis

Although it might seem obvious, it’s important to understand the difference between data and analysis.

Data are the raw facts of a situation – anything from a spreadsheet of sales figures, a database of employee details or stock levels. Data are unchanged, unbiased and usually quantitative.

Analysis, on the other hand, is an explanation of what the data means. For example, an analysis of sales figures might tell you that the sales team are selling above their targets, but have been struggling to sell one particular product.

Be careful when analysing data. It can be easy to jump to assumptions or put a personal bias onto the information. Raw data is objectively true, but once you start trying to assign a narrative or meaning, then it’s far too easy to spin it to say what you want. As Mark Twain once wrote, “There are three types of lies: lies, damn lies, and statistics.”

As an example: the sales team’s data show they have been selling above their quotas, but less of one particular product. When analysing this in a report, you could suggest several different reasons. For example: 1) the product is particularly unpopular, so it isn’t selling; 2) the other products are particularly popular, so they’re being sold instead; 3) the sales team don’t know how to promote its features properly, so clients don’t buy it.

Just because the data show one set of numbers, doesn’t mean they only tell one story. Therefore, even if it’s tempting to analyse and report the data in a way that shows your training in a more favourable light, in the long term it’s better being honest with yourself and others. If something hasn’t been working people will notice anyway, so you should take this as an opportunity to learn and improve it next time.

Consider A/B testing your training, changing just one thing at a time to see what difference it makes. If a training session usually runs with twenty people, is it more or less impactful when there are just ten participants? This method lets you fine-tune your learning solutions to gradually optimise them, but it takes patience and honesty when analysing your results.

Remember, a single piece of data doesn’t necessarily mean everything follows that same pattern. Scientists always repeat their tests to check their validity, so although you might have discovered one course is far more effective online than in person, you shouldn’t start completely shifting online until you’ve seen the same pattern several times. Also, there’s a big difference between correlation and causation. Just because sessions held on a Monday have been the most effective, doesn’t necessarily mean it’s because they were on that day of the week.

 

Know what success looks like

When planning your training, always start with the end goal in mind, which should be linked to relevant metrics and KPIs for the organisation’s objectives. It’s obviously unnecessary running management skills training when the biggest current skill gap is presentation skills, or running customer service training for checkout staff if the business is about to close their stores and become online only. After running training, you should specifically measure the same metrics to see how they have been affected. Has your company won more bids since the team attended the course on successful pitching?

The most effective metrics for measuring success tends to be business process improvement, revenue per employee and proliferation of innovation. Yet, these are some of the least used metrics. The most used are course completion rates, employee satisfaction surveys and post-course questionnaires.4 It’s no coincidence that the most used are also some of the easiest and most convenient, even though they have little relevance to what should be measured and are much less useful.

Post-course questionnaires are good for seeing whether or not participants enjoyed the course, the course was well-organised and if the instructor was good, but not for analysing whether the participants internalised anything new or if they feel confident utilising it in their job. Imagine if we only tested how well school children learnt by asking if they liked their teachers. This is effectively what happens when you judge the success of L&D on post-course questionnaires.

 

Know how to articulate your findings

You might have analysed all the data and discovered that you ran the most successful training programme in history, but if you don’t know to report this to the relevant people, no one’s going to care. L&D isn’t the only industry with a knowledge deficit around understanding and analysing data, so you need to find a way to make your reports simple, understandable and convincing.

Always link back to the key metrics you started with. If the senior management wanted to boost productivity, then you should show them data analysis on this precise issue, such as participants’ work output before and after their training.

Find ways to visualise this clearly. A simple graph showing data from before and after the training can be effective. A picture tells a thousand words and seeing an upwards trend in profits can be a more compelling argument than anything else you’d say.

Your L&D interventions might have had other unintended results. Perhaps the presentation skills training also helped internal efficiency, as the whole team became clearer communicators. If you can show this with relevant data, then this will help you highlight how training creates more well-rounded workers, but don’t forget everyone’s priorities. If you were asked specifically to increase the number of bids won, then make this the first thing you report back, with all the secondary information afterwards. When someone’s waiting to hear a particular piece of information, they won’t be as interested in anything else you say until they hear about their main priority.

So overall, it’s important to base your learning solutions on good data that is properly and honestly analysed. This will help you to organise learning which is most needed and will be of the most use. You must also put the extra effort into analysing the results of learning, and not just the quality of the session itself. By changing your attitude to data, and becoming data-driven and data-informed, you will see huge improvements in the quality of your L&D and in the effect it has on your company.

 

Check out our training solutions to see which programmes is best for your organisation.

 

1 Gartner (2018) How data and analytics leaders learn to master information as a second language

2 LinkedIn (2019) Workplace Learning Report 2019

3 Conduent (2019) The Effectiveness of Learning Measurement

4 Conduent (2019) The Effectiveness of Learning Measurement