Shifting Trends in L&D Technology & Measurement

Welcome to our “Leaders in Corporate Learning” blog series! We sat down with Robert Young, Director of Learning Analytics and Technology at Charter Communications. Robert’s career has traveled an unusual path, starting in law enforcement before bringing his expertise to corporate learning and development. In his current role, Robert has visibility into the impact of training on tens of thousands of employees, which gives him a unique view into some of the trends, challenges, and technologies impacting corporate learning today.

How did you start your path in learning and development?

I started my career as a police officer in the Los Angeles area, where I worked for 11 years. The last few years I spent leading the department’s training function. Moving into training wasn’t the most common path for a police officer, but I took the role because I believed it would allow me to multiply my influence and make a bigger impact on the department’s mission.

When an injury on the job caused me to evaluate new career paths, I returned to school and entered the corporate world as a training facilitator in 2007. I’ve stayed in corporate learning & development ever since, progressing to my current role in analytics and technology.

How has the role of L&D changed since you started?

Over the course of my 14 years in corporate training, I’ve seen the rise and fall of many learning and development trends. But the most notable has been the shift to “consumerization of learning.” This is the shift to thinking of employees as consumers, with access to many nontraditional outlets for both personal and professional learning. Your corporate training is but one of many learning channels competing for their time.

In the old model, training was something that you “went someplace” to do, whether in-person or in a learning management system (LMS). It had some formal stamp of authority from a company or a school. Now, when you want to learn something, you can go online and watch a YouTube video or read WikiHow or Wikipedia, and the primary criteria is “does it work?”. The existence of these alternatives, and this new competition for a learner’s attention, means that companies need to treat learning more like a product. It’s no longer a given that employees have to use the training you deliver. I keep this in mind when focusing on developing the strategy and tactics around our current L&D initiatives. Is the training we deliver meaningfully different and valuable enough to command their time and interest? The answer needs to be yes.

Are there any other trends changing the role of corporate L&D?

I’m seeing greater emphasis placed on developing our people and delivering learning that adds real value to our people’s lives. Soon I expect that all corporate learning teams will need to be able to answer questions such as:

  • How do we know our people are growing their skills and careers?
  • What are we doing to support their professional development?
  • Is the learning we’re providing worth it to the learner?
  • How are we supporting the next generation of leadership?

When you consider the future of any organization, you should be thinking about the people on your “bench,” to use a sports analogy. Somewhere right now, in some organization, somebody is starting in a manager role who may be the CEO of Charter in 25 years. You need to have a plan to develop this next generation of leaders currently at the end of your bench.

Your job title is learning analytics, so we have to ask you about training ROI. How do you approach the challenge of measuring the ROI of training programs?

Determining how to measure the impact of training is something every L&D team faces. It’s also one of my biggest responsibilities at Charter.

We often try to frame the value of training in terms of ‘the company made X amount of money’ or 'we saved X amount of money or time’ as the result of training. Those are good targets and we pursue them when we can, but it’s often difficult to make those conclusions definitively. There are four factors that make the calculation of training value complex:

  • Defining Success: It’s tough to create a consistent approach to showing value in a company the size of Charter, with 96,000 employees, a variety of training programs and learning teams operating simultaneously, all with different objectives. In addition, my internal clients often don’t have a clear metric that they can target to measure the success of a training program. They’re trying to drive a specific behavior within their department or team, but they can’t identify a clear metric or KPI that they feel confident will change as a result of the training. This challenge with identifying business KPIs that training will credibly impact is often a big obstacle to defining training success.

  • The applicability of evaluation models: We use the Kirkpatrick Model of Evaluation as our fundamental model to measure training effectiveness. Although it’s effective to use at the initiative level, it can be hard to extrapolate that method equally across hundreds of initiatives and projects taking place throughout our large company. Not every learning experience can be evaluated all the way to a level 3 or 4. The question I’m asking myself as we push our analytics capabilities forward is: how do we identify, at scale, the value of the training when the standard model of evaluation doesn’t fit the larger need? While I really like Kirkpatrick, I don’t think it’s the answer to that question. The learning field needs a different model to answer that question in a large, multi-faceted organization.

  • Sharing the Success: It’s easy to point to factors within a training program that made it successful, like strong manager support, great design, and the dedication of the learner. But when the targeted business initiative is successful, it’s more challenging to associate that success with the underlying training program. Even if a program is designed for that specific business outcome, people will (rightly) argue that there are too many other contributing factors, such as excellent business strategy or great management that make it difficult to attribute that success only to the training program. While this is valid, it’s an important conceptual obstacle for L&D teams to overcome. Incontrovertible proof of a learning initiative being the sole or even primary factor in business success is an exceedingly high standard. Well-researched, data-based correlations are often a more realistic target.

  • Data You Can Trust: It can be tough to identify and prove the training metrics that matter. We often default to completion data, because it’s the most consistently available evidence of value, and it’s the easiest data to track at scale. Even when starting with this relatively simple metric, you have to assess the value of completion data in different ways for the different types of media you’re using. For some types of media, you can simply measure completion in a binary fashion⁠—did they access the training or not. But for certain media such as video, you should be looking deeper into engagement data to find insights. For example, when a learner finds their answer 20 seconds into a 60-second video, you’ve delivered value to the learner, even though the completion percentage may suggest otherwise. However, that insight might lead you to evaluate whether the video should be shortened to highlight the key insight.

Although I focus here on completion rate, I recognize that this metric alone is not sufficient for training measurement. But it’s a good starting point to build on.

When analyzing learning, what data is missing today that you wish you had?

Consistent, meaningful measures of the impact that a training program has on a defined business metric. If I could find a magic button to implement that tomorrow, I would.

What learning technologies do you count on most in your role?

Part of my role is identifying learning technologies that can help improve the organization’s training programs, and subsequently, learning retention.

Like many other organizations spanning all industries, we use an LMS as the fundamental system of record. It serves as the company’s formal training distribution and tracking system but isn’t sufficient in itself. Another tool we rely on, that not everyone is taking advantage of, is a learning content management system (LCMS).

Our LCMS allows our team flexibility and creativity in the creation and distribution of learning. It’s helping us in the push to meet the learner at the point of need. If you can get your content into the learner’s space without them having to chase it, the learning becomes more natural and contextual.

Another way we achieve more seamless learning experiences is through use of a third-party learning reinforcement platform that delivers training content as microlearning. This allows us to extend engagement, reinforce key points over time, and track whether employees are remembering and applying what they learned over time.

What’s the next big thing in learning technology?

My money’s on augmented reality. Right now, we're able to imagine a scenario where a technician is out in the field, can point a phone at a piece of equipment, and receive all the information he needs to fix a problem, including core troubleshooting processes for that specific model of modem. The same AR tools could potentially apply to anything that a device’s image sensor can identify. That type of instant access can make work dramatically easier, supporting correct and efficient action in the context of real work. Using augmented reality tools to make it easier to solve practical problems, as it becomes more scalable and cost-effective, is a big part of where learning is headed.

This post was originally published on September 1, 2021.

Ready to see
Mindmarker in action?

Learn how Mindmarker can make corporate microlearning more
engaging, more relevant, and less disruptive for employees.
Get Started
Cookie Policy