Become a Speaker at Explorance World 2026
Submit your presentation proposal by December 5!
Submit Your Presentation Proposal
Skip to contentExplorance Logo
Back to Blog Home

How to Bridge the Gap Between Learning Measurement in a Skills-Based World

Published onNovember 26, 2025|11 min read
Illustration for the article How to Bridge the Gap Between Learning Measurement in a Skills-Based World

Learning leaders have lived with the same frustration for years. The business keeps asking for proof that learning and development (L&D) works. Pros in the corporate learning space point to course completions, smile sheets, and learning hours. Everyone leaves the conversation a little unsatisfied.

Now the pressure is rising. Skills are at the center of talent strategies. AI is reshaping work. Budgets are tighter. L&D is expected to act less like a content factory and more like a strategic partner that can move real business needles.

That shift is precisely what Claude Werder, Senior VP and Principal Analyst at Brandon Hall Group, and Steve Lange, Director of Consulting for Metrics That Matter by Explorance, explored in their recent Excellence at Work podcast. Their conversation focuses on a simple but urgent idea: if organizations want a skills-based strategy, they need a skills-based measurement strategy that connects learning, capability, and business impact.

This article pulls out the key insights from that discussion and turns them into a practical roadmap. It covers how to align skills and strategy, move beyond traditional metrics, connect skills to outcomes, use AI and custom frameworks wisely, and bring managers into the center of skills application.

You can watch the interview in its entirety on all major podcast platforms, including YouTube:

1. Strategic Skills Alignment: Turning Skills into Business Outcomes

Most organizations say they care about skills. Fewer can explain how specific skills ladder up to the outcomes that matter most to the business. That is where strategic skills alignment starts.

Steve frames the context very clearly. L&D is under pressure to be more focused, more selective, and more connected to real business problems.

L&D is becoming and has to become much more agile with less resources. Therefore, they need to know what's working, what's not working, and why.

Strategic alignment means skills are never an end in themselves. They sit within a chain that runs from business goal to critical job to required skill to learning experience to on-the-job application, and finally to measurable outcomes.

Start With Critical Business Problems

Steve recommends starting from the top, not from a generic skills library.

You have to start with what are the most important jobs that need to be done in the business, and then, what are the skills and expertise that you need for those jobs.

That sounds simple, but it is a powerful filter. Instead of trying to maintain a massive skills taxonomy for every role, he suggests coming back to a few critical issues:

  • Identify the three to five most important business problems or priorities in a given time frame.
  • Map the roles that most directly affect those problems.
  • Define the critical skills those roles must demonstrate to move the needle.

Once those are clear, L&D can look at the learning portfolio through a different lens. The question shifts from “What content do we have?” to “Which experiences build the specific capabilities that matter most?”

Build A Skills Stack, Not Just Single Events

Skills growth usually does not happen in a single course. Steve gives a helpful way to think in stacked experiences.

First, I can take some microlearning on these three courses, then I can take an advanced course on one of those skills. After, I had another course that focused on all of them again.

That stack should align with business outcomes. It also becomes the basic unit of measurement: the path learners take to build and apply targeted skills, not just the last course they completed.

Practical Takeaways

  • Start every skills conversation with two questions: What business problem are we solving, and which roles matter most to that problem?
  • Build a shortlist of critical skills tied directly to those roles and problems.
  • Organize learning into coherent skill-building paths instead of disconnected courses.
  • Treat skills measurement as part of a bigger chain that includes business outcomes, not as a separate initiative.

2. Beyond Traditional Metrics: From Completions to Capability Evidence

Most L&D dashboards still orbit around volume and satisfaction: enrollments, completions, time spent, “Did you like the course?” scores. Those metrics describe activity. They do not prove capability.

Steve is clear that the core goal of effectiveness has not changed. What is changing is the level of focus.

We've always wanted to measure the elusive transfer of training or on-the-job application of training. That's part of the holy grail of training effectiveness. With the skills, the conversation now is more focused.

Instead of asking whether training in general is effective, the question becomes whether a specific skill has been acquired and applied in a way that matters to the business.

Defining What “Skill Acquisition” Really Means

One of the most significant barriers is definitional. Organizations say they want to measure skills, but they often have not agreed on what counts as evidence. Steve describes the challenge:

What does the acquisition of a skill mean? Is it a score on a written test? Are there application exercises during training so someone can get certified …? Is it a checklist that needs to be verified by an SME or a manager a few weeks after training?

The answer, in his view, is that there is no single indicator. Skills acquisition is a process that plays out over time and across different contexts.

It's really tough. It's a process that needs to be defined, tracked, and scalable, and connected to an outcome. So it's not just one thing. It's a lot of things over a timeline or a stage of events that need to happen.

Expanding Measurement Across The Timeline

Rather than rely on one survey or one test score, Steve advocates a multi-touch approach:

  • Before training: baseline assessments, goal setting, or current performance data.
  • During training: knowledge checks, practical exercises, simulations.
  • Shortly after training: learner confidence and intent to apply.
  • Weeks or months later: manager observations, checklists, performance data, project outcomes.

That kind of timeline requires more coordination, but it produces much more substantial evidence than a single post-course rating.

Practical Takeaways

  • Treat skills acquisition as a process that produces several different kinds of data over time.
  • Define, in advance, what will count as evidence of acquisition and application for a specific skill.
  • Keep traditional metrics, but treat them as early signals, not proof of impact.
  • Build measurement points before, during, and after learning, so you can see progression rather than isolated snapshots.

3. Skills And Business Impact: Building The Chain Between Learning And Outcomes

The most important part of a skills strategy is connecting skills to business results. Without that link, organizations risk repeating the same old story: enthusiastic learning, impressive registrant numbers, no clear business case.

Steve captures this bluntly.

It's got to be linked to a business problem. It can't be just, ‘We need more people skilled in AI.’ Okay, but why? What for? What is that going to do? And what is that going to get us?

The same logic applies beyond AI. Earning a certification, completing a program, or finishing a curriculum only matters if those capabilities translate into outcomes the business values.

From Certificates To Real World Results

Steve uses the example of project management and Six Sigma.

For example, I've earned the Project Management Professional (PMP) certification, or maybe I've earned my Six Sigma Green Belt. These are great skills to have, but what outcome are we trying to achieve for me by having them? Is it more projects on time and under budget? Is it more efficient processes with less waste?

This is where L&D and business leaders need to work together. They need to articulate the result that will count as success for a given skill investment: fewer errors, higher conversion rates, better customer satisfaction, reduced cycle times, higher retention, and so on.

Steve often talks about this as building a chain of evidence.

Do we have enough, I'll say, proof or evidence, or links in the chain that you're building? … Both before and during training, maybe 60 days down the road, you've got certain things, certain levels of rigor showing that you've had skill development in these areas.

That chain allows L&D to present a narrative that feels credible to executives: what skills were targeted, how they were built, how they were applied, and what moved in the business as a result.

Focus Where It Matters Most

Not every program needs a full ROI study. Steve uses Patti Phillips as a reference point on that topic and stresses the importance of focus.

You might not be able to do that with all your skills courses either. You have to start somewhere.

This is where the earlier idea of focusing on the most critical jobs and issues comes back into play. Put your most rigorous measurement around the programs that support those priorities.

Practical Takeaways

  • For every targeted skill, define the specific business outcomes you expect to influence.
  • Work with business stakeholders to identify which measures will count as credible evidence of progress.
  • Treat measurement as a chain of linked data points, not a single number.
  • Reserve your most rigorous, resource-intensive measurement efforts for the few skill areas with the highest strategic value.

4. AI And Custom Frameworks: Making Skills Strategies Smarter, Not Just Faster

It is easy to fall into a narrow view of AI in learning: faster content, easier asset creation, and quick knowledge summaries. Steve warns against seeing AI only as a way to speed up production.

He recalls a conversation with a group of CLOs about this tension.

Everyone's talking about how AI can help L&D provide or produce more content, right? You can do this faster now. You don't need to interview 15 SMEs. You can just interview AI and build all this stuff really quickly.

One CLO in that conversation took a different angle.

Instead of asking how AI can help us build more content because it's not really what we need, it's about how can we be more efficient at what we're doing … If we get AI to help us do those things, then we can start linking those to business goals and objectives not just for L&D but in in the business overall.

AI As A Pattern Finder And Design Partner

AI is particularly good at spotting trends and connections in complex data sets. That is exactly what skills measurement needs.

AI's really, really good at picking out those patterns, those trends that we might be missing, to let us know that, well, maybe not this direction, maybe that direction, or something along the lines of no one's going over here, so we need to put people over there.

Many HR and learning technologies are already building AI into their capabilities.

I know a lot of the HR tools — like LMSs, LXP, and other tools — are building these taxonomies with AI, so they're getting smarter.

That means AI can help organizations:

  • Clean up and rationalize large skills libraries.
  • Prioritize the skills that have the strongest relationships with key business metrics.
  • Compare current workforce skills against future needs and highlight gaps.
  • Suggest learning paths that align with those gaps and strategic goals.

Custom Portfolios and Frameworks

Steve also sees a future in which organizations manage diverse skill portfolios tailored to specific objectives or populations. Each portfolio can have its own measurement framework, tuned to the outcomes that matter most for that group. AI then helps maintain and refine those portfolios over time.

Practical Takeaways

  • Use AI to sharpen your skills strategy, not just to speed up course creation.
  • Let AI help you find patterns in skills, learning, and performance data that humans might miss.
  • Build tailored skills portfolios for priority objectives or groups, each with its own measurement approach.
  • Ensure that AI-driven taxonomies and recommendations are always cross-checked against real business priorities, not just system logic.

5. Managerial Involvement: Making Skills A Team Sport

No matter how smart the measurement framework or how advanced the technology, skills only come to life on the job. That is where managers become essential.

Throughout the conversation, Steve emphasizes that acquiring and validating skills requires coaching, follow-up, and accountability beyond the classroom.

You've got to have something built into your process that either documents it or showcases it, or something where you can go back and know that, okay, yes, Steve went to this course. He passed this test, but then he also did this project where he applied all that training, and it was a three-person panel, and they checked him off on 35 pieces of criteria.

Managers, mentors, or subject-matter experts are often in the best position to provide that validation. Near the end of the podcast, he describes this as a true partnership between L&D and the business.

I just delivered a workshop today to a client, who talked about how it's a team sport, right? Especially with something like skills. L&D can develop the most excellent training in the world. But the point is that they're applying that back on the job, and we don't see what happens. So, we need that help to be able to say skills were acquired, and yes, they are being applied.

Redefining The Manager’s Role

This pushes managers into a more active role in development, far beyond approving training requests. They need to:

  • Help define the skills and outcomes that matter most for their teams.
  • Create opportunities for employees to apply new skills quickly.
  • Observe and document skill use using checklists, projects, or performance data.
  • Act as ongoing coaches and accountability partners.

That is a mindset shift for many leaders. It also requires L&D to support managers with tools, prompts, and simple processes that fit into the flow of work.

Practical Takeaways

  • Build explicit manager actions into every significant skills initiative.
  • Give managers simple, scalable tools for observing and confirming skill use.
  • Encourage post-training check-ins that focus on application, not just “How was the course?”
  • Treat skills development and measurement as a shared responsibility between L&D and the business, not a function that sits in a single department.

Building A Skills Measurement Strategy That Actually Moves The Needle

Skills-based transformation will not succeed on content volume alone. The organizations that win will be the ones that can show a clear, credible line from targeted skills to real outcomes.

The conversation between Claude Werder and Steve Lange points to a future where L&D leaders:

  • Start with the business problems that matter most and align skills from there.
  • Move beyond traditional metrics toward multi-touch evidence of acquisition and application.
  • Treat skills and impact as parts of the same chain, not separate topics.
  • Use AI and custom frameworks to bring clarity, prioritization, and scale to skills strategies.
  • Bring managers into the center of skills development as coaches, validators, and partners.

As Steve puts it, the core measurement philosophy does not need a complete overhaul.

You don't necessarily need to veer too far from what you're doing, but you still want to focus on how to answer the question the business asked: We need to help with these skills.

For L&D, CHROs, and business leaders, the opportunity is clear. Start small. Focus on a few critical skills tied to high-stakes outcomes. Build a realistic chain of measurement. Pull managers and AI into the process.

Do that consistently, and learning measurement stops being a backwards-looking report. It becomes a forward-looking instrument for building the capabilities your organization needs next.

Related Articles

Avoid the Struggle to Get Buy-In for L&D: How to Prove Learning's Real Business Impact
Avoid the Struggle to Get Buy-In for L&D: How to Prove Learning's Real Business Impact
Learn how to secure L&D buy-in by linking training outcomes to cash, profit, growth, assets, and people. Shift from reporting metrics to demonstrating real business impact.
8 min read
What Is Scrap Learning? How to Identify, Prevent & Reduce Wasted Training
What Is Scrap Learning? How to Identify, Prevent & Reduce Wasted Training
Discover what scrap learning is, why training fails to transfer, and how to reduce wasted learning using analytics, reinforcement, and role alignment.
7 min read
Building L&D Credibility in a Disruptive Era: A Strategic Framework with Explorance Metrics That Matter (MTM)
Building L&D Credibility in a Disruptive Era: A Strategic Framework with Explorance Metrics That Matter (MTM)
Explore a proven framework for earning L&D credibility using Explorance’s MTM to align learning with business impact and executive priorities.
5 min read
demo

Get a Personalized Demo

Harness the power of feedback to achieve your goals.
Explorance LogoExplorance LogoExplorance Logo
Newsletter

Stay connected with the latest updates and news from Explorance.

Products
Solutions
Resources
Company
Explorance LogoExplorance Logo
  • Privacy Policy
  • Terms Of Use
  • Anonymous Reporting Form
  • Sitemap
Copyright 2025 © Explorance Inc. All rights reserved.