We’ve said it before, and we’ll say it again: great online learning achieves both high learner engagement and program scale.
But how do you quantify “high engagement”? In a crowded marketplace of platforms, AI tools, and ready-made courses all claiming the utmost engagement, how do you evaluate an individual learner’s engagement? Furthermore, how does engagement change over time?
Our Hierarchy of Engagement scoreboard is designed to make sense of an individual learner’s journey, reveal how cohorts of learners behave, and link learning metrics to business value.
Why engagement is essential for learning
Engagement is required for effective learning, and effective learning is what drives business outcomes.
“We focus so deeply on engagement because it is the single greatest predictor of learning. You don’t learn anything passively. Learning happens when you’re actively involved: thinking, doing, questioning, trying, failing, and trying again,” says Lauren Gould, Sr. Learning Strategist at Studion.
One study by Accenture found that ten hours of high-quality sales training correlated to a 7% increase in quota attainment. When training is voluntary, it must be engaging so that learners reach the critical mass of learning that’s required for meaningful business impact.
Understanding engagement, how it starts, is maintained, and changes over time, is therefore key to making informed strategic decisions on how to use learning to achieve business objectives.
Learning’s progression of business value
Thinking about engagement as a hierarchy is a useful model to understand, predict, and influence learner engagement and its impact on a business.
“The key insight is that there are different forms of engagement, that all can be measured, that their impact on the business can be profoundly different, and that there is a progression of increasing value,” says Studion CEO Furqan Nazeeri.
Each stage in the Hierarchy of Engagement represents a qualitatively different type of engagement, and each can be measured in distinct ways. The value of the hierarchy is not only in naming these levels but in showing leaders how to track progress and make interventions that keep learners moving upward.
Studion’s Hierarchy of Engagement

Level 1: Access
This is the baseline of engagement, and also the easiest to track. It may seem overly simplistic to start an engagement framework with mere access. But after building hundreds of digital learning experiences, we know that getting the basics of user login flows right is essential to program success. At this early stage, focus on metrics that capture time spent learning. While these KPIs are often readily available in any learning platform, determining the “right” amount of time will be unique to each learning experience.
Level 2: Knowledge and Skills
This level focuses on what most people think of when they think about formal learning: demonstrating understanding of specific knowledge and/or skills. Skills-based learning can actually be quite complex, especially when mapping learner skills to employer needs. But for our purposes, we focus solely on the learner’s journey. At this stage, it’s important to focus on a range of assessment metrics. Content consumption, while important, is not an indicator of understanding. It’s also important to measure learner progress, or change over time, in order to understand the impact of your learning experience on learner knowledge and skills.
Level 3: Utility
At this stage, theory transforms into practice: learners go from acquiring knowledge to applying it in the real world. This stage will look radically different depending on the topic and goals of the learning experience itself. Medical device training for surgeons cannot be measured in all of the same ways that professional development for accountants is, but some overlap does exist.
It’s helpful to track impact in the same place you track other business KPIs (a CRM, PRM, or other similar tool). Again, it’s essential to measure change and progress here to capture impact. Progress might look like sales teams adopting new frameworks or messaging, or a specific product feature showing increased utilization. Change doesn’t happen overnight; expect the process to take time. Measure how much time.
Level 4: Value
This stage is about perception. If in the Utility stage learners apply their new skills or knowledge, in the Value stage they recognize the larger impact of those skills. Interestingly, there might be no real change in the KPIs from the utility stage. Instead, the key indicator of this stage is advocacy, with Net Promoter Score (NPS) the gold standard metric.
Level 5: Preference
The Value stage is often the end goal for both the learner and the organization. But stopping at Value neglects a key impact of learning experiences, especially corporate learning experiences designed for partners and customers: preference. At this level, learners have transcended the original objectives of the learning experience. They now see the organization as a trusted source beyond the confines of the course, and they’ll return again and again. Here, learning experiences become brands unto themselves. Metrics of success will be similar to the overall brand signals, like long-term retention rates and higher win rates. But Preference can and should impact other areas that are harder to measure, like outperforming competitors, demand for more learning experiences, and a sense of identity around the learning experience, with learners actively promoting the training as part of their professional identity.
Using the Hierarchy of Engagement to measure and improve training
| Level | Measurement | Improvements |
| Access | Focus on time spent learning: -User logins -Unique active users -Session duration -Repeat visitors | Focus on reducing friction: -Simplifying login workflows -Refining messaging -Ensuring smooth onboarding processes |
| Knowledge & Skills | Data that demonstrates true skill-building and knowledge acquisition: -Quiz scores -Authentic assessments -Simulations -Self-reported confidence levels -Learning progress over time | Focus on applying best practices in learning design to a learning experience, then re-evaluating. |
| Utility | Post-learning survey data and business impact: -Post-training surveys -Sales and marketing data in CRMs -Tickets in ERP or CSP systems -Analytics in sales enablement or PRM systems -Product adoption KPIs for SaaS companies | Focus on supporting real-world outcomes. Make it as easy as possible for learners to apply what they’ve learned with checklists, diagnostic tools, or customer-ready assets like slide decks. |
| Value | Track advocacy, including: -NPS scores -Referrals -Confidence ratings -Repeat customers | Lean into advocacy by: -Socializing success stories -Collecting testimonials -Providing forums for learners to share their successes. |
| Preference | Measure through: -Long-term retention rates -Cross-program enrollments -Survey responses around “go-to” resources -Outperforming competitors -Increased market demand |
How the Hierarchy of Engagement enables High Engagement at Scale™
While the Hierarchy of Engagement measures a learning program’s performance and maturity, it doesn’t tell you what to do with that information. For example, if your learning program is stuck at Level 2: Knowledge and Skills, how can you move to Utility? The Hierarchy of Engagement is also focused on just the program’s learners–not competitors. That’s where our High Engagement at Scale™ framework comes into play.
We use High Engagement at Scale™ to build differentiated, strategic digital learning experiences. Our High Engagement at Scale™ framework benchmarks learning experiences along five key components of engagement: Learner-Centered Content, Active Learning, Unbounded Inclusion, Community Connections, and Real-World Outcomes.

But these metrics are designed to measure the learning experience’s overall capabilities relative to industry norms and competitors. High Engagement at Scale™ is not designed to track an individual learner’s engagement level, nor does it track the impact on the business. That’s a job for the Hierarchy of Engagement framework.
We use both frameworks, working in concert, to create world-class digital learning experiences.
“You establish the Hierarchy of Engagement scoreboard so you can see where your partners really are. Then you use the HE@S recipe—learner-centered content, active learning, inclusion, community, and real-world outcomes—to create Signature Moments that pull partners upward,” explains Nazeeri.
Use case: Activate the channel partner long-tail
One application of the Hierarchy of Engagement is in assessing and optimizing learning experiences for the long tail of channel partners. It’s something of a universal truth in channel sales that 80% of revenue comes from 20% of partners. (Sometimes it’s more like 95/5!) To increase revenue through the channel, you can try to drive more revenue from the 20%, grow your overall partner network, or engage the 80% of nonperforming partners. We’re advocates for using training to accomplish the latter. High-performing partners are likely to land in the Value or Preference Hierarchy of Engagement levels. First, segment partners based on engagement levels. Next, deploy targeted learning strategies to increase partner engagement. Partner revenue should follow partner learning engagement.
Increase engagement to impact outcomes
The business impact of learning programs is notoriously difficult to track. By focusing on a spectrum rather than a single KPI, businesses can gain a holistic view of learning programs impact on both learners and the business. “Engagement is still the scoreboard. And when companies invest in engagement, the business results follow: better performance, stronger loyalty, and measurable ROI,” says Nazeeri.


