Learning analytics has become one of those phrases everyone uses, but few people feel confident with. For learning designers, it can sometimes feel like data belongs to someone else; analysts, platform administrators, or the people in charge of dashboards. But the reality is that learning analytics is most powerful when it informs design rather than just reporting after the fact. When we treat analytics as part of our toolkit rather than an afterthought, it becomes far easier to make meaningful improvements that enhance learner experiences and demonstrate real impact.
The first step is asking the right questions. All the numbers and charts are meaningless if we’re not asking the right questions. Oddly this was hammered home to me when I learned excel. I thought if I could do all the formula, I’d get the information I needed. I was wrong. I can google the formula, I can’t google the question I need to ask to get the information. When I’m looking at a course, the questions are about what learners struggle with, if/where they drop off, which activities lead to deeper engagement, etc.
When you approach analytics from a design perspective, the data becomes less intimidating because it has a purpose, it is going to answer some very specific questions. For example, when working on a course recently, I noticed that a high percentage of learners abandoned one specific activity. At first glance, it looked like a motivation problem, I liked the activity, but, after digging into the analytics and qualitative feedback, it turned out to be a clarity issue. Not great for my ego but with this knowledge, I could look at the task again though a new perspective. It turns out, the instructions were too long, and the cognitive load was too high. A simple rewrite and a short example fixed the problem almost instantly. (This aligns strongly with the principles in WCAG Understandable.)
I didn’t get there with numbers alone though. Quantitative data needs to be balanced with qualitative data. Comments, open‑text responses, discussion threads, even the way people interact with scenario activities, can all tell us something about our courses and students. (If you’re exploring ways to strengthen engagement, you might also enjoy The Power of Storytelling in eLearning.)
Analytics should be a blend of both, but again, it can be overwhelming without a clear question to answer. Personally, I find that some of the most telling information appears in the smallest comments. A learner mentioning that they “weren’t sure what to do next” or that they “felt overwhelmed by the information” can explain patterns that numbers alone can’t. This combination of data types gives a fuller picture, one that helps you design with empathy rather than speculation.
Working with analytics also means getting comfortable with iteration. Learning design has traditionally operated on long production cycles, especially in higher education, but analytics naturally pushes us toward continuous improvement. Iterating on shorter timescales as everything, at least in the spaces I’m aware of, moves towards faster turnaround times. Instead of waiting until the end of a module or academic year, or even multiple year cycles, small data‑informed adjustments can be made as the course runs.
It is important not to overinterpret small sample sizes; early‑stage anomalies can look dramatic but often flatten out once more learners engage. It’s also surprisingly easy to slip into designing for the metric instead of the learning outcome. A shiny dashboard number might go up, but that doesn’t mean the learning has improved. (A nice complement to this idea is explored in How to Implement Kirkpatrick’s Four Levels of Evaluation: A Practical Guide).
At the same time, changing the order of two activities, simplifying a scenario, or adding a quick recap can still make a measurable difference. And analytics gives us evidence to experiment with confidence. You’re not redesigning based on a hunch; you’re redesigning based on patterns that keep emerging and that, that shift, is one of the biggest advantages of embedding analytics in design practice.
Of course, analytics only becomes meaningful when it is contextualised and interpreting analytics through the lens of learning theory is crucial. Understanding concepts like cognitive load, retrieval practice, and motivation helps you avoid misinterpreting the data. It also ensures your decisions align with pedagogical purpose rather than surface‑level metrics. Embedding analytics within the design process is one way to keep control of the narrative, making sure they tell a learning story rather than a bums on seats story. A high completion rate might look good on paper, but if learners are clicking through rapidly without engaging, it’s a false indicator of success. Likewise, a challenging assessment might produce lower scores, but if it encourages deeper reflection or better long‑term retention, it could still be a win. (If you want a quick refresher, Key Learning Theories Every Designer Should Know expands nicely on these foundations.)
There are also ethical considerations to keep in mind. Data must be treated with care, especially when it reflects learner behaviour. Analytics shouldn’t be used to surveil or punish but to support and improve learning. Learners should feel comfortable with us having access to their data, trust our intentions, and know what data is collected and why. Transparency is essential. I often include a short explanation at the start of a course about how analytics helps improve the learning experience. When you frame analytics as a partnership, not a judgement tool, learners tend to engage more openly. (If you’re exploring wider ideas around learner equity, A Designer’s Guide to WCAG Perceivable Principles pairs well with this.)
Ultimately, learning analytics is about noticing patterns, asking questions, and using evidence to make learning more accessible, inclusive, and effective. When designers look at analytics with that mindset, when they start being curious about what the data can tell them, the data stops being a burden, it becomes a creative ally, a partner that helps us make decisions that genuinely improve the learner experience. (For a view of where this is heading more broadly, The Future of AI and Machine Learning in Education offers a helpful perspective.) And that is where data becomes meaningful.



Leave a Reply