The end goal for all learning initiatives should be to prove that they have a direct and measurable impact on business performance. By being able to measure your success by analysing what is working, what isn’t, and if your initiatives are landing with your audiences, we can then understand what needs to be improved and what current barriers are in the way which can negatively affect the performance of individuals and the business.
We are a Micro-Learning App which can be personalised to your business to help you measure and prove your learning success from within our platform StoryShare Learn. Our platform links (in real-time) to powerful BI tools, where data from multiple business applications can be combined. Tools like PowerBI and Tableau that are so good at generating views with valuable insights, tracking trends over time and helping you prove insight. Plus they’re so easy to use.
We believe there are 3 key areas which we must measure the impact and results of our learning initiatives; Knowledge, Learner Behaviour, and Improve & Evolve. Let’s take a look at the StoryShare Insights Framework to see how we prove learning by what we measure, where, and how.
The whole point of implementing learning initiatives is so the people completing the learning can learn something new, or reinforce their knowledge about products, skills, systems etc. The goal is to prove that knowledge is being transferred to your people and any skill or competency can be demonstrated in their individual performance.
StoryShare Learn measures this by analysing a combination of data ranging from short bursts and randomised quizzes, to how many times an individual has looked at a specific content asset, to the amount of time they spend on each session.
From within our platform we analyse every detail and statistic such as quiz scores, completions, badges earned, page views, time spent on a resource and much much more, supplying you every metric you need to measure the performance and the impact your learning content is having on an individual or group.
2. Learner Behaviour
We need to prove that we understand our audience, how they best learn and how you can adapt your approach for certain personas.
Many departments choose to measure the completions of content in isolation over anything else. If user experience is so important in the learning industry then why aren’t people understanding their users behaviours as a result of the experiences created?
We deep dive into data and see exactly what our learners do and how a learner has interacted during their journey. Data such as what devices they are using, time of day, session duration, popular search terms, where they came from and where they go next, plus many more, giving you a detailed outline of your learners behaviour creating opportunities to make improvements and more personalised strategies going forward.
3. Improve and Evolve
Lastly, we need to prove that we are improving course content based on individual performances. We need to constantly be looking to evolve the learning journeys for individuals to maximise engagement and to enhance their performance through their work objectives.
With a full 360 degree view of your content, from user level to organisational – we improve and evolve your content with statistics and figures to compare what works and what doesn’t, for example, how does video content engage with your learners compared to SCORM?
Within Learn you can find several stats around how your content has performed, for example, break it down by asset type to compare how each is being engaged and how many times, deep dive into video analytics and see where people have dropped off, opens vs completions, text vs video based content. All the data you need to make educated decisions and improvements to evolve your learning journeys.