Ellen Wagner, from 8 Realities Learning Professionals Need to Know About Analytics:
Learning and development organizations simply cannot live outside today’s enterprise focus on the measurable, tangible results now driving IT, operations, finance, and other mission-critical applications. More to the point, learning and development organizations have emerging opportunities for putting their data to work in new and highly productive ways that will lead to demonstrable impact and alignment with business goals and enterprise strategic directions.
And at the same time…
Learning professionals would be well-served to consider the kinds of decisions that data analyses will be likely to serve, and not put their faith in the misguided belief that Big Data will directly serve up ready-to-go solutions from the mist.
We all know that I’m actively exploring the potential of the Tin Can API (and my own experience is why I’m referring to Tin Can in this post as opposed to any of the other technologies mentioned in Wagner’s article — not because of any strength or weakness in Tin Can specifically). One use of TC it is generating lots of data about what learners are doing inside of learning experiences and elsewhere. Something that’s less discussed (in depth, in public, and so far) is how the aggregated data can be used to inform good decision-making and good design. That’s not a criticism of the people who are experimenting with, adopting, shaping, or using Tin Can or any particular technology; for many of them, this is already well understood. For everyone, though, this is a timely and relevant message about moving forward with learning analytics successfully… or not.
As much as I don’t want to insult the intelligence of those in our field, I am also aware that we sometimes tend to buy into easy answers. So, just to take a very popular tool as an example, I’ve tested a Storyline course with Tin Can reporting enabled. It was very easy to do… development, check. But Storyline sends a statement for every page the learner accesses and every click s/he makes. Can you make meaning of that data once it’s delivered? Is the learning experience even designed to send meaningful statements, ones that can lead us toward actionable knowledge?
Wagner outlines eight rules of thumb to keep in mind when considering investing in — and living by — learning analytics. I hope you will read, question, and comment; I agree that the ability to harvest broad analytics could be transformative to what we do, and I also agree that it won’t be… without solid planning.
A follow-up… What happens when you let something sit in your Drafts for two weeks? If you’re very, very lucky, Ellen Wagner shakes the dust off of her blog. Here’s another post from her on this topic: Losing Our Analytics Religion.