Day 2 of the Edusprint on Analytics was devoted to the impact analytics can have on learning and teaching. Marsha Lovett from Carnegie Mellon presented an interesting approach to derive meaningful inferences from analytics about students’ learning states.

To start with in a quick poll, participants (of whom most declared themselves IT professionals) showed a clear preference toward intuitive judgements being made for measuring student performance and progress. 59% of some 300 spontaneous respondents said they evaluate how their course is going by the “feel”.

Carnegie Mellon have attempted to move beyond this in interesting ways, and with consideration of cognitive learning theories. Their base assumption is that learning is skills specific and that students draw on their skills to carry out instructional activities. On this they built a quantitative model of skilled learning, by also assuming that frequent practice leads to better skills. Through this, one can get better and deeper insights into the learning state of a student than by just looking at raw performance data, such as test scores or attendance. Unfortunately, Marsha did not explain how the digital actions of students (e.g. clicking on a resource) are translated into these skills, but still, the thought model looked pretty convincing.

When a student does some digital learning, the interactions are analysed and computed into an inferred student state, which, if I understand correctly, identifies the learning skills in use, and compares them to an expected state. This is presented to the student and instructor in a dashboard. The dashboard displays key aspects of the student’s learning state and also gives some recommendations.

What I liked about the presentation was the mention of analytics needing to be actionable. This is close to my thoughts encapsulated in the Learning Analytics framework. It leads to the question: are the dashboard displays of learning skills intelligible to students and teachers in determining their next action?

The general idea that I see represented here is similar to what some of my colleagues work on, that is using dashboard displays as reflection amplifiers that make people consider their position and perhaps change direction or efforts. Another thing that is reasonably clear is that comparative info about similar students performance gives more room to reflect than just one’s own data.

An important point raised by the audience was in how much learning analytics takes web clicks as proxies for learning. The response to this is that we need to look beyond simple clicking behaviour. Students perform linked actions and we need to perceive them contextually as patterns of learning and action sequences.

{lang: 'en-GB'}