Mon 27 Jun 2011
I received valuable feedback on the proposed design framework for Learning Analytics. A key question people asked was where pedagogy was in the model. Here is how I see it:
Pedagogic strategies and learning activities as such are not part of the analytics process but are implicitly contained in the input datasets that encapsulate the pedagogic behaviour of users. As we know, this behaviour depends a great deal on the platform and the pedagogic vision the developers built in (cf. Dron & Anderson, 2011). For example, data from a content sharing platform will have a behaviourist/cognitivist pedagogy attached to the learner behaviour, since this is the pedagogic model underlying the technology. In any case, only the pedagogic patterns exhibited in the dataset can be analysed and this will vary.
Additionally, pedagogy can be explicitly addressed in the goals and objectives that the LA designer sets. The LA method will determine the outcome of the analysis and together with the interpretation applied may lead to a large variety of options for consequences and interventions. If such pedagogic interventions are applied they lead to new behaviours which, once again, can be analysed through the available data.
A simple analogy would be boiling water in a pan. At any time (or continuously) you can stick a thermometer in and measure its temperature. The goal would be to determine whether you need to turn up the heat or not. The result of the analysis can then lead to the actions you want to take. The thermometer is only one method for such an analysis. An alternative would be to observe and wait until the water bubbles. Setting a threshold expectation (in the goals design) can inform you when it is time for the teabag to go in.
The model takes note that pedagogic success and performance are not the only thing that Learning Analytics can measure. Learning Analytics are snapshots taken from educational datasets. These snapshots can be used to reflect or predict, in order to make adjustments and interventions (either by a human or by a system). Through connecting the corner stones of the design model in different ways, different use cases can be constructed.
A key element of the Learning Analytics process that is not explicitly present in the model is that the outcome of any analysis needs to lead to a decision process which determines the consequences. Whether these are pedagogic or not, depends very much on the goals specified. Decision making can be stimulated and executed through the method applied and the algorithms chosen, for example in recommender systems. But decisions can also be taken by a human (e.g. teacher or self-directed learner). In any case they would lead to consequences and through a feedback loop the process can be chosen to be iterative.
Decisions based on Learning Analytics are a critical issue, because they determine the usefulness and consequences for the stakeholders. It is here where ethics play an enormously important role. Imagine an educational dataset that determines that children of immigrants are performing worse in reading tasks. Several options present themselves, and by all likelihood will be exploited by political parties or others: (1) more support for immigrant children could be offered; (2) segregation of immigrant vs non-immigrant schools; (3) right wing politicians will not hesitate to point out the deteriorating quality of schools due to immigration. Hence, data analysis could have dramatic (and unwanted) consequences. We need to be aware of this danger!