Community & Society

This presentation by Richard Palmer at the JISC Digifest is a provocation not a promotion for learning analytics! There is so much bias in this piece that it’s hard to decide where to begin with taking it apart. It’s full of empty generalisations and predictions, so I start with issuing a WARNING – this read might damage your views on human teachers.

The argument that computers don’t come hung over, stressed or loaded with bias to work has been used many times, especially at a period in technological evolution that was led by the illusion that computers are infallible and perfectly neutral. For a long time, we know that neither is the case. Apart from this already idealistic view, computer systems and networks pose the greatest of all security risks to our societies (from petty criminals to cyber warfare). Why then should we or would we entrust them with education?

The vision presented is that computers can improve attainment, progression and the educational experience of students, thus leading to better grades and retention rates. This is naive at best! Computer-led education too is human managed – by humans that are no pedagogues but algorithmic programmers, statisticians, and profit makers, mostly ignorant of human psychology and development. The implication here is that human teachers wouldn’t know how to improve attainment, progression and experience, but that’s wrong, for it is the human collective (aka. education system) that sets these quality indicators of what it means to be successful. It is determined by micro-to-macro economic thinking of the labour market, politics, parents’ ambitions, value systems, social needs, and so forth. Not so long ago, humanistic education and educational selection was seen as the quality benchmark, now it is market education (employability, entrepreneurship, civic compliance) and massiveness that characterises it. Can machines set these values? Should they?

Another dystopia is promoting services connected to the idea “all watched over by machines of loving grace”, where systems survey your every move and then bug you with sending messages with “can I help you…?” This is followed by the thought that machines would judge work efforts and “intelligence” of students and then again bugs them with support spam if they perform under their algorithmic prediction. The mention of “objective criteria” makes me laugh in this context, in a world where fake news and post-truth knowledge dominate the headlines every day. The age of objectivism is long over.

A criticism against humans expressed by Palmer is that they are sluggish in changing and cling to ways things have been done in the past. I disagree with such a prejudice. Human history has always been polarised between progress and conservation. We tend to cling to the “known” because routines of the familiar help us be efficient in terms of our brain power and energy consumption. Innovation is always connected to risk assessment, but it isn’t fair that there hasn’t been evolution beyond the natural. And, after all, we invented machines to change production processes (like the mentioned looms).

Comparing the guidance of a young student with driving a car in complexity actually answers itself – a car is just another machine with simple mechanical responses. A student embedded in society is a complex system that has more to do with chaos theory then with algorithms. True, computers don’t turn up hung over or stressed, but they do crash frequently, and network failures cause entire workplaces to stand still. What’s better? Well, can you communicate with a crashed computer? And, turning it around, with a hung over student I can still communicate on some level – even if it is just to buy him time to recover! Can a computer do the same and understand why he does it?

Here’s a good thing about humans that’s not in the paper: We can think flexible and context specific, whence we can accommodate special needs and wishes. Compare this to the experience in a wifi-governed ordering system in a restaurant (when asking for rice instead of fries) or to a computer till at the supermarket – will it tell you that you can get two for one if the algorithm is dominated by maximising profits for the company? Will it send you to the shop opposite because they have a better offer? Such things happen every day between humans. They are not regular programmed events, they are social interactions.

Yes, technologies will improve. Yes, there is a danger that machines will replace human teachers (at least in certain functionalities), but it is preposterous to anticipate that this will lead to a better world!

{lang: 'en-GB'}


It’s extremely difficult to characterise different generations and generational shifts in societal attitudes. Not everyone of the parent pre-1960’s generation was a frugal family person nor is everyone of the 2000’s an entertainment junkie. Yet, there are certain signature traits in behaviours, attitudes and values that make up the mainstream and stand for an era.

This interesting piece of writing looks at the rise in supersticious beliefs and the anything-goes society. The author argues that with the dramatic changes in the 1960’s society (sexual revolution, hippies, etc.) a new mentality formed in the US that allowed for unscientific knowledge to become part of the local and national narrative. Especially, Americans became open to conspiracy theories of all sorts that still surround us today. What’s worse, it seems that these have now fully entered politics and the government, shaking the foundations of our democracy.

With Star Trek and the likes an integral part of my (European) youth, it never occured to me that most of the space sci-fi ideas emerged in the US, including UFO sightings, aliens, utopian and dystopian cartoons and documentaries, etc. This is well documented in Hollywood and TV productions from Battlestar Galactica to E.T. Europe was at that time (the 1960s and 1970s) still working on its post-WWII trauma and settled with glorifying the past (empires). There was very little if any of its movie productions that portrayed the futuristic, the out-worldish, or the supernatural.

Whether it is true or not, what the author implies, that Americans today are prone to believe whatever they want, as implausible as it may be, the internet has certainly contributed to the spread of misinformation to the credulous. Recent electoral events indicate a firm rejection of facts and science – to the extent where researchers feel the urge to protest for being sidelined.

What is equally worrying is the decline in reliability of information through the use of “social media” in journalism. I mean the references in traditional and well-established papers to social media “news” on the basis of their virality. This way, a single tweet can transform the worldview of many people – and are we really able to detect all the “fake news”, propaganda, and hoaxes? Losing reality by a variety of (technological) means, e.g. fake news, opinion manipulation via facebook, VR, dark web, entertainment, and so forth, jeopardises our  consensus-based democracy. It seems that different mob-created “worlds” emerge for us to pick and choose, or to create our own – including delusions of caliphates and other tribes.

{lang: 'en-GB'}

It sounds absurd that in the “Knowledge Society” we should struggle to defend fact-based science and research.

Here we are, all that talk about evidence-based learning, learning analytics, machine learning, big data and mining techniques. All those enormously hyped expectations that this will lead us to a better understanding of learning and learners, and, hence, to better and more customised learning opportunities. It was hailed as opening new insights not only into learning behaviours. And beyond learning, Big Data is now exploited in many ways to maximise customer experiences, space science, active citizenship, the economy, schools and universities, health, the environment, and so forth. It’s the new blueprint for an information and fact-based world, that’s sometimes too scary to think of.

Yet here we stand faced with post-truth politics where facts are worth nothing, rendering science and research useless. Fake news and the ignorance of science and its methodical approaches are killing the search for facts and figures, that drove much of the past decades towards the Knowledge Society. We have heard much about the changes in management culture away from anecdotal decision making to fact-based leadership. Well, it looks like the rumour mills are back in business.

Resistance comes from scientists and research professionals. The community is coming together and marches! There’s a bunch of activities to send out the message. But who will listen?! History tells us, that it’s no fair fight when science and facts meets rumours, propaganda, believes and outright denial. The demotion of knowledge is endangering and flattening our world… – until it becomes again a disk where we fall off either side!

{lang: 'en-GB'}

There maybe finally some awakening to the tsunamis of graduates that hit the labour market in recent years. The article quite reasonably sums up what is wrong with too large numbers of people going to university. Over the past twenty-five years or so, a policy of widening access has led to what the author calls “conveyor belt” education, i.e. the automatic advance from one level of school to the next with an almost unreflected progression into higher education. This situation has been accelerated under the pressures of country rankings according to their participation rates at university.

The assumption (promoted by the likes of the OECD’s “Education at a Glance“) is that higher education brings better jobs, better earnings and more job security. This may still be true for some countries and some professions, but, nationally, it has had various negative impacts including and in addition to those mentioned in the article:

We could say, the OECD argument still holds true, just because graduates occupy jobs of non-graduates, hence, relatively speaking, they are “more secure” whether a university degree is actually needed for the job or not. At the same time, they push other school leavers into inferior positions, and this push-chain downward continues to leave the most vulnerable and least educated without jobs or in very precarious situations, while turning academics into cheap labour due to excessive supply. This leads to low-cost for human resources and to deteriorating employment conditions  (zero-hour contracts, etc.).

Another downside of the all-time high participation is that the quality of a university education drops to accommodate higher numbers of graduates, while at the same time the cost of university education rises, which is then passed on to students (and parents) who are often caught in a debt trap. It also prolongs the (unproductive) time for young people before they earn a living. In some countries, the age of young people still living at home has risen dramatically in the wake of this (notably Italy and Spain).

From a sector and vocational point of view, to many businesses a university education adds little in productivity or economically, since higher education is not and has no aim to be a vocational education for professional crafts businesses or the service sector. Too often, students attend a course that has little to do with their later employment, while vice-versa most jobs did not develop into something requiring more sophistication or intellectual capital (e.g. bank clerks).

No wonder then that universities find themselves in an identity crisis with the task of catering for the entire (upper level) labour market. The political push to university participation has, in my mind, neglected other (possible) forms of education that the economy would require. Rather than turning universities into employability factories and away from their mission as think-tanks and innovators to advance human knowledge, they would benefit from a narrower portfolio and target group. Instead of a generalist education that virtually everyone passes and that in its learning outcomes strongly resembles previous grammar schools, we need more real diversity – i.e. quality education opportunities to fulfil a specialist role in the economy, offered by different types of institutions for a variety of learners. In Austria and Germany, the “Fachhochschulen” (comparable to FE colleges) fill this part quite successfully and are flexible enough to adapt to labour market needs.

Careers development for vocational professions should no longer be neglected, but new (higher) designs outside of university are thinkable, which could also lead to more self-esteem of non-university learners, as well as presenting a lifelong learning road for high achievers in non-academic professions.


{lang: 'en-GB'}

Image result for student evaluation of teaching

Ha! Finally, a study that confirms what was general knowledge anyhow, among non-decision makers! Student evaluations of their teachers have no correlation to their learning. Who would have guessed?! Filling in a couple of questions at the end of term, if anything, indicates at the most popularity, not quality or progress. Male teachers seem to fare better over all, confirming a gender bias.

I particularly like this part:

“The entire notion that we could measure professors’ teaching effectiveness by simple ways such as asking students to answer a few questions about their perceptions of their course experiences, instructors’ knowledge and the like seems unrealistic given well-established findings from cognitive sciences such as strong associations between learning and individual differences including prior knowledge, intelligence, motivation and interest. Individual differences in knowledge and intelligence are likely to influence how much students learn in the same course taught by the same professor.”

and this:

“Currently, there is massive production of unnecessary, misleading, and conflicted systematic reviews and meta-analyses. … these instruments often serve mostly as easily produced publishable units or marketing tools.”

I would add to it that it’s a miserable waste of valuable staff and student time and creates an anxiety that undermines learning.

{lang: 'en-GB'}

There is much uncertainty about ethics and privacy in learning analytics which hampers wider adoption. In a recent article for LAK16, Hendrik Drachsler and I tried to show ways in which trusted learning analytics can be established compliant with existing legislation and the forthcoming General Data Protection Regulation (GDPR) by the European Commission, which will come into force in 2018. In short, four things need to be established:

  • Transparency about the purpose: Make it clear to the users what the purpose of data collection is and who will be able to access it. Let users know that data collection is limited to fulfil only the intended purpose effectively.
  • Informed consent: Get users to agree to data collection and processing, by telling them what data you are collecting, for how long data will be stored, and provide reassurance that none of the data will be open for re-purposing or use by third parties. According to the GDPR, approval can be revoked and data of individual users must then be deleted from the store – this is called “the right to be forgotten”.
  • Anonymise: Replace any identifiable personal information and make the individual not retrievable. In collective settings data can be aggregated to generate abstract metadata models.
  • Data security: Store data, ideally encrypted, in a (physically) safe server environment. Monitor regularly who has access to the data.
{lang: 'en-GB'}

I am not sure whether the worrying developments in HE play into the hands of those people advocating disruptive change or the idea of abolishing the HE system altogether. As you can read here below, I am not one of them, as I believe education to be in the common public interest and a matter for society (i.e. the state), not for profit sharks. Still, I note a cumulative deterioration of system components that are driven by the implementation of commercial models in HE institutions.

Direct competition between institutions has been introduced decades ago leading to established market thinking, business cases and student “customers”. However, more recently, the university system developed into a luxury brand, for those who can afford it. The state slowly withdrew itself from the scene via severe cuts and austerities, or, on the student side, dramatically rising fees and costs – with less and less support from the government.

At the same time, the government eyed at private providers, so-called “challenger” institutions to compete with the public sector (and perhaps to later replace it). Very little is known about these private providers according to HESA, which leads to a messy market with bogus degree awarding entities. Some 220 such unauthorised providers were identified over the last 5 years, 80% of them had been closed. This must mean that the cost of patrolling the sector must have exploded too. Judging by the tremendous “success” the rail privatisation had on their customers, it is foreseeable that HE will go down a similar path, only with an even more dramatic knock-on effect on the labour market.

If someone now shrugs their shoulder and says “so what”, I can briefly summarise what we lost in these and similar developments: Gone are free for all studies (in previous days universities were open to everyone!), gone are maintenance grants, and good earnings for post-grads – this spells the end of the widening access agenda and the equal opportunities policy. Long gone, of course, are the days of humanitarian no-profit studies like philosophy or numismatics, Ancient Greek, etc. once departments that could not generate money to make up for the loss in government finances were closed.

The question for the future is whether the reductionist approach to higher education, which will inevitably lead to smaller numbers of academics (and institutions), will in fact lead to a rise in value of pre-university degrees like A-levels and apprenticeships.

{lang: 'en-GB'}


There is too much information in the Information Society! Un-vetted information that is. The readiness of available information leads to circular confirmation of misinformation or misinterpretation of so-called “facts”. There are a number of indicators for this situation:

  • information overload: people exposed to too many news sources suffer from anxiety of (a) missing something (like in a facebook news stream), (b) trusting the source, (c) trusting their own capability of evaluating information to sift out misinformation. It’s connected to the paradox of choice.
  • news loops: news publishers, especially on the internet, are challenged to provide up-to-the-minute news, which leads them to neglect their own analysis and research and instead copy-paste from press agencies. This is why news in all news outlets are to 80-90% identical – including their “own” opinion. Or have you not wondered why some geographic areas suddenly disappear from all news channels? It’s news going round in circles. China’s regulator even went so far as to decree the verification of news stories.
  • social media: up-to-the-minute info by news publishers nowadays references and takes for true postings on social media channels like twitter or facebook. The assumption seems to be that if many people (only the ones connected to twitter and facebook) express a strong feeling about something – then this must be a valid quantitative measure of satisfaction on political and other issues. However, as the run-up to the Brexit votum showed, manipulation and propaganda on social media is on the increase.

This kind of information society does not lead to more self-determination by individuals nor does it empower the powerless. It’s steering rapidly to a 1984 scenario where people are no longer able to distinguish truth from make-believe.

{lang: 'en-GB'}

I am always wary when it comes to hyping a new technology. As the recent LAK16 global conference has hinted at, Learning Analytics may just have reached the height of the Gardener hype cycle.


Sure, Learning Analytics has its promises to create new insights into learning and a new basis for learner self-reflection or support services. But it is dangerous to expect it to produce “truth about learning”! A forthcoming paper I recently reviewed covers the promising influence LA has on the Learning Sciences and rightly demands that more learning theories should be put at the basis for LA, but, as Paul Kirschner expressed in his keynote presentation, there are many types of learning and often in LA research and development they are simplified and generalised.

To correctly ground our expectations in some sort of reality, we only need to look at areas where data analysis and predictions have long been used to “tell the truth” and to predict the future in order to take appropriate measures: politics, economics, and the weather forecast. Since it is without human unpredictability, the weather forecast has become the most accurate of the data-heavy sciences, yet, even there, the long term predictions still carry a strong element of randomness and guesswork. Do we want to risk the future of students’ lives by basing them on 75% probabilities?

Even where there is higher accuracy, the question may be raised about algorithmic accountability. Who will be held responsible and how can anyone make a claim against a failed prediction. This risk isn’t as present in the commercial world, where an inaccurate shopping suggestion through targeted advertisements can simply be ignored, but in education careers are at stake. From a managerial perspective, while it is scientifically fabulous to have a 75-80% accuracy in predictions of highly specific drop-out scenarios, there is a cost-benefit issue related to this. To simply propose that system alerts should trigger teachers’ attention on particular students, and student support services then need to call up that particular student (which they may actually like as much as a phone call from the bank selling new services) doesn’t cut it. As a cheaper alternative, I ,sarcastically, suggested to use a random algorithm to pick a student for receiving special attention that week.

It is also worth contemplating in how much predictions about the success of learners may become self-fulfilling prophesies. Learning Analytics predictions are typically based on a number of assumptions forming the “student model”. One big assumption is that of a stable teaching/learning environment. If everything runs linear and “on rails” then it is relatively easy to say that the learning train departing from station A will eventually reach station B. However, it’s nowadays well recognised that learning is situated and human teachers are didactically and psychologically influencing the adaptivity of the learning environment. It would, in my mind, require much higher levels of intelligence for algorithms to achieve the same support as human teachers, but if it did, what would then become of our teachers? What would be the role of human teachers if LA and AI take over decision making? What qualities would they need to possess or could they just be obsolete?

We cannot neglect the human social factor in other ways too: quantifying people inevitably installs a ranking system. While a leaderboard scheme based on LA data could be on the one hand a motivating tool for some students (as is the case in serious and other games), it could also lead to apathy in others when they realise they’ll never get to the top. The trouble is that people are being metatagged by analytics and these labels are very difficult to change. They also may exercise a reverse influence on the learner in that such labels become sticky parts of their personality or digital identity.

As so often with innovative approaches, hypes and new technologies, the benefit of Learning Analytics may not lie in what the analytics actually do or how accurate they are, but in a “side-effect” that is somewhat unexpected. I see part of the promise of learning analytics in starting a discussion on how we take decisions.

{lang: 'en-GB'}


It is one of those statistically proven facts that young people from higher educated family backgrounds are more likely to get higher educated than their peers from lower educated families. Having parents with a university degree provides students with a greater chance to succeed in HE themselves, perhaps reaching even higher levels. Such facts and figures have been used in international comparisons like the OECD’s Education at a Glance, but also in national strategies targeting the lower social classes in order to widen participation.

I would like to reflect on this so-called fact, though, because it assumes a very stable idea of what ‘family’ means. It mirrors and purports a society perhaps of the generation before the sexual revolution in the late 1960s. Today, in an era where around half of the official marriages break up and single-parenthood has become a more frequent situation than biological families, this assumption should at least be challenged. How temporary and patch-work parenthood actually influences educational participation and success, is a question that has not reached the statisticians yet.

As the demography of students changes, where 60% of students (in Austria) have some kind of job besides their studies, and, where lifelong learning raises the average student age, I see many situations that influence HE participation more than pure ancestry.

Leaving the financial aspects aside, participation and success would need to be measured against compatibility with respective cohabitant folks at home rather than biological parents. Having a learner-friendly environment is critical for deciding to study in the first instance, but also for persisting over a longer period of time. While women may find it relatively easy to tell their friends they’ll go to sign up for a course, men find it considerably more difficult to talk about such a move to their friends, especially in lower educated environments. On the other hand, the home acceptance rate of women with low educated partners to go into FE or HE can be considered as a direct barrier, and many could be actively discouraged and prevented from doing so. All in all, the parent factor while still present in the figures may be of lower importance than the current generation experiences.

{lang: 'en-GB'}

Next Page »