I’m hoping this will be the first in a new era of responsive posts that link back to the theme of ‘how we think’ and ‘how we learn’ (differently?) in the age of digital platforms and data.
As so often, @BenPatrickWill provided the initial spark, with this thread on the rise and rise of the ed tech industry and its entanglement with the world of finance.
As Ben points out, beyond simple investments, there are now a range of financial instruments available for gambling on a hi-tech future for education, from ‘special purchase acquisition companies’ to ‘exchange traded funds’. These don’t just provide investors with the chance to benefit from future rises in ed tech stocks (good old-fashioned capitalism). They drive us towards a world in which the ‘imaginary’ of a high tech future classroom is selectively funded into being. The big profits are no longer made from investing in products that turn out to be of future use: they are made by betting on futures (literally, and figuratively) in which today’s hype about the future hasn’t been debunked yet.
Ben’s thread mentions AI-in-ed as a key target of this feeding frenzy. AI has always been the fantasist in the IT room – and this isn’t a bad thing in a science and innovation project. The relationship between science fiction and fact can lead to ingenious productions, or at least interesting philosophical speculations. But the fantasies of AI have proved irresistible to financial speculators too. (This wikipedia page on AI ‘winters’ tracks what happens when belief in the future of AI falters. Funding also falters. No funding, no future.)
As long ago as the 1990s, Frederic Jameson identified the financialisation of capital markets with cultural postmodernity (‘abstract’, ‘fragmented’, emptied of content, ‘deterritorialized’). He described global financial as ‘a kind of cyberspace in which money capital has reached its ultimate dematerialisation as messages that pass instantaneously from one nodal point to another across the former… material world‘. Where good old fashioned (GOF) capitalism allowed money to be invested in real bodies at work in the material world, finance capitalism allows the money system to float free of what it supposedly represents, and generate productivity and profit in its own right. Trillions of dollars are exchanged on financial products and services – on the basis of competing stories about the future, rather than on actual value in the present. AI is one of those stories.
Let me develop the parallel a bit further. Good old fashioned (GOF)AI understood intelligence as reasoning about the world. Examples of GOFAI or ‘symbolic’ AI reasoning would be production rules in expert systems, or heuristics for describing a micro-world and the actions within it. These rules tie GOFAI systems to a ‘real world’ they claim to represent, and (in theory at least) reflect ways that human mind-bodies might reason about the real contexts they inhabit. GOFAI is in Jameson’s terms a modernist project – it has narrative as well as spatiality (rules, algorithms and chains of reasoning), it has referential depth (its codes and symbols point systematically to something else). Of course it has all the problems associated with a modernist project – the minds-bodies it imagines are strangely disembodied, in other words by default white, male, able-bodied, educated. Alison Adam and Lynette Hunter produced feminist critiques of GOFAI that still stand today.
Machine learning represents AI’s postmodern turn. Rather than modelling ‘the world to be known’, statistical approaches or neural networks are used to discover patterns in vast data sets. (Critiques of AI on race and gender grounds today are focused as much on the sources of this data as on its statistical and algorithmic processing). What matters for my argument is that the data is not meaningfully ordered: the patterns need not make human sense at all.
Google translate is a good example of this. I spent some time in the 1980s studying grammatical approaches to natural language processing as part of a degree in cognitive science. When I first encountered Google translate in the 2010s, I thought perhaps the problem of representing human language – the underlying grammar – had finally been solved! But of course the GT algorithm simply matches phrases with previous translations in the target language (originally by searching millions of documents from international bodies, latterly by searching interactions with google translate itself). ‘How human language works’ is of no interest to GT at all. Syntax and semantics remain black boxes, while the vast corpus of linguistic data that swashes through google every hour is cleverly repackaged and recirculated among its users.
As with money, so with data. What matters is the size of the network, the scale of the data set, the speed of the transaction. (And learning?)
As with money, so with data. What matters is the size of the network, the scale of the data corpus, and the speed with which transactions can be serviced. Reference (to an external world of value or meaning) gives way to reflexivity, narrative, human-readable rules to data relations. New financial instruments (new translations) emerge and proliferate, with an agency that can be attributed neither to the network itself, nor to its human users.
Economics, like education, presents itself as a rational way of knowing the world, while at the same time creating the world it claims to know. Questions about value in economics, like questions about ‘effectiveness’ in learning, depend on who is asking and what power they have to make their answers stick. The more that educators are required to standardise, measure and make transferable the outcomes of learning (or learning ‘gains’, in the TEF), the more like money or like data they become. However GOF education theorists have tried to understand learning in the past – as critical consciousness, social action, human capability, self-realisation – those debates can be quietly set aside. All that matters is what you can do with your credentials.
What if the connection between postmodern AI and postmodern capitalism is more than simply cultural, or metaphorical? After all, in the world of high finance, data is quite literally money (Donald MacKenzie 2019) and money, data (Carola Westermeier, 2019). A world economy in which money has floated free of real value, in which profit is driven by data volumes and transaction speeds, allied with a technology project based on fantasies of post-human reasoning – this is a potent alliance for sidelining real human life-forms and their material needs.
And where is postmodern education in this picture? Is it really just a question of passing packages of content, like currency, or data, across digital networks in which learners are not so much users as nodes – plugging in to content engines in the morning, and assessment engines in the afternoon? Not yet. But if education is, self-reflexively, whatever the education system produces, by definition it is becoming whatever the platform-university produces. Perhaps this is why the explanations of human learning offered by neuroscience are becoming so persuasive. Learning can be understood as emergent patterns of activity in a network of neurones – when the external conditions of learning match that explanation so well, it becomes the most credible one.