Sunday, 20 August 2017

Potlatch and Education

I've been a bit of a journey in thinking about the relation between education and economics. It seems to me that the proponents of the market-oriented rhetoric in education, and its opponents, are both preaching from economics textbooks which are fundamentally wrong on many important issues and certainly do not explain the complex things which happen in education.

Wednesday, 16 August 2017

Bateson on "Pride and Symmetry"

I've been thinking a lot about symmetry recently, and in recommending a student read Bateson's paper on Alcoholics Anonymous ("The cybernetics of self"), I noticed a sub-heading which struck me with more force than it did when I last looked at the paper: "Pride and Symmetry". Bateson was interested in symmetrical relations - particularly social symmetrical relations. This is where his notion of symmetrical and complementary schizmogenesis comes from, and he uses this idea to explain the double-bind that the alcoholic is in:

The so-called pride of the alcoholic always presumes a real or fictitious “other” and its complete contextual definition therefore demands that we characterize the real or imagined relationship to this “other.” 
A first step in this task is to classify the relationship as either “symmetrical” or “complementary” (Bateson, 1936). To do this is not entirely simple when the “other” is a creation of the unconscious, but we shall see that the indications for such a classification are clear. 
An explanatory digression is, however, necessary. The primary criterion is simple: If, in a binary relationship, the behaviors of A and B are regarded (by A and B) as similar and are linked so that more of the given behavior by A stimulates more of it in B, and vice versa, then the relationship is “symmetrical” in regard to these behaviors. If, conversely, the behaviors of A and B are dissimilar but mutually fit together (as, for example, spectatorship fits exhibitionism), and the behaviors are linked so that more of A’s behavior stimulates more of B’s fitting behavior, then the relationship is “complementary” in regard to these behaviors. 
Common examples of simple symmetrical relationship are: armaments races, keeping up with the Joneses, athletic emulation, boxing matches, and the like. Common examples of complementary relationship are: dominance-submission, sadism-masochism, nurturance-dependency, spectatorship-exhibitionism, and the like. More complex considerations arise when higher logical typing is present. For example: A and B may compete in gift-giving, thus superposing a larger symmetrical frame upon primarily complementary behaviors. Or, conversely, a therapist might engage in competition with a patient in some sort of play therapy, placing a complementary nurturant frame around the primarily symmetrical transactions of the game. 
Various sorts of “double binds” are generated when A and B perceive the premises of their relationship in different terms-A may regard B’s behavior as competitive when B thought he was helping A. And so on. With these complexities we are not here concerned, because the imaginary “other” or counterpart in the “pride” of the alcoholic does not, I believe, play the complex games which are characteristic of the “voices” of schizophrenics. Both complementary and symmetrical relationships are liable to progressive changes of the sort which I have called schismogenesis (Bateson, 1936).
Symmetrical struggles and armaments races may, in the current phrase, “escalate”; and the normal pattern of succoring-dependency between parent and child may become monstrous. These potentially pathological developments are due to undamped or uncorrected positive feedback in the system, and may-as statedoccur in either complementary or symmetrical systems. However, in mixed systems schismogenesis is necessarily reduced. The armaments race between two nations will be slowed down by acceptance of complementary themes such as dominance, dependency, admiration, and so forth, between them. It will be speeded up by the repudiation of these themes. This antithetical relationship between complementary and symmetrical themes is, no doubt, due to the fact that each is the logical opposite of the other. 
In a merely symmetrical armaments race, nation A is motivated to greater efforts by its estimate of the greater strength of B. When it estimates that B is weaker, nation A will relax its efforts. But the exact opposite will happen if A’s structuring of the relationship is complementary. Observing that B is weaker than they, A will go ahead with hopes of conquest (cf. Bateson, 1946, and Richardson, 1935). 
This antithesis between complementary and symmetrical patterns may be more than simply logical. Notably, in psychoanalytic theory (cf. Erikson, 1937), the patterns which are called “libidinal” and which are modalities of the erogenous zones are all complementary. Intrusion, inclusion, exclusion, reception, retention, and the like-all of these are classed as “libidinal.” Whereas rivalry, competition, and the like fall under the rubric of “ego” and “defense.”

Saturday, 5 August 2017

Gombrich on Semiotics

I'm going to begin my talk on Peirce next week with Ernst Gombrich's preface to the 2000 edition of Art and Illusion. This short piece fascinated me as much as the contents of the rest of the book when I first encountered it nearly 20 years ago. Art and Illusion is about the relationship between art and nature, and the Greek idea of mimesis. The relationship between art, pictures and signs is clearly an important sub-topic in this, and this is what Gombrich wrote his new preface about - at a time when semiotics was much discussed in the art schools (late 90s, early 2000s). His aim was to correct the current fashion which argued from a constructivist position that all images were signs, that the Greek idea of mimesis was nonsense. Gombrich begins:

[the] commonsense interpretation of the history of Western art has recently been attacked on the ground that the whole idea of mimesis, truth to nature, is a will-o'-the-wisp, a vulgar error. There never was an image that looked like nature; all images are based on conventions, no more and no less than is language or the characters of our scripts. All images are signs, and the discipline that must investigate them is not the psychology of perception—as I had believed—but semiotics, the science of signs.
Gombrich argues that this reaction is overstated: the thirst for illusion is unabated - the goal of mimesis captivates the imagination. Gombrich, always ahead of his time, points out the technological advances in pursuit of mimesis:
Simulators were developed for the training of pilots, who put on a helmet through which their eyes were fed the appearance of an environment rushing past, which they were asked to control. More recently, so called "virtual reality" has been perfected, which allows us not only to see and hear an invented reality but even to touch it with specially constructed gloves. I do not know whether this device will, or can become a medium of art; all that matters in the present context is the undeniable evidence that images can be
approximated to the experience of reality
Gombrich talks about the 'mental set' - the field of expectations - through which signs are interpreted. He points out the playfulness in the interpretation of signs, and the shifts in mental set. For example, the puppet theatre which might transfix the child's imagination in a story suddenly disrupts this expectation when the giant puppeteer's hand appears in the scene to move a character.

What Gombrich appears to be talking about are the constraints within which signs are interpreted: that a sign is not a construct of some individual mind, but that it is the result of a game played within multiple contexts (or constraints) of sensory stimuli, life experiences, expectations, education, social situations, and so on. The game of mimesis is played between image, perception and illusion, among many other things.

The contributing factors in the game are additional descriptions. He says:

A string of ovals can also be an ornament purely used for decoration, as in this case: 0000. But add the word "PLUM" underneath  and you transform the mental set: the oval no longer appears to stand on a neutral background, it is surrounded by an infinite halo of space, because we expect plums to be solid, and not only to be edible, but also graspable—an effect we can further enhance by the suggestion of a foreshortened stalk and leaves.

He comes to the crux of the issue, highlighting the importance of the game that is played in recognising a sign:
We come to realize in such cases that the required mental set did not precede the reading, but followed in a rapid feedback process. Where signs and images appear together on the page the feedback works almost instantly—witness the ease with which our youngest read so-called comics, combining pictures with a simple story. 
The difference between images and signs, then, does not lie in the degree of iconicity or conventionality. Images can function as signs as soon as they are recognized. We need only think of the labels on cans to realize that a perfect iconic image can function as a sign.

Gombrich tells a story about Constable whose judgement about early photography is very revealing in both his and Gombrich's attitude to the relationship between the image and nature:
In 1823 Constable visited a sensational display, the diorama constructed by Daguerre, later the inventor of the daguerrotype. "It is in part a transparency," he wrote, "the spectator is in a dark chamber, and it is very pleasing and has great illusion. It is outside the pale of art because its object is deception. The art pleases by reminding, not deceiving."
In reflecting what Constable might have meant by "outside the pale of art", Gombrich says:

Would we go quite wrong in suggesting that, for Constable, art had become something like a game of skill, with its own rules, which must be kept free of labor saving devices? To deceive the eye is to cheat, for the painter must please by reminding, just as the playwright of Shakespeare's Prologue must work on our "imaginary forces." Fidelity to nature has to be achieved within the limits of the medium. Once this compact between the artist and the beholder is destroyed, we are outside the pale of art. Indeed, as soon as Daguerre's and Fox Talbot's mechanical methods entered the field, art had to shift the goalposts, and move the pale elsewhere. 

There's something very profound in what is it to remind rather than deceive. I wrote something about this with regard to music a few years ago: Art reminds by overlaying descriptions on top of one another. I think its interplay of multiple descriptions reminds us of the interplay of multiple descriptions in our lived experience. To deceive us of reality is to identify and reproduce as faithfully as possible the descriptions of actual experience. Since the actual experience of one person and another is different, this deception necessarily abstracts from individual experience the principal descriptions which it sees to be universal - some of these abstracted descriptions can be taken as 'signs'. The complexity of the interaction of abstracted descriptions is never the same as the overlaying of multiple descriptions to produce complexity.

Tuesday, 1 August 2017

Semiotics and Symmetry

Next week I'm giving a talk about Peirce and Quaternions at the Alternative Natural Philosophy Association. It's a fascinating group which was introduced to me by Peter Rowlands at Liverpool, and I've quite enjoyed getting stuck in to thinking about Peirce long after I'd thought I'd left all that stuff behind.

The thing which has dragged me back to Peirce is his interest in quaternions, which Peter Rowlands introduced me to through his physical theory. It was a coincidence that I discovered that Peirce had been fascinated by Hamilton's work too - largely because of his father who was quite an eminent mathematician. In Peirce's writing, the quaternion tables are quite prominent, and I'm pretty convinced that his obsession with tripartite structures derives from this.

What put me off Peirce is a kind of semiotic dogmatism which analysed the stuff of the world as Symbol, Icon and Index, obsessing about Interpretants, signs and representamens. There didn't seem to be any ground for the dogmatism. But of course, this was the fault of those who jumped onto the Perice bandwaggon, not the man himself. Even within more thoughtful scholarship, and emphasis on semiosis as process  (which it clearly is), the Peircian categories are overlaid as if to say "this sign is produced by this process".

What does it mean to say "this sign" anyway? This has got me thinking about contexts, and whether Peirce's sign theory is really a theory about the context of signs.

A tripartite, anticommutative, symmetrical idea like the quaternions is an interesting way of thinking about contexts. We detect sameness and stability through a continually changing context. To say "this is a sign" is to make a declaration about something remaining the same despite changes in the context of its perception. Peirce's distinction between Sign, Representamen and Interpretant are different dimensions of the context, and within each dimension, there are a further three subdivisions. So "sign" breaks down into Icons, Indexes, and Symbols, for example. His firstness, secondness, thirdness feels like the three dimensions which hold the structure together.

This has significance for the way we think about analogy, sameness and induction (which is dependent on analogy) - Peirce was doing logic after all.

Sameness, counting, induction and analogy are all declarations: we say "this is a chair" because of its sameness with other chairs. We say "there are three chairs" because of the sameness between them. Of course, in making declarations like that, we are producing signs; but the declaration itself is necessitated by the differences between the context of the perceptions of the objects. Nothing is ever really "the same".

However, things may be symmetrical. To make a sign and say "this is a chair" is to respond to the differences of context in which chairs are perceived. Might those changes in context result from an anti-commutative rotational symmetry? I'd like to explore this. What of the anti-commutative rotational symmetry of the statement "this is a chair?" - are the changes in the contexts related? What are their dynamics? How might we investigate it?

The best way we have of investigating a context - or a constraint - is information theory. It's a crude instrument. However, what it does do is allow us to look at the many descriptions of something (the statements people make about something) and see how they relate to one another. The most interesting context to do this is over time-based media like music or video. constraints change over time, and it is possible to explore the dimensions of constraint over time, and particularly the way that changes in one constraint relates to changes in another.

The technique for doing this is known as relative entropy. A similar technique is used for exploring the presence of entanglement in physics. There, the descriptions of charge, mass, space and time seem fixed - and yet, do these properties also change the context for observation?

Questions like this are intriguing because they hint at a closing gap between the physical and the social sciences.

Sunday, 30 July 2017


A bit more work to do here (I'm using which is great at hosting and updating the versions of documents. It means that this chapter will magically improve over time!)

Thursday, 27 July 2017

Beer and Illich on Institutional Change: Uncertainty at the heart of the system

Stafford Beer's "Platform for Change" is an extraordinary book which sets out  diagrammatically to document the processes by which the world might move from pathological institutions, markets, exploitation and environmental destruction, to a viable world which lives within its means. The diagrams get more complex, as the book goes on, culminating in this:
Which is a bit daunting. However, there are things to notice. Within each of those boxes, there is a smaller box at the bottom with a "U" in it: this is "Undecidability". I think it could equally be called "Uncertainty", but it is worth noting that around every heavy-type box (in bold), there is a lighter type box which is connected to the "U" box, and which is labelled with things like "Metasystem", "conscience", "reform", and so on. 

Beer's point in platform for change is that the way society manages its "undecidability", or uncertainty, causes pathology. This is most clear from his diagram about the difference from old institutions to new institutions:
What manages uncertainty in the pathological "old" institutions at the top? The "Metalanguage of Reform". This is the drive for "restructuring", "privatisation", "outsourcing" and so on. What does structure mean in the first place - it's in the middle of the box - the hierarchical organisation of most institutions. 

Feeding in to the whole thing in the pathological institution is "Homo Faber" - the maker of increasingly powerful tools which dictate how people should live and drive people into increasing technocratisation. On the other side, we clearly see that this comes at the cost of the "Exploitable earth", with exploitable people, and cost-benefit analysis. On the right hand side at the top, Beer sees the "conservation" movement as the management of uncertainty about the exploitable earth with a metalanguage of "conscience" which is managed by the conservationist's discourse. Of course, this is a reaction to the pathology, but it also appears as part of the overall system of the problem. 

What do to about it?

Uncertainty (or undecidability) has to be managed in a different way. In the lower part of the diagram, Beer imagines a different kind of institution which facilitates the coordination of uncertainty among the different people who engage with it. The Undecidability box is connected to a "Metalanguage of Metasystem" - a way of having a conversation about the way we have conversations. 

Technology works with this not as the continual pathological product of Homo Faber who produces ever-more powerful tools, but as an appropriate response to establishing synergy in the system. Feeding it and monitoring it is "Homo Gubernator" - whose actions are dedicated to maintaining viability, providing safeguards and monitoring the eudemony in the system. 

Of course, it all raises question - but they're good questions. But I've been struck by the similarity between Beer's thought and those of Ivan Illich in his Tools for Conviviality.

For Illich, the problem of the pathological institution (the top of the diagram) is the declaration of "regimes of scarcity": the need to maintain institutional structures in the face of environmental uncertainty, which often takes the form of increasing specialisation, educational certification, division between people in society, and the ever increasing power of tools. This is a positive feedback mechanism whereby increasingly powerful tools generate more uncertainty in the environment which entails a need for more institutional defence, more scarcity declarations, and so on. It is this pathological way of dealing with uncertainty which is the underlying mechanism of the appalling inequality which we are now experiencing. 

For Illich, education lies at the heart of the means to transform this into what he calls a "convivial society". The education system we have produces scarcity declarations about knowledge, and supports professionalisation which alienates people and creates division (we've seen this with populism). 

The solution to this is to invert education - to make knowledge and learning abundant rather than scarce, and to create the conditions for conviviality. Conviviality is an alternative way of managing uncertainty. Its diagrammatic representation is in the "New Institution" box at the bottom of Beer's diagram. Quite simply, conviviality is where each person manages their uncertainty by engaging directly with each other person. Intersubjectivity is the most powerful mechanism for dealing with uncertainty that we have. We do not have to create institutions to manage uncertainty, nor do we need to create ever more powerful tools.

Illich closes the system loop because he sees the limiting of tools as the critical factor in the establishment of a viable convivial society. This limiting is a politicising of technology: it is where a convivial society determines through dialogue what tools are needed, what should be limited, and how it should manage its resources. In effect it is a communitarian approach to managing the commons of education, environment, tools and people  - very similar to that which was studied by Eleanor Ostrom. 

To do this, educational technology is a critical component. We need abundance of information and skill. We need open education and open resources for learning. 

But the most important thing is to see that the route to viability (and the root of our current pathology) is uncertainty. 

Wednesday, 12 July 2017

Winograd and Flores on Computers and conversation

Winograd and Flores wrote this in 1984. Have things changed much?
Computers do not exist, in the sense of things possessing objective features and functions, outside of language. They are created in the conversations human beings engage in when they cope with and anticipate breakdown. Our central claim in this book is that the current theoretical discourse about computers is based on a misinterpretation of the nature of human cognition and language. Computers designed on the basis of this misconception provide only impoverished possibilities for modelling and enlarging the scope of human understanding. They are restricted to representing knowledge as the acquisition and manipulation of facts, and communication as the transferring of information. As a result, we are now witnessing a major breakdown in the design of computer technology - a breakdown that reveals the rationalistically oriented background of discourse in which our current understanding is embedded. 

[...] Computers are not only designed in language but are themselves equipment for language. They will not just reflect our understanding of language, but will at the same time create new possibilities for the speaking and listening that we do - for creating ourselves in language. (Understanding computers and cognition, p78)

Later on Winograd and Flores defend their argument that computers are tools for keeping track of commitments that people make to each other through recording speech acts. They argue:

New computer-based communication technology can help anticipate and avoid breakdowns. It is impossible to completely avoid breakdowns by design, since it is in the nature of any design process that it must select a finite set of anticipations from the situation. But we can partially anticipate situations where breakdowns are likely to occur (by noting their recurrence) and we can provide people with the tools and procedures they need to cope with them. Moreover, new conversational networks can be designed that give the organisation the ability to recognise and realise new possibilities.   (p158)

I'm curious about this because it resonates with many of the aims of big data today. Winograd and Flores were anti-AI, but clearly the mass storage of speech acts does serve to reveal patterns of recurrence and breakdown which do provide anticipatory intelligence (which is what Google Now does).

I think the real issue concerns a deeper understanding of language and conversation, and particularly the inter-subjective nature of conversation - that is, the con-versare nature of it (dancing). 

Saturday, 8 July 2017

Interoperability and the Attenuation of Technological Possibility: Towards Socially Responsible Hacking?

I owe at least 10 years of my career directly or indirectly to generous funding from JISC in the UK and the EU commission. The underpinning rationale which attracted this research money was interoperability in educational technology. It was presence of the Centre for Educational Technology and Interoperability Standards (CETIS) at the University of Bolton which created the conditions for engagement in a wide range of projects. The University of Bolton, of all places, had the greatest concentration of technical experts on e-learning in the world (something continually reinforced to me as I meet colleagues from overseas: Bolton? You were a world-leader!).

Now that most of the project funding opportunities have gone (JISC survives in very different form, but on a mission to keep itself going on a commercial footing which has become problematic), the EU closed its Technology Enhanced Learning strand a couple of years ago (hardly surprising since there were rather too many very expensive projects which delivered little - even for the EU!), and CETIS survives as an independent Limited Liability Partnership (LLP), albeit in a role of more general IT consultancy for education, rather than a focused mission to foster interoperability. The international agency for interoperability in education, IMS, seems to have largely ceded the debate to the big commercial players like Blackboard, who talk the language of interoperability as a salespitch, but have little interest in making it happen.

Now that I am settled elsewhere, and I'm pleased to say, soon to be joined by a former CETIS colleague, it seems like a good time to think about interoperability again. In my current role, interoperability is a huge issue. It is because of interoperability problems that my faculty (just the faculty!) runs four different e-portfolio systems. It is because of a lack of interoperability that the aggregation and analysis of data from all our e-learning platforms is practically impossible (unless you do something clever with web automation and scraping, which is my current obsession), it is because of interoperability problems that individual parts of the faculty will seek new software solutions to problems which ought to merely require front-end adjustments to existing systems, and interoperability problems coupled with pathological data security worries create barriers to systems innovation and integration. Eventually, this becomes unsustainable.

So given all the effort that went into interoperability (my first JISC project was an investigation of interoperable web services in E-portfolio in 2004 - the project concluded that the available interoperability models didn't work and that something should be done about it), how have we got here?

Any new technology creates new possibilities for action. The ways of acting with a new tool may be very different from the ways of acting with existing tools. This means that if there is overlap in the functionality of one tool with another, users can be left with a bewildering choice: do I use X to do a,b and c, or do I use Y to do a, c and z? The effect of new technologies is always to increase the amount of uncertainty. The question is how institutions should manage this uncertainty.

CETIS was a government-funded institutional attempt to manage the uncertainty caused by technology. It served as an expert service for JISC, identifying areas for innovation and recommending where calls for funding should be focused. CETIS is no longer funded by government because government believes the uncertainties created by technology in education can be managed within institutions.. so my university ends up with 4 e-portfolio systems in one faculty (we are not alone). This is clearly bad for institutions, but not bad in terms of a libertarian philosophy to support competition between multiple providers of systems. Having said this, the interoperability battle was lost even when CETIS was flourishing. The dream of creating an educational equivalent of MIDI (which remains the golden child of systems interoperability) quickly disappeared as committees set about developing complex specifications for e-portfolio (LEAP, LEAP2, LEAP2a - see, the packaging of e-learning content (SCORM, IMS-Content Packaging), the sequencing of learning activities (IMS Learning Design, IMS Simple Sequencing), and more recently, Learning Analytics (xAPI).

All of this activity is bureaucratic. Like all bureaucratic processes, the ultimate result is a slowing down of innovation (importantly, this is NOT what happened with MIDI). Whilst technology creates new possibilities, this also creates new uncertainties, and bureaucratic processes act as a kind of weir to stem the flow of uncertainties. Institutions hate uncertainty. In the standards world, this is achieved by agreeing different shaped boxes into which different things can be placed. Sometimes the boxes are useful: we can say to a vendor of e-portfolio, does it support LEAP2a (for example). They might say "yes", meaning that there is an import routine which will suck in data from another system. However, much more important is the question "Does it have an API?" - i.e. can we interact with the data without going through the interface and do new things which you haven't thought about yet? The answer to this is almost always, No! The API problem has also become apparent with social media services too: APIs have become increasingly difficult to engage with, and less forthcoming in the data they provide. This is for a simple reason - for each of the clever things you might want to do with the data, each company wants to provide as a new "premium service".

An alternative to the institutional bureaucratic approach to the interoperability problem would seek to manage the uncertainties created by technology in a different way. This would be to embrace new uncertainties, rather than attenuate them,  and create situations within institutions where processes of technical exploration and play are supported by a wide range of stakeholders. One of the problems with current institutionally attenuative approaches to technology is that the potential of technology is underexplored. This is partly because we are bad at quantifying the new possibilities of any new tool. However, in working with most institutional tools, we quickly hit barriers which dictate "We can't do that", and that's the end of the story. But there are usually ways of overcoming most technical problems. This is what might be called the "Socially responsible hacking" approach to interoperability. With the failure of bureaucratic interoperability approaches, this may be the most productive way forwards.

Socially Responsible Hacking addresses the uncertainty of new technology in dialogue among the various stakeholders in education: programmers who see new ways of dealing with new and existing tools, teachers who seek new ways of organising learning, managers who seek new opportunities for institutional development, learners who seek new ways of overcoming the traditional constraints of institutions, and society within which educational institutions increasingly operate as something apart, rather than as an integral component. 

Wednesday, 5 July 2017

Saturday, 1 July 2017

Ivory Towers and the Grenfell Tower: The problem with Evidence

The Grenfell Tower fire represents a collapse of trust in expertise and evidence, and will bring about a reawakening of scepticism. Newsnight's report on "How flammable cladding gets approved" - raises questions about the role of evidence beyond fire safety. In policy in health, education, welfare, economics and housing evidence is the principal aid for decision-making. What Enid Mumford calls "dangerous decisions" are supported by studies which demonstrate x or y to be the best course of action. The effect of these studies is to attenuate the range of options available to be decided between. Of course, in that attenuation, many of the competing descriptions of a phenomenon or subject are simplified: many descriptions are left out, some voices are silenced. Usually, the voices that are silenced are those "on the edge": the poor, immigrants and the occasional "mad professor". From Galileo to Linus Pauling, history tells us that these people are often right.

Understanding "evidence" as "attenuation" helps us to see how easily "evidence-based policy" can become "policy-based evidence". Evidence can be bent to support the will of the powerful. The manifestations of this exist at all levels - from the use of econometrics to produce evidence to support austerity to the abuse of educational theory in support of educational interventions (which so many educational researchers, including me, are guilty of). But it helps academics to get published, to raise our status in the crazy academic game - and, once established in the sphere of the University, the habit sticks. Effective decision-making is intrinsic to effective organisation. If organisational pathology creeps in, decision-making within a pathological organisation will be constrained in ways which obscure real existent problems.

The deeper problems concern academia's and society's allergy to uncertainty. We hold to an enlightenment model of scientific inquiry, with closed-system experiments and the identification of causal relations through the production of event-regularities. Too often we pretend that the open systems with which we engage are closed systems whose event regularities are no longer physical events, but statistical patterns. So Stafford Beer's joke that "70% of car accidents are caused by people who are sober" entailing that we should all drink and drive, highlights the dangers of any statistical measure: it is an attenuation of descriptions - and often an arbitrary one at that.

The computer has changed the way we do science, and in almost all areas of inquiry from the humanities to physics, probabilities are what we look at. These are maps of uncertainty, not pointers to a likely successful outcome, or a statistically proven relation between an independent variable and a probability distribution. What is an independent variable, after all? It is a single description chosen out of many. But its very existence is shaped by the many other descriptions which are excluded by its isolation. And we don't seem to care about it! I review endless depressing papers on statistical approaches to education and technology, and I see these assertions being made without the slightest whiff of doubt - simply because that is how so many other papers which are published do it. I reject them all (although always gently - I hate horrible reviews - but always inviting authors to think harder about what they are doing).

Uncertainty is very difficult (probably impossible) to communicate through the medium of the academic journal article. The journal article format was devised in 1662 for an enlightenment science which is radically different from our own. Of course, in its time, the journal was radical. The effect of printing on a new way of conducting and communicating science was only just opening up. Printing was doing to the academic establishment what it did to the Catholic church a century before. Enlightenment scholars embraced the latest technology to harness their radical new practices.

We should be doing the same. The experiments on building cladding are easily demonstrable on YouTube. Equally, uncertainties about scientific findings can be expressed in rich ways using new media which are practically impossible in the journal. The scientists should learn from the artists. Furthermore, technology provides the means to democratise the making of descriptions of events. No longer is the description of an event the preserve of those with the linguistic skill to convey a compelling account in print. The smartphone levels the playing field of testimony.

Our decisions would be better if we became accustomed to living with uncertainty, and more comfortable living with a plurality of descriptions. The idea of "evidence" cuts against this. We - whether in government or academia - do not need to attenuate descriptions. Uncertainties find their own equilibrium. Our new media provide the space where this can occur. Universities, as the home of scholarly practice in science, should be working to facilitate this.

Friday, 30 June 2017

The Paradox of Institutional Change in Universities: The Strategic Need for a Pincer-Movement

The last 10 years has seen most Universities in the UK undergo significant restructuring. These processes, which are still ongoing - most terribly at Manchester and the OU at the moment - are intended to deliver transformations to the institution's financial viability, their "market appeal", improvement of the student experience, and increasing competitiveness in research and teaching.

The results from the last 10 years of restructuring tells us quite clearly that NONE of this actually occurs. Departments may be closed, and salaries saved, but within a few years, the salary bill creeps up to exceed what it was before. Staff morale is damaged through the autocratic processes by which friends, colleagues and (most importantly) conversations are broken up. The atmosphere in institutions whilst restructuring occurs is dismal and this has an impact on students.

The recruitment of new (cheaper, younger) staff can also be highly problematic. Some of these will be adjuncts, paid very little, and struggling to survive, let alone teach their large (and highly profitable) Masters class of overseas students in the Information Systems department.  These people are clinging on to the academy in the hope that something better comes up. But things continue to get worse. Other new staff will be recruited on a kind of "metric" basis - those with the most papers wins! Never mind what they are like as people, how collegial they are, how well they care about their students. And often, they are appointed by a few senior colleagues, because the junior staff who keep the department going are all at risk of redundancy.

The spirit of despondency turns out to be highly contagious. The new staff - particularly the good ones - leave. The students complain - although they continue to attend in sufficient numbers to keep the thing on the road because almost everywhere else is the same.

Who benefits from restructuring? Usually, only the person who thought the thing up.  There is a real and deep question about institutional change which needs to be addressed.

Organisms change their structure when the structure of their environment changes. What is the environment of the University? With the student-as-customer rhetoric, are students cast in the role of the "environment" of universities? Universities seem to believe this, because they attempt to adapt to meet student expectations.

But many would argue that society at large is the environment of the university. What is the relation between the University and society? Well, it is one of circular causation. The university produces important aspects of society (its knowledge), and society produces the university through society's requirement to think about itself and produce new components like doctors, teachers and government ministers.  Of course, society includes learners... and teachers, administrators, tax payers, voters, Brexiteers, Remainers, banks, Corbynites and Theresa May.

History tells us that Universities do change over time. Like biological organisms, change comes about through adaptation to changes in the environment - to changes in society. Francis Bacon's 1605 "The Advancement of Learning" was a wake-up call to universities, just as the Reformation was a wake-up call to the Catholic church. Curiously though, the members of the 17th century "invisible college" beavering away at scientific experiments outside the university had all been through the academic establishment at some point. The early IT pioneers like Gates and Jobs, the military developers of the internet and the Whole Earth Catalogue existed on the periphery of the institution in a counter-cultural bubble. The same might be said of the off-piste developments in BitCoin in the early 2000s.

University change takes the form of organic absorption of the counter-culture. Jazz improvisation, for example, moves from seedy strip joints to the university classroom (with its professors of jazz) in the space of 90 years.  The only counter-cultural development which has resisted this seems to be the sex industry, and yet its adoption and development of technology has paved the way to the iPlayer and lecture capture!

What can we learn from this?

  • Institutional restructuring is institutional self-harm. 
  • If institutions change in response to changes in their environment, perhaps they should consider nurturing environmental changes which they might find challenging in the short-term, but to which their adaptation will be fruitful in the future. 
  • The obvious thing here is to develop feasible free personal certificated learning - but this is NOT a MOOC and it is not a marketing exercise. The institution doesn't need to make its presence felt, but to support social movements. 
  • Institutional change is likely to result from a pincer-movement: Constructive internal initiatives to help an institutional culture thrive are good, but they go hand-in-hand with initiatives to develop challenging things in the environment. 

Wednesday, 28 June 2017

Viable Institutions

Still a lot to do here, but it's taking shape...

Saturday, 24 June 2017

Government as Steering: Cybernetics and the Coming Labour Government

The joy surrounding Jeremy Corbyn's success in the election masks a need to do some very difficult work if a left wing labour government is going to deliver on the promise to transform society. There is muddle-headedness about the practicalities of government, the way events can overtake good intentions (no politician would have wanted a Grenfell on their watch), or the sheer challenge of keeping a political machine together which always seems hell-bent on self-destruction (all political parties seem to have this tendency).

Now is a golden opportunity to do this. Corbyn has the luxury of opposition where his grip on the party has been strengthened, and public expectation of a Corbyn victory (unthinkable before the election) has shifted significantly. These are real achievements.

Labour, and Corbyn, have got here because the Tories don't know how to govern. They see the world in a linear and hierarchical way, where simple "strong and stable" solutions can solve intractable problems. When things don't work out the way they wished (like the deficit coming down), the Tories tend to carry on regardless: strong and stable. This isn't government. It is ideological extremism.

"Government" and "governor" come from the same latin root: Gubernator. The Watt governor is the simplest idea of governing:

The Watt governor 'steers' the engine, by increasing the flow of steam if the engine runs too slow, and decreases it if it runs to fast. The Greek word for governor is kybernetes, from which we get cybernetics. The Kybernetes was the steersman on the ship, so cybernetics is about steering. And so is government.

Stafford Beer is the cybernetic thinker who considers the problems of government (and its related problem, management) in most detail. I have thought about the Viable System Model (see for many years, and the Cybersyn experiment in Chile of 1971-3 (see remains the most significant attempt to rethink government (apart from some promising experiments in the Soviet Union which didn't get off the ground properly - see

There is a fundamental problem that the VSM addresses: the problem of attenuating descriptions of the world. In hierarchical power structures like governments, or bosses of universities, hospitals or any institution for that matter, the "top" relies on filters to give them the most important information from the ground. This is where the pathology starts, because the filter entails removing most of the other descriptions which are not considered important. This is why the election opinion polls got it so wrong - because they didn't listen to the variety of description that was out there. Technology has made the situation worse - it can filter more effectively than anything else - although this is a stupid way to use technology!

The VSM is a set of nested loops within which there is attenuation of description (there has to be), but at the same time the attenuated descriptions are organised into the production of a generative model whose engagements with the organisation (or country) that is being managed is continually monitored. The circular loop continually asks "Are we right?", "In what ways are we wrong?", "What have we learnt about the world that we didn't know before?", "How should the model be changed?". In other words, there is attenuation, and there is amplification of the abstracted model in a continual process of organic adaptation (Beer described his model using the metaphor of the human body). This is steering.

In theory, this is fine, and the VSM is often used in management consultancy to help heal organisational pathology: I'm hosting a conference in November at Liverpool on this very topic:

But apart from Cybersyn, there has been no real-time empirical attempt to exploit this thinking in government or management. We should do it, because our existing models of government cannot deal with the obvious circular causality which is endemic in our world, from overseas wars and local terrorism to austerity and burning tower blocks.  We have to have a practical way of dealing with circular causation, and I worry that Corbyn's labour isn't prepared.

Beer's Cybersyn was a data-driven operation in a world where data was hard to come by (they transmitted it with Telex machines). Today, we have data everywhere - but we don't know how to use it. Most approaches to "big data" seek to amplify automatic "filters" of complexity - this is basically what machine learning does. That's fine up to a point, but whatever filters are produced, are used to create a model which must be tested and improved. The human thinking about the rightness of the models used doesn't appear to happen. All "big data" results are the opportunity for humans to produce new descriptions of the world, and for these new descriptions to feed into higher level steering processes. But it doesn't happen. Consequently, we allow the "big data" to dictate how the world should become without thinking about what we've missed.

One of the critical signs that any government or management should worry about is a decrease in the variety of description about something. This is usually the harbinger of catastrophe. Our Universities are heading straight for this, because they are removing vast chunks of variety in the conversations and descriptions which are made within them as they close departments, sack staff, become fixated on metrics of academic performance which mean nothing, or chase government targets for "teaching excellence" in the hope of getting more money. Nobody is monitoring the richness of conversation in Universities. Yet, the true strength of any university is the richness of the conversations which it maintains.

The same goes for a healthy society. The urgency of thinking about this was impressed upon me a couple of days ago when I received a text message from a bright and brilliant academic and friend in my old institution (one of only a few in that awful place). It's a dismal reminder of how much trouble we are in: "I've just been told I'm being made redundant". So that's another conversation killed.

Monday, 19 June 2017

Technology, Forms and the Loss of Description

When rich descriptions are difficult to bear, methods of attenuating description become attractive. They restrict the mode of expression to that which is permitted by whatever medium is devised for conveying 'standard' messages. We have become so used to this that we barely even notice it.  Paul Fussell identified in "The Great War and Modern Memory", that the means by which descriptions are attenuated emerged from the most brutal and traumatic of events where it was barely possible to articulate how people felt. Before the first world war, there were no "forms" to fill in.

The military authorities did their best to ensure that richer descriptions of the soldier's experiences were not conveyed home, lest it lead to unrest or loss of morale. Fussell describes a letter sent by a young boy in a platoon which went:

Dear Mum and Dad, and dear loving sisters Rosie, Letty, and our Gladys, -
I am very pleased to write you another welcome letter as this leaves me. Dear Mum and Dad and loving sisters, I hope you keeps the home fires burning. Not arf. The boys are in the pink. Not arf. Dear Loving sisters Rosie, Letty, and our Gladys, keep merry and bright. Not arf. 

Today our whole lives are ruled by forms, and even the scope for protesting the restrictions of the medium are curtailed. The best one can do is not fill it in. Such 'data gathering' processes have become part of normal life. We even conduct social research like this. 

Fussell describes the "Form A. 2042" shown above. The Field Service Post Card was sent 
with everything crossed out except "I am quite well" - immediately after a battle which relatives might suspect their soldiers had been in. Such were the hazards of occupying newly blown mine-craters that, according to George Coppard, "Before starting a twelve-hour shift in a crater, each man had to complete a field postcard for his next of kin, leaving the terse message "I am quite well" undeleted."
Soldiers found ways of using the medium to convey messages that the cards were not meant to convey. Fussell notes:
the implicit optimism of the post card is worth noting - the way it offers no provision for transmitting news like "I have lost my left leg" or "I have been admitted into hospital wonded and do not expect to recover".  Because it provided no way of saying "I am going up the line again" its users had to improvise. Wilfred Owen had an understanding with his mother that when he used a double line to cross out "I am being sent down to base" he meant he was at the front again. (Fussell, "The Great War and Modern Memory", p185)
Fussell claims that the Field Service Post Card is the first "form": "It is the progenitor of of all modern forms on which you fill in things or cross out things or check off things, from police fraffic summonses to "questionnaires" and income-tax blanks. When the Field Service Post Cardwas devised, the novelty of its brassy self-sufficiency, as well as its implications about the uniform identity of human creatures, amused the sophisticated and the gentle alike, and they delighted to parody it..."

Today we have video, which has, in many ways, levelled the playing field of testimony: one does not have to be a great poet or writer to convey the complex reality of a situation - anyone can do it. Yet the form remains. How could one summarise the complexity were it not for the tick-boxes?

There is a better answer to this question than tick boxes. The form amplifies a particular set of descriptions as a series of choices. Whatever actual descriptions might be made by individuals, these somehow have to fit the provided descriptions. The interpretation of the fit to the provided descriptions adds a further layer of attenuation.

Institutions and governments fail because they fail to listen to the rich variety of descriptions made within the organisations they oversee. Instead, they collect "data" which they attenuate into "preferred descriptions", and implement policy according to their conclusions. Crisis emerges when the effects of policy are the production of more descriptions which are also ignored. 

Sunday, 18 June 2017

Tuesday, 13 June 2017

Open Educational Resources and Book Printing Machines

"Being open" has been a major theme in educational technology for many years. It goes to the heart of why many have been drawn to education technology in the first place: "let's transform education, make it available to all, liberate ourselves from constraints", and so on. There is an associated economic narrative which speaks of "re-use" and highlights the apparent ridiculousness in the redundancy of so much content - why have 100 videos about the eye when one would do?

The opportunity of technology is always to present people with new options for acting: blogging presents new options for publishing, for example. In effect, new options for acting are new ways of overcoming existing constraints. When looking at any innovation, it is useful to examine the new options it provides, and the constraints it overcomes. Sometimes new technologies introduce new constraints.

What new options does OER provide? What constraints does it overcome?

These are not easy questions to answer - and perhaps because of this, there is much confusion about OER. However, these are important questions to ask, and by exploring them more fully, some insight can be gained into how OER might be transformative.

Enormous amounts of money have been spent on repositories of stuff which are presented as lego bricks for teachers to assemble their teaching. Remember learning objects? Remember widgets? Remember JORUM? The rationale behind much of this was that educational content could be assembled by teachers and incorporated as ready-made chunks of knowledge into new courses. So the constraint was the labour of teachers? Or the cost of resources? OER to the rescue!?

But actually none of this addressed the deep constraint: the course. Who cares about courses? Well, universities do... Courses = Assessment = Money.

Of course, away from courses, there are Open Educational Resources on YouTube, Facebook, Twitter, Wikipedia, Stackoverflow, Listservs, blogs, wikis, and various other specialised disciplinary forums. Moreover, the tools for searching and retrieving this stuff have got better and better. Email histories are now a major resource of information thanks to vast storage of google and the capabilities of their search tools. All of these things have circumvented the constraint of the course.

Universities care about courses. Open Educational Resources cut the costs of setting courses up. And of course, the skill requirements of the teacher might be seen to be lowered to that of curators where the cost saving implications are attractive to university managers: we don't need teachers - we can get this stuff for free, and pay cheap adjuncts to deliver it! So the constraints are financial and organisational.

But... nobody really understands what happens in teaching and learning. Whilst there are ways in which a video on YouTube might be said to "teach", generally teaching happens within courses. But what does the teacher do?

What happens in teaching and learning is conversation. Ideally, in that conversation, teachers reveal their understanding of something, and learners expose their curiosity. This can happen away from formal courses - most notably on email listservs, where (perhaps) somebody posts a video or a paper, and then people discuss it, but it is something that clearly is meant to happen in formal education.

"Revealing understanding" of something is not the same as presenting somebody with ready-made resources and activities (although someone can reveal their understanding of a subject in a video or a book - or indeed, a blog post!). Teachers have always used textbooks, but conversations usually revolve around a negotiation of the teacher's understanding of the textbook. Most textbooks are sufficiently rich in their content to throw up interesting questions for discussion.

Ready-made resources represent someone else's understanding. They can sometimes present an unwelcome extra barrier for the teacher, since the teacher is trying to reveal their understanding of the subject, but are caught trying to reveal their understanding of somebody else's understanding.

Teachers produce resources to help them articulate their understanding. Some very experienced teachers may even write books about their understanding of a subject. When resources are publishable at this level, things get interesting and a new set of constraints emerges. The big constraint is the publishers.

Let's say a teacher writes a book. They send it to a publisher and sign away their rights to it. In signing away their rights to the content, they are restricted in what they might do with the content in future. The book might be very expensive and so the people who a teacher wants to read it, cannot afford it. There may be chunks of text which they might want to extract and republish for a different audience. They can't do it.

I think this is about to change. One of the exciting developments in recent years has been print-on-demand self-publishing. Alongside this, professional typesetting has become within easy reach of anyone. LaTeX-driven tools like Overleaf ( make a once-esoteric skill accessible to all. And the book printing machines like Xerox's Espresso Book Machine are the most powerful exemplars of 3D printing:

Why will academics exploit this? Because, whilst publishing with a respectable publishing house is often seen as a 'status marker', it also constrains the freedom of the academic to manage their own resources and engagement with their academic community.

A self-published open book can exist on GitHub as a LaTeX file, which an academic can fork, republish, reframe, etc. And why not allow others to do the same? And if copies can be printed for very little, why not do your own print run and distribute your book to your academic community yourself? For all teachers, and for all academics, the point of the exercise is conversation.

More importantly, with the production of high quality resources that can be exploited in different ways, the teacher is able to express their understanding of not just one but potentially many subjects. What is the difference between a book on methodology in education research to a book on methodology on health research? Might not the same person have something to say about both? Why shouldn't the resources or the book produced for one not be exploited to do the other?

Saturday, 10 June 2017

Albert Atkin on Peirce

I have always been a little bit reticent about Peirce's semiotics. It's become another kind of theoretical 'coat-hanger' which media theorists, communication scholars, educationalists, informational scholars, musicologists, and much postmodern theory has draped 'explanations' which, it seems to me, don't explain very much. My suspicion, as with many social and psychological theories, is that the clergy are a pale imitation of the high-priests. It's the same story with James Gibson and affordance theory. And whilst believing that there's much more to Peirce than meets the eye of someone surveying this academic noise, I haven't yet found a way into it. Until now.

I'm reading Albert Atkin's recent book on Peirce. He articulates exactly how I feel about the sign theory, when he first of all points out that philosophy has largely ignored the sign theory - partly due to unreflective criticisms of analytic philosophers (most notably Quine), whereas

"Interest is much livelier outside of philosophy, but a similar problem lurks nearby. One finds interest in and mention of Peirce's sign theories in such wide-ranging disciplines as art history, literary theory, psychology and linguistics. There are even entire disciplinary approaches and sub-fields - semiotics, bio-semiotics, cognitive semiotics - which rest squarely on Peirce's work. Whilst this greater appreciation of Peirce's semiotic marks a happier state of affiars than that which we find in philosophy, there is still a worry that, as the leading scholar of Peirce's sign theory, T.L. Short, puts it, 'Peirce's semiotics has gotten in amongst the wrong  crowd'. Short's complaint may be a little hyperbolic, but his concern is well founded considering the piecemeal and selective use of Peirce's ideas in certain areas. From a cursory reading of much work in these areas, one might think Perice had only ever identified his early tripartite division of signs into icons, indexes and symbols." (Atkin, "Peirce", p126)
Peirce's biography, which Atkin covers elegantly, is extremely important in understanding how Peirce's logic, mathematics, sign theory and metaphysics fit together. A combination of intellectual isolation - he lost his University position in 1884 and never gained another one - and a unique inheritance from his mathematician father Benjamin Peirce, together with power intellectual life in the family home, set the scene for a radical redescription of logic, mathematics, cognition and science. The simple fact is that the extent to which this redescription is truly radical remains underappreciated - not helped by noisy dismissals by the academic establishment - not only of Peirce himself, but also of some of the foundational work which Peirce built on (he gained his interest in Hamilton's Quaternions from his father; Hamilton's work too suffered some careless dismissals).

If people think they know Peirce, or they know the semiotics, they should think again. I strongly suspect the time for this true original is yet to come. 

Tuesday, 6 June 2017


This is gradually coming together... It helps me to post on here - it's a multiple description!

Monday, 5 June 2017

Peirce on Quaternions

Had it not been for my discussions with Peter Rowlands at Liverpool University, I wouldn't know what a quaternion was. That I took it seriously was because it plays a vital role in Rowland's physical theory which unites quantum and classical mechanics, and my interest in this has evolved through a desire to tackle the nonsense that is talked about in the social sciences about sociomateriality, entanglement, etc. But then there is a another coincidence (actually, I'm more convinced there is no such thing - these are aspects of some kind of cosmic symmetry). I got to know Rowlands because he is a friend of Lou Kauffman, who has been one of the champions of Spencer-Brown's Laws of Form.

One of the precursors to Spencer-Brown's visual calculus is contained in the existential graphs of Charles Sanders Peirce. So on Saturday, I went looking in the collected writings of Peirce for more detail on his existential graphs. Then I stumbled on a table of what looked like the kind of quaternion matrix which dominates Rowlands work. Sure enough, a quick check in the index and Peirce's work is full of quaternions - and this is a very neglected aspect of his work.

To be honest, I've never been entirely satisfied with the semiotics. But the mathematical foundation to the semiotics makes this make sense. It situates the semiotics as a kind of non-commutative algebra (i.e. quaternion algebra) - and in fact what Peirce does is very similar intellectually to what Rowlands does. It means that Peirce's triads are more than a kind of convention or convenience: the three dimensions are precisely the kind of rotational non-commutative symmetry that was described by Hamilton. I'm really excited about this!

So here's Peirce on the "Logic of Quantity" in the collected papers (vol. IV), p110:

The idea of multiplication has been widely generalized by mathematicians in the interest of the science of quantity itself. In quaternions, and more generally in all linear associative algebra, which is the same as the theory of matrices, it is not commutative. The general idea, which is found in all of these is that the product of two units is the pair of units regarded as a new unit. Now there are two senses in which  a "pair" may be understood, according as BA is, or is not, regarded as the same as AB. Ordinary arithmetic makes them the same. Hence, 2 x 3 of the pairs consisting of one unit of a set of 2, say I, J, and another unit of a set of 3, say X,Y,Z the pairs IX, IY, IZ, JX, JY, JZ, are the same as the pairs formed by taking a unit of the set of 3 first, followed by a unit of the set of 2. So when we say that the area of a rectangle is equal to its length multiplied by its breadth, we mean that the area consists of all the units derived from coupling a unit of length with a unit of breadth. But in the multiplication of matrices, each unit in the Pth row and Qth column, which I write P:Q of the multiplier coupled with a unit in the Qth row and Rth column, or Q:R gives:
      (P:Q)(Q:R) = P:R
or a unit of the Pth row and Rth column of the multiplicand. If their order be reversed,
      (Q:R)(P:Q) = 0
unless it happens that R = P.

Monday, 29 May 2017

Symmetry, Learning and Sociomateriality

A number of ideas are bombarding me at the moment. The best thing that has happened to me at Liverpool University is the encounter with the theoretical physics of Peter Rowlands. This is what universities are really about: creating the possibility of encounter with radically new ideas. Peter is interested in reconfiguring the relation between classical and quantum mechanics. At the root of his approach are three simple concepts: the 'nilpotent' - an imaginary number when raised to a power produces zero; quarternions - 3-dimensional imaginary numbers invented by Hamilton which have peculiar properties; and, most importantly, symmetry.

In grappling with this (and I am still grappling with it... this blog is part of the process!) both symmetry and the nilpotent resonate with me in my thinking about cybernetics, learning, music and emotional life. The nilpotent puts the focus on nothing. The universe is about nothing. Now compare this to the importance of absence in  the work of Terry Deacon, or Bhaskar, or Lawson, or the apophatic in the ecological work of Ulanowicz. There is also the category theoretical work of Badiou who places particular emphasis on nothing in his graphs. Is absence nothing? In the sense that absenting is about something "not there"... zero is clearly not there. From Newton's third law (Rowlands has published a number of books about Newton), an obvious point to make is that the resultant force in the Universe is zero. More importantly, the somethings that we see in the universe are the product of constraints of things which we can't see (dark matter/energy). There is a nothingness about dark matter. But there is also nothing in the resultant totality of what we can see and what we can't. The real question is how something emerges from nothing.

In cybernetics, the concept of constraint takes the place of absence - although they are considered to be the same thing (Deacon, Ulanowicz, Lawson are all in agreement about this). The nilpotent idea seems to be mirrored in the tautology of Ashby's Law: a complex system can only be controlled by a system of equal or greater complexity. Something emerges from nothing, through the fact that at any particular level, systems are unstable: The complexity of system a and system b might be greatly different, thus necessitating systems at higher levels to participate in balancing the variety. This is a dynamic process. In terms of understanding it, this is a process that relies on broken symmetry.

Having a fundamental way of describing symmetry-breaking is something which mathematics struggles with. Perhaps the closest we get is in fractal geometry, or in the dynamics of Conway's game of life. But these are the result of heuristics and recursive functions rather than fundamental mathematical properties. The quaternions present a way of thinking about broken symmetry which is fundamental. i, j and k are all square roots of -1, so ii = jj = kk = -1. But ij is not equal to ji. This anti-commutative property gives quaternions the potential to articulate complex matrix structures which have a kind of 'handed-ness'. This abstract property becomes useful to describe the apparent handedness that we see in the universe, from subatomic particles to DNA to the Fibonacci structures in biology.

What's fascinating about this is that nilpotency and broken symmetry combined have remarkable generative properties. For Rowlands, one of the key things is the bridging of the gap between classical mechanics and quantum mechanics. In much social science writing (and in educational technology) it has become fashionable to cite quantum phenomena like "superposition" and "entanglement" as a way of articulating the complex 'sociomateriality' of social life. Many realists object to the woolly language. Scientists like Sokal object to the lack of understanding of physics - although some who promote sociomateriality do so from a scientific perspective - like Karen Barad. Part of the problem lies within physics. Classical mechanics and quantum mechanics are generally considered (not just by Barad) to be different kinds of thing. Rowlands argues that they are the same kind of thing - and in fact, quantum mechanics can be seen to be an entirely logical and consistent extension of classical mechanics. So Newton was more profoundly right about the universe than is widely accepted. Through Rowland's ideas of broken symmetry, issues like superposition and entanglement emerge as logical consequences of the conservation of mass and energy, and the non-conservation of time and space.

So here's a tantalising question: could learning be seen as a product of broken symmetry and nilpotency? My first instinct with these kinds of questions is to ask it of something like learning but more objectively observable: music. Can music be the result of broken symmetry and nilpotency? There have been many studies of the Fibonacci sequence in music - notably in Bartok and Debussy. I strongly suspect the answer to this question is yes. So learning? What is the symmetry of understanding or thinking? Is there a way of answering this question? At a deep level, these are questions about information - and through taking them as such, methods can be devised for exploring them. The way Rowlands is able to explain the emergence of something from nothing immediately suggests a new approach to one of the fundamental questions in the theory of information - the "Symbol grounding problem" (see

A nilpotent broken symmetry of learning would have to entail a nilpotent broken symmetry of education and other social structures. Might they be investigated in the same way? What about a nilpotent broken symmetry of politics? (Is that dialectic?) Are these too questions about information?


Sunday, 21 May 2017

New technologies and Pathological Self-Organisation Dynamics

Because new communications technologies liberate individuals from the prevailing constraints of communication, it is often assumed that the new forces of self-organisation are benificent. Historical evidence for massive liberation of means of communication can tell a different story. Mechanisms of suppression, unforeseeable consequences of liberation - including incitement to revolt - revolution, war and institutional disestablishment follow the invention of printing; propaganda, anti-intellectualism, untramelled commercialism and totalitarianism followed telephone, cinema and TV; and the effects of unforseeable self-organising dynamics caused by the internet are only beginning to be apparent to us. It isn't just trolling, Trump and porn, its vulnerabilities that over-rationalised technical systems with no redundancy expose to malevolent forces that would seek their collapse (which we saw in the NHS last week).

What are these dynamics?

It's an obvious point that the invention of a new means of communication - be it printing or Twitter - presents new options for making utterances. Social systems are established on the basis of shared expectations about not only the nature of utterances (their content and meaning) but on the means by which they are made. The academic journal system, for example, relies on shared expectations for what an "academic" paper looks like, the process of review, citations, the community of scholars, etc. It has maintained these expectations supported by the institutional fabric of universities which continues to fetishise the journal, even when other media for intellectual debate and organisation become available. Journalism too relies on expectations of truth commensurate with the agency responsible for the journalism (some agencies are more trusted than others), and it again has resisted new self-organising dynamics presented by individuals who make different selections of their communication media: Trump.

But what happens then?

The introduction of new means of communication is the introduction of new uncertainties into the system. It increases entropy across institutional structures. What then appears to happen is a frantic dash to "bring things back under control". That is, reduce the entropy by quickly establishing new norms of practice.

Mark Carrigan spoke in some detail about this last week in a visit to my University. He criticised the increasing trend for universities to demand engagement with social media by academics as a criterion for "intellectual impact". What are the effects of this? The rich possibilities of the new media are attenuated to those few which amplify messages and "sell" intellectual positions. Carrigan rightly points out that this is to miss some of the really productive things that social media can do - not least in encouraging academics in the practice of keeping an open "commonplace book" (see

I'm wondering if there's a more general rule to be established relating to the increase in options for communicating, and its ensuing increase in uncertainty in communication. In the typical Shannon communication diagram (and indeed in Ashby's Law of Requisite Variety), there is no assumption that increasing the bandwidth of the channel affects either the sender or the receiver. The channel is there to illustrate the impact of noise on the communication, the things that the sender must do to counter noise, and the significance of having sufficient bandwidth to convey the complexity of the messages. Surplus bandwidth beyond what is necessary does not affect the sender.

But of course, it does. The communications sent from A to B are not just communications like Twitter messages "I am eating breakfast". They are also communications that "I am using Twitter". Indeed, the selection of the medium is also a selection of receiver (audience). This introduces a more complex dynamic which needs more than a blog post to unfold. But it means that as the means of communicating increases, so does the entropy of messages, and so does the levels of uncertainty in communicating systems.

This is what's blown up education, and it's what blew up the Catholic church in 1517. It's also what's enabled Trump's tweeting to move around conventional journalism and the political system as if it was the Maginot line. As the levels of uncertainty increase, the self-organisation dynamics lead to a solidification (almost a balkanisation - particularly in the case of Trump) of message-medium entities which become impervious to existing techniques for establishing rational dialogue. Government, because it cannot understand what is happening, is powerless to act to intervene in these self-organising processes (it should). Instead, it participates in the pathology.

We need a better theory and we need better analysis of what's happening.

Saturday, 13 May 2017

The Evaluation of Adaptive Comparative Judgement as an Information Theoretical Problem

Adaptive Comparative Judgement is an assessment technique which has fascinated me for a long time (see Only recently, however, have I had the opportunity for trying it properly... and its application is not in education, but in medicine (higher education, for some reason, has been remarkably conservative in experimenting with the traditional methods of assessment!).

ACJ is a technique of pair-wise assessment where individuals are asked to compare two examples of work, or (in my case) two medical scans. They are asked a simple question: Which is better? Which is more pathological? etc. The combination of many judges and many judgements produces a ranking from which a grading can be produced. ACJ inverts the traditional educational model of grading work to produce a ranking; it ranks work to produce a grading.

In medicine, ACJ has fascinated the doctors I am working with, but it also creates some confusion because it is so different from traditional pharmacological assessment. In the traditional assessment of the efficacy of drugs (for example), data is examined to see if the administration of the drug is an independent variable in the production of the patient getting better (the dependent variable). The efficacy of the drug is assessed against its administration to a wide variety of patients (whose individual differences are usually averaged-out in the statistical evaluation). In other words, in traditional clinical evaluation, there is a linear correlation between
P(patient) + X(drug) = O(outcome)
where outcome and drug are shown to be correlated across a variety of patients (or cases).

ACJ is not linear, but circular. The outcome from ACJ is what is hoped to be a reliable ranking: that is, a ranking which accords with the  judgements of the best experts. But it is not ACJ which does this - it is not an independent variable. It is a technique for coordinating the judgements of many individuals. Technically, there is no need for more than one expert judge to produce a perfect ranking. But the burden of producing consistent expert rankings for any single judge (however good they are) will be too great, and consistency will suffer. ACJ works by enlisting many experts in making many judgements to reduce the burden on a single expert, and to coordinate differences between experts in a kind of automatic arbitration.

Simply because it cannot be seen to be an independent variable does not mean that its efficacy cannot be evaluated. There are no independent variables in education - but we have a pretty good idea of what does and doesn't work.

What is happening in the ACJ process is that a ranking is communicated through the presentation of pairs of images to the collective judgements of those using the system. The process of communication occurs within a number of constraints:

  1. The ability of individual judges to make effective judgements
  2. The ease with which an individual judgement might be made (i.e. the degree of difference between the pairs)
  3. The quality of presentation of each case (if they are images, for example, the quality is important)

An individual's inability to make the right judgement amounts to the introduction of "noise" into the ranking process. With too much "noise" the ranking will be inaccurate.

The ease of making a judgement depends of the degree of difference, which in turn can be a measure of the relative entropy between two examples. If they are identical, then the relative entropy will be the same. Equally, if images are the same, the mutual information between them will be high, calculated as:
H(a) + H(b) - H(ab)
If the features of each item to be compared can be identified, and each of those features belongs to a set i, then the entropy of each case can be measured simply as a value for H, across all the values of x in the set i:

The ability to make distinctions between the different features will depend partly on the quality of images. This may introduce uncertainty in the identification of values of x in i.

What ACJ does is it deals with issues 1 and 2. Issue 3 is more complex because it introduces uncertainty as to how features might be distinguished. ACJ deals with 1 and 2 in the same way as any information theoretical problem deals with problems of transmission: it introduces redundancy.

That means that the number of comparisons needed to be made by each judge is dependent on the quality and consistency of the of the ranking which is produced. This can be measured by determining the distance between the ranking produced by the system and the ranking determined by experts.  Ranking comparisons can be made for the system as a whole, or for each judge. Through this process, individual judges may be removed or others added. Equally, new images may be introduced whose ranking is known relative to the existing ranking.

The evaluation of ACJ is a control problem, not a problem of identifying it as an independent variable. Fundamentally, if ACJ doesn't work, it will not be capable of producing a stable and consistent ranking - and this will be seen empirically. That means that the complexity of the judges performing ranking will not be as great as the complexity of the ranking which is input. The complexity of the input will depend on the number of features in each image, and the distance between each pair of images.

In training, we can reduce this complexity by having clear delineations of complexity between different images. This is the pedagogical approach. As the reliability of the trainee's judgements increase, so the complexity of the images can be increased.

In the clinical evaluation of ACJ, it is possible to produce a stabilised ranking by:

  1. removing noise by removing unreliable judges
  2. increasing redundancy by increasing the number of comparisons
  3. introducing new (more reliable) judges
  4. focusing judgements on particular areas of the ranking (so particular examples) where inconsistencies remain
As a control problem, what matters are the levers of control within the system. 

It's worth thinking about what this would mean in the broader educational context. What if ACJ was a standard method of assessment? What if the judgement by peers was itself open to judgement? In what ways might a system like this assess the stability and reliability of the rankings that arise? In what ways might it seek to identify "semantic noise"? In what ways might such a system adjust itself so to manipulate its control levers to produce reliability and to gradually improve the performance of those whose judgements might not be so good? 

The really interesting thing is that everything in ACJ is a short transaction. But it is a transaction which is entirely flexible and not constrained by the absurd forces of timetables and cohorts of students.

Wednesday, 10 May 2017

The Managerial Destruction of Universities... but what do we do about it?

As I arrived at the University of Highlands and Islands for a conference on the "porous university", there was a picket line outside the college. Lecturers were striking about a deal agreed with the Scottish Government to establish equal pay among teaching staff across Scotland which had been reneged on by management of colleges. The regional salary difference can be as much as £12,000, so this clearly matters to a lot of people. It was a good turnout for the picket line (always an indication of how much things matter) - similar to the one when the University of Bolton sacked their UCU rep and his wife which made the national press (

It is right to protest, and it is right to strike. But sadly, none of this seems to work very well. Bad management seems to be unassailable, and pay and conditions continually seem to get worse.

At UHI, the porous university event was an opportunity to take the temperature of the effects of over 5 years of managerial pathology in universities across the country. The collective existential cry of pain by the group was alarming. The optimism, hope, passion and faith which is the hallmark of any vocation, and was certainly the hallmark of most who worked in education, has evaporated. It's been replaced with fear and dejection. Of course, an outside observer might remark "well, you've still got jobs!" - but that's to miss the point. People might still be being paid (some of them) but something has been broken in the covenant between education and society which has destroyed the fabric of a core part of the personal identities of those who work in education. It's the same kind of breaking of covenant and breaking of spirit that might be associated with a once healthy marriage which is destroyed by a breakdown of trust: indeed, one of my former Bolton colleagues described the spirit of those working for the institution as being like "the victims of an abusive relationship".

Lots of people have written about this. Stefan Collini has just published his latest collection of essays on Universities, "Speaking of Universities", which I was reading on the way up to Scotland. It's beautifully written. But what good does it do?

In the perverse monetised world of universities, the writing and publishing (in a high ranking journal) of a critique of the education system is absorbed and rewarded by the monetised education system. In its own way, it's "impact" (something Collini is very critical of). Weirdly, those who peddle the critique inadvertently support the managerial game. The university neutralises and sanitises criticism of itself and parades it as evidence of its 'inclusivity' and the embrace of opposing views, all the time continuing to screw lecturers and students into the ground.

A good example of this is provided by the University of Bolton who have established what they call a "centre for opposition studies" ( There are no Molotov cocktails on the front page - but a picture of the house of commons. This is sanitised opposition - neutralised, harmless. The message is "Opposition is about controlled debate" rather than genuine anger and struggle. Fuck off! This isn't a progressive way forwards: it is the result of a cynical and endemic conservatism.

I wouldn't want to accuse Collini of conservativism in the same way - and yet the symptoms of conservativism are there in the way that they exist in the kind of radical "history man" characters that pepper critical discourse. The main features of this?
  • A failure to grasp the potential of technology for changing the dimensions of the debate
  • A failure to reconcile deep scholarship with new possibilities for human organisation
  • A failure to suggest any constructive way of redesigning the system
If I was to be cynical, I would say that this is because of what Collini himself admits as the "comfortable chair in Cambridge" being a safe place to chuck bricks at the system. It is not really wishing to disrupt itself to the point that the chair is less comfortable. 

The disruption and transformation of the system will not come from within it. It will come from outside. There's quite a cocktail brewing outside the institution. One of the highlights of the UHI conference was the presentation by Alex Dunedin of Alex's scholarly contribution was powerful, but he himself is an inspiration. He exemplifies insightful scholarship without having set a "formal" foot inside a university ever. His life has been a far richer tapestry of chaos and redemption than any professor I know. Meeting Alex, you realise that "knowledge is everywhere" really means something if you want to think. You might then be tempted to think "University is redundant". But that might be going too far. However, the corporate managerialist "nasty university" I think will not hold sway for ever. People like Alex burn far brighter. 

Another bright note: Just look at our tools! The thing is, we have to use them differently and creatively. I did my bit for this effort. I suggested to one group I was chairing that instead of holding up their flipchart paper with incomprehensible scribbles on it, and talking quickly in a way that few take in, they instead passed the phone over the paper and made a video drawing attention to the different things on their paper. So paper became a video. And it's great!