Wednesday, 27 December 2017

A Logic of Learning

I don't know how anyone can say anything defensible about learning. Learning is like an "itch" - it is what Searle calls an aspect of "epistemic subjectivity" - something we know about in our individual consciousness, but provides no direct object for shared social inspection and agreed definition. Yet in the dreary world of educational research, so many academics insist at some point in defending their educational innovation with some kind of statement about what learning is. What they imply by such a statement is what learning isn't - and what learning isn't is the particular practice in education that they don't like, as opposed to the one that they "sell". How can they possibly know?

The fact that we think we have some idea of what learning is is important. It impacts on our educational practice. I once asked a friend (who is a leading education academic) my favourite question, "Why is education so crap?" and he said "bad theory". But that raises the question as to what a good theory might look like. Since we can say nothing defensible about what learning is, how could we establish any ground for good theory?

Theory generates expectations. Bohm pointed out that the word theoria has the same root as "theatre". Theory, he says, is a "theatre of the mind" - where our expectations about what might happen play out. But whilst it might be impossible to agree a single "play", it might be possible to agree on the logical principles upon which all our different plays are constructed. There is, after all, a logic to the plays of Shakespeare, to the politics of Machiavelli, to the music of Bach or the military tactics of Julius Caesar.

To be more precise, there is "logic" in the sense that we learn about on philosophy and mathematics courses. It belongs to the classical world of Aristotle. It involves principles like the law of the excluded middle. This logic is also the logic which underpins the way in which we think about computers and technology, and in turn it drives our thinking about social organisation, big data, statistics, metrics and so on.

But the logic of nature is not this. It works differently. The logic of Shakespeare, Bach, Machiavelli and even Caesar embraces contradiction. Only recently have such logics been explored, partly through the discovery of logical principles in nature (quantum mechanics and biology) which appear to similarly embrace them. At the moment, I am exploring the logic of Stephane Lupasco (see and the work of Joseph Brenner, whose 'Logic in Reality' presents itself as a new way forwards in logical thinking which might be able to express a deeper logic which might unite aesthetics, biology, quantum mechanics with learning.

So whilst we might not (and cannot) agree about what learning is, we can unpick the logic upon which our propositions about learning are formed. Doing this is to tunnel under the foundations of our current mad discourse in education. It's a strategy for reformulating an approach to education which acknowledges learning as metaphysical whilst embracing it within a transformed scientific approach.

Tuesday, 26 December 2017

Christmas TV and the Entropy Pump

I had a nice family Christmas with everybody being together. This year, it was noticeable that we didn't watch TV. There were a couple of moments where someone said "What's on telly?", and after perusing the available 100+ channels, we concluded that the answer was "nothing"! When I think back to our childhood when my brothers, sister and I had opened our presents, we inevitably settled down to watch the TV, and usually, there'd be something on that we could all watch (even if we didn't fully agree). Then there were 4 channels to choose from, and the programming between those channels was carefully planned so as to gain the best possible audience.

The other striking things about modern TV is the sheer complexity of turning the thing on. Ever since satellite broadcasting we have had to work out which remote control to use, how to get to the programme guide, and so on. We used to simply turn the thing on and that was it. The business of choosing something from 100 plus channels has become the process of watching: and it has become a process where eventually (after about 10 minutes of deflation) we decide there is nothing to watch. Then someone says "What about Netflix? or iPlayer?", and round we go again...

Technology adds to the available options for doing things. The uncertainty involved in choosing anything, as a result, increases. Another way of looking at this increase in uncertainty is to say it is an increase in disorder, or entropy. More technologically driven choice increases entropy: it is an entropy pump.

Entropy pumps are useful for controlling people. Where totalitarian regimes used to ensure through propaganda that everybody got the official message, now social control can be effected by ensuring that there is so much noise, nobody gets any message! When the entropy pump is focused on a family group deciding about what to do with their time, then it results in a pointless 10 or 15 minute activity of arguing about nothing, and in the end deciding to do something else (whilst still feeling disappointed that somehow they must be missing something). When the entropy pump is focused on the individual, the result is different.

What limits the family discussion is a balancing of the chaos presented by the TV with a collective awareness of each other and an exploration of other possibilities for communication. When we retreat into our mobile devices, we are faced with another kind of entropy pump... but we seem to get hooked on it rather like a drug! Why is this?

An increase in entropy in the environment leads to a search for identity of the system that finds itself in that environment. When the device we are using is both the source of entropy, and presents itself as the means of finding identity, preserving one's sense of self, then the relation between the individual and the device will be addictive. Even by writing this blog, this is what is happening in me: I am defining or reinforcing my identity in the face of the electronic noise around me.

All systems exhibit this behaviour in the face of the increasing complexification produced by technology. The most dangerous responses are by traditional institutions as they engage in all kinds of pathological measures to try and keep their structures stable. In some cases (government, media companies), the command "generate more entropy!" is given.

What we do as individuals to defend ourselves against this is a critical question. It has, I suspect, a simple solution: we need to look at each other. Christmas is such an interesting time because, for all its faults and distractions, we cannot avoid doing that!

Thursday, 21 December 2017

Marion Milner's "The Human Problem in Schools"

Marion Milner was a psychotherapist who, in 1938, undertook a research project on the nature of schooling by studying a girls school which was part of the Girls' Public Day School Trust which had been established in 1872. Milner focused on a range of dimensions of schooling including:

  1. The physical conditions
  2. Arrangement of the Time Table
  3. Teaching Methods
  4. Mental Health of Teachers
  5. Intelligence and Vocational training
as well as a programme of dissemination of findings (what Milner calls "lecturing to parents and staff"). 

She captured in note form the difficulties faced by various members of staff. For example, 

Difficulty in form mistress in getting to know her form when very often she does not teach them all. Thinks that children are not putting in their share of the work, they don't really work in class. 'We try so hard to make it amusing but certainly when they leave no one is going to do that' [...] Thinks the children are spoonfed. Thinks that fatigue partly due to fact that 'if you let up attention for a minute, you've lost them'.
Problem of change of regime when a form passes up to another form mistress with different methods of discipline... transition from strict disciplinarian to one who believes in independence, resulting period of apparent unruliness[...] Children take so long to settle down nowadays...
For the children, she created a questionnaire in which she asked what might be improved about their experience. She captured a list of "miscellaneous worries" which are fascinating:

  • When people are cross
  • Telling lies
  • Being the eldest in the class and the least brilliant
  • Quarrels
  • Feels her worries are too ridiculous to mention
  • Life generally
  • Losing things
  • Making younger sister go to bed when told to
  • Being late
  • If she's forgotten to do something she promised to do
  • Gets very depressed
  • Suffering in the world
  • Worries over trifles
She then tabulated these responses:

One of the big innovations in schooling at the time was the emergence of the intelligence test. Milner appears to support this, although she does document the thoughts and feelings of both staff and students towards it. One child said she didn't like it: "it's ridiculous and I don't believe in psychology and I know a girl who got ill through it", or another who said "it tired my brain too much" (p.62)

Milner conducted a series of deeper interviews with the girls. One of the techniques used was what Milner describes  as a postcard sorting technique:
A set of about 40 picture postcards was prepared, showing different kinds of people in a variety of different situations. When each girl came for her interview it was explained it was explained that a study was being made of the 'different kinds of things people are interested in' and she was asked if she would sort the cards into three piles, according to whether she would 'like to be one of the people in the picture, or hate to be, or not mind either way'. When she had done this she was asked to go through the 'likes' pile and the 'dislikes' pile and say why she had placed each card in that particular pile. (p.79)
In more detailed questioning, children were asked about their day-dreaming. One child, described by her teachers as having an "antagonistic attitude in class, indolence and lack of ambition" reported her day-dreams like this:

In bed I imagine I am diving in the Olympic Games, and doing extraordinary fancy dives absolutely perfectly. In school I imagine I am taking a Gym class. When listening to the wireless I wonder what it would be like to act, or sing into a microphone, or perhaps sometimes I feel I am broadcasting myself.
 Another child, reported as being aloof, has a different set of daydreams.
I imagine I am far away in some unknown land, I fancy it may be Utopia. The fountains play and in their spray there forms a cottage smallest of the small, I always think. The oak beams after many years have warped and now are bent and in the crevices grow moss of all shadses. This place, once a home, is now an empty field.
The roses, pink and white have spread over the doorway so that I cannot enter in. I know what is inside because through the lattice windows lovely visions play across my mind. The house is mine, I say, no one shall even know what I see in there! I shall always remember how a tall Poppy bnowed down to me and said, "It is yours for ever". This is one of my thoughts that comes to me when I am tired. I call it the Home of the Unknown. 
Milner points out that the contrast between these two descriptions as demonstrating "emotional assertiveness" in the first and a lack of interest in domination or successful performance in the second. She relates this to Jungian ideas of extroversion and introversion.

Milner then looked at ways of 'dispersing anxiety', using Jungian categories of sensation, intellect, intuition and emotion, she specifically focused on "finding a social function" and "dispersing anxiety through creative work" as two practical avenues the emotions could be dealt with. She also considered the environment for the growth of the individual, including:

  • The nature of the parents' interests
  • Amount of change in the environment
  • Opportunities available for the multi-level solution of conflict
  • Companionship of equals
  • Amount of emotional stress in relationship with adults (p189)
There's some stuff in Milner's book which is of its time, and to us would seem quite offensive. For example, her solution to reducing fatigue for the staff is to recognise "the intellectual limitations of the non-academic child"! However, she also supports "abolition of numerical marking", and comments that (in 1938!) "several parents mentioned, quite incidentally, that it is a recognized practice for girls to help each other over the telephone while doing their preparation"

She also comments that 
"much of the time now spent in exhortation is fruitless; and that the same amount of time given to the attempt to understand what is happening would, very often, make it possible for difficult girls to become co-operative rather than passively or actively resistant. It seems also to be true that very often it is not necessary to do anything; the implicit change in relationship that results when the adult is sympathetically aware of the child's difficulties is in itself sufficient."

Thursday, 14 December 2017

Personal Learning, Technology and the end of the Curriculum

I'm learning Russian at the moment. I have an excellent tutor, and I think Я делаю хороший прогресс! Since I've been very interested in the mediating role of objects in learning - particularly in how objects illuminate the understanding of both the teacher and the learner - I've been particularly fascinated by the way that Google Translate can be used to loosen-up the learning conversation so that it follows a more natural line of human inquiry.

All of a sudden, I find myself back in the world of the Personal Learning Environment - but with a twist. It is not that we learn through personal tools. But rather computer tools (like mobile phones) are objects which can be used to summon-up other objects (like an automatic translation or Wikipedia). In a face-to-face learning conversation about language, technology becomes an interlocutor whose flexibility and sheer variety of behaviour prods both teacher and learner into revealing more about themselves.

So, for example, a conversation may start with talking about the different cases of Russian grammar (genitive, dative, accusative, etc). With the mobile phone in the centre, the question becomes "does Google Translate deal with cases correctly?". This turns the process of learning a language (which is often presented as a dull exercise in remembering stuff) into a process of inquiry about the behaviour of the tool. Sometimes the tool gets it wrong. I will ask my tutor why it's wrong. I learn something more about the tutor. I am always studying the tutor, not the content.

All objects illuminate the understanding of people engaging with them. It is through the use of objects that we produce multiple rich descriptions of our understanding. What is learnt are the underlying patterns which generate the variety of descriptions: so one moment we talk about google translate's attempt to translate cases properly; the next we talk about the news in Russian or the weather in Vladivostok.

Education has yet to catch up with the generative power of the technological objects at its disposal. When it does so, it will see the "curriculum" to be a redundant concept. The curriculum is a very crude object which expresses the organisation of knowledge in some form. Good teachers seek to redescribe the curriculum "object" in such a way that their own understanding (or lack of it) is revealed more to their students. But more usually, teachers hide their understanding (or lack of it) behind the curriculum, its assessments, and their Powerpoints.

Objects as technologies should be the organising focus of education, not curriculum. We should create ways in which objects can be manipulated so as to create a natural flow of inquiry between teachers and learners and between learners and each other. The ridiculous thing is that I don't think this is hard to do. But to achieve it we have to deal with that other pernicious object in education: the assessment. Assessments are where everybody hides their lack of understanding! In an authentic world of object-human relations, there may in fact be no need for assessment. But that's an unthinkable thought in the education system of today.

Wednesday, 13 December 2017

Bohm on Nilpotency and Quaternions

My interest in physics and its relation to phenomenology, education and sociology stemmed from my meeting Peter Rowlands at Liverpool University. This is unquestionably the most important thing that has happened to me in Liverpool (although I have done a lot of other stuff, including converting a few people to cybernetics!). But really we work and study in a University to mix with people from whom we learn new things and gain new insights which we wouldn't have otherwise gained. 

Peter's work is based on the mathematics which underpins physical law. It addresses fundamental problems which beset quantum mechanics and relativity theory, and addresses the relationship between classical mechanics and quantum mechanics (which is often seen as a radical paradigm shift - something which has done the social sciences no good, as people have jumped on to the "entanglement" bandwaggon). In Peter's arsenal of mathematical devices, two things stand out: the "nilpotent" - the idea of a mathematical entity which when raised to a power equals zero (it's like the square root of minus 1, but with zero). In Peter's universe, everything grows from nothing.

The other element is Hamilton's Quaternions. These are a 3-dimension complex number, which has the property of anti-commutativity: if i and j are elements of the quaternion, then i * j is not the same as j * i. This anti-commutative behaviour introduces powerful and complex symmetries, which when coupled with the nilpotent, arise from nothing. These ideas have changed me (and I noticed that Peirce was also interested in quaternions).

Now, I have been looking much more closely at physics and quantum mechanics more particularly. David Bohm is a fascinating figure because he connects physical theory with a theory of consciousness and communication. A coherent connection to learning and education isn't far behind, although Bohm didn't quite go there - but that's where I'm interested in going!

But Bohm was ahead of the game. In this passage, he seems to prefigure Peter Rowlands work:
We do not regard terms like 'particle', 'charge', 'mass', 'position', 'momentum', etc as having primary relevance in the algebraic language. Rather, at best, they will have to come out as high-level abstractions. [...] the real meaning of quantum algebra will then be that it is a mathematization of the general language, which enriches the latter and makes possible a more precisely articulated discussion of implicate order than is possible in terms of the general language alone. (p 163)
He then discusses some of the properties of the algebra, and hits on two of the key features which are also important in Rowland's work. The first is the nilpotent:
It is important to emphasise that the 'law of the whole' will not just be a transcription of current quantum theory to a new language. Rather, the entire context of physics (classical and quantum) will have to be assimilated in a different structure, in which space, time, matter, and movement are described in new ways. Such assimilation will then lead on to new avenues to be explored, which cannot even be thought about in terms of current theories. 
First, we recall that we begin with an undefinable total algebra and take out sub-algebras that are suitable for the description of certain contexts of physical research. Now, mathematicians have already worked out certain interesting and potentially relevant features of such sub-algebras.
Thus, consider a given sub-algebra A. Among its terms A(i), there may be some A(n) which are nilpotent, i.e., which have the property that some powers of A(n) (say A(n)^6) are zero. Among these, there is a subset of terms A(p) which are properly nilpotent, i.e. which remain nilpotent when multiplied by any term of the algebra A(i) (so that (A(i)A(p))^s = 0) (p. 169)
This is a slightly different take on the nilpotent from Rowlands: Bohm is saying that invariance in structure depends of the absence of what he calls proper nilpotency ("we should have an algebra that has no properly nilpotent terms" (p170)). Rowlands, by contrasts, sees invariance in terms of conservation, and that this is an aspect of the different dimensions associated with mass (a scalar), charge, space (vector), or time (imaginary scalar).

Bohm also addresses the use of quaternions:
It is significant that by mathematizing the general language in terms of an initially undefined and unspecifiable algebra, we arrive naturally at the sort of algebras used in current quantum theory for 'particles with spin', i.e. products of matrices and quaternions. [....] the quaternions imply invariance under a group of transformations similar to rotations in three-dimensional space. (p/170)
Why does this matter to education? It is because Bohm makes the connection between quantum mechanics and consciousness. His "implicate order" is nothing short of what we perceive in our emotional life minute by minute. Bohm's mechanics and his algebra gives us a technical way of thinking about questions of phenomenology as we reach for apprehension of the implicate order from analysing the explicate order. Education itself, and university particularly, deals ostensibly with issues of the explicate order (because they are ostensible), but it sits on an inner logic about which both Bohm and Rowlands have powerful things to say. That their techniques and approach are similar is not, I think, a coincidence. 

Tuesday, 12 December 2017

David Bohm on Music

I'm finding my current obsession with David Bohm quite mind-changing. His insights are profound. I have not been this affected by academic work since I discovered Alfred Schutz a few years ago.  The common denominator is that both Bohm and Schutz say some penetrating things about music. This is Bohm on music in Wholeness and the Implicate Order, pp198-200:
Consider what takes place when one is listening to music. At a given moment a certain note is being played but a number of the previous notes are still 'reverberating' in consciousness. Close attention will show that it is the simultaneous presence and activity  of all these reverberations that is responsible for the direct and immediately felt sense of movement, flow and continuity. To hear a set of notes so far apart in time that there is no such reverberation will destroy altogether the sense of a whole unbroken, living movement that gives meaning and force to what is heard. 
It is clear from the above that one does not experience the actuality of this whole movement by 'holding on' to the past, with the aid of a memory of the sequence of notes, and comparing this past with the present. Rather, as one can discover by further attention, the 'reverberations' that make such an experience possible are not memories but are rather active transformations of what came earlier, in which are to be found not only a generally diffused sense of the original sounds, with an intensity that falls off, according to the time elapsed since they were picked up by the ear, but also various emotional responses, bodily sensations, incipient muscular movements, and the evocation of a wide range of yet further meanings, often of great subtlety. One can thus obtain a direct sense of how a sequence of notes is enfolding into many levels of consciousness, and of how at any given moment, the transformations flowing out of many such enfolded notes interpenetrate and intermingle to give rise to an immediate and primary feeling of movement. 
This activity in consciousness evidently constitutes a striking parallel to the activity that we have proposed for the implicate order in general. Thus [...] we have given a model of an electron in which, at any instant, there is a co-present set of differently transformed ensembles which inter-penetrate and intermingle in their various degrees of enfoldment. In such enfoldment, there is a radical change, not only of form but also of structure, in the entire set of ensembles[...]. and yet, a certain totality of order in the ensembles remains invariant, in the sense that in all these changes a subtle but fundamental similarity of order is preserved. 
In the music, there is, as we have seen, a basically similar transformation (of notes) in which a certain order can also be seen to be preserved., The key difference in these two cases is that for our model of the electron an enfolded order is grasped in thought, as the presence together of many different but interrelated degrees of transformations of ensembles, while for the music, it is sensed immediately as the presence together of many different but inter-related degrees of transformations of tones and sounds. In the latter, there is a feeling of both tension and harmony between the various co-present transformations, and this feeling is indeed what is primary in the apprehension of the music in its undivided state of flowing movement. 
In listening to music, one is therefore directly perceiving an implicate order. Evidently, this order is active in the sense that it continually flows into emotional, physical and other responses, that are inseparable from the transformations out of which it is essentially constituted. 

Friday, 8 December 2017

The Dynamics of University Corruption

From reading the press today, one would be forgiven for thinking that all universities are corrupt fiefdoms, exploiting the young who are not of an age to know the full implications of the financial commitments they enter into, or the risk of getting very little in return for it. The focus on VC salary is important - some of these people have displayed an astonishing arrogance of superiority. But it's very important to look deeper.

The most penetrating critic of the present situation in Universities lived over 100 years ago, and saw in the American education system a manifestation of atavistic madness. Thorstein Veblen was right, but we have made no progress in untangling the mess created by what he called the "leisure class" and the capitalist system it lived in in assuaging its own ontological insecurity.

Ontological insecurity characterises the mindset of most of the university system today. Nobody - students, teachers, Vice-Chancellors - is comfortable to "be". Academics will often boast of how much more they could earn in industry: they have adopted "industrial" mentality - always thrusting, getting the next grant, recruiting the next bunch of gullible students.  When push comes to shove (and we've seen a lot of shoving!), University managers will hold a cosh over teachers saying "do what I say or you'll never work again": this effectively was the message given by a particular Vice-Chancellor during a stunt with staff involving the pro-Vice Chancellor counting out £10 notes in an attempt to demonstrate the financial woes of the institution. He was, of course, projecting his own fears.

University has been distorted. Vice Chancellors will still cite the lofty ideas of Newman, staking a claim to a more illustrious and thoughtful heritage. But it's either a manipulative lie as they parade their gold (or silver!) TEF rating, or a desperate attempt to quell the existential anxiety of marketised education with some mystical past glory. What they want to say is "Come and buy your certificates here!" (and keep me in the manner to which I've become accustomed).

Salaries are important in the sense that they enslave individuals to capitalism. The VC with the big salary will have a big mortgage, kids at private school, status in society, invitations to high-level political gatherings and a sense of self-importance. That's a lot to give up. When we explore the corruption of Universities, we have to explore the psychoanalysis of loss.

It doesn't just apply to the VC. It applies to the whole management team, and to many academics. The boast that "I could earn more in industry" is usually not true. In fact, it is usually not true that "I could earn more in a different University". So, in fact, it's "This job or we sell the house and the kids leave their schools" - unless such an individual has access to private means (which creates additional problems).

Now, in the senior management team, the degree of ontological insecurity is greater. Many of these people have risen to their position through naked ambition and a desire to please the boss, rather than through acting with integrity and honesty. Many of them will have done dirty work for the boss at some time in the past. Some will know dirty secrets and the boss wants to keep them close. Talent for the job is not a criterion for career advancement! What this all means is that they are committed to the success and happiness of the leader.

The astonishment that remuneration committees have approved eye-watering salaries becomes much more understandable when the collective and inter-dependent ontological insecurities of senior staff are taken into account. "I'll approve your £800k and I continue to get senior approval to stay in my job".

The other dimension to this is the network of ontological insecurity in the local community outside the university. The University has prestige coveted by local business people, leaders of the local council, the local football club, the leading law firms, etc. In each of these institutional structures, similar dynamics will play out, and for each of the people at the top of these structures, association with the University can similarly ameliorate the existential angst of modernity, whilst reinforcing their own positions. The Vice Chancellor has a powerful card up his or her sleeve: the "honorary degree". We should look to those recipients of this de-facto honours system (as if the proper one wasn't bad enough!). Those among them will be on advisory committees for the University, or advising on the latest corporate wheeze as the University uses student income to build some new facility (which will not benefit those who paid for it).

The scale of the problem and the nature of its dynamics are very complex. It is not down to a few rogue University Vice Chancellors (although they exist and thrive). It is not even down to governance: whatever new models of governance are invented, they will be corrupted in the same way. The problems are existential and organisational. Truth and Reconciliation is required - put the students shafted by the system in front of the Vice Chancellors who bought yachts with their money! But then we need to rethink how science and knowledge is preserved and developed in our society to save it from the disaster that Universities have become. 

Tuesday, 5 December 2017

The Intersubjective Foundation of Economic Confidence: Why Bitcoin will crash...

Simmel understood that money was a codification of inter-human relations - it codified expectation. Marx had a similar view. How the codification actually happens can vary. Money might be linked to the actual value of the precious and scarce metal in a coin. Equally, it might be linked to a promise made by an institution on a piece of paper issued by that institution (fiat money).

Codifying expectations is a way of establishing trust between humans, and that essentially means that communications can be made with some security that conversations can be managed by both parties, that irrational acts are unlikely, and that space exists for negotiation and manoeuvre if something goes wrong. The bank's promise acts in a similar way to the universal recognition of the value and scarcity of a diamond. One is a speech act (what Searle calls a "status function"), the other involves many other levels of description (aesthetics, etc) in addition to speech acts.

The context within which expectations are established is critical to the whole thing working. Technology is changing the context within which we deal with money, and the shared expectations it brings. First of all, the model of "flexible pricing", used by airlines, holiday booking sites, Uber and so forth, looks set to become more widespread. What this means is that the bank may promise to pay the bearer the sum of £5, but the exchange value of that £5 might depend on the time of day, the insurance group of the owner, and the extent to which the owner really wants to buy a particular thing. What might this do to codified expectations? Nobody knows yet.

At the same time, technology has been used to change the rules of fiat money. Bitcoin is fiat money with a difference: the promise to pay the bearer is not made by an institution, but the requisite trust for the currency is established through the way the algorithm works. In this shifting of parameters around fiat currency, some variables are overlooked which might render Bitcoin worthless. For example, banks are subject to political forces where Bitcoin isn't. Some see this as a strength of Bitcoin. I'm not so sure. Political forces themselves result from individuals grouping together for a common cause: in the face of catastrophic uncertainty, they look to each other for support, and organise themselves to change their environment. The best way of managing catastrophic uncertainty is to look into the eyes of another human being facing the same thing. This is the essence of trust, not a shared ledger.

Human trust is an intersubjective phenomenon. It may be mediated by objects: a coin, or a shared ledger are objects. But objects themselves merely illuminate the humans  who deal with them. It is through human engagement with objects that humans understand each other better. We may think that we trust the ledger, or the bitcoin, or even the £5 note. But this is to miss the point that it is each other that we really get to know better, and through this illuminated "getting to know" we establish trust.

The problem with hype - whether around Bitcoin, or around University (which is another bubble about to burst) - is that the codification of expectation it manufactures is only indirectly the result of a particular object. Like the Emperor's new clothes, trust finds a way of establishing itself with the removal of the object too.

Bitcoin has got no clothes on. The illumination it brings to our understanding of each other can be equally well-established by its absence, and as circumstances become more and more intense, the search for new ways of establishing human trust and the codification of expectation also intensifies. Indeed, if trust and intersubjectivity is what it's all about in the end, we may ultimately have little need for "shared objects" like money at all...

Monday, 4 December 2017

The Computer Programmer's Superego and Mental Illness

The second book of Ehrenzweig's Hidden Order of Art makes reference to imagery in Frazer's Golden Bough and relates it to Freudian interpretation. It's a move of some brilliance because it presents the primeval mythological forces that all art draws on and relates them to an understanding of therapy and conscious life. He begins chapter 13 by making a bold statement about the role of the superego over the ego:
The exact role of the superego's aggression in creative work will probably be fully understood only when we have found out more about its role in causing mental illness. In many ways creativity and mental illness are opposite sides of the same coin. The blocking of creativity through ego rigidity is apt to unleash the self-destructive fury of the superego, which is otherwise absorbed and neutralized by the periodic decomposition of the ego during creativity. An increased measure of the superego's oral and anal aggression against the ego is utilized for deepening the normally shallow oscillation of the ego as it swings down to less differentiated levels. 
When Ehrenzweig uses Freudian terminology like "oral" and "anal" aggression, he is referring to processes of attenuation and assertion of distinctions. Sometimes the superego breaks down the distinctions of the ego and drives them into the unconscious: Ehrenzweig calls this "anal scattering", after the anal stage of development where the child doesn't control their bowel movements. Alternatively, the superego can cause rigidity in the ego by becoming authoritarian in the distinctions that are made: Ehrenzweig calls this "containment". Deeper delving into the unconscious "undifferentiated levels" (the primary process) occurs through a process of generating and enforcing new distinctions: scattering melts things down, containment gathers things up. New distinctions drill deeper into the undifferentiated levels, drawing up new material for the conscious mind to work on. He continues:
The superego's anal scattering attacks drive the ego inexorably towards an extreme oceanic depth until the process of dedifferentiation even suspends the distinction between ego and superego. Then the ego can shake itself free from the superego's aggression. 

This is the creative struggle which all artists and scientists must deal with. Insanity, Ehrenzweig argues, "may be creativity gone wrong" (Ch.15, p257).

Since I've been doing a lot of computer programming recently, I'm asking myself where computer programming sits in this creative process. Is it creative in the same way that Picasso was creative? Does it probe the oceanic depths of consciousness and bring forth new distinctions? In much recent writing about creativity, the consensus seems to be that creativity in software is the same as creativity in art; the designs of Apple Corp are creative in the same way that Jackson Pollock was. Coupled with this question is a question about why I personally have found it so difficult to compose music using technology (which I will write about in a later post, inspired by Marion Milner's "On being unable to Paint").

The deep question in this stuff is "where do distinctions come from?". In forms of activity like computer programming, distinctions are normatively defined: the syntax of a langauge, the right way of doing something, the threats of doing it wrong, and so on. Are the normative distinctions of code the same as the affordances of paint, or the properties of sound?

I don't think they are. The difference, I believe, has to do with singularity and multiplicity of distinction. A sound comprises many distinctions (many frequencies, for example); paint has many properties including viscosity, colour, luminosity, and so on. In the logic of code, syntax is syntax - formally defined in the rulebook of the language: there is not multiplicity at a deep level, and there is not flexibility for redefining it. All great painting redefines paint at some level or other. In Ehrenzweig's language, the material is scattered, dedifferentiated, and then contained and reconstructed.

It is possible to code like this, but it is not what most people would take to be computer programming. It is the difference between the sculptor working with metal girders to create some new edifice, and the architect working with girders to create a bridge which won't fall down.

The superego of the computer programmer is the compiler + the expectation of the customer. Both of these are containing forces, and their pressure causes a rigidity of the ego. I find myself, when I am subject to these forces and writing code, that my creative spirit dries up. It's interesting to reflect that much of education - even creative education - has this same dessicating effect. The result is mental anxiety and stress which we've learnt to put down to modern life, but which is really a symptom of ego dissociation and madness.

Could we escape this? Could our manipulation of symbols be a scattering as well as a containing? I'm beginning to see this as a very important question regard the human relationship with technology. 

Wednesday, 29 November 2017

The Psychoanalysis of Institutional Madness

So, Glynis Breakwell has gone (see The arrogance of superiority finally got her. How many others now, I wonder? I suspect other VCs, including my favourite - George Holmes at the University of Bolton - will be glad that the axe has fallen, blood has been spilled, and hopefully the whole "What the fuck are you paid?" thing dissipates. It won't. The reason is because it's not about money, or even about corruption: it's about hierarchy. Inflated salaries and corruption (of which there are both) are symptoms of hierarchy and the pathology whereby hierarchies are seeking to reinforce themselves in an environment which is ever-more ruled by uncertainty and ambiguity.

I've become more interested in this on a psychoanalytic level. Freudian language gave us the idea of "anal retention", and basically, this is what we see from all those at the top of institutions. They are so bound by "rules" and "prescriptions" - what Freud calls the domination of the superego over the ego, that they are afraid of expressing themselves freely - just as the child who is chided for shitting inconveniently will seek to withhold their bowel movements. This has another effect too - the anal retentive will find other, more sinister ways of expressing their libidinous instinct. Harvey Weinstein was the top of a hierarchy, he was anally retentive and his sexual predation was driven by a need to descend to the unconscious and dark level to expel all the shit that had been building up. I'm wondering if this isn't going on in all hierarchies.

Hierarchy is a fossil of human organisation. It was an established social structure which could deal with the uncertainties of life. It was formed against a context of communication which was largely restricted: before print (and what a difference print made!) hierarchy served as a route for society to manage its uncertainty through institutions like the church, government and the universities. Alongside it went deference, the law, torture, and military power.

Technology has given us untrammelled growth in uncertainty. Hierarchies are tied to a pathological positive feedback mechanism: they use technologies to help to manage uncertainty, but in doing so feed the uncertainty which they attempt to deal with.

Stafford Beer, in Platform for Change, has some fascinating diagrams. They are all tripartite: there is an institution, or a thing-with-identity, there is uncertainty arising from that thing-with-identity, and there is a function which seeks to manage the uncertainty. It looks like this (these are my versions). So religion might look like:
 But then I was thinking how similar this looks to the relation between the Ego, the Id and the Superego in Freud:
Where does Beer go with this? Well, he says that the metasystem is pathological: the superego eats the ego (as Freud says), and the Catechism rules the religion. How do we solve this? We need a new metalanguage. 

For a psychotherapy, I was wondering what the metasystem might be. I have come up with something like this:
I might add "Metalanguage of global consciousness and creativity."... Ehrenzweig's "Hidden order of art" is taking on a new significance for me.

Tuesday, 28 November 2017

Empirical Phenomenology

How much time do we spend staring at oblongs? As I'm writing this, I am staring at an oblong - a black border with lights in the middle. When I look at my phone, its the same thing - and in fact, the extent to which the oblong fills my field of view is roughly the same, since I hold the phone closer to me. Do we know what prolonged oblong exposure does to people?

I asked my 17 year old daughter. She said, "but you look at windows, you don't worry about that.." I said, "I never look at windows. I look through them. It's different".

We have amazing technologies for exploring the phenomenology of perception in a way which can be far more precise than anything that was available to phenomenological theorists of the early 20th century. We should be using these technologies. I'm interested in how the synchronic and diachronic aspects of visual perception can be studied using spectographic analysis of visual perception, audio perception and haptics.
Plotting the entropies of the dimensions of experience reveals patterns in their inter-relationships: the graph above looks like a kind of counterpoint to me. This is part of an analysis of one of Vi Hart's videos (this one: Perception is contrapuntal, and I think the counterpoint is quite explicit in Vi Hart's work. But watching her video is still staring at an oblong...

So if we spend all our time staring at oblongs then we're doing something to the counterpoint of perception. It would be like having a kind of drone going on in the background. Something fixed and insistent upon which any other 'dancing' takes place. If it was a real drone, I think eventually we would get tired of it and want it to change. But we can't seem to escape our oblongs!

Saturday, 25 November 2017

Pauline Oliveros in Huddersfield

As is often the case with great artists, Pauline Oliveros was late for the artistic celebration of her work at the Huddersfield Contemporary Music Festival. It unfortunately seems that dying is the prerequisite graduation task in order to be taken seriously by the artistic establishment. Oliveros, having spent a life transforming the way we think about music, politics, perception, gender and being successfully died last year, prompting various art establishments to look her up on Wikipedia and organise a festival. Which, I guess, is something...

I was in Huddersfield for an education conference on Power and Professionalism in FE. It was a good conference, with some thoughtful contributions - notably by the brilliant Alex Dunedin, who recommended I turned up. I'm glad he did. There was some great stuff, including a brilliant satirical "letter by Machievelli to the Principal of an FE college". I enjoyed it very much, but in the corridor noticed an invitation for an event happening next door: a Pop-up art school, based around the work of Oliveros. That looked more fun still!

And this is what they were doing - and it was great! A meditation room, some sonic stuff with bongos and Ableton Live, and a brilliant guy with a turntable, some vinyl records, some cardboard and a pen.

We can talk about education for ever. But really, it's very simple to have fun, stimulate minds, and change worlds. Oliveros knew that.

I was lucky enough to meet her at the 2011 American Society for Cybernetics conference She didn't want to talk much. She wanted to make music. She did her sonic meditations with us standing around the pool of a hotel in the mid-west. I've never been the same since!

Thursday, 23 November 2017

The Reality Conundrum in Education

I ran a short session yesterday on "Uses of Critical Realism in Education" with some Education Masters students. To be honest, I should have changed the title - whilst I think pursuing ontology is  urgently needed in education research, "Critical Realism" has become just another topic on the curriculum rather than a process or a movement. Those kinds of things are best avoided.

The other problem with Critical Realism is the invitation it provides for "teachers of Critical Realism" to talk endlessly about it, bore their students to death, sound pretentious with long words, and so on. I provided a short printed summary of "Why Ontology in Education", and said "I'm not really going to talk about this. But what I want to do is some activities with you which hopefully will disturb your equilibrium sufficiently to make you curious about what's in the leaflet."
So this is what we did. The value of ontology in anything is that it should put you in direct contact with the perceived phenomena, and a shed-load of questions. Behind all the questions is the fundamental question that Bhaskar asks: "Given that such-and-such occurs, what must the world be like?"

So the class is an opportunity to explore phenomena: we did some singing and explored the multiple frequencies in a single sound, we watched David Bohm explain his thoughts on multiple description and perception, and we watched a short series of videos of social dynamics which might be called "learning" from mother-baby relations, very boring university teaching (boredom is really interesting isn't it?!), crows playing with cats, children picking up worms and a string quartet playing Beethoven. I asked "What's going on? Is it different things in each case, or is there a common principle at work?"

I said that the value of Critical Realism for me was not the explanations it provides, or the methods it provides for investigation, but the discussion with those who disagree with it (like some social constructivists). The value of Critical Realism for me was that it took me into a contested place.

My biggest problem in CR is the dogmatism: it appears that Critical Realism is only critical up to a point. One academic put it elegantly a few years ago: "Critical Realism isn't sufficiently critical of the assumed facticity of its own categories". Yes. More simply, I would say "it has an observer problem".

Bohm's message is that there is no single description of any mechanism. There is instead a kind of harmonic coordination between multiple descriptions which is revealed in dialogue. If I say "now, I think this is right", what I am saying is that Bohm's description resonates with a series of other multiple descriptions which are both generated by it, and co-exist with it.

There is no single thing.

I had a quick chat with Tony Lawson at the Realist Workshop in Cambridge a couple of weeks ago. Much of the discussion in Geoff Hodgson's talk was about consensus. I said to Tony "I think we all see things in different ways". His face lit up, and quick as a flash he pointed at me and said "I absolutely agree with you!". We laughed. Although it's a joke possibly at the expense of the "multiplicity view" of someone like Bohm, I suspect this was precisely what Bohm was getting at!

Wednesday, 22 November 2017

Diabetic Retinopathy and Adaptive Comparative Judgement Grant Success

Very shortly after I started at Liverpool University, I was invited to a meeting with a doctor from the Eye and Vision Science department of the University who was also a surgeon in the hospital. He explained his passionate desire to do something to prevent blindness in China by implementing a proper diabetic retinopathy screening programme. No such programme currently exists, and there is much ignorance about the condition where there are no symptoms (only a retinal scan can reveal problems) and blindness is sudden and irreversible. The scale of the problem is staggering: there are 110 million people with diabetes in China.

Discussions had got as far as thinking that a MOOC might be the thing to do to train people to diagnose the condition by grading retinal scans. I said this probably wouldn't work, and that the real issue was finding an effective way to deal with the complexities of scale of the problem. The challenge of diabetic retinopathy grading is a straight-forward cognitive problem. There are numerous initiatives (including in Liverpool) to use machine learning to do it - but these attempts have limited success. The sensitivity and specificity  of the diagnosis is critical (i.e. ensuring that false positive and false negative results are minimised) - and the machine learning does not always perform well - although it can improve if it is effectively connected to human learning.

The problem of grading is one of assessment on the one hand, and hierarchy on the other. Experts do grading, and experts have to be trained. The scale at which educational assessment now operates has led to a search for new models of assessment and creative uses of technology. Adaptive Comparative Judgement is one of the most interesting. It enlists a large group of assessors to make simple, low-stakes judgements about which of a pair of artefacts (student work) is better. It produces a ranking from which grades can be established. I asked whether grading by an expert could instead be ranking by a group. I suggested that if this was the case, then the complexity of scale of China could be managed by a crowd-based approach using Adaptive Comparative Judgement. Fortunately for me, this idea completely transformed the discussion - particularly in the vision of the doctor leading the project.

An EU bid followed in 2015 which was unsuccessful, but served to stimulate interest across a consortium, and made the connection between the ACJ, Blockchain and xAPI. This year, I joined a group in Liverpool going for a "long-shot" bid to the EPSRC for £1m to develop a training programme based on ACJ, coupled with machine learning and the development of a new low-cost scanning device. The EPSRC had 150 submission to work through and could only fund a handful of projects. It was a long shot.

Well, it looks like it wasn't such a long-shot after all! I suppose what this is making me think is that thinking remains the most important thing in universities. Universities need thinkers, not people who are going to tow a corporate line. The disaster of managerialism and marketisation have done their best to turn many universities (I think particularly of my former institution, Bolton) into fiefdoms where thinkers are sacrificed like heretics of the "corporate religion".

A powerful and simple idea can go a long way. 

Monday, 20 November 2017

Technology, Objects and Dialogue: Using technology to keep things simple

Technology usually makes things complicated... Over the last couple of weeks, the power of simplicity in education has impressed itself upon me.

First up, I organised a conference on "healing organisations" (see for the Metaphorum group - a research group formed around the work of Stafford Beer. Beer warned about the "Homo Faber" mode of being where innovation is seen as the answer to problems. During the conference, there were a number of "innovative" approaches to the problems of health which were suggested: each innovation would ultimately lead to increased complexity. In other words, it would feed the pathology from which the innovation attempted to escape. This kind of positive feedback is symptomatic of the "iatrogenic disease" (healer-induced sickness) which Illich (and John Seddon, who spoke at the conference) warn about. Education suffers from its own disease of complexification through innovation.

The conference was organised over three days, with day 1 focused on critique ("what's wrong with the system?" - there was a lot of that); day 2 on possible solutions to address problems; and day 3 focused on conversation. For both days 2 and 3 I asked presenters to do activities with delegates rather than simply talk. The best presentations did precisely this. Day 3 was particularly great - we sat in a circle and explained the meaning of various objects which we had brought to the conference (I asked people to bring an object which illustrated their understanding of "healing organisations").

For a while now, I've been interested in how objects illuminate the understanding of the individual talking about them. Since conversations (con-versare - "to turn together") depends on our understanding of each other, objects are a powerful prop to self-revealing. The conversation was visceral, and the revealing of one another was in some cases deeply emotional. There were tears.

Maturana said (in a conference at Asilomar in 2012) that "What we learn, we learn about each other". It is a beautiful summary of things which he has said before - but never so clearly. I don't think he's ever written it down! But it's right.

We learn maths... we learn about a maths teacher or somebody else who does maths. We learn the piano, we learn about a pianist (or a number of them). We learn sociology, we learn about other sociologists.... and so on.

The key to teaching and learning is self-revealing of the teacher. This self-revealing is usually accompanied by objects. Bad teachers will hide behind their powerpoints. Good ones will reveal who they are as people through them. Such teachers embrace a critical principle: that any object is subject to multiple descriptions. There are always many possible interpretations.

A teacher may generate many possible descriptions of an object: "you can think about quadratic equations like this... or like this... or alternatively...". Equally, they may invite descriptions of others: "what do you think?". The point is that the truth of any object - whether a body of knowledge or skilled performance is that it is a multiplicity of different descriptions. To understand is to acquire the capacity to generate multiple descriptions. Teaching is a performance of understanding.

Last Thursday, I led a session at the Ragged University on Objects, Perception and Communication (see It was, in many ways, the same idea as the conference. I asked people to take a photograph of something in the room which revealed something about themselves. We sat in a circle and presented our photographs to each other. Then I illustrated the point about multiple description with music. Using a real-time spectrum analyzer, I showed how a single note is a patterned multiplicity of frequencies like this:

I think this patterned multiplicity is what occurs in the communicating around objects. In illuminating the understanding of each individual, they create the conditions for a "resonant polyphony" of alternative descriptions. Quite simply, we get to know each other better. I followed the singing with Augusto Boal's human statue exercise - another example of objects where people are the objects. Multiplicity of description can be investigated in many ways - with many descriptions!

Now I'm planning something bigger with the Far Eastern Federal University in Russia (Vladivostok). We are developing a course in "Global Scientific Dialogue" drawing on the ideas of David Bohm. 300 students in the University will participate in it next year. This is a radical experiment - and weirdly, something that could possibly only happen on the other side of the planet where the pathologies of EU/US education are less marked. In the 60s, we went to California to do new cool things. Now I think it's 10 hours flying the other way... (actually, it's 13 to Vladivostok).

Why Bohm? Well, he knew about multiplicity of description. This is very powerful:

Tuesday, 14 November 2017

Information and Syncretism: from Floridi to Piaget

Luciano Floridi has appealed for an "ethics of information" (he has written a book about it: His basic argument is that since we all live in an "information environment", information ethics should be seen as a variety of environmental ethics. So putting out "wrong" information onto social media is like dumping mercury into a river. I wouldn't be surprised if Floridi has been consulted with regard to the UK's stance on Russian hacking (see But whatever the Russians (plus quite a few others) have been doing on social media, I think there's a ontological error in Floridi's argument. Information is not mercury: unlike mercury, information's effects depend on the beliefs of those receiving it.

Among the central presuppositions of belief in society today is a view of logic which upholds the principle of the "excluded middle": either the statement "it is raining" is true, or the statement "it is not raining" is true. Both statements cannot be true. What this means is that a collection of statements which are taken to be true or false can be taken together to leave the impression of an indisputable fact. By virtue of this principle, the more facts which can be brought to bear to support other statements, the more "objective" or "scientific" the conclusions drawn from their combination.  For example, the demand for "evidence" in social science is rather like this: the demand for more statements whose truth or falsehood can be established to more precisely identify the truth or falsehood of a more complex statement.

Some medieval philosophers puzzled over the excluded middle because this aspect of Aristotelian logic did not fit their theology. It occurred to John Duns Scotus that something could conceivably be true and false at the same time. He called his principle "synchronic contingency": Antonie Vos has brilliantly explored this ( - personally I am indebted to Prof. Dino Buzetti for drawing my attention to it. What's so fascinating about this is that in Quantum Mechanics, exactly the same principle has a name: super-position. Scotus saw synchronic contingency as a co-existing dimension to what he saw as Aristotle's "diachronic contingency" - which is where something may be true and one moment and false at the next, but never both at the same time.

In the world of synchronic contingency, information looks very different. I think it also looks much more like our deeper human creative processes and spirit.

In my reading of Ehrenzweig's Hidden Order of Art, I've been struck by the emphasis that he places on another theological word: syncretism. Actually, Ehrenzweig cites Piaget as the originator of the use of the word in a scientific context:

Piaget has given currency to the term "syncretistic" vision as the distinctive quality of children's vision and of child art. Syncretism also involves the concept of undifferentiation. Around the eighth year of life a drastic change sets in in children's art, at least in Western civilization. Whilst the infant experiments boldly with form and colour in representing all sorts of objects, the older child begins to analyse these shapes by matching them against the art of the adult which he finds in magazines, books and pictures. He usually finds his own work deficient. His work becomes duller in colour, more anxious in draughtmanship. Much of the earlier vigour is lost. Art education seems helpless to stop this rot. What has happened is that the child's vision has ceased to be total and syncretistic and has become analytic instead. (p6)

In theology, syncretism refers to the holding of many contradictory ideas at the same time. Ehrenzweig argues that the creative process is precisely a process of holding many contradictory ideas at the same time. When he talks about dedifferentiation (see my previous post) he is describing the process of blurring the boundaries between true and false so that something new may be brought into being.

Our problem with "information" - whether its in big data, learning analytics, or the stock market - is that we don't consider the creative potential of a syncretic approach to it whereby such machine generated information could be a powerful spur to more authentic creativity. Instead, we uphold the excluded middle, and seek "triangulation" between different "truths" and "falsehoods". It is because we are so bound to this that our social media networks have become so vulnerable to "wrong" information - whether it's placed there intentionally or by mistake.

The world of creativity and the wold of "data" feel very different. One enlivens the soul and warms the heart. The other tightens the stomach muscles and ties us in knots - both as individuals and as a global society! Syncretism is the difference between the artistic mode and the analytic: the distrust of syncretism is the root of the pathologies of management and government.

Saturday, 11 November 2017

Ehrenzweig on Objects and Creativity: Symmetry and Entropy at the heart of the heart

Objects are important in education. Institutions sometimes seem to believe that objects are the things which they "sell": the learning content, notes, powerpoints and other media... the manufactured products of education, contact with which it is sometimes believed produces learning.

Constructivists might deny the importance of objects, but the concreteness of a cool video or a text book is hard to deny: "this is a great video!" we say. Others lose sight of the fact that it is the making of such an utterance which is the beginning of where the learning which is intrinsic to human coordination happens. Like anything of fascination or beauty, the expression of emotion, feeling, intellect or curiosity is a fundamental human reaction which is communally shared. In the art gallery, we gaze at pictures often together. In the concert hall, we all have emotional experiences which somehow in the silence and ritual of the place, we manage to convey to others, in the cinema we gasp together as somebody escapes imminent death, and so on. Today, media objects get shared online: the common expression of feeling happens diachronically (sequentially) rather than synchronically... but it still happens. "A cool game! What's your score?", and so on.

What happens in these human reactions? I think the answer is simple: we understand something more about each other. Maturana made the point that "what we learn, we learn about each other". Yes, that's it. I will refine this: "What we learn, we learn about the symmetry that exists between us". Why is learning about each other important? Simply because we cannot communicate successfully unless we do know more about each other. The better we know each other, the more effective our social coordination will be. I took two friends visiting from Russia to see the "The Death of Stalin" this week. It was a case in point - as we revealed much about ourselves in our different responses to the film.

Alfred Schutz calls this revealing process "inter-subjectivity", and Talcott Parsons (and later Niklas Luhmann) calls it "double contingency". Despite Parsons's and Schutz's disagreemnents, there is a core principle at work, but an important difference in how they understand it. In double contingency, we communicate because we have some idea of who we are communicating with, how they will respond to our utterances, and so on. Parsons is different from Schutz in that he emphasises the importance of selection of communications (what we mean to say) and the selection of utterance (how we choose to say it). Luhmann developed this further.

I've been re-reading Anton Ehrenzweig's "The Hidden Order of Art" recently (after nearly 20 years). What an amazing book! Ehrenzweig is interested in artistic communication, and he believes that artistic creation does not emerge out of selection.  Ehrenzweig draws his inspiration from the Freudian concept of the primary process - the undifferentiated formless state of consciousness from which conscious experience (distinctions) emerge. He introduces a concept called dedifferentiation where "the ego scatters and represses surface imagery" in creative acts. He also draws on Paul Klee's distinction between two kinds of attention, one on the figure and the other on the ground. Ehrenzweig argues:

What is common to all examples of dedifferentiation is their freedom from having to make a choice. Whilst the conscious gestalt principle enforces the selection of a definite gestalt as a figure, the multi-dimensional attention of which Paul Klee speaks can embrace both figure and ground. Whilst vertical attention has to select a single melody, horizontal attention can comprise all polyphonic voices without choosing between them. Undifferentiated perception can grasp in a single undivided act of comprehension data that to conscious perception would be incompatible. 

I'm interested in this from a more technical perspective - which is certainly not how I would have read it 20 years ago. From a technical perspective, the central issues is the symmetry of relations. Whilst the perception of figure - or rather the identification of the distinction between figure and ground - is an epiphenomenon, there are symmetries in deeper mechanisms which underpin perception which might become better known to us.  Parsons and Luhmann took the epiphenomenon as the phenomenon. But if we think like them, we lose all creativity (and in the process, we risk our humanity). This is however, not to put anyone off from engaging with their ideas: they are powerful - but they flatten the symmetry.

Schutz, on the other hand is much closer. His "pure we-relation" - where human beings communicate face-to-face - is a different kind of coordination which is not based on selection. Ehrenzweig calls the alternative to selection, syncretism - but that, I think, is another word for symmetry. Symmetry emerges in the space between multiple descriptions of things. It emerges in the space between my understanding (and my descriptions of my understanding) and your understanding. It emerges in the ways that a melody, a harmony, a timbre, or a rhythm all draw out the same form.

Sometimes, different descriptions adopt similar patterns. Sometimes the change in their complexities coincides: for example, at the end of a piece of music, final chords eliminate rhythmic complexity, tonal complexity too disappears with the repetition of a tonic chord, alongside the melody which now emphasises a single note. Then, everything is silent. Another way of putting this is that the change in entropy of different descriptions coincides; their relative entropy increases. Now imagine a rich and busy counterpoint: ideas are thrown from one voice to another, different things are happening. There is a rich interplay between the entropies of description.

Ehrenzweig's mode of thinking is fundamentally musical: syncretism happens across the diachronic domain of counterpoint, and the synchronic domain of harmony. Schutz, also a musician, also thought about social relations musically. The syncretic is the same as the coordination of Schutz's "pure we-relation": it is a recognition of symmetrical relations.

The more we engage with objects, the more we reveal ourselves to others, and the more we recognise the symmetry the lies between us.

Friday, 10 November 2017

Illich and the Experts: Whose fake news do you want?

With so much concern about truth and falsehoods in social media, and the role of Universities in defending knowledge or fighting fake news (see, the defence of the "experts" by Universities should be seen for what it is: a defence of existing hierarchy.

Ivan Illich was on to this in the 1970s - particularly in his book "Disabling Professions":

The Age of Professions will be remembered as the time when politics withered, when voters guided by professors entrusted to technocrats the power to legislate needs, the authority to decide who needed what, and a monopoly over the means by which those needs should be met. It will be remembered as the Age of Schooling, when people for one third of their lives were trained to accumulate needs on prescription and for the other two-thirds were clients of prestigious pushers who managed their habits. It will be remembered as the age when recreational travel meant a packaged gawk at strangers, and intimacy meant training by Masters and Johnson; when formed opinion was replay of last night's talk-show, and voting, an endorsement to the salesman for more of the same.
Illich's recipe is to overturn the hierarchy. We need to think about what that means for "experts", and particularly the difference between the "declared experts" by institutions (who are often merely the product of institutional management - "professor" has become a synonym for "manager"), and "intellectual authority", which is something different: the community elder who has read more, thought more, and often is more uncertain and open in their thinking than anyone else.

Thursday, 9 November 2017

Education is simple. Why have we made it so complex?

I've been taking stock of the range of things that I've been doing as part of my role as an educational technologist. Much of it involves struggles with software to do things which the institution believes are necessary in modern education. So there are technologies for assessment, technologies for analysis, technologies for content delivery and so on. Each of them can (and does) go wrong, and each of them demands considerable labour in keeping the system going. From an educational perspective, none of them are particularly effective.

Learning itself is an inter-human activity which involves conversation. Without conversation, there is little learning - a fact which I have to keep reminding those who believe somehow that "content" will "deliver" learning. The only real value of content (Powerpoints, videos, etc) is that it illuminates the understanding in another human being, and that might be the precursor to a conversation. However, if we believe content to be some kind of magical "learning producer", it creates all sorts of chaos and complexity in its production: huge amounts of time are invested in creating sexy animations, vast resources put into audio and video post-production, and whilst what results looks pretty, it inevitably represents the understanding of a committee - not the easiest thing to have a conversation with!

Content, then, is a path to complexification. But it is not the only one.

What inevitably makes content complexify is that it is inherently hierarchical. It is the joint product of expertise and quality audit: the first a result of the academic status machine which manufactures "professors" (who are not always representative of intellectual authority), and the other, a function of the university's bureaucracy. These two functions are related.

The university hierarchy is both a mechanism for apportioning blame for things that might go wrong (like all hierarchies), and a mechanism for dividing knowledge. One of the principal barriers to inter-disciplinary working is the negotiation as to who is responsible (i.e. who can be blamed) for which bit. The quality processes of the university, which are another arm of the hierarchy, uphold these structures. With technology, the university has reinforced its mechanism.

Now there is a curious thing about communication in hierarchies. Hierarchies have "lines of command" - even in their loosest form. These are channels for communicating simple messages from top to bottom: "assessments must be marked by....", "the timetable is published...", etc. These are not conversations, although they might be the cause of conversations further down the system. Sometimes education exploits this for learning: the command "your assignment is to..." is the cause of conversation among students. In these conversations students will often learn about each other. They won't necessarily learn about the teacher, whose utterance might only be "your assignment is to..."

By virtue of the hierarchical structures the teacher find herself in, the conversational utterarances are sometimes restricted to particular forms of delivery: lectures, seminars, assessments, etc. The teacher's position is upheld by compliance with the institution's rules, not the learner needs (although the institution pretends that it represents the learners' needs, it does nothing of the sort - it represents its own needs!).

This all gets incredibly complex. How could it be simpler?

The alternative to hierarchy is either heterarchy (many leaders) or anarchy (no leaders). Both I believe are preferable. In order to achieve them, we have to deal with the twin structural problem: on the one hand, expertise and the status mechanism which gives rise to it; and on the other hand, the institutionalised apportionment of blame and the carving up of knowledge to fit institutional structures.

This is not to say that we ignore intellectual authority. If anything, it is to say that intellectual authority is privileged over the baubles of job title. Intellectual authorities are the elders in the community. They are the source of the best questions; the best guides towards a conversation. But they offer an articulation of uncertainty, not answers: "the best lack all conviction, while the worst are full of passionate intensity," as Yeats put it.

Technology today gives us new lines of communication. We haven't yet learnt how to reorganise our social structures to exploit them; we have instead reinforced our social structures with stupid uses of technology. I'm increasingly convinced that hierarchies persist because of impoverishment in communication, and hierarchies exacerbate this impoverishment. Technology gives human beings new ways of coordinating themselves with richer channels of communication. This is what we should be doing. At its heart are the communicative principles of redundancy which characterise the inner workings of the brain: what Warren McCulloch called "the redundancy of potential command". He also coined the term heterarchy.

Education would simple in a heterarchy.

Monday, 6 November 2017

Power, Hierarchy, and the Sexual Harassment Scandal - Bateson's attempt to clarify categories

I'm recovering from organising the Metaphorum Conference on "Healing Organisations" (see After a lot of anxiety in preparation, it was both an intellectually dynamising and deeply heartfelt conference. It was possibly a lot else too - everyone seemed to enjoy it. I'll say more about the speaker contributions from John Seddon (, Liz Mear, Gerald Midgley, David Welbourn, David Shiers and Allenna Leonard at a later point. They were all brilliant. There was something cathartic about the whole thing... 

A lot of discussion at the conference concerned the pathology of hierarchy and what we do about it (heterarchy? telephathy?). In the news, hierarchies are in trouble: the sexual abuse/harassment scandal is toppling men at the the top of hierarchies, whose positions have enabled them to behave appallingly towards those they had power over, and become unchallengeable.

Hierarchies have a "top", and the top has 'power' over the rest. It also exercises crap management. It takes courage to challenge it. Universities particularly have become increasingly hierarchical in recent years. Something is in the air at the moment that is giving women (and some men) courage. What it is, I think, is overwhelming environmental uncertainty which has been stoked-up by austerity and other attempts by hierarchies (and those at the top of them) to preserve themselves. At the conference, John Seddon pointed out that every attempt to cut costs ends up raising them. This is probably why the deficit doesn't come down, why the health service is on its knees and why those same hierarchies are under attack. It's a positive feedback loop, and like all positive feedback loops, eventually it goes "snap!".

The power inherent in the hierarchy is a strange thing: Power is a controversial concept - particularly in cybernetics. Behind it lies certain assumptions about the way the world works which may be incorrect. The first one concerns evolutionary dynamics. This has sent me back to reading Bateson. In his paper "The Pathologies of Epistemology" which is  in Steps to an ecology of mind, he moves his argument from Darwin to thoughts about what he calls the "myth of power". On Darwin he says:

In accordance with the general climate of thinking in mid-nineteenth-century England, Darwin proposed a theory of natural selection and evolution in which the unit of survival was either the family line or the species or subspecies or something of the sort. But today it is quite obvious that this is not the unit of survival in the real biological world. The unit of survival is organism plus environment. We are learning by bitter experience that the organism which destroys its environment destroys itself. 
If, now, we correct the Darwinian unit of survival to include the environment and the interaction between organism and environment, a very strange and surprising identity emerges: the unit of evolutionary survival turns out to be identical with the unit of mind.
Formerly we thought of a hierarchy of taxa—individual, family line, subspecies, species, etc.—as units of survival. We now see a different hierarchy of units—gene-in-organism, organism-in environment, ecosystem, etc. Ecology, in the widest sense, turns out to be the study of the interaction and survival of ideas and programs (i.e., differences, complexes of differences, etc.) in circuits.
Let us now consider what happens when you make the epistemological error of choosing the wrong unit: you end up with the species versus the other species around it or versus the environment in which it operates. Man against nature. You end up, in fact, with Kaneohe Bay polluted, Lake Erie a slimy green mess, and "Let's build bigger atom bombs to kill off the next-door neighbors." There is an ecology of bad ideas, just as there is an ecology of weeds, and it is characteristic of the system that basic error propagates itself.
That's the epistemological error: choosing the wrong unit. The critical thing is to include the environment. This is exactly what John Seddon said about the health service (although he was slightly reluctant to be so abstract as to say "environment"). He said "The health system doesn't understand its demand". It assumes demand is ever-growing, where analysis shows that it's stable. The system's increasing inability to cope with what appears to be increasing demand is iatrongenic (iatros = doctor) - a healer-induced sickness, an organisational failure. This is critically important.

Of course, at the root of the iatrogenic disease is misused power. So what does Bateson say about this?

They say that power corrupts; but this, I suspect, is non-sense. What is true is that the idea of power corrupts. Power corrupts most rapidly those who believe in it, and it is they who will want it most. Obviously our democratic system tends to give power to those who hunger for it and gives every opportunity to those who don't want power to avoid getting it. Not a very satisfactory arrangement if power corrupts those who believe in it and want it. 
Perhaps there is no such thing as unilateral power. After all, the man "in power" depends on receiving information all the time from outside. He responds to that information just as much as he "causes" things to happen. It is not possible for Goebbels to control the public opinion of Germany be-cause in order to do so he must have spies or legmen or public opinion polls to tell him what the Germans are thinking. He must then trim what he says to this information; and then again find out how they are responding. It is an inter-action, and not a lineal situation. 
But the myth of power is, of course, a very powerful myth and probably most people in this world more or less believe in it. It is a myth which, if everybody believes in it, becomes to that extent self-validating. But it is still epistemological lunacy and leads inevitably to various sorts of disaster.
I've wondered about this for many years. Is power a myth? It feels pretty real to me... But what Bateson is saying is that power is an epiphenomenon of systemic failure. If you heal the system, power-as-a-myth disappears. In its place, one would hope, we have wisdom.

Tuesday, 31 October 2017

Is Life Simple or Complex? Some reflections on John Torday's Evolutionary Biology

Recently, I've been studying the work of evolutionary biologist John Torday, after he posted a fascinating contribution to the Foundations of Information Science ( mailing list. I wasn't alone among my friends in seeing this as something different, and potentially important: the conversations between friends when they say to each other "do you see...?" are very important indicators of what needs to be investigated further. This has been followed with a rich email exchange with Torday, prompted by my pointing out the similarities between his position and Stafford Beer's arguments for a copernican shift in the way that institutions organise themselves, which he wrote about in Platform for Change (and which I blogged about here:

Torday insists:
"Life is simple. We complicate it due to our subjectively evolved senses". 
A more comprehensive articulation of this is contained in (importantly, this is open access!). The second sentence above might be changed to "We complicate it due to our discursively evolved sense", but I haven't yet encountered a systems view which states that the discursive environment in which we all operate is epiphenomenal to more fundamental underlying mechanisms. Having said this, I suspected that with all the complexity of Luhmann's theory or Pask's conversation theory, etc (and their manifest failure to really make a better world), we were missing something.

Torday thinks that the fundamental thing that we miss is cellular communication. In saying this, he is saying something also articulated by "bio-semioticians" like Jesper Hoffmeyer. But Torday's theory is not the same as Hoffmeyer. He is a physiologist, and the empirical work he cites in support of his argument seems compelling to me. He cites the evolution of cholesterol from lipids carried to earth by asteroids, argues for a fundamental role of cholesterol in consciousness, and the connection between the skin and the brain. He argues that:
"All of the neurodegenerative diseases have skin homologs. And the Defensin mutation that causes asthma also causes atopic dermatitis in the skin."
These claims are referenced in the empirical literature. Torday's basic mechanism of cellular organisation through cell-cell communication is specifically a response to environmental ambiguity. This may be the same as a cybernetician would say: cybernetically, cells self-organise to mop up variety - at least if we can say that variety is ambiguity (is it? - it might be...). 

Doesn't the same thing happen in economics? Don't institutions reorganise their components to mop up the extra variety (new options) created by technological development and a discourse which reflects this? In other words, its not a direct causal connection between increased options and discourse and transformations of practice in institutions. It's an indirect connection where innovation increases options, and institutions self-organise in response to the increased variety (and uncertainty).

Discourse, then, is an epiphenomenon of cellular evolutionary mechanisms which are much deeper than our exchange of messages. Torday says complexity itself is an epiphenomenon: he's theorising at a much deeper level than Luhmann, but in a related cybernetic/mechanistic way. The current state of academic Babel would support his arguments, wouldn't it?

The work marks a scientific advance on the work of Bateson, Maturana and Robert Rosen who are the main cybernetic figures in biology. It's a reminder (to me) of the importance of the systems sciences staying close to field work in biology, physics, maths and technology.

Monday, 30 October 2017

MozFest Technology Coolness!

I attended Mozfest in London at the weekend, at the suggestion of Beck Pitt from the OU. There wasn't a large contingent of educational technologists there (although I did bump into Maren Deepwell (who did an ALT-themed session with Martin Hawksey) and Josie Fraser). There really should be more educational technologists at this kind of thing - and the odd institutional manager ought to take note.

Not that there was much room - the place was packed, mostly with the young: the average age was about 20, and there was a very large and encouraging contingent of school age kids, getting wired-up on the decentralised web, blockchain, fighting surveillance and injustice, and feeding the world. There was a real buzz about the place.

MozFest took over 9 floors of the Ravensbourne college building opposite the O2. It felt like an occupation. I haven't seen anything quite like this since I saw the University of Amsterdam occupation in 2015 (see
There was a distinctly low-tech approach to coordination and organisation. Walls and stairwells were full of hand-drawn posters for different events and talks:

Future of scientific publishing? There was a clear technological move away from the established practices of publishers and institutions.

Worried about Fake News? What might technology be able to do about it?

Blockchain for bug fixing?

Google docs with IPFS?! Cool!

I asked some of the people doing these things why they didn't just use the technology without doing all the practical badge-making, stamp printing, type-writing (yes, there was a typewriter!) stuff. The response basically pointed out the immediacy of experience, the importance of physical contact, and so on.

Whatever the reason, it was the right decision. The physical activities were a social and fun way of contextualising more complex technical discussions.

The most important technical themes were about web decentralisation. The drive for this is partly technical, partly practical (how to distribute internet access to parts of the world where building vast infrastructure isn't viable), but mostly political (fighting surveillance). It's not just Blockchain, but the Inter-planetary file system (, and a few similar decentralising protocols like DAT (

The hardware to support decentralisation is also developing. The Gotenna device (see is a small radio repeater which carries a signal for 4 miles, and can easily form a mesh network with other Gotenna recievers in the neighbourhood.  I found this incredibly exciting. Basically, we're heading for an off-grid internet.

As with all of these things, the question for an educational technologist is "What does this mean for institutions?"

The answer is, "We don't know", except that what it will spell out is "Change". The decentralised web is a threat to institutions. That's a message educational institutions need to hear right now, because they all are behaving as if technology has been "done", that it's all about MOOCs and the VLE and that all they need to worry about is "policy" with regard to technology.

Frankly, that's bollocks - ask any 14 year-old at MozFest!