Friday, 23 March 2018

Education as Music: Some thoughts on the “Vladivostok Experiment”

I’m in Vladivostok at the moment. I’ve had a long-term relationship with the Far Eastern Federal University (FEFU) here, which has led to a large-scale educational experiment which is one of the most exciting things I have been involved in. The experiment takes the form of a course which:
  • Is free of any specific curriculum, although it revolves around “systems thinking” 
  • is a conference, concentrated into two weeks rather than spread over 14 weeks 
  • is assessed by patchwork-text and a form of comparative judgement 
  • is driven by conversation 
  • is coordinated with video
  • is oriented around resources or objects 

It’s intended for students in the Management and Economics school to help them prepare for the world of the future, and to help them fill the gaps between their disciplinary knowledge and the skills they will need in the workplace. There will be nearly 300 students involved in it in October. In order to make it all work, we need teachers to be facilitators. 

For many teachers who see themselves as disciplinary experts, this is a challenge, so this week and next I am coordinating some activities with teachers whilst also trying out some of the ideas for the course. Helping me is Sebastian Fiedler from the University of Hamburg, as well as key staff from FEFU. 

 It’s all going rather well… and that leads to the question “What are the design features of this course which are making it work?” I’m slightly uncomfortable with the question, because “design” feels too stiff a term to describe the process that has led to its creation. In many uses of the word “design”, there are intentions which are stated at the outset about “how things will work” and for whatever reason, they never work out like that (think of “learning design”). 

This course feels more like making music together: We do something together, everyone has a good time… nobody quite knows why or exactly what’s happened, but everyone feels changed in some way. Music isn’t really “designed”, but it is “created”. What’s the difference? 

One of the key features of any musical activity is the amount of redundancy that is involved. Music is highly redundant: repetition of rhythm, melody, harmony, etc is its fundamental constitution. Educational design doesn’t “repeat” in the same way – partly because it doesn’t seem rational to do this. But this emphasis on redundancy is important, and particularly relevant for this course in Vladivostok. I came to Vladivostok as a visiting professor three years ago at the invitation of a young Russian academic who I had met through collaborating with Loet Leydesdorff. Her name is Inga Ivanova, and she had made an important contribution to Loet’s work. 

Inga's contribution was to highlight the importance of redundancy in innovation networks, and to suggest ways in which mutual redundancy between different agencies could be calculated. Ever since, I have been fascinated by redundancy in teaching and in music. (Ironically, my trip to Vladivostok came a week after I was made redundant by the University of Bolton – something which, it turned out, was rather a good thing – but it didn’t feel it at the time). Redundancy is a technical term for the production of multiple descriptions of things. I’ve since realised that the process of producing multiple descriptions of the same thing is fundamental to the process of teaching. Human communication relies on the production of redundancy, just as machine communication has to add redundancy in order for signals to overcome noise (as in Shannon’s theory). Good teaching involves saying the same thing in many different ways. Student understanding is expressed through the generation of many descriptions of what is understood. Our assessment processes rarely recognise this. 

Redundancy may be generated in many ways. Conversation among students is a way of doing it.  So a discussion about an object invites many different descriptions of that object. Conversation is a mechanism for the coordination between the many descriptions to establish a shared meaning among a group. In the activities we are doing in Vladivostok, staff either create or are presented with different kinds of object: sometimes it's photos on their phones, other times, its pictures or unusual artifacts. Each time, groups are asked to express descriptions of these objects and coordinate their different descriptions into a coherent narrative. 

There are a number of side-effects of this which are powerfully educational. Firstly, individuals get to know each other better: they discover things about each other which they didn't know before, and this leads to new conversation (for example, two academics in the session yesterday, realised that they had a shared employment history). These conversations continue long after the event.

Secondly, some of the objects they are presented with stimulate curiosity in the conversation which leads to further reading and research: the objects serve to disturb the equilibrium of individuals such that new learning becomes necessary. 

The assessment strategy of Patchwork text provides a mechanism for participants to keep a record of what happens to each individual, how they are changed by things, what new research they do about things. It all seems to work!

Our conventional understanding of education focuses on information - the opposite of redundancy. But it seems to me that redundancy - and its music - may be far more powerful.

Saturday, 17 March 2018

Do Universities need Vice Chancellors? Some thoughts on the pension dispute...

One of the ironies of the pension dispute is that it centres around risk, which was the topic of expertise of the former University of Bath's overpaid Vice-Chancellor, Glynis Breakwell. Her book on risk is on my desk, which I got cheap in an Amsterdam bookstore. Earlier today I, along with other USS pension scheme members, received an email about USS's assessment of risk in their pension deficit calculations. Everyone agrees that risk is not an exact science, and the battle is about whose interpretation you believe. This is compounded by the fact that trust between the academics (for which read UCU) and the management (UUK) has broken down not just on the issue of the pension, but on a whole host of issues related to the running of the academy over the last 10 years, where we have seen the closure of departments, zero hours contracts, students as customers, compromise agreements, outrageous salaries, ridiculous expenses, VC globe trotting and a complete absence of humility.  In the view of many academics, it's all gone to shit.

All these problems are the fault of management, not teachers or researchers. So why do we need them? Is it an unthinkable thought that we rid ourselves of vice-chancellors and their management cronies, and that universities run as academic cooperatives? How could such a thing be possible?Martin Parker's point in his "Against Management" (see is absolutely right: we need to think about the organisation of education, not its management.

How have we ended up here? Like most crazy aspects of capitalism, it's the fault of the Americans. Even in the early 20th century, it was obvious that US universities were taking a more commercial path to higher education than their older European cousins. Veblen saw it first, commenting (in 1899) that:
"it may be remarked that there is some tendency latterly to substitute the captain of industry in place of the priest, as the head of seminaries of the higher learning. The substitution is by no means complete or unequivocal. Those heads of institutions are best accepted who combine the sacerdotal office with a high degree of pecuniary efficiency" (Theory of the Leisure class)
He could have been writing about today, where literal "captains of industry" (a term which Veblen coined) are making a mint out of universities. What do they do, exactly? What do they do which is worth the £200k - £400+ salary they are paid? Well, one thing they do is join a club called Universities UK...

What if they all went? Would universities fall down? No. But if students don't get taught, or can't sit their exams (particularly all those foreign students who pay a fortune for the privilege of sitting in classrooms in the UK rather than in their home countries), do things start to fall apart? Well, probably yes they do.

The people to blame for the current strike are the Vice-chancellors. I'm not surprised that some VCs are talking of the need for "compromise" from UUK. They know they are on a sticky wicket. The VC of Cambridge even blamed government policy for turning Universities into businesses ( Quite right. Except that such strong criticism of government proposals was not voiced at the time of when the government introduced the policy. The VCs then pushed for the highest fees, from which they ramped up their own salaries.

We have a moment of a reckoning. Something's gone badly wrong in Higher education. 

The Creative Process

Of the stages of artistic creation, beginning something appears not as difficult as continuing something. I think continuing is generally more difficult than finishing, but at each stage, the artist has to make choices, and the choices at the beginning shape the choices made when continuing and when finishing.

In the beginning, a distinction must be made. "Let there be light" is a distinction. The world begins with distinctions. The context of this initial distinction is an undifferentiated totality - it is something drawn up from Freud's "primary process". How this decision is made is quite mysterious. Something is required to attenuate the possibilities to make the first distinction. For Leonardo, when preparing a fresco, it was the "cracks in the plaster". There'll be some observed constraint in the material which makes that first moment of making a reality. Gombrich talks about the way that Picasso tears a piece of paper: the form of the tear, the fibres hanging out in the initial moment then give him a way forwards. The first distinction creates constraints for subsequent distinctions. 

Any first distinction is taken with a view to how subsequent distinctions might be made. Everything has possibilities, creates expectations. There's something about a first distinction which resonates with possibilities, with the ideas expressed in the culture, with other aspects of the material or form. The criteria for selecting an appropriate first distinction is symmetry. When Stravinsky talks of conceiving works as wholes (he's not the only one to say this), he is referring to the discovery of a symmetry which connects a first moment of creation with the completed artifact. David Bohm would call this the perception of "implicate order": an awareness or consciousness of totality - not just totality in the moment of creation, but totality through history. It's a symmetry of diachronic and synchronic dimensions.

"Continuing" is then an unfolding of the first distinction. That makes "continuing" sound easy - which, of course, it isn't. What typically happens in "continuing" is that we decide that the first distinction was no good, and so we make another one.  Like Amédée the playwrite in Ionesco's absurd drama of the same name, it's not unusual to have a creative process which continually writes beginnings, crosses them out, and writes a new one. For most people, this becomes exhausting, and whatever impulse there was to create something new dissipates in the frustration of abortive beginnings.

When it works, beginning and continuing are connected by something deeper. The first distinction isn't simply a mark on the paper or a crack in the plaster. It is the identification of a generative principle. The creative process is one of discovering a deep generative principle which connects the first moment of creation with the unfolded form. Every person engaged in an attempt at creativity experiences the frustration of abortive attempts at beginnings. Not every person understands what they are in, or that to understand the form of the process one is in is to understand the deeper nature of the search and purpose that they are engaged in. Disorientation kills the creative process. Successful creation results from having a compass.

All of this interests me partly because I am frustrated by my own creativity. After 8 years, I really am now finishing my book. It's taken so much longer than I anticipated. But I had to go through the process of identifying what it was about, what's its generative principle was, what the first distinction should be. But a book is easy compared to writing music, which is what I always wanted to do. When I was a teenager, music flowed out of me much more easily than it does now. When we are young we are much more attuned to the implicate order and its generative principles than when we get older. Academic knowledge hides the implicate order. My experience of university was that it stultified creativity: where there was energy, curiosity and passion, it created concepts and discourse. That was the death of creativity for me - particularly on a music degree! (I was fortunate that my professor, Ian Kemp, who was head of department at Manchester, knew this all too well: "You'll never learn anything in a place like this," he said. I admired his courage for saying it at the time, without considering exactly how right he was)

Getting the book done is a big deal. But it is an academic book - it's about concepts and discourse. When we talk about human creativity however, whether in art or science, this is not where the action is. The problem is that the way we are taught to think in University is fundamentally synthetic: we are taught to aggregate and synthesise different presentations of phenomena and different theories. We're taught to say "x says this, and y says that", and we taught to see that "a can be explained by x and b can be explained by y". Stephen Hawking is a good example of a synthetic thinker - a product of the university system. He sought to unify quantum theory with relativity. But really, he failed, just as everyone else has. It's not that they're stupid. It's because they are starting from the wrong place. Artists know this more clearly than scientists.

The opposite of a synthetic approach is an analytic approach. My colleague Peter Rowlands argues that this lay at the heart of Newton's scientific approach (see Chapter 2 of . I'm convinced he's right. Newton was able to identify deep generative principles; he didn't seek to synthesise available theories and phenomena. This is ironic, because the Universities modelled themselves on what they believed to be "Newtonian" science. And the artists - like William Blake, who was Newton's antagonist - knew this was wrong. Peter argues that Blake got Newton wrong, and that had he understood how Newton worked, he would have recognised a kindred spirit. It was the institutions that screwed it up.

The issue at the heart of this has to do with how we think about "selecting" a course of action from a set of possibilities. It is about how we think about "information". We tend to think that selecting something involves consideration of the synchronic context: the options available at a particular moment. Technology encourages us to think like this.  But it doesn't work like this. Selecting involves identifying the symmetry between synchronic and diachronic dimensions. This, I think, has profound implications for the way we think about information and technology. We need ways of thinking about diachronic and synchronic symmetry. The generative principle is the source of an unfolding symmetry.

Tuesday, 13 March 2018

Science at the heart of the system

The “student as customer” should not be the driving force for the development of universities. But the government is determined to pursue a policy of shaping Universities in the image of student desires. Since everybody – students, academics, managers, politicians - is confused about what education is for, what university is about, what matters and what doesn’t, it would be foolish to let any single group determine the direction of universities. The latest wheeze is to brand courses as “gold”, “silver” and “bronze” ( – as if it is "the course" which is the independent variable in the life and career of the student. This is nonsense – there are no independent variables!

How have we got here?  

We have pursued an ideology which turns everything into money. The easiest route to turning everything into money is to identify a group of people as “customers” and another group as “providers” and the interaction between them as the provision of a “service” which is charged for. In reality, nobody really agrees who is a customer of who, who provides what and what on earth a “service” is. Everything get blurred in the complexity of intersubjective engagement. Consequently, the distinctions “customer”, “service” and “provider” needs reinforcement if the financialisation process is to work at all – even in its own terms.

What we see in every effort by the government to “regulate” education – from the REF to TEF to NSS to the latest “gold and bronze courses” is an effort to reinforce infeasible distinctions. This is a positive feedback loop. Every effort to codify the uncodifiable results in new confusion. New confusion leads to new efforts to reinforce the distinctions. So some new even more granular metric will always be around the corner. And the effect of this on the system? Inevitably it changes institutional and individual behaviour. The education system has become financialised because it has sought to fit the distinctions that are determined for it, and increasingly to ignore the fundamental problem of the impossibility of making clear distinctions.

This creeping ignorance is the most serious problem for universities. Multiplicity  of description and difference of interpretation are the cornerstones of academic discourse. Universities have always been places where ambiguity and confusion are coordinated in the conversations between members of the institution – students and staff. In a world of government-determined, clearly codified distinctions, where failure to comply results in personal disaster, the space for discussion disappears in an environment of fear.

Science only survives and advances in an environment of openness to difference and ambiguity, in much the same way that Amatya Sen argues that economic development depends on democracy. This is why the Arabic world could not capitalise on its extraordinary scientific discoveries, and instead they passed to Europe. The government is killing the universities, and with it, it is killing the foundation of social flourishing.

The kids aren’t stupid though. They can see this is ontologically wrong. My daughter complained the other day about the bronze and gold courses: “This is why I don’t want to go to university. They’ve become as bad as school”. She’s right. There’s hope in that she can see it.

Saturday, 10 March 2018

Good, Bad and Ugly Universities

Today I've encountered the best and the worst of Universities. The best I found in my own institution: a lovely moment of serendipity which is what these places ought to be about. The worst I heard from my former institution - hardly a surprise given the regime there, but very sad nonetheless.

I started working in Higher Education at the age of 33, having previously been a rather unhappy computer programmer for a few years, and a teacher in Further Education colleges and schools. In the years since graduating from my music degree in Manchester, I always stayed close to the library. I always felt that my job was to read and to think - even if I had to do something else to earn money. It was only when I became a computer science lecturer at the University of Bolton that I could legitimately read and think for a living.

There are many thinkers and readers out there who remain in the position I was in before being employed by Bolton: committed to staying close to the library, but having to do other things to stay alive. Nowadays, some of these people find adjunct positions in Universities on pauper wages - but having no money is no good either, particularly if you've got a pile of mentally exhausting marking: they'd be better off working in Sainsbury's or driving cars for Uber.

I was very lucky with Bolton in 2002, because it led me to cybernetics which, it turned out, I had been deeply interested in in all my academic wanderings, but didn't know what it was (I wish someone had introduced me to it when I was 18). I wouldn't get the job now without a PhD, which I would never have been able to afford. Those who get such jobs in the future will come from more monied backgrounds than I did. I was paid quite well, the work was relaxed and I had time to be creative, making new pieces of software, doing cool things with local schools, accompanying a violinist colleague in local concerts, and getting involved in educational technology projects.

Yesterday I learnt that the department I joined was being restructured. All but a couple of the staff have effectively lost their jobs as senior lecturers, and will be invited to apply for (fewer) lecturer grade positions. This coincides with a merger with the local FE college, and no doubt there is an agenda to transfer much of the undergraduate teaching to FE (cheaper) staff. I feel very sad for my former colleagues, particularly as I was also the victim of the dreadful regime that has ruined Bolton. They will be better off out of it.

What happens when this kind of restructuring takes place is that conversation is destroyed. Universities are all about conversation. That's what happens in the classroom, and it is what happens in science. Those who destroy conversation do not really believe it matters. All they believe in is the reproduction of knowledge which can be assessed, certified and (most importantly) charged for. This is pretty much what they do in the FE college over the road from Bolton University. It's not higher learning; it is schooling.

One of the great giants of Liverpool University, where I now am, was Charles Sherrington (my office is just below where his labs were). After seminal work on neurophysiology at Liverpool in the early 20th century, Sherrington moved to Oxford, where he had this to say about education:
"after some hundreds of years of experience we think that we have learned here in Oxford how to teach what is known. But now with the undeniable upsurge of scientific research, we cannot continue to rely on the mere fact that we have learned how to teach what is known. We must learn to teach the best attitude to what is not yet known. This also may take centuries to acquire but we cannot escape this new challenge, nor do we want to."
The strange thing about this kind of statement is that you could ask any of the great academics of the past or present, and they would say pretty much the same thing. What Sherrington means by "the best attitude" has a lot to do with conversation. What is not yet known is what is not yet codified: it exists in many descriptions in many peoples' heads, and our job as academics is to coordinate these many descriptions by talking and listening to each other.

The kind of management that Bolton currently has clearly does not understand this.

In Liverpool, meanwhile, I received an email from an eminent friend in the physics department. He's the best thing in Liverpool, although he's also struggled with modern academia. It was a forwarded email from another colleague in the architecture department who was previously unknown to my physicist friend. It said something along the lines of "Professor x from the University of Illinois sends his regards". Now, my friendship with my physicist friend stemmed from the fact that we both know Professor x (who is something of a giant in cybernetics). Suddenly, I find someone in architecture also knowing Professor x. Moreover, I have recently been talking to other people in the architecture department about cybernetics. So a few more emails later, and the world starts to look different: how many more possibilities for doing exciting things we all have!

This is what Universities are about. They are about conversation. Liverpool hasn't been spared the madness of managerialism (although it's not as bad as Manchester!), but it hasn't damaged the deep structure which remains pregnant with possibilities. Loet Leydesdorff (who is also responsible for the friendship with the physicist) calls this "redundant options". Universities need to maintain redundancy: the most destructive thing is to make redundancy redundant!

Unfortunately, the money-God leads us to do precisely the wrong thing. If Bolton's management had more insight they would realise their mistake. Instead, they have created a machine for eating the university.

Monday, 5 March 2018

"Provost": A definition

This is for my friends at the University of Bolton, who may be curious as to what a "Provost" is.

(a) a derogatory term for an individual who mistakenly believes themselves capable of running a school.
(b) a more general term of abuse. e.g. "He's a complete and utter provost", "What a provost", "What happened to that other provost? You know, the Greek one..."
(c) a short-lived academic position awarded to an individual who causes a lot of problems in an institution and eventually disappears

Do you suspect someone of being provost near you? Don't delay - call the Education and Skills Funding Agency. Their anti-provost team will swing into action immediately - like it did here:

Next time: definitions for "President and Vice-Chancellor", "Deputy Lieutenant", "Presidents' Club Member", "Former Bishop of Manchester" and many more!

Saturday, 17 February 2018

The Cybernetics of Competence and Capability: Revisiting Pask

Enid Mumford noted that the difference between competence and capability lay in the difference between attenuation and amplification between complex systems. Competence involved the attenuation of an environment to fit the acceptable parameters of an individual who had learnt a set number of appropriate responses. Capability involved the production of multiple descriptions of understanding of a complex situation such that a solution to a new situation may be creatively generated rather than retrieved from memory.

Gordon Pask noted that the production of multiple redundant descriptions of a single thing was a fundamental part of the learning conversation. When he clarified his "teach-back" process, he goes much further than Laurillard's simple comparator approach to check if the learner has taught back what they were taught. Pask says (in the "Cybernetics of Human Learning and Performance", (1975))

Teachback goes as follows: the teacher says of the student (or ‘subject’) that the student understands a topic to the extent that he can teach it back to the teacher. This is, understanding is inferred if the student can furnish an explanation of the previously discussed topic and can also explain why he gave that explanation of how he constructed it. The crucial point is that the student’s explanation and the teacher’s explanation need not be, and usually are not, identical. The student invents an explanation of his own and justifies it by an explanation of how he arrived at it (in fact an identical explanation is generally rejected unless the student can give a reason why the teacher’s explanation was particularly good). 

The difference between the teacher's utterances and the student's is critical in the teachback process. Pask goes on to say:

the resilience of a memory will depend upon the number of explanations produced in teachback; for example, that a student impelled to give many explanations will fare better at session 2 than a student required to give only one. He has many ways of reconstructing a concept and this  redundancy will combat the effect of interfering and incompatible learning experiences during the intervening week. 
What this suggests is that redundancy is the principal indicator of learning, not information. In Shannon information theory (which underpinned Pask's thinking), redundancy is the inverse of information: it is the context within which messages are formed. Education is not about the message; it is about the context!

How might this work?

Well, compressing Pask's diagram into the exchanges between teacher and learner, it might be drawn like this:
At the stage of "rigidity", the teacher present many alternative descriptions of what they are trying to convey (this is generally what teachers do!). The learner is only able to reproduce one of the descriptions they are given, if that. Gradually, they acquire greater flexibility to utter more descriptions. Having two descriptions of the same thing is very powerful, and builds towards a generative capacity to create (or guess) other explanations.

Then there comes a stage when the learner is able to respond to the different descriptions of the teacher with effective matching alternative descriptions. At this point, we might say that the learner is competent. 

Finally, the true generative power of the learner's understanding is revealed at the stage where from a simple prompt by the teacher, the learner is able to generate all manner of descriptions - some of which may not have been conceived by the teacher. At this stage, we can say that they are capable, confident and adaptable in their knowledge.

What happens from stage 1 to stage 4 is a gradually awakening to the constraint which lie behind the teacher's generation of their own descriptions. By the end, what the learner has learnt are not facts, but the mechanisms of transduction within the teacher whereby the teacher is able to generate the descriptions and skilled performances that they demonstrate.

If only we'd thought of this when people sent their students off with their dreadful e-portfolio systems, we would have done it all very differently!

Friday, 9 February 2018

When is a musical note a different note?

I've been talking a lot about transduction recently.  Transduction is the process whereby a distinction gets maintained.  In engineering, it is the process of taking one form of energy and turning it into another - like an electric transformer. But the point is it produces a boundary.

The idea of transduction is useful because it turns what we usually think of as fixed 'categories' into processes. So think of a category or a subject.. 'maths', 'geography', 'chair', 'happy', 'ill', etc... Now think of the process which makes that category. It turns out that any category has two sides: a category is a boundary. The process of maintaining it works from both sides of the category. If you want to change a category, you have to change the process.

So you want to change the culture in an organisation? You need to understand where the transduction is happening and 'tweak' it.

Transductions are recursive. One category depends on many many other categories. More to the point, a category is not necessarily something that can be expressed in language. All perception is transduction: to perceive a difference is to experience a transduction.

That's useful when we think about music. To differentiate one note from another is to experience a transduction. What does that tell us about how transduction actually works? Well, to distinguish one note from another depends partly on there being multiple descriptions of a note. A note is never a simple thing: it is a multiplicity of frequencies to start with, which give it a timbre or colour. It also has a beginning and an end, it has a volume, and so on. A note is different from a silence. So a single note is perhaps a kind of transduction between a silence and the note. Since a silence also has multiple descriptions (silence has many qualities), a note is the difference between one set of multiple descriptions of something and another set of multiple descriptions of something.

If we were to compare one note with another note, then some descriptions between one note and another note might be the same - they might have the same volume for example. But they might have different frequencies, or they might have different timbres. 'Another' note is a change in the arrangement of multiple descriptions. That is what the transduction does: it shifts from one set of descriptions to another.

What about detecting that a note is "the same" note as another? That's an interesting transduction. To say that it is "the same" is to still detect that it is "different", but it is different in a way where the boundary between one and the other produces a new category of "the same".

A new category? Ah! A new transduction!

Thursday, 25 January 2018

A Statement Issued by the University concerning the attendance at the Presidents' Club Event by their Vice Chancellor

The University can confirm that its President & Vice Chancellor, Professor Screwtape DL, attended The Presidents’ Club Charity Dinner event held at The Dorchester Hotel, London on Thursday, 18 January 2018. Professor Screwtape DL has never previously attended any event held by The Presidents’ Club and is not a member (he prefers to work "by proxy"). His attendance was arranged under the  guise of an "invited guest" to a charity fundraising dinner. He likes to look good. More specifically, he was a guest of one of the University's key business sponsors who hosted and personally paid for the table with their soul.

Professor Screwtape DL has confirmed that he was approached whilst dining and served drinks by several hostess staff allocated to his area of the dining room. Co-incidentally he now (in hindsight, shitting himself) recognises one of those staff who spoke to him as being the undercover reporter now identifying herself as Maddison MarriageProfessor Screwtape DL recalls expressing at the time (to the woman now known as Maddison Marriage) that both personally and also in the current context (in 2018, particularly post the Harvey Weinstein allegations) he was uncomfortable with the totally unexpected influx of hostess staff and certain auction lots. The undercover reporter has confirmed subsequently to a fellow journalist and Editor, that Professor Screwtape DL ‘looked pretty shocked’ on twigging that she might possibly be a journalist.

Professor Screwtape DL has further confirmed that another member of the hostess staff also served him a drink whilst he was dining and when he asked her what she thought of the event, she similarly indicated to him that she was uneasy with the event. She stated that she was particularly nervous about descriptions she had been given of the ‘after party’ which she had heard from other hostess staff who had worked at the event in previous years. This prompted Professor Screwtape DL to seek out and speak with one of the event staff team leaders, expressing his concern that if any press were present at the event, his career would be over. Professor Screwtape DL cast a blind eye on the assaults subsequently alleged in the press and he chose to leave as soon as was politely possible at the end of the charity auction after he fulfilled his role to convert a number of key influential individuals to Satanism as required of him when he attends such public events.

Professor Screwtape DL has confirmed that he did not and, more importantly, chose not to participate in the post dinner ‘after party’ which he had been unaware of when he accepted the invitation to the black tie dinner, and he returned to his family who were with him in London.

Tuesday, 23 January 2018

Psychodynamic pathologies

The ultimate purpose of higher learning is individuation - that wholeness of being which is the essential component of wisdom. The deep problem with this is that individuation is ontological, not epistemological. It is not about knowledge. It is about being. Modern universities are all about knowledge, and pay scant regard to ontology/being/experience/etc.

In many prestigious universities, the most powerful departments are medical or related in some other way to the life sciences. There are many professors within those departments whose achievements in academic life relate to their acquisition of large amounts of knowledge. These disciplines are dominated by epistemology: "higher learning" becomes equated with "knowing stuff" rather than anything ontological. It could be argued that medicine has never really been a discipline of higher learning - despite the fact that it was present in the academy since the beginning (although in earlier times, it had more of an ontological basis than it does now).

There is an interesting Freudian distinction to be made between those academics who are possessed with an "epistemic bias" and those possessed with an "ontological bias". It lies in the relation in the psyche of the individual in the dynamics between id, ego and superego. An obsession with "correct facts" is really domination of the ego by a rigid superego. Dreams and fantasies are repressed, kept away from conscious intellectual life. Formal Aristotelian logic keeps things in check: the superego says "you've got to do it like this, or it's wrong". Pedantry, lack of critique, conservatism, are all negative side-effects of this. Rigour, planning, organisation, delivery are all virtues. Richard Dawkins, together with politicians like Michael Gove, typify this.

The academic with ontological bias is sceptical and imaginative with knowledge. Most famous intellectuals fall into this camp, although it's usually balanced with something which ensures things actually get done. But it is because they have an ontological bias that they are celebrated. It is because of the dynamic relationship between the superego and the id that they are creative and do new world-changing things. David Hume, Paul Feyerabend, David Bohm, William Blake, Jean Piaget are among good examples.  Insight, creativity, openness and spontaneity are virtues. There is the risk that lack of focus could result in failure to deliver.

Universities today are more favourable to the "epistemically biased" academic. This is partly because management of universities has turned into a branch of politics where the skills of dreary politicians like Gove become useful in the University management too. Universities must 'produce' like factories, and this is the domain of the superego.

The problem is that while the hold of the superego over the ego destroys their own creativity, they then occupy positions of power where they amplify this ego-rigidity and inflict it on others. A psychodynamic pathology in one individual becomes  a psychodynamic pathology in the institution.

The warning is that this can happen anywhere today, in any institution - however prestigious. Prestige itself becomes the ruling superego, and if its measurement is seen to be the result of rule-following, rigour and "doing it properly", then the superego's grip on the institutional life will be tightened. This is precisely what happened (and continues to happen) in Bolton. It is a psychodynamic pathology.

Sunday, 21 January 2018

Psychodynamics, Creativity and Mental Health

Anton Ehrenzweig is, I'm sure, right to identify the distinction between the Freudian primary and secondary process as fundamental to creativity. Freud, in articulating the dynamics of psychic processes, needs to invent terminology for the "force" of emotion in instinctive behaviour. "Cathexis" is the word he and Breuer used to describe emotional energy. In "Beyond the Pleasure Principle", Freud has this to say about the primary and secondary processes.

I described the type of process found in the unconscious as the 'primary' psychical process, in contradistinction to the 'secondary' process which is the one obtaining in our normal waking life. Since all instinctual impulses have the unconscious systems as their point of impact, it is hardly an innovation to say that they obey the primary process. [...] it is easy to identify the primary psychical process with Breuer's freely mobile cathexis and the secondary process with changes in his bound or tonic cathexis. If so, it would be the task of the higher state of the mental apparatus to bind the instinctual excitation reaching the primary process. A failure to effect this binding would provoke a disturbance analogous to a traumatic neurosis; and only after this binding has been accomplished would it be possible for the dominance of the pleasure principle (and its modification, the reality principle) to proceed unhindered.

He's basically saying that untrammelled emotional energy leads to madness. In section VII, he says
We have found one of the earliest and most important functions of the mental apparatus is to bind the instinctual impulses which impinge on it, to replace the primary process prevailing in them by the secondary process and convert their freely mobile cathectic energy into a mainly quiescent (tonic) cathexis. 
Ehrenzweig's insight is to see that this process is fundamentally the same as the act of creation, but that its a continual process of binding-up the primary process and disintegration of the secondary process. Furthermore he identifies how this process relates directly to pedagogical techniques for teaching creativity. Devices for artistic production function to disintegrate the material of the secondary process (over which the Superego has a powerful grip) and send it back into the primary swamp. Those same devices give new form to the process of binding the primary process as new things are brought into consciousness.

Freud and Ehrenzweig suggest that if this stops working mental illness follows. Most particularly, if the Superego's grip on the primary process is so strong and unshakeable that nothing can lead to the fragmentation of its binding, then repression will result. Freud says "The essence of repression lies simply in turning something away, and keeping it at a distance, from the conscious". The psychotherapeutic approach is to bring out repressed instincts into contact with the conscious mind. It's like jump-starting a motor which has stopped working.

The creative imagination of the artist uses various techniques for challenging the Superego's dominance. Starting from a distorted surface is one, of which serialism in music is a simple example.

I've been experimenting with improvisation using fragments of notated music - in this case Bach and a bit of Schubert. I feel that if I was freely improvising with no boundaries, what comes out tends to follow set patterns - things which I am thinking, cliches which are imposed on my own subconscious by my superego. The results are a bit flat. But with the disruption of a "broken surface" of musical extracts, I've found that I become more creative and inventive. It's an interesting experience...

But there's a level at which maybe Freud misses something. There's something about history - about the fact that it's Bach that I use to manipulate. This leads us to Jung's theory...

In "The concept of the Collective Unconscious", Jung writes:

In addition to our immediate consciousness, which is of a thoroughly personal nature and which we believe to be the only empirical  psyche (even if we tack on the person unconscious as an appendix), there exists a second psychic system of a collective, universal, and impersonal nature which is identical in all individuals.  This collective unconscious does not develop individually but is inherited. It consists of pre-existent forms, the archetypes, which can only become conscious secondarily and which give definite form to certain psychical contents. 
I've always been attracted to Jung. Now perhaps he's saying the same thing as David Bohm - that somewhere, there is an "implicate order" - some kind of fundamental origin in the symmetry of the universe. And Bohm also notes that through music, we come into direct contact with it (see

Thursday, 18 January 2018

An Educational Techno-Utopia

Last week, one of my favourite sociologists, Christian Smith, published an angry piece in the Chronical of Higher Education entitled "Higher Education is Drowning in BS" (see I've been fascinated by Smith's work for some time (see, and there are two things that strike me on reading his Chronical piece.
  • First, it is no ordinary rant from any ordinary academic: this is someone who is an authority on human experience.
  • Second, I doubt that the senior management of his institution have read his work or have anything like the high opinion I and many others have of him. Some of those senior managers will call themselves "professor" and consider themselves to be intellectual authorities (since this is what "professor" denotes). In reality they will simply have been ambitious enough to acquire the title of highest academic rank without having to have read or thought that much.

There are some serious qualitative distinctions that need to be made and which are becoming blurred. Smith says it in his piece:

BS is universities hijacked by the relentless pursuit of money and prestige, including chasing rankings that they know are deeply flawed, at the expense of genuine educational excellence (to be distinguished from the vacuous "excellence" peddled by recruitment and "advancement" offices in every run-of-the-mill university).
For me personally, I have this disaster coupled with a very bright 18-year old daughter who is adamant she doesn't want another "three years of school" - and that is pretty much how all universities have become. So the bright kids are starting to desert the academy. The intellectual authorities in the institutions (the ones who know their way around the library), have either retired or have "had enough". What hope is there?

Among the many factors which have fed this decline, confusion over what "educational experience" is is high on the list of culprits. Because of the sheer difficulty in examining experience, we have allowed ourselves to be convinced that the only reliable methods are "by proxy" - questionnaires, surveys, etc. Yet these things do nothing  to measure experience. As Roger Brown says, University is an "experience good". That means "you can't know it until you've experienced it" (after having parted with £9250). That's an experience in itself!

In truth, Universities do their best not to be honest about the experience of university. Everyone knows that photographs of smiling students are a lie. Universities never tell you what it's like to struggle to get assignments done (or even, exactly what assessed work will be expected) or be bored rigid in a lecture. Why don't they publish their assessments up-front and let students decide when they feel they are ready? Because that wouldn't be in the commercial interests of the institution, even if it clearly is in the interest of the students.

In Dennis Potter's play from the 1980s, Blind Lazarus, a dead man's experience is available for others to enjoy (or at least experience too). Might technology deliver something like this to us one day?

I'm beginning to wonder if its not impossible. I've been doing some experiments analysing the dimensions of real-time experience as a kind of "counterpoint". At the moment it takes a lot of processing power to produce a map of the interplay of different domains of experience (visual, auditory, haptic, kinaesthetic, proprioceptive, etc). But as with any data processing, it will get quicker to the point of becoming instant. That would change things.

There could be no hiding of experience. One person could know another's consciousness. Would we still talk? probably - but it would change. I don't think capitalism would survive this innovation, let alone universities. But it would usher in a completely new era of learning and communicating. We would have tools to amplify the tuning-in to one another that is essential to communication. Assessment and certification would disappear as trust (which is what those things are about) becomes an explicit pattern of consciousness. Would we still lie? Maybe - but equally, we would know that we do it, and understand it better in others.

This isn't as far away as I once thought. It is really the flip-side of AI and machine learning. Those tools (AI) contribute to objects which transform themselves, presenting automatically generated multiple descriptions of themselves to the consciousness of individuals. Individual experience, contextualises these automatic multiple descriptions, and situates them within the many other multiple descriptions which comprise the context of conscious life.

I doubt Christian Smith will be able to look into the crystal ball like this - he is, after all, longing for the disappeared old academy. But here we see a new academy. It's not a hierarchy of professors and managers, but a heterarchy of intersubjective insight.

Learning and teaching will take care of itself.

Tuesday, 16 January 2018

Learning Analytics, Surveillance and Conversation

In the noisy discourse that surrounds learning analytics, there are some basic points which are worth stating clearly:
  1. Learning Analytics, like any “data analysis” is basically counting: complex equations which promise profound insights are in the end doing nothing other than counting. 
  2. Human beings determine what is to be counted and what isn’t, and within what boundaries one thing said to be the same (and counted as the same) as another thing. 
  3. Learning analytics takes a log of records – usually records of user transactions – and re-represents it in different ways.
  4. The computer automates the process of producing multiple representations of the same thing: these can be visual (graphs) or tabular 
  5. Decisions are facilitated when one or many of the representations automatically generated by the computer coincides with some human’s expectation. 
  6. If this doesn’t happen, then doubt is cast over the quality of the analysis or the data.
  7. Learning analytic services typically examine logs for multiple users from a position of privilege not available to any individual user. 
  8. Human expectations of the behaviour of these users is based on bias surrounding those aspects of individual experience that a person in privilege will have: typically this will be knowledge of the staff ("the students have had a miserable experience because teacher x is crap")
  9. Often such high-level services exist on a server into which data from all users is aggregated with little understanding by users as to what might be gleaned from it. 
  10. The essential relationship in learning analytics is between automatically generated descriptions and human understanding.  
  11. Data analytic tools like Tableau, R, Python, etc all provide functionality for programmatically manipulating data in rows and columns and performing functions on those rows and columns. Behind the complexity of the code, this is basically spreadsheet manipulation. It is the principal means whereby different descriptions are created. 

So the real question about learning analytics is a question about automatically-generated multiple descriptions of the data, and how those multiple descriptions influence decision-making. 

Of course, decisions made from good data will not necessarily be good decisions, nor are decisions made with bad data necessarily bad. What matters is the relationship between the expectations of the human being and the variety of description they are presented with. 

In teaching, communication, art, biology or poetry, multiple descriptions of things contribute to the making of meaning. Poets assemble various descriptions to convey ideas which don't have concrete words. Composers create counterpoint in sound. When we discuss things, we express different understandings of the same thing. And teaching is the art of expressing a concept in many different ways. What if some of these ways are generated by machines?

AI tools like automatic translaters or adaptive web pages are rich and powerful objects for humans to talk about. As such tools adapt in response to user input, people talking about those tools understand more about each other. Each transformation reveals something new about the people having a discussion. 

This is important when we consider analytic tools. The richness of the ability to generate multiple descriptions means that there is variety in the different descriptions that might be created by different people. The value of such tools lies in the conversations that might be had around them. 

With the emphasis on conversation, there is no reason why analytic tools should be cloud-based. There is no reason why surveillance is necessary. They could be personal tools, locally-installed instead. Their simple job is to process log files relating to one user or another. Through using them in conversation, individuals can understand each other's understanding better. They should be used intersubjectively.

Recently I've been doing some experiments with personally-oriented analytical tools which transform spreadsheet logs of activity into different forms. The value in the exercise is the conversation. 

Whatever we do with technology, it is always the conversation that counts!

Saturday, 13 January 2018

Learning as an Explanatory Principle - a response to Seb Fiedler

Seb Fiedler (University of Hamburg) wrote this ( earlier last week in response to my post about a "logic of learning" (see

My original post was about the impossibility of saying anything sensible about learning. Bateson's idea of "explanatory principles", which Seb uses, was his way of pointing out the essentially relative nature of anything we say about anything. Gravity? It's an explanatory principle!

Seb highlights Jünger's view that "learning is an explanatory model for the explanation of change".

The effect of any explanatory principle is to allay uncertainty about the environment. We are generally uncomfortable with uncertainty, and seek to explain it away. If it's not  God, it's the Government, or "human nature".... Because we attribute learning to so many aspects of change in the world to which we are uncertain, we have established institutions of learning to do an industrial-scale mopping-up of this uncertainty!

Explanatory principles - particularly when they are institutionalised - wash over the details of different people's interpretations of an explanatory principle. When the institution defines what learning is, individuals - learners and teachers - can find themselves alienated from their own personal explanatory principles. A common experience in education is for a learner to be told that they've learnt something when they feel just as confused (or more so) about the world as they did before they started.

At the heart of Bateson's argument about explanatory principles was the epistemological error which he feared would lead us to ecological catastrophe. He believed, as many believe in cybernetics, that one has to correct the epistemology. Bateson's attempt to articulate the logic upon which the epistemological error was based revolved around his work on the "double-bind". Double bind logic is a dialectical logic of levels of contradiction and resolution at a higher level. This is the logic which I think we should be looking at when we look at education and the discussion about learning. 

The use of the explanatory principle of "learning" is a bit like a move in a strategic game. When x says "this is learning" they are maintaining a distinction through a process of transducing all the different descriptions of their world and what they observe into a category. They then seek to defend their distinction against those who might have other distinctions to make. It's not the distinction that matters. It's the logic of the process whereby the distinction comes to be made and maintained. 

The logic behind the double-bind which produces the distinction is not Aristotelian. Bateson did not fully explore the more formal properties of the double bind logic. Lupasco did, and Joseph Brenner is able to tell us about it. Also I think Nigel Howard's theory of Metagames is also able to articulate a very similar kind of logic in a formal way using game theory.

Tuesday, 2 January 2018

Partial Notation of Improvisation and Creative Processes

I experimented with creating an instrumental voice (a flute) using some music notation software (Staffpad) and then improvising some kind of accompaniment to it on the piano. The notation process was interesting because it was effectively a process of creating space in the score. The gaps between the instrumental sections were more important than what occurred in those sections. I improvised into the gaps.

This worked quite well. It struck me that the process is a bit like doing a drawing where you demarcate the background and work towards the figure. The instrumental sections were pretty random - but it was just a frame. The colour was filled in with the improvisation.

I listened to the ensemble and started to add another voice which reinforced some of the features of the piano. Eventually I imagine I could dispense with the improvised bit completely.

When we sing along, or improvise with existing music, what is happening is the making of an alternative description of it. It's rather like taking Picasso's bare skeleton of a bull, and gradually filling in the bits which are missing. The bare bull is still a bull. What we add are alternative redundant descriptions.
This is what my improvisation is in relation to the fragments of notated melody on the computer. Gradually more and more description is added, and more and more redundancy is created.

One further point: thinking about my interest in Ehrenzweig's work on psychotherapy and the creative process (see, the notated score with its bare bones and large gaps is a means of creating what Ehrenzweig calls "dedifferentiation" in the psyche. It breaks things up and creates a framework for the drawing up of new forms and ideas from the oceanic primary process. Ehrenzweig talked about serialism doing this. This is the first time I have had the feeling that technology might actually be able to do it too. My experience with technology and musical creativity generally has been that it gets in the way because it reinforces the superego's "anal retentive" demand that things must be done in such and such a way.

I have not felt this with this particular exercise. Of course, it's not great music. But the process promises something...