Saturday, 17 March 2018

Do Universities need Vice Chancellors? Some thoughts on the pension dispute...

One of the ironies of the pension dispute is that it centres around risk, which was the topic of expertise of the former University of Bath's overpaid Vice-Chancellor, Glynis Breakwell. Her book on risk is on my desk, which I got cheap in an Amsterdam bookstore. Earlier today I, along with other USS pension scheme members, received an email about USS's assessment of risk in their pension deficit calculations. Everyone agrees that risk is not an exact science, and the battle is about whose interpretation you believe. This is compounded by the fact that trust between the academics (for which read UCU) and the management (UUK) has broken down not just on the issue of the pension, but on a whole host of issues related to the running of the academy over the last 10 years, where we have seen the closure of departments, zero hours contracts, students as customers, compromise agreements, outrageous salaries, ridiculous expenses, VC globe trotting and a complete absence of humility.  In the view of many academics, it's all gone to shit.

All these problems are the fault of management, not teachers or researchers. So why do we need them? Is it an unthinkable thought that we rid ourselves of vice-chancellors and their management cronies, and that universities run as academic cooperatives? How could such a thing be possible?Martin Parker's point in his "Against Management" (see is absolutely right: we need to think about the organisation of education, not its management.

How have we ended up here? Like most crazy aspects of capitalism, it's the fault of the Americans. Even in the early 20th century, it was obvious that US universities were taking a more commercial path to higher education than their older European cousins. Veblen saw it first, commenting (in 1899) that:
"it may be remarked that there is some tendency latterly to substitute the captain of industry in place of the priest, as the head of seminaries of the higher learning. The substitution is by no means complete or unequivocal. Those heads of institutions are best accepted who combine the sacerdotal office with a high degree of pecuniary efficiency" (Theory of the Leisure class)
He could have been writing about today, where literal "captains of industry" (a term which Veblen coined) are making a mint out of universities. What do they do, exactly? What do they do which is worth the £200k - £400+ salary they are paid? Well, one thing they do is join a club called Universities UK...

What if they all went? Would universities fall down? No. But if students don't get taught, or can't sit their exams (particularly all those foreign students who pay a fortune for the privilege of sitting in classrooms in the UK rather than in their home countries), do things start to fall apart? Well, probably yes they do.

The people to blame for the current strike are the Vice-chancellors. I'm not surprised that some VCs are talking of the need for "compromise" from UUK. They know they are on a sticky wicket. The VC of Cambridge even blamed government policy for turning Universities into businesses ( Quite right. Except that such strong criticism of government proposals was not voiced at the time of when the government introduced the policy. The VCs then pushed for the highest fees, from which they ramped up their own salaries.

We have a moment of a reckoning. Something's gone badly wrong in Higher education. It's time to fix it. 

The Creative Process

Of the stages of artistic creation, beginning something appears not as difficult as continuing something. I think continuing is generally more difficult than finishing, but at each stage, the artist has to make choices, and the choices at the beginning shape the choices made when continuing and when finishing.

In the beginning, a distinction must be made. "Let there be light" is a distinction. The world begins with distinctions. The context of this initial distinction is an undifferentiated totality - it is something drawn up from Freud's "primary process". How this decision is made is quite mysterious. Something is required to attenuate the possibilities to make the first distinction. For Leonardo, when preparing a fresco, it was the "cracks in the plaster". There'll be some observed constraint in the material which makes that first moment of making a reality. Gombrich talks about the way that Picasso tears a piece of paper: the form of the tear, the fibres hanging out in the initial moment then give him a way forwards. The first distinction creates constraints for subsequent distinctions. 

Any first distinction is taken with a view to how subsequent distinctions might be made. Everything has possibilities, creates expectations. There's something about a first distinction which resonates with possibilities, with the ideas expressed in the culture, with other aspects of the material or form. The criteria for selecting an appropriate first distinction is symmetry. When Stravinsky talks of conceiving works as wholes (he's not the only one to say this), he is referring to the discovery of a symmetry which connects a first moment of creation with the completed artifact. David Bohm would call this the perception of "implicate order": an awareness or consciousness of totality - not just totality in the moment of creation, but totality through history. It's a symmetry of diachronic and synchronic dimensions.

"Continuing" is then an unfolding of the first distinction. That makes "continuing" sound easy - which, of course, it isn't. What typically happens in "continuing" is that we decide that the first distinction was no good, and so we make another one.  Like Amédée the playwrite in Ionesco's absurd drama of the same name, it's not unusual to have a creative process which continually writes beginnings, crosses them out, and writes a new one. For most people, this becomes exhausting, and whatever impulse there was to create something new dissipates in the frustration of abortive beginnings.

When it works, beginning and continuing are connected by something deeper. The first distinction isn't simply a mark on the paper or a crack in the plaster. It is the identification of a generative principle. The creative process is one of discovering a deep generative principle which connects the first moment of creation with the unfolded form. Every person engaged in an attempt at creativity experiences the frustration of abortive attempts at beginnings. Not every person understands what they are in, or that to understand the form of the process one is in is to understand the deeper nature of the search and purpose that they are engaged in. Disorientation kills the creative process. Successful creation results from having a compass.

All of this interests me partly because I am frustrated by my own creativity. After 8 years, I really am now finishing my book. It's taken so much longer than I anticipated. But I had to go through the process of identifying what it was about, what's its generative principle was, what the first distinction should be. But a book is easy compared to writing music, which is what I always wanted to do. When I was a teenager, music flowed out of me much more easily than it does now. When we are young we are much more attuned to the implicate order and its generative principles than when we get older. Academic knowledge hides the implicate order. My experience of university was that it stultified creativity: where there was energy, curiosity and passion, it created concepts and discourse. That was the death of creativity for me - particularly on a music degree! (I was fortunate that my professor, Ian Kemp, who was head of department at Manchester, knew this all too well: "You'll never learn anything in a place like this," he said. I admired his courage for saying it at the time, without considering exactly how right he was)

Getting the book done is a big deal. But it is an academic book - it's about concepts and discourse. When we talk about human creativity however, whether in art or science, this is not where the action is. The problem is that the way we are taught to think in University is fundamentally synthetic: we are taught to aggregate and synthesise different presentations of phenomena and different theories. We're taught to say "x says this, and y says that", and we taught to see that "a can be explained by x and b can be explained by y". Stephen Hawking is a good example of a synthetic thinker - a product of the university system. He sought to unify quantum theory with relativity. But really, he failed, just as everyone else has. It's not that they're stupid. It's because they are starting from the wrong place. Artists know this more clearly than scientists.

The opposite of a synthetic approach is an analytic approach. My colleague Peter Rowlands argues that this lay at the heart of Newton's scientific approach (see Chapter 2 of . I'm convinced he's right. Newton was able to identify deep generative principles; he didn't seek to synthesise available theories and phenomena. This is ironic, because the Universities modelled themselves on what they believed to be "Newtonian" science. And the artists - like William Blake, who was Newton's antagonist - knew this was wrong. Peter argues that Blake got Newton wrong, and that had he understood how Newton worked, he would have recognised a kindred spirit. It was the institutions that screwed it up.

The issue at the heart of this has to do with how we think about "selecting" a course of action from a set of possibilities. It is about how we think about "information". We tend to think that selecting something involves consideration of the synchronic context: the options available at a particular moment. Technology encourages us to think like this.  But it doesn't work like this. Selecting involves identifying the symmetry between synchronic and diachronic dimensions. This, I think, has profound implications for the way we think about information and technology. We need ways of thinking about diachronic and synchronic symmetry. The generative principle is the source of an unfolding symmetry.

Tuesday, 13 March 2018

Science at the heart of the system

The “student as customer” should not be the driving force for the development of universities. But the government is determined to pursue a policy of shaping Universities in the image of student desires. Since everybody – students, academics, managers, politicians - is confused about what education is for, what university is about, what matters and what doesn’t, it would be foolish to let any single group determine the direction of universities. The latest wheeze is to brand courses as “gold”, “silver” and “bronze” ( – as if it is "the course" which is the independent variable in the life and career of the student. This is nonsense – there are no independent variables!

How have we got here?  

We have pursued an ideology which turns everything into money. The easiest route to turning everything into money is to identify a group of people as “customers” and another group as “providers” and the interaction between them as the provision of a “service” which is charged for. In reality, nobody really agrees who is a customer of who, who provides what and what on earth a “service” is. Everything get blurred in the complexity of intersubjective engagement. Consequently, the distinctions “customer”, “service” and “provider” needs reinforcement if the financialisation process is to work at all – even in its own terms.

What we see in every effort by the government to “regulate” education – from the REF to TEF to NSS to the latest “gold and bronze courses” is an effort to reinforce infeasible distinctions. This is a positive feedback loop. Every effort to codify the uncodifiable results in new confusion. New confusion leads to new efforts to reinforce the distinctions. So some new even more granular metric will always be around the corner. And the effect of this on the system? Inevitably it changes institutional and individual behaviour. The education system has become financialised because it has sought to fit the distinctions that are determined for it, and increasingly to ignore the fundamental problem of the impossibility of making clear distinctions.

This creeping ignorance is the most serious problem for universities. Multiplicity  of description and difference of interpretation are the cornerstones of academic discourse. Universities have always been places where ambiguity and confusion are coordinated in the conversations between members of the institution – students and staff. In a world of government-determined, clearly codified distinctions, where failure to comply results in personal disaster, the space for discussion disappears in an environment of fear.

Science only survives and advances in an environment of openness to difference and ambiguity, in much the same way that Amatya Sen argues that economic development depends on democracy. This is why the Arabic world could not capitalise on its extraordinary scientific discoveries, and instead they passed to Europe. The government is killing the universities, and with it, it is killing the foundation of social flourishing.

The kids aren’t stupid though. They can see this is ontologically wrong. My daughter complained the other day about the bronze and gold courses: “This is why I don’t want to go to university. They’ve become as bad as school”. She’s right. There’s hope in that she can see it.

Saturday, 10 March 2018

Good, Bad and Ugly Universities

Today I've encountered the best and the worst of Universities. The best I found in my own institution: a lovely moment of serendipity which is what these places ought to be about. The worst I heard from my former institution - hardly a surprise given the regime there, but very sad nonetheless.

I started working in Higher Education at the age of 33, having previously been a rather unhappy computer programmer for a few years, and a teacher in Further Education colleges and schools. In the years since graduating from my music degree in Manchester, I always stayed close to the library. I always felt that my job was to read and to think - even if I had to do something else to earn money. It was only when I became a computer science lecturer at the University of Bolton that I could legitimately read and think for a living.

There are many thinkers and readers out there who remain in the position I was in before being employed by Bolton: committed to staying close to the library, but having to do other things to stay alive. Nowadays, some of these people find adjunct positions in Universities on pauper wages - but having no money is no good either, particularly if you've got a pile of mentally exhausting marking: they'd be better off working in Sainsbury's or driving cars for Uber.

I was very lucky with Bolton in 2002, because it led me to cybernetics which, it turned out, I had been deeply interested in in all my academic wanderings, but didn't know what it was (I wish someone had introduced me to it when I was 18). I wouldn't get the job now without a PhD, which I would never have been able to afford. Those who get such jobs in the future will come from more monied backgrounds than I did. I was paid quite well, the work was relaxed and I had time to be creative, making new pieces of software, doing cool things with local schools, accompanying a violinist colleague in local concerts, and getting involved in educational technology projects.

Yesterday I learnt that the department I joined was being restructured. All but a couple of the staff have effectively lost their jobs as senior lecturers, and will be invited to apply for (fewer) lecturer grade positions. This coincides with a merger with the local FE college, and no doubt there is an agenda to transfer much of the undergraduate teaching to FE (cheaper) staff. I feel very sad for my former colleagues, particularly as I was also the victim of the dreadful regime that has ruined Bolton. They will be better off out of it.

What happens when this kind of restructuring takes place is that conversation is destroyed. Universities are all about conversation. That's what happens in the classroom, and it is what happens in science. Those who destroy conversation do not really believe it matters. All they believe in is the reproduction of knowledge which can be assessed, certified and (most importantly) charged for. This is pretty much what they do in the FE college over the road from Bolton University. It's not higher learning; it is schooling.

One of the great giants of Liverpool University, where I now am, was Charles Sherrington (my office is just below where his labs were). After seminal work on neurophysiology at Liverpool in the early 20th century, Sherrington moved to Oxford, where he had this to say about education:
"after some hundreds of years of experience we think that we have learned here in Oxford how to teach what is known. But now with the undeniable upsurge of scientific research, we cannot continue to rely on the mere fact that we have learned how to teach what is known. We must learn to teach the best attitude to what is not yet known. This also may take centuries to acquire but we cannot escape this new challenge, nor do we want to."
The strange thing about this kind of statement is that you could ask any of the great academics of the past or present, and they would say pretty much the same thing. What Sherrington means by "the best attitude" has a lot to do with conversation. What is not yet known is what is not yet codified: it exists in many descriptions in many peoples' heads, and our job as academics is to coordinate these many descriptions by talking and listening to each other.

The kind of management that Bolton currently has clearly does not understand this.

In Liverpool, meanwhile, I received an email from an eminent friend in the physics department. He's the best thing in Liverpool, although he's also struggled with modern academia. It was a forwarded email from another colleague in the architecture department who was previously unknown to my physicist friend. It said something along the lines of "Professor x from the University of Illinois sends his regards". Now, my friendship with my physicist friend stemmed from the fact that we both know Professor x (who is something of a giant in cybernetics). Suddenly, I find someone in architecture also knowing Professor x. Moreover, I have recently been talking to other people in the architecture department about cybernetics. So a few more emails later, and the world starts to look different: how many more possibilities for doing exciting things we all have!

This is what Universities are about. They are about conversation. Liverpool hasn't been spared the madness of managerialism (although it's not as bad as Manchester!), but it hasn't damaged the deep structure which remains pregnant with possibilities. Loet Leydesdorff (who is also responsible for the friendship with the physicist) calls this "redundant options". Universities need to maintain redundancy: the most destructive thing is to make redundancy redundant!

Unfortunately, the money-God leads us to do precisely the wrong thing. If Bolton's management had more insight they would realise their mistake. Instead, they have created a machine for eating the university.

Monday, 5 March 2018

"Provost": A definition

This is for my friends at the University of Bolton, who may be curious as to what a "Provost" is.

(a) a derogatory term for an individual who mistakenly believes themselves capable of running a school.
(b) a more general term of abuse. e.g. "He's a complete and utter provost", "What a provost", "What happened to that other provost? You know, the Greek one..."
(c) a short-lived academic position awarded to an individual who causes a lot of problems in an institution and eventually disappears

Do you suspect someone of being provost near you? Don't delay - call the Education and Skills Funding Agency. Their anti-provost team will swing into action immediately - like it did here:

Next time: definitions for "President and Vice-Chancellor", "Deputy Lieutenant", "Presidents' Club Member", "Former Bishop of Manchester" and many more!