Tuesday 29 January 2013

Playing with Utopias

Andrew Feenberg makes an important claim in his book "Between Reason and Experience: essays in Technology and Modernity" that ours is not the only technological society that is possible. To believe that it is can be construed as a technological deterministic position - Feenberg charges Heidegger (particularly) with this (and since he was a student of Marcuse, who in turn was a student of Heidegger, he at least has some pedigree for saying this).

Technology was a dominant concern in Heidegger's career, from the early "Being and Time" through to the late essays "The question concerning technology" and "Building Dwelling Thinking". The late works argue that the essence of technology is 'enframing' and that man is bound by a technological world as the "setting upon that sets upon man", and that the only escape is a mystical retreat to poetry and art.

To what extent is Feenberg right? To what extent is there merit in Heidegger's position?

Feenberg's book is essentially about the relationship between abstraction and experience, and the role of technology in that relationship. It is really a book about where politics sits in the unfolding of this relationship. These are themes very close to my own preoccupations (see http://dailyimprovisation.blogspot.co.uk/2013/01/technology-abstraction-and-experience.html). Feenberg advocates a critical approach to technology as a way of engaging political discourse in the technological realm. I find that this theme has echoes in Ulrich Beck's work too.

What may be wrong in Heidegger, and (interestingly may also be wrong in the technological work which is Heidegger-inspired, such as Winograd and Flores) is that the view of technology is fundamentally irrealist. Heidegger, after all, was a phenomenologist. To him, individual experience was to be privileged. It should also be said that much 2nd-order cybernetic work is also phenomenological in this way - it's all about the individual-biological perception. As I have argued in the past, this kind of methodological individualism has social and political implications. Adorno, whose project of Negative Dialectics was essentially a competing project to Heideggers, dismissed Heidegger's work as being fascist- no doubt sticking the knife in with some relish following Heidegger's inexcusable war record! But individualism can lead to fascism - it can lead to the denial of the reality of the social. And with the denial of the reality of the social goes the denial of the possibility of building a better world.

This is where I find common ground with Feenberg. I believe that work with learning technology is precisely about exploring what a better world might be like. It is a process that Ronald Barnett wrote about a couple of weeks ago in the Times Higher: http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=422221: Universities are about creating the conditions for the experimentation with 'feasible utopias'.

However, whilst Feenberg concerns himself with technology, I would argue that just as important is teaching and learning. Technology is experiential, it allows us to escape from abstraction, but so is teaching. Indeed, it is precisely in the combination of technology and pedagogy that the "feasible utopia exploration project" can be undertaken - not one in favour of the other.

There is a deep social need for this to take place. Just at a time when the institutions we originally established for the purpose of free exploration of ideas begin to marketise, professionalise and determine narrow purpose for themselves (much narrower than in their original form), so something new needs to happen which  opens things up again. Neither technology nor pedagogy are restricted to the University. But finding new ways of exploring feasible utopias - either within or outside universities - needs to become a major political priority.

The alternative is slavery to the fate that Heidegger described. Yet we will only become slaves if we believe that we are in the best of all possible worlds.

Saturday 26 January 2013

Towards Negative Cybernetics - A home for serious thinking about Educational Technology?

At his talk to the American Society for Cybernetics in 2012 in Asilomar, Terry Deacon asserted that 'Cybernetics isn't enough', as he attempted to argue the case for absence to be taken seriously in information theory. This didn't go down terribly well, with one very eminent member of the ASC heard to mutter "what you just heard was a sham!" - despite much of what Deacon was arguing for being prefigured in the work of Bateson and a few others.

To me, Deacon was uttering a statement that was a reaction to a peculiar kind of stasis that has struck the cybernetic discourse in recent years - really in the years following the deaths of the most significant thinkers in the discipline: Heinz von Foerster (died 2002), Gordon Pask (died 1996), Stafford Beer (died 2002), Niklas Luhmann (died 1998), Bateson (died 1980) and Ernst von Glasersfeld (died 2010). It's unfortunate that stasis has set in alongside the biggest global economic crisis since the 2nd world war... in fact, since the very original crisis which was the source of the remarkable synergistic, trans-disciplinary creativity of the Macy conferences. But exactly what "isn't enough?".

My view is that what "isn't enough" for Deacon is what might be called "positive cybernetics". That is the study of theoretically-proposed actual feedback mechanisms which are seen to be responsible for the phenomena of the world. In 2nd order cybernetics, the existence of these mechanisms calls into question the ontological status of matter. Biologically-inspired totalisations characterise material experience and psychological phenomena  with individual mechanisms of coordination of coordinations (so my coordinations coordinate your coordinations - and vice-versa - and the dynamics of these coordinations produce the experience of a shared reality). However, despite these mechanistic operations calling into question the nature of reality, the ontological status of the mechanisms themselves remains untouched. They merely assert themselves by their capacity to reduce highly complex phenomena to highly logical recursive formulae.

There is much of value in these ideas. They are ingenious, rich and fascinating. But at root, there is an assumption that reasoned abstraction of all the complexity of life is conceivable by an individual but operates at a deeper ontological level than individual perception. This is a heavy-duty metaphysical assertion, and Deacon finds this hard to swallow - and so do I. As Wiener put it over 60 years ago,
"the whole mechanist-vitalist controversy has been relegated to the limbo of badly posed questions"
The problem lies, I believe, in concentrating on the 'actual' operation of mechanisms. There is simply more to life than what is 'actually' there. Our whole yearning to know more, to be curious, to explore is driven by a sense of incompleteness - of something missing. Whatever we can actually see - be it traffic jams, or economic forecasts, or disease symptoms or exam grades - we sense questions within ourselves "what more? what's missing?" Importantly, our interpretation of the actual always takes into account "the missing": we read more into the learner's absences than anything else; the messages that aren't there are causal.

My reaction to the mechanistic totalisations of cybernetics has always been "what more?" For me, this sense of incompleteness has come from a lifetime being fascinated by and thinking about music. Music is the epitome of incompleteness - waves of rich questions which interlock with one another eventually reaching some sort of reconciliation - but always a reconciliation that raises many more new questions.

But we don't want to throw the baby out with the bathwater: actual mechanisms are important! We have made remarkable machines through our understanding of them; some policy interventions actually work! But is there a way of situating the actual mechanisms of cybernetics against the context of missingness that surrounds them?

This is where I've started to think about the possibility of a 'Negative Cybernetics'. I'm consciously thinking of Adorno's 'Negative Dialectics' as I suggest this and it is first worth saying a bit about that. Adorno was reacting to two poles in German philosophy: On the one hand, he saw Kant's "negative" critical endeavour and Hegel's negative aspects of dialectic, on the other the overall totalizing and positive tendency in Hegel's dialectical method. Alain Badiou argues (in "Five Lessons on Wagner") that Adorno was interested in purging the positivising, identity-producing forces in this, by removing the positive aspects of Hegelian dialectic and refocusing those aspects of Kantian critique which determine the limits of thought. Badiou makes the point that what became known as 'critical theory'
"was given the name "negative dialectics" by Adorno as a yoking together of critical theory and negative dialectics, of Kant and Hegel now transcended." 
Cybernetics is essentially also a child of German idealism. A negative cybernetics demands transcending the positive identification of a mechanism. Critical cybernetics demands consideration of the limits of conceiving a mechanism, of the impact both of the context on the action of a mechanism and on its abstraction and conceptualisation. There emerge a range of questions, which for starters, might include:
  1. How might absence be constitutive of an abstract description of a mechanism? (for example in a theory)
  2. What are the social implications of a mechanistric metaphysics?
  3. What are the causal relations between absence and presence as a constitutive force on being?
  4. How can the causal relationships between absence and presence be known?
  5. Do these problems force us to consider the relationship between reason and experience?
  6. At an experiential level, what is the place of technology and how should we approach it?
  7. At an experiential level, what is the place of teaching and learning and how should it be approached?
I think a "negative cybernetics" may be a good starting point for the serious study of educational technology. This is because both technology and education are concerned with experience - they are both places where abstractions are 'played out' in society. In educational technological practice, whilst idealised abstractions epitomise positive identities, the essence of experience with technologies in education is nonidentitical - it emphasizes absence. Educational Technology cannot be content with abstraction alone. Asserting the ontological privilege of abstract mechanism (such as those suggested in 'positive cybernetics') quickly reveals itself to be a mistake.

There is a balance between abstraction and experience. Technology and education are at the pivot point. But in order to see this as a living process, we have to make ourselves vulnerable enough to see that our ideas and abstractions are no more than "comfort blankets" veiling nothingness. Indeed, they may do harm.

It is interesting to reflect that the driving force behind Adorno's theorising was the 'break in history'. I'm wondering whether in our daily experimentation with educational technology, there aren't continuous small 'breaks with history', except that we've chosen to overlook the discontinuity in favour of an identity-bound, rational narrative. Bourdieu would call such pedagogic moments acts of 'symbolic violence'. We should take these seriously. Certainly letting a runaway identity principle loose in education is unlikely to do anyone any favours. 

Tuesday 22 January 2013

Understanding Face-to-face, One-to-one Learning

Imagine a situation where a teacher is supporting a single learner. In my experience, this is most common with things like music lessons, although any kind of supervision could be considered. What are the characteristics of this situation? How can we explain the learning process? How is this distinct from the characteristics of group learning, or distance learning?

I've made some strong criticism of Conversation theory recently, and I think that the one-to-one situation highlights some of the explanatory deficiencies in the theory. I could be accused of overstating my case (!) but I want to defend this position because at the root of my criticism is a complaint about the way we tend to think about thinking and learning. I am not anti-technology!

The centrepiece of Pasks' theory is the concept of teach-back. In essence this is a mechanism whereby the teacher examines the utterances of the learner in response to their teaching and questioning, and determines the extent to which the learner has understood what they are being taught: Pask thought of it like a 'comparator'. Fine in principle; but what happens in reality??

Think what it actually feels like to be in that situation. We say something. The learner may say something, or shuffle around a bit, or look away, or blush, or go "errr....". What do we make of all that? What do we 'read' and what do we filter out? Is the shuffling insignificant? etc. The astonishing thing is really that with such complex information coming their way, teachers have the ability to decide what to do next at all!

Related to this, I had a conversation this morning (at least I've provoked people to argue with me!) where my complaint that text-based communication was deficient was challenged by considering 'the telephone' (in terms of reading richer signals) or "a super-high-definition-3d-with-smell-and-touch-a-phone" - what about that, eh?!  Surely these technologies mediate rich coordinations almost as good as face-to-face? I replied "can you enumerate all the particular sensory qualities that go to making a realistic impression?". I think the enumeration process never ends - we never quite get to reality. The super-high-definition-3d-with-smell-and-touch-a-phone is an attempt at such an enumeration. This is not to say that improvements and increasing detail in enumerating the sensory deficits cannot lead to improvements - but it's all a bit like Achilles and the Tortoise.

This is all about what's missing. It's also all about the fact that considering what's missing leads us to what is perhaps an important conclusion: educational experience is stratified into levels which are irreducible to one another. In information science, there is a lot of work going on around the irreducibility of social structures: Critical Realism strongly emphasises this; but more recently we find it in Terry Deacon's work; and Floridi also suggests the irreducibility of his 'levels of abstraction'. There is a tendency for communication theories like Pask's theory (Maturana suffers the same fate) to collapse levels into one another. It's like dissolving biology and chemistry into physics (why don't we go the whole hog and dissolve sociology?!! - Pask comes close to that!). What we end up with is a flat totalization.

How does what's missing (absence) lead to irreducible stratification of learning experience? Absences are all around us from the moment we are born. If they are causal on our development (which I think they are) then the shared absences between mother and child create the original spaces where thought develops. This is the grounding for language and action. Language and action create absences on a social level - between siblings, between friends, etc. Shocking events, rites of passage, etc all are powerful in their formative influence. But what emerges are distinct spaces where people grow and become attached to one another (which may mean they coordinate around the things that they lack). The social level of the family and friends is distinct from the level of the baby: one cannot be explained in terms of the other - it has it's own rules, it's own dynamic. And so we move up the social systems, each of them distinct, emerging from the other, but irreducible to it.

This makes me think of Luhmann. But Luhmann's dynamic - rather like Pasks - is based on a coordination of coordinations of communication: in the end, the perturbation is king at all levels. So Luhmann can create his distinct social structures, and show how they persist, but he always relies on the obscure processing of the 'psychic system' in order to make it work. If instead we consider that whatever we are looking at, there is something we are not looking at which shapes what we see and how we think about it, then there is a chance to move away from the global totalisations towards something that can be brought into focus a bit more clearly: the irreducible stratified levels of the internal and the external world.

So the question of what happens in face-to-face learning isn't a simple one. Who is it who's talking? Mother and child? Teacher and student? Brother and Sister? It matters because the absences are different in each case. Regarding mediated communication through technology, we can never escape the absences of the technology itself (it tries hard enough to remind us them, after all!) We both might share those absences and choose to ignore them (or discuss them - how much online communication is about the communication of technology, or where the technology is part of the drama of the conversation? - see http://en.wikipedia.org/wiki/La_voix_humaine for a beautiful example of this!). But I'm never going to get my mum to Skype!!

And our learners are in so many different spaces, coordinating with their absences is almost impossible in formal education. Some learners who are awkward in social company may find an outlet online - but maybe they have really found a way of talking to themselves (I know a few people on online networks like this!). For those learners that really don't know their way, online learning is unlikely to reap any lasting reward: only the determined re-parenting of digging into someone's absences and making ourselves vulnerable enough that they can see ours. Those are the best music lessons and supervisions I remember.

Sunday 20 January 2013

Technology, Abstraction and Experience

There is a problem with my thinking. Like all academics, I like to think in abstractions. I look at the world, think about my experiences, and look for neat explanations of things. It's a kind of therapy and of course, I enjoy doing this (some people are rude about this kind of thing and call it 'intellectual masturbation' - it's unkind, but not entirely untrue!)

But abstraction is always problematic for a number of reasons.

  1. Real experience is lived in time; abstraction compartmentalises time (if it acknowledges it at all) as a kind of divine mechanical clock, where successionism rules (b follows a..)
  2. Abstractions are an individual's ideas which form from the individual's experiences, habits and history; they are not merely rational creations, nor is their form purely the result of their internal logic.
  3. Abstractions must be taught if they are to have any effect in the world (there is no point in having a new theory if you can't teach it!); the teaching of abstractions is typically a situation where learners are indoctrinated with the categories of the abstraction: it can tend to exhibit sage-on-the-stageness...
  4. To be taught an abstraction is a lived experience for the learner. This experience is fundamentally different from the experiences which led to the formation of the abstraction in the first place. 
  5. Were one to formulate an abstraction of the learning experience of an abstraction, it would be a very different kind of abstraction!
  6. Then there are socialisation problems associated with abstractions, and the institutional, corporate, political and legal frameworks which form the context within which abstractions are generated and within which they are taught.
  7. The fundamental changes underway in HE (the process of marketisation) has a powerful bearing on these processes of abstraction, teaching, learning and the necessity for critique for the advancement of knowledge.

I think that from the Enlightenment to the decline of manufacturing industry in the 1980s, it was possible to overlook these problems with abstraction. Now, I think not.

The deep problem with abstraction is "time" and successionism. It's as if we have convinced ourselves of the ontological status of clocks. Computers haven't helped - they epitomise causal successionism in their internal operations. But they epitomise the fallacy of causal successionism in their social effects.

My deep concern for this is that I feel the need for something more than abstraction. There are I think two things which we should consider much more seriously if we are to escape our enlightenment chains. They are:

  1. Technology
  2. Play

Technology, of course, is the product of abstraction. But technology also provides experience in time. Technologies used to teach with can provide rich experiences of simulated problems from which abstractions emerge. The best example of this is Lovelock's DaisyWorld simulation. There Lovelock used technology as a way of allowing people to explore the problem situation he was considering and play with the parameters.

This process of play avoided the need for sage-on-the-stage. Instead, the abstraction of Gaia was presented in a way that invited participation. This leads me to the conclusion that Technology is a "solvent": the problem of the abstraction of time dissolves in effective use of technology through play.

This overcomes some of the other problems with abstraction. Considering that an abstraction is an individual's ideas,  what is it that a learner of the abstraction learns? I think they learn about the originator of the abstraction: how do they think? how did they reach that conclusion? Learners also learn about each other as they all learn about the teacher of the abstraction. Technology gives us not only play, but convivial play. Playing together around complex ideas affords powerful opportunities for the social impact of learning.

Most importantly, perhaps, playing together with powerful ideas can help us to come together, to unite, to organise ourselves. If we are to really address the challenges that face us in this current crisis, then new ways of finding togetherness and purpose must be created. For that we will need to use our "solvent" wisely!

Saturday 19 January 2013

3 Bad Assumptions of E-learning

The value of e-learning to me has been in the opportunity to look at the mysterious process of human development afresh through the lens of technology. We had such optimism, and had so much support (and money) from government sponsors. But where's it got us?

Whilst technology cannot bear the full blame for the current crisis in HE, it lurks in the background as a causal factor - even if its just the whiff of 'inefficiency' when we see a lecturer performing the same lecture they always have done to a small group of students. The steamroller of technology says "why not video it?", put it on a social network, etc. Of course, this question is never taken as a challenge... "Why NOT video it?"

The problem is one of perceived "functional equivalence". Or rather, the mismatch between perceived functional equivalence and ontological and phenomenological difference. In short, we've been good in spotting (and defending with 'evidence') functional equivalence. We've been less good in investigating ontological distinctions. Those are the things which come to bite us when we look at miserable MOOCs.

This is not to defend the status quo, or to argue for the maintenance of inefficient practice. It is to attack shallow, lazy and expedient thinking. It is to critique the traps that 'functional equivalence' have led us to. These, I think, are 3 significant assumptions that lurk behind our current problems:

  1. Communication is the exchange of information between individuals;
  2. Community is a network of individuals bound by communications (as defined above);
  3. Learning is adaptation.

For each assumption, I think we need to insert "more than" so that "Communication is more than the exchange of information", etc. But the obvious thing to then to say "in what way 'more'? How can we investigate it?" The dynamics of communication have been modelled in various ways, and in e-learning, one of the principal models is Pask's Conversation theory as it was adapted by Laurillard.

The problem with conversation theory that it posits the existence of a mechanism whereby individuals coordinate themselves with the communications of each other. The coordinations are based on the communications (the result of other coordinations) of the teacher, or other learners. It must be said that this is not an unsophisticated model; indeed, Pask developed a theory which accounted for 'concepts' themselves in a remarkable (but rather impenetrable) way.

But the bottom line is, if Pask was right, then the experience of MOOCs would be fantastic. They offer the epitome of the constructivist theory in action, where adaptation, concept formation and communication work together in an environment which is geared towards supporting it. (The connectionist metaphor is, of course, where the MOOC started in the first place, and this is entirely consistent with Pask's more sophisticated theory).

Pask is wrong for an important reason: thinking itself, not just communicating, is a social and environmental process. The world in its entirety, including what is and what isn't there, is constitutive of individual humanness; it is not a 'constraint' within which concrete processes of cognition operate. Merleau-Ponty's talk of "the flesh of the world" is perhaps the most visceral expression of this. Pask thinks that individual humanness is merely constituted by the coordination of communications. He ends up positing ideal humans. We have developed our technologies (not just learning technologies) based on this misapprehension too.

This brings me on to the second problem: community. If the individual is constituted by their environment, then it is a mistake to conceive an individual as a dot in a network. It is worse to misrepresent communities as clusters of dots. Each dot represents acts which are recognised to be there, recognised to be individual. And whilst each act may relate to other acts of other people (for example, I give you a present and you say "thank you!"), the act is not the person, and the sequence of acts is only a sign of something deeper that might be going on. Each act (the act of joining a forum, for example) is the result of deliberation of the individual which, as I have just indicated - is itself social at a deep level. Community lies in the deep bonds of trust, care, understanding and love that develop between people as they grow together. Community does not lie in patterns of acts between people per se. Teaching at its best, is about community. But isn't teaching at its very best face-to-face (at least with someone, although not necessarily with the teacher themselves...)? or even one-to-one?

Finally, we come to learning. The Piagetian idea of learning as adaptation needs challenging. Rather like Pask's conversation model, Piaget envisaged entities reacting to concrete perturbations in the environment. In a way, his is a biological model with a physical characterisation of causality. For Piaget, what we learn is an epiphenomenon of our adaptational processes. This model is wrong I believe, because it conflates biological adaptation processes with complex psychological and social performances. The problem lies in the conflation of biological reductionism - the flattening-out of structure.

Learning and Teaching are like music. At any moment we may determine 'elements' to which we might attribute an experience, or even reduce the gamut of our experiences to that element: it's all about the harmony, or all about the melody, or the rhythm, etc. In reality, however much we might want to bracket-out things which we don't want to consider in order to understand our experience, those things are still there. They have not been reduced, just overlooked. Indeed, the challenge of notation of music has been to codify the uncodifyable: to establish a normative representation that connects intuitive practice with intentions.

Whilst I write this, I am aware of a recent document about 'learning design': http://www.larnacadeclaration.org/. I should talk more about this later, but the document contains an analogy to music notation and 'educational notation' which I found a bit superficial. The document states that  regarding notation of music:
As a result, a musician living hundreds of years later, in a very different context, can still understand the musical ideas of a composer long ago, and with appropriate skills, can reproduce those musical ideas.
This is an inappropriate causal attribution. The notation does not do this. Music lives through practice, interpretation and study. Indeed, early notations appear really focused on coordinations of group practice. These things took place in monasteries, universities and (much later) orchestras and opera houses. (It is only relatively recently that some medieval notations have been interpreted so that they can be performed). It may however be true that notation created a common language whereby the study of this rich and mysterious art may be conducted and coordinated amongst practitioners. The processes of study (like the hermeneutic study of ancient texts) have to consider what is not there as well as what is. Performance takes place in this context, and always against the context of its own time, not some ossified recreation of the past. Literacy is not just about reading and writing; it is about the organisational, institutional, political, educational, technological and legal structures that facilitate the passing-on of the understanding of texts from one brain to another.

Efforts to establish notation of (and technologies for) learning have assumed a conflated reductionist model of learning and teaching, and consequently produced deficient experiences as the ignored levels of educational ontology assert themselves (in ways never anticipated by the learning designers). This can be organisationally disastrous for the Icarus-like attempts to promote new technologies (like the University of California blowing $4.9m on online courses and recruiting one student! http://www.sfgate.com/education/article/UC-online-courses-fail-to-lure-outsiders-4173639.php) Just as music's normative practice is always social (even if it's merely a dead composer's score and a solo performer), but where the magic is in the communion which results, so too with learning. Learning exists in an ontological context which includes society, politics, care, attachment and love. A functionalist notation (and functionalist technology) brackets these things out, or simply flattens them.

Our challenge now is in coming to terms with the stratified and irreducible levels of learning, communities and communication and finding ways in which our technologies can address and support these levels, rather than reduce and conflate them. 

Saturday 12 January 2013

What Does Creativity Do? - a Speculation using Positioning Theory

There's an awful lot of bleating on about creativity these days. "oh, it's really important!", "innovative economies need creativity!", etc, etc. It's a bit like saying sex is important. Well, I guess we talk a lot about that too. But does it get us anywhere? "What, (Sir Ken), are we doing differently because we've talked about creativity so much recently?"

Not a lot, I think. Except that we've given lots of attention (and Knighthoods) to people talking about creativity.  Here's a more sensible (and grounded) talk about creativity from John Cleese - who can at least reflect on his own experiences of being creative. It's an entertaining talk, but he acknowledges most of what he says is "completely useless". The best bit is at the end, when he says:

"now I come to the really important part and that is how to stop your subordinates becoming creative, because that is the real threat! No-one appreciates as I do what trouble creative people are and how they stop decisive hard-nosed bastards like you and me running businesses efficiently!"

This is interesting on a number of levels. First, it really is the most authentic thing that Cleese says in the whole talk: it goes to the heart of the issue. Creativity is political, and those in power often fear creativity as a challenge to their position. For creative subordinates, the only way round this is to come up with creative ideas and then convince the boss that they thought of it. Then something might happen. But otherwise, you stand a good chance of being fired!

But that suggests that creativity DOES SOMETHING.

If we are serious about changing what we do in the light of recognising the importance of creativity, we need to UNDERSTAND WHAT IT DOES!
  • It does something for individuals
  • It does something for organisations within which those individuals operate
  • It often does something for society
  • It often does something for economies

I've been thinking about this in the context of HarrĂ©'s Positioning Theory. Positioning Theory articulates a triadic relation between Normative Positions, Individual Storylines and Speech Acts (or more broadly I guess, agency):

Positions condition the thoughts and ideas of individuals. Those thoughts and ideas take the form of a 'storyline' - the way individuals see the world. From the storyline, agency emerges through speech acts which reproduce and transform the normative situation.

A creative act changes positioning. It might be thought of as a transformative speech act. But what gives rise to it is some kind of transformation in the storyline.

The storyline is the least well elaborated part of HarrĂ©'s theory. But I think thinking about creativity can be useful in this regard.

A creative person becomes aware of absences both in their environment and in themselves. They know that what they think is the result of what they cannot think, and that the absences they are immersed in have a causal bearing on patterns of thinking which might be 'stuck'. What they need to do to think new things is to determine absences so that they are no longer absent and can be directly addessed. A determination of absence changes the constraints of thought.

Monty Python is a superb example of this. So often, Cleese and his colleagues pinpointed absences - the things that constrained everyone's thinking but they couldn't articulate. The "Life of Brian" is full of determined absences.

Absences need not be external. Creative people most usually draw on their inner life, trying to find a reconciliation between inner experience and outer experience. This is a way of reconfiguring the storyline (much link reconciling the 'dilemmas' of Nigel Howard's drama theory which I wrote about yesterday).

The creative actor knows that their acts are potentially dangerous in the political environment within which they operate. This causes them to be subtle, subversive - even devious. But subtlety and subversion pay-off better than outright challenge. The normative situation does gradually change, and the landscape becomes more amenable to the collective determination of absences that the creative people first identified. This is why Ezra Pound says of poets that they are the "antennae of the race".

I may be wrong on this. But I certainly think that understanding the mechanism of the causal efficacy of creativity is more important than trying to understand what creativity is, how can you become creative, etc, etc. We need to understand creativity as political behaviour.

There are creative people amongst us. They do a job for everyone. On the day after the suicide of Aaron Swartz, we must be concerned about protecting the conditions within which they operate.

Friday 11 January 2013

Datadrama

Nigel Howard's "drama theory" is highly relevant to data analytics. Any piece of data is the result of deliberation and decision. Behind each item of data, there is (at least) psychological drama, if not real drama of events. Behind each item of data are the characters who form the context within which an act is taken. Each decision is taken in the light of a set of expectations that certain events will follow. If it is a good decision, the events will be in accord with expectations.

In Howard's methodology, things begin with a kind of 'agon' where each character articulates their position relative to the others (see http://en.wikipedia.org/wiki/Drama_theory); what they want; what their fall-back position is. When everyone is aware of everyone else, the characters work through their positions. This usually results in 'dilemmas' (in his theory of meta-rationality, Howard called these 'paradoxes') as the positions taken aren't tenable, threats aren't believable, etc. This causes emotional and psychological change, with characters developing their position as the drama unfolds. The wikipedia article describes that:

a character with an incredible threat makes it credible by becoming angry and finding reasons why it should prefer to carry out the threat; likewise, a character with an incredible promise feels positive emotion toward the other as it looks for reasons why it should prefer to carry its promise. Emotional tension leads to the climax, where characters re-define the moment of truth by finding rationalizations for changing positions, stated intentions, preferences, options or the set of characters.
There are specific 'dilemmas' or paradoxes, which Howard articulates (see the wikipedia article). His argument is that resolution in the drama cannot be reached until all the paradoxes or dilemmas are dissolved.

It is important to note that the dilemmas arise through particular strategic and meta-strategic priorities of the actors. What informs the strategy-making (and meta-strategy-making) are pieces of information or data which the actors either willingly release, or unwittingly reveal. The willing release of data will be a strategic act whose consequences will have been considered. The unwitting release of data will be a bi-product of some other kind of act.

A strategic act will depend on anticipation of the reaction of others. It will depend on guessing of the strategies and meta-strategies (the anticipations) of others.

A strategic release of information (the kind of data that gets analysed in data analytics) will be based on a calculation of how others will respond. However, it may be so calculated as to mislead others into thinking that the intentions behind the act (and so the strategy) are something other than it really is. An information leak can easily hide true intentions.

Misinformation will lead to certain paradoxes. One piece of information with an assumed strategy behind it will contradict another piece of information. Those on the receiving end of this kind of information will wonder how the originator of the information can be thinking, how they can be consistent.

What can happen here is that the apparent meaning of the data can be questioned in the light of what might be in the mind of the originator. "Maybe they said this to put us off the scent... Maybe they don't really care what happens, but want us to believe that they do!" and so on. Those on the receiving end of this stuff will construct a storyline which most closely fits.

What really is going on here is the business of identifying the kinds of constraints which might produce the assorted items of information. Is it madness? Is it self-interest? Is it a genuine mistake? and so on.

Analytically, this produces some interesting possibilities. Imagine we are presented with some data. Email messages sent, policy decisions made, reports of discussions, etc. Can we reconstruct the kind of psycho-drama that produced these events? This is a bit like psychological profiling. What this should reveal is the extent to which the overt meaning of speech acts, email messages, etc matches the intentional meaning behind those acts and their intended purpose. Where there is a mismatch between the intentionality behind messages and the expressed meaning of those messages (the meaning that might be initially constructed by recipients), then a refined view towards the interpretation of future messages is necessary. This would take into account the reasons why there is a mismatch. The 'refined' view  is actually a shift in the modelling of the originator of those messages - a more accurate identification of their absences.

Resolving the paradoxes means finding a "storyline which fits". But if we do arrive at something which fits, is there further analysis that might be done? Can the process be re-applied to an emergent narrative? Doing this might, I think, shed new light on new items of data to be searched for. It is rather like exploring a new theory of physics. The theory is an invention, but the invention suggests the existence of new physical phenomena - so we go looking for them.

Our tendency when analysing 'big data' is for the high-level global viewpoint. A drama-theory inspired perspective would:

  • start from particular items of data
  • suggest a psycho-drama that might produce the data
  • identify the conflicts between the suggested psycho-drama and the overt meaning of the messages (identify where the anticipations of the utterer of the messages is different from the anticipations of the audience)
  • redefine the psycho-drama
  • with a refined psychodrama, identify new possible items of data to support the theory
  • go and find them, together with randomly chosen new items of data which might challenge the theory.
So the trawl through the big-data set is a step-by-step exploration driven by suggested contexts for the production of the data.

To me, something like this puts the heart back into the data - something which I find sorely missing in most data analysis I encounter!


Thursday 10 January 2013

Do we need "Decision Analytics"?

For all the talk of 'big data' and analytics of various sorts, I'm wondering if we're looking at the right thing. Don't get me wrong, stats are great - in terms of self-gratification, I find my blog statistics as fascinating as games or porn! But that's probably an indication of something unhealthy about it...

The key is in the fact that blog statistics are open to interpretation. Examining them is rather more like examining Tarot cards or the I-Ching than the bigwigs of big data would like to admit. As such, what they offer is an opportunity to reflect on 'signs'. The fact that your processes of reflection involve opportunities to interact and 'drill-down' the data only adds to the fascination. All of which is not without value, but it's not quite what the learning analytics people (for example) claim.

The problem with data is that it is there; it is actual. Trying to reach conclusions about reality just from looking at the actual is prone to error. Each piece of data represents an act, or a decision, and behind each decision are all the things that we can't see. The reality comprises both that which we can and that which we can't see. [this of course is also a big problem with social science methodology, but that's another topic!]

Which leads me to think that it is decision, not data, which we need to analyse. If we treat data as markers of decisions, then what we need to look for is not the data itself, but its 'negative image'. We need analytical methods of cumulating and emerging a coherent negative fabric which can account for the constraints which may be likely to produce the positive (actual) acts and decisions of producing the data.

"Meaning" may be part of the negative fabric. Meanings are related to the anticipations of the likely responses to a particular decision. (My posting of this message is a decision based on the likely responses of the community to this message). To get  at the meaning of data, you have to dig deeper than the surface representation of the data. It's not about tweets connected to tweets, or messages exchanged between individuals on Facebook. It's about an individual's decisions.

We should then think about which individuals we are concerned about. I find the answer to this easy, although perhaps I haven't thought about it hard enough: "it's the people in power, stupid!" I certainly think the inspection of the negative fabric behind those making decisions that affect everyone else is an extremely important place to start. With the growth of management techniques like NLP (Neuro-Linguistic Programming) which rival the tricks of the Advertising executives for their manipulative power, the tricks of politicians and managers have got harder to read. We need to be more sophisticated.

Ironically, this may turn the tables on the global tech firms who have given us all this data in the first place. Their analytics are exploited to manipulate us, to sell us stuff, to make bigger profits for them. Negative analytics might reveal the ground behind their decision-making. It might even reveal their naked self-interest early enough for us to do something about it.

Now that would be fascinating!

Wednesday 9 January 2013

Seducing TINA

Some people find TINA incredibly attractive: the promise of the single door opening before you with a big red flashing neon sign saying "There Is No Alternative". All the deliberation, complexity, confusion and debate peels away in a moment of clear-sighted, hard-nosed decision-taking.

Of course, other people find TINA hideous and dangerous. Unfortunately, lust for TINA is most common amongst those in power.

Part of TINA's appeal lies in the way that the path to her single doorway can be manipulated by those in power. It takes the form of:

  1. a engineered crisis, or the exploitation of an existing crisis;
  2. the suppression of debate, opposition, humour and creativity;
  3. the cumulation of small, apparently insignificant moves, which gradually transform the situation to one where TINA becomes more likely (but too slowly for anyone to see it coming);
  4. the manipulation of decision-making bodies and regulatory mechanisms to ensure we stay on the path to TINA;
Eventually TINA arrives, nobody can do anything about it, and when she is truly revealed, what we see is naked self-interest. But by then, There Is No Alternative.

I think we're seeing this process all around us at the moment. In government, in the banks, in the media, in our institutions... everywhere. And people seem powerless to do anything about it. 

One of the major features of our current crisis (as Oleg pointed out to me yesterday) is that this time everyone is knackered. No-one's got the energy to fight when the fighting most needs to be done. It's as if we've awoken from a wild party with a terrible hangover, burdened with guilt because we've done things we know we shouldn't have done, and we simply can't deal with our own demons to muster the energy to take on the bad people.

This time there is no external demon that we can objectify as an 'other' and attack. After Saddam and Osama, we're done with that kind of thing. The 'other' is in us. We have to deal with ourselves. This crisis is existential in a way which is entirely new in the modern era.

Our politics needs a better way of defending itself against the TINA-merchants. We need to get better at reading the signs.

Those in power make decisions. Every decision reveals something in their hinterland. Whilst each decision may seem inconsequential, irrelevant and unconnected, each decision is an indicator of the constraints which produce it. Each decision, each thought is the product of what cannot be thought. If what can only be thought is the self-interest of the thinker, then the decisions will reveal this... and we will be in TINA trouble before long. 

The people need a means of performing a kind of negative analysis on those in charge. It is not only the regulation of society we should be concerned with; it is the virtue of the regulators. It may be that a negative analysis of decisions may provide an early-warning sign of emergent self-interest. I think we need to explore this possibility urgently. There is enough data from government and the draconian changes in our public and private institutions for an investigation.

Saturday 5 January 2013

Is online communication communication?

Email discussions are often hopeless because they reveal little of the person behind the messages. This is particularly noticeable on forums - which are the dominant mode of communication in e-learning platforms, including MOOCs.

I believe that what we learn, we learn about each other. Because of this, text messages in a forum are a deficient means of learning anything because they reduce the 'whole person' to their textual utterances. There is an opposing position to mine that says "what we learn we learn about ourselves." This position too has its merits, but it seems rather too solipsistic to me, and potentially dangerous since the fundamental human acts of promising and of forgiving cannot involve just one person. Arendt argues:
“The possible redemption from the predicament of irreversibility──of being unable to undo what one has done──is the faculty of forgiving. The remedy for unpredictability, for the chaotic uncertainty of the future, is contained in the faculty to make and keep promises. Both faculties depend upon plurality, on the presence and acting of others, for no man can forgive himself and no one can be bound by a promise made only to himself.” 
So to believe we only learn about ourselves is to say we forgive ourselves, and we validate our own promises. In the hands of a psychopath, this clearly is dangerous, and my biggest fear of this position is that it either directly leads to a self-validating fascism, or an idealism which becomes blinded to the potential for evil in the world - which similarly lets in the bad people.

I've been thinking more deeply about this need for other people in cognitive process. I think there is a determination of absence that takes place. Each utterance we make is a decision which is reached not just by what we can think, but by what we cannot think. Each utterance is a calculation based on our ability to coordinate and anticipate the direction of the communication, and consequently, our ability to anticipate the responses of those we converse with.  That means we have to determine the thinkable and the unthinkable of other people too. Indeed, I suspect that it is what we cannot think that has the greatest bearing on the decisions we take.

A discussion is a shared experience. Moves in the discussion depend on calculating the success of communications which entails anticipating what others will say (what Luhmann calls the double-contingencies of communication). Where the capacity to predict communications is hampered, and where one feels forced to utter something but unable to predict what will happen, confusion creates stress. This is the situation of a classic double-bind. Nothing is clearly demarcated, the absences of the other person (the unthinkables) cannot be determined, and consequently, their utterances cannot be anticipated, and we therefore have no grounds for making a decision ourselves.

For good conversation to take place, where each party learns something about the other, there needs to be the acknowledgement of some sort of "shared absence"; something that we both recognise we lack; something which demarcates the thinkable from the unthinkable in each of us. A conversation may then proceed to determine an aspect of that shared absence and establish a new concept. The learning about the other person is the acknowledgement and determination of what we both lack. A new concept transforms the world views of both participants. But this is an ideal situation.

Online it is hard to assess the absences of the other person. I can only judge their absences by taking decisions to make utterances which might elicit information about their absences. One way of doing this is to make assertions and see how the other person reacts to the assertion. Often in forums, however, the stakes of a communication are not high, and so assertions will tend to be ignored (which will of course tell us something about what's missing!)

If the stakes in online discussion are higher (e.g. if there is a threat to personal reputation in some way) then conversation may become unpleasant to the point that both parties seek a "way out". They cannot just stop communicating because they are caught in the double bind of the communication situation (needing to determine common absence but being deprived of the means to effectively do this). The stress and confusion of this situation will create complexities and difficulties in making communicative decisions - a situation that at the very least needs addressing in some way. One way out is to recognise that the absence which is shared is a "dislike of the present discussion". An agreement to differ is an identification of a "question" as the determination of an absence: we will in the final analysis agree that this absence is "is it x or is it y?". Such a new concept-question can offer some respite and an escape because the complex decision-trees can be reorganised with the new concept so as to lead to a mutual restructuring of expectations.

Face-to-face contact gives much more information about absences that online media. The eyes reveal far more about an absence than words. In fact technology presents its own absences. This may be what can make it seductive, since what is missing in the technology can only be substituted by the user. In this way, an attenuated communications medium does indeed lead us to reflect on ourselves, on our own absences. We might think we identify the absences of others, but these may in fact be detected in ourselves. But it can makes technologically-mediated communication narcissistic.

This makes me think about my argument that an online community is not a real community (see http://dailyimprovisation.blogspot.co.uk/2012/02/myth-of-online-community.html). Is online communication real communication? How do we distinguish our understanding of communication from our understanding of media?

Maybe thinking about absences can help us grasp the varieties of non-neutrality of media in communication.

Friday 4 January 2013

Virtual Theatres

I've been playing with some software called "Crazy Talk" (see http://www.reallusion.com/crazytalk/Animator/help/animator/pro/) with which I've produced a video to introduce some software for the TRAILER project:



and all this has set me thinking about some of the new possibilities for articulating ideas which technology is making available to us.

I've always felt that communication technologies are a bit like 'masks': the real 'me' hides not just behind my language, but behind the word-processor, the blog, the email, etc. There's masking in my  video not just in the digital puppets, but in the sound too.

In the online environment you have to wear a mask. But even in the real world, letters have always masked the individual (although hand-written letters reveal more than word-processed ones); CVs still mask us as part of the process of gaining employment. Sometimes we see real people behaving robotically behind other kinds of mask - like cars, for example - or even sunglasses.

Another word for 'mask' is 'persona'.

One thing that has happened as technologies have developed is that the sophistication of the masks that are available to ordinary people has increased. Skills that were once only the domain of the typesetter become available to anyone with a word-processor; online publishing which was once the domain of geeks is now available to anyone who sets up a blog. But as ubiquity replaces novelty, so the drive for new novelty continues. Technological progress appears driven by continual critique and the determination of deficiency.

So my latest obsession with deficiency has led me to consider the potential of 'virtual theatres' and the presentation of information as dialogue rather than monologue. Crazy Talk enables me to do things only professional animators were able to do a few years ago. Actually, I want to go further, because I want to have my virtual theatre as 'interactive' - that would enable me to virtually produce some of the drama games that Augusto Boal (see http://en.wikipedia.org/wiki/Augusto_Boal) developed and which I've found to be so powerful in face-to-face interactions. I wouldn't be surprised to see this kind of stuff becoming ubiquitous soon.

Will I ever be satisfied?

No.

There is always deficiency - not just because the world changes, but because there were deficiencies in us to begin with. That's why we have technology in the first place. It's sticking-plaster over deep absences which are in us, not in our tools. But if playing with technology is a way of accepting that, then maybe it needn't be as pathological as I sometimes fear!