Genres of Resonance


Resonance might usefully be split into different genres of resonance: those that reinforce, those that are cross-modal, those that are modality-free, and those that cross gross semiotic boundaries: i.e. across from text to embodiment, and back.

Or to look at it another way, we might distinguish between resonances that close down possibilities and horizons, versus resonances that - paradoxically - open up, and open out, what is possible, as well as the range of the possible. (Cross posted from 14/5/17, here ...). What is important is the relationship between context on the one hand, and the balance and 'fit' between structure and agency on the other hand.

1. Structure and Agency


Resonance is a matter of convergence between different waves, patterns, energy, thoughts, practices, etc. When this is captured in texts, it builds knowledge, or "the capacity for (more) effective action". This convergence results in a mutual strengthening of the different components/events, and/or actors. This in turn is based on the extent to which the components (e.g. sounds from different instruments, or parts of an instrument) share 'structures' - whether these are sound or light waves, or patterns, thoughts, or practices. The more convergence, the more coherence, the simpler (and metaphorically, the more simplistic) the resonance - or, conversely, the more pronounced the dissonance, in which case the result is tensions, paradoxes and ambiguity.

The resulting resonant structure, closes down, or opens up, the space for agency (or building up knowledge, and the capacity for independent action), either for compliance, or for initiative, innovation, exploration, and creativity. Or to put it another way, structure and agency define the range of affordances for the self to engage with the social world.

[Aside: ... There is also a distorted kind of political meta-'resonance', loyalty ... see the FBI Director, Comey's, distinction between allegiance and loyalty. In January 2017 Trump asked him for 'loyalty', and Comey, correctly reading this as blind allegiance, said he could offer the President only honesty [loyalty, in the FBI, should be solely to the Constitution]. Comey was duly fired before the end of May 2017. Perhaps what Trump understood by loyalty was even further from Comey's set of values, i.e. for Trump 'loyalty' actually meant collusion: the type of joint enterprise where authority determines truth/iness, and where truth has no authority. Loyalty usually means a broader set of values than collusion, or personal/tribalist 'loyalty', as in the Mafia, and other secretive corporate practices. McLuhan presciently said that the global village we were already headed for, many decades ago, would be tribal and fractured. Perhaps we need to keep in mind that 'tribal' easily elides into 'tribalist', which could be the key virus/meme to watch out for - in the political tribes of the the West and East, just as much as in the '3rd World'].


2. Algorithmic Identity

Changes to Marx's modes of production are always disruptive. They change the dynamics of structure and agency, and they change the 'fit' between different types of actors/agents and socio-economic context. For instance, after many millennia of the rule of the alpha male, this has been supplanted, to quite some extent, by the rule of the geeks, even though it is accompanied by anger and resentment from the (dying??) spasms of alpha male culture (see the South African campaign against violence on women: #MenAreTrash, and Donald Trump's rust-belt tango).

Or, to put it another way - as Thomas Khun rephrased it (about 120 years after Marx) - it's all about changing the paradigms for the production of intellectual capital. Power is in the gift of those who control the production of capital - financial and/or intellectual (preferably, both), and the modes of production of both types of capital have been substantially disrupted and reconfigured by the internet: e.g. in Knorr-Cetina's micro-global structures [add link ...] of the 24/7/globalisation of financial markets as well as transnational terrorism, and in McLuhan's fractured, tribal/-ist global village [add link ...] of connectivity, which has morphed into social media - from the freemium business model to the globalised commoditization of social capital (including political capital too, now). By 2017 Facebook has reached over 2 billion users, and a quarterly profit of $3.89 billion (second quarter, 2017).

When we wake up - somewhere within the cycle of global 24/7 'time' (there are no absolute days/nights any more), and reach for our interfaces to the www, we plug ourselves straight into this; the addiction and dependency is already complete. Consequently, the real nightmare is not old fashioned nuclear weapons (they are probably too big to be used) but rather the next generation of weapons, the EMP (electromagnetic pulse) weapons (nuclear and non-nuclear). The high altitude (H)EMP's, for instance, simply fry all the electronic systems on the ground, over a very wide range, from an immense height. A interesting, new kind of digital-cleansing / dystopia. (Off-grid is just too easy, too pretentious, it just takes a tap on the on a button to reconnect, and we are all already complicit in the 'net anyway. And whether the internet proves to be as resilient to this kind of attack as its designers planned remains moot).

In the meantime, in the provisional absence of any of these modes of old fashioned nuclear war, or bombardment by the newer e-weapons, we need to get on with understanding the structures of how our internetted social reproduction: i.e. increasingly within the global algorithms of 'social' media, or electronic bureaucracy (or perhaps 'e-rocracy', that's got a nice buzz to it, no?).

2.1 e-rocracy
This is a new morph of bureaucracy into rule by algorithm, which has ambitions to displace consent, and insert itself into the space vacated by the discredited social engineering projects of the 1960's and 70's (see the political adage that it is preferable to rule by consent rather than by brute force, although force 'can never be ruled out'). The big difference of course is that e-rocracy is privately owned, and run for private, off-shore, shareholder's profit. Old fashioned public-sector bureaucracy, with the public as shareholders, and at least some democratic accountability, is being quietly pushed aside.

e-Rocracy goes much further than consent, and is a fusion of what used to be called 'intelligence profiling' and the freemium business model. Data/surveillance is the new capital, and the new proletariat is us - we contribute our labour and surplus value, in the form of our postings (like this one), for 'free' - the capitalists dream scenario. Or as Theresa May says, "the fight against ISIS ... [and for hegemony, generally, one would presume] ... is shifting from the battlefield to the internet". She is correct, and perhaps even chillingly so. In surveillance terms, we all unwittingly work for the 'special branch' now.

The problem May addresses here is of course real; the question is not whether she is correct, but whether she will resist the temptation for over-reach. The fictional 1984 scenario worked so well because, well, it just worked so well, and the temptation was just too great.

2.1.1 The fusion of AI and bureaucracy.
This questions that e-rocracy raise are about the factual, political, and regulatory limits of AI (and, by the way, whether Trump is just an unflattering beta-version AI bot, created by silicon valley). The missing link in 'AI' is the lack of subjectivity, inter-subjectivity (and self-awareness), which it shares with old fashioned computerised bureaucracy.

(Cross posted, and edited, from 13/5/17, here )
The simplest way to think about it is to consider the one, crucial, aspect of abstraction that is, by definition, not possible to programme into AI: i.e. the token swapping that occurs when you say “ I think x, what do you think?” and I reply: ‘I think x.1, does that make sense to you?”. Conversation is premised on passing around, and exchanging, the 'empty' signs, “I” and “you” which in a sense are 'meaning-less’ (because they are so unstable, their meaning reverses all the time, as each exchange takes place), but which are also, paradoxically, the tokens and the mechanisms for constructing our inter-subjective commitment to a common, semiotic, meaning, which we are both committed to maintain / develop further, based on our understandings of a much wider, dynamic, social context.

Such meaning is, paradoxically, not primarily abstract, in the sense of being ‘context- and subject- free'. It is meaningful precisely because it is grounded in shared context and shared (and jointly created) inter-subjectivities history/ies and aspirations.

That tiny point in time in which minds touch each other, and share aspirations, is denied to machines. They can make sense, they might even be able to create something sensible, but they cannot make meaning.

The question “what do you think?” cannot be asked of a computer, because the computer has no “I” to exchange. And simulating a ‘voice’ for a computer that would say “I” is a non-sequitur, which is perhaps why the David Walliams comedy sketch about bureaucracy always has the computer saying “computer says no”, rather than “I say no”. Which points to the crux of the problem, and of the absurd humour in Walliams's sketch, which is that if the computer ever did say “I said no” you really, really, couldn’t ask “And who the .... do you think you are?”

Of course this is nothing new. This is precisely the point of bureaucracy - whether computerised or not. Namely, to create a totally abstract set of rules and procedures, abstracted to such an extent that you cant ask ‘why’, because bureaucracy is precisely context- and subject- stripped; it has no “I”. It is constructed in such a way as to be immune from questioning, because it is an embodiment of a kind an amorphous 'common sense', which has no individual subject. That is precisely the value and the frustration of bureaucracy: the 'subject' of bureaucracy, if it has one at all, is supposed to be the public / good - which already includes you, and it makes no sense to question yourself, does it? That would merely be an infinite recursion.

The 'threat' is not the machines, or AI, but rather the people promoting AI (and 'big' data / 'big' alpha males) as the solution to human evolution, a solution leading to a "robot society which might keep us humans as pets [in which] the subjective recognition, done by diverse resonating individuals, is replaced by algorithms whose supervision is thought to be optimized by rationality (or by a dictator) to find the one true solution" (from Matthias, by email).

This is just warmed over 'social engineering' [with algorithm-sauce added] from the 1970's, but it smells of 1984, and, ironically for a machine, pseudo-religious teleology. So we are all trapped, and complicit, in the bureaucracies (and worse, now, e-rocracies) of the 'social contract' that we (willingly?) vote for, again and again, and delegate blindly to our democratically elected 'leaders' until they call the next election. This is what we have to resolve.


2.2 Identity politics has been automated
Rob Horning (May 17, 2017) analyses the problem quite succinctly. It has long been the case that the governors of the world conjure up hypothetical avatars of 'social actors', design social policy for them, and then try to realise corresponding behavioural change in real people without the wheels falling off (as this would invite dissent or insurrection against the governors). Large amounts of largess and money are spent on (unelected) 'special advisors' to make social policy 'work' in this way. And in the UK this increasingly takes place behind the closed doors of No. 10, with few or no members of the UK 'House' being invited (thanks to Tony Blair's sofa-government meme). At least in the US, Trump has to talk to Capitol Hill, as he is finding out in his own bumpy 'apprenticeship'.

The 'avatars' of social policy are constructed using a range of information gathering tools, from (sp)eyes on the ground to numerous forms of data collection and harvesting. Statistics and surveys were once used for 'sampling at a distance' of large, amorphous samples, which were used to create broad, generalised interventions. Big data on the other hand claims to provide real time, 24/7, interactive engagement - up close and intimate - both in the data harvested, and in the micro-targeted interventions. It's 'in your face/book', in more ways than one. It is a campaign manager's dream - whether it is a commercial or political campaign.

That may sound like a warm-fuzzy social media collaboration, but the intimacy is intrusive; data mining is more like voyeurism, and it's no more interactive than the oil companies mining of the planet's resources. And people like Trump (he's just one of many 'leaders' who is doing this) - support this with legislation [add link to the latest Internet laws] encouraging a hybrid of surveillance and the privatisation of intrusion and the harvesting of data, which creates the global free trade - for the state/capital - in all our communications: Foucault's worst nightmare, of the panopticon intruding its gaze, 24/7, into the most intimate capillaries of power. (Who says Trump is anti-globalisation?).

What is perhaps worse, still, is that this is done within dark-ware (the software algorithms of big companies and the security sector) none of which the public can see, let alone exercise any oversight over. Hypothetically, that might be something that the public would be prepared to take on good faith, but the consequences (which is all we are allowed to see) are not pretty.

  • "Just as we don’t know how these systems calculate our identity and rank it for various purposes, we often don’t know why either. This means they can be used behind our back to mark us as persons of interest to police and border agents, or to single us out as an insurance risk or for other categorical forms of discrimination without any human agents having direct knowledge. (All quotes from Horning: Sick of Myself: Algorithmic identity is a means of control and consolation unless otherwise stated).

  • "They can render certain concatenations of data to be normal and others to be deviant and socially disqualifying". These machine-'learned' prejudices may not even have human names, which makes it harder for people to unite and fight against them. The labels cannot be reclaimed as principles of solidarity. "You might find yourself on a terrorist watch list or in quarantine for a flu you don’t have, merely because of data associations".

  • "As more information about ourselves is captured within Big Data systems by phones, social media platforms, fitness trackers, facial recognition software, and other forms of surveillance, algorithms assign identity markers to us, place us in categories based on correlations to patterns drawn from massive data sets, regardless of whether these correspond to how we think of ourselves.

  • "We become, to an extent, what other people do, as their data contributes to how ours is interpreted. The system will infer our identity, according to categories it defines or invents, and use these to shape our environments and further guide our behavior, sharpen the way we have been classified, and make the data about us denser, deeper. As these positivist systems saturate social existence, they nullify the idea that there is something about identity that can’t be captured as data". (And that 'something' could be summed up as a unique culture).

e-Rocracy is yet another layer of semiotic alienation, over and above the potential alienation of meta-semiotics. Meta-semiotics [add link ... ] includes all the systems and structures (memes and temes, if you like [add links]) that are context- and subject- stripped: from counting systems to mathematics, money, financial markets and futures markets, bureaucracy, representative democracy and referenda, and even the digital internet itself (although the internet has other attributes too).

The important thing about meta-semiotics is that it intersects with mundane human activity; even though its algorithms might be complicated, they are in principle reasonably transparent, which means that instances of meta-semiotic use, like money, include both the rather abstract meta-semiotics of exchange value, and the plain old semiotics of use value. And the two values are always 'equivalent' (even though dynamic and changing), and can be used as affordances [add link to article] for action by anyone, who can appropriate them for building their capability for effective action, their knowledge, and their identities.

This is not true of e-rocracy; it's substance (data correlations and interpretations) are not made by human action, nor can they be openly used to exercise agency and create knowledge and identity. That is now done by a black box, which produces statistical and algorithmic sense, but not necessarily any meaning. On the contrary, it may well produce irritants, false accusations, lies, etc; or, as the computer might say, "false positives", which sounds alarmingly like Trump's "fake news" or Theresa May's "fake claims" (from opposition parties, of course). (See Horning's examples of false listings on a terrorist list, or quarantine programme, above).

These 'false positives' are created ...

  • "because what gets calculated by algorithmic systems to be race, gender, age, or political affiliation is a selection of data markers that may have no connection to the social indicators used to determine those categories — it may neglect even how individuals self-identify. ... If all the content on Facebook is tailored to suit the company’s construction of who we are, then consuming it is like consuming a [synthetic, algorithmically-modified], 'coherent' version of ourselves. It also reinforces the idea that the best place to glimpse your stable social identity is on Facebook. Engagement with social media then signals our assent to this algorithmic figuring of the self, an identity we step into when we access platforms that feels as if it has always already been inside us somehow".

  • "The way our self-expression gets ranked in likes and shares in social media would seem to subordinate identity to competition over metricized attention, dividing peers into winners and losers. And the creation of identity in the form of a data archive would seem to fashion not a grounded self but an always incomplete and inadequate double — a 'self partially forced from the body.' (a quote from R.D. Laing's Divided Self ). The neoliberal demand that we convert our lives into capital and grow it systematically seizes on the ideal of self-expression and strips it of its dignity and allure".

Precisely. Facebook as the (latest) apogee of what Robin Cooke (UK Foreign Secretary who resigned from Blair's cabinet over Iraq) called 'feral capitalism". God help us. Identity politics has just been appropriated and automated by AI.

It goes without saying that these tools for automated harvesting and intervention are structures (or memes/temes) that close down agency for the subjects of the communication, and open agency up for alienating and transforming experience into intellectual capital, and influencing (if not controlling) a wide range of behaviours for the benefit of strangers - or identity hedge-fund managers - who extract and accumulate identity-capital.


It might be best to explore this in terms of specific examples, of real contexts, such as gardening, and learning ...


3. The Constant Gardener

Like all other forms of capital, you can't 'make' identity capital, you can only accumulate it, and even then, you have to keep a constant eye open for shifts in the capital markets. However, what you can do is to create, manage, regulate, sanction (etc) what happens with the structures (laws, regulations, markets) that facilitate particular types and genres of agency.

One way to think about this is to use the example of gardening / horticulture / agriculture, in, for instance:

Virgin gardens: Jan Smuts, ecologist (and Prime Minister of South Africa in second quarter of the 20th century), lived in
Meadow gardening:
Formal gardening, Landscaping
Mono-agriculture / Mixed farming:
GM crops (genetically modified agriculture),
Organic farming, etc.



4. Learning and autonomy

Learning and autonomy are another way to think about structure and agency.

... depending on: context, intent, aims, affordances ... WIP





It might be best to explore this in terms of specific examples, of real contexts ...







1.1 Montessori pre-schools
The ... WIP ...