Saturday, 30 October 2010

Todays action is at the NCP's

Friday, 29 October 2010

Modern reality - a product of thought fragmentation


Coherence requires hierarchy. A single sytem in terms of which all other systems can (at some level of approximation) be harmonized. 

Modernity is - according to Niklas Luhmann - progressively increasing functional differentiation.

In modernity, there is no heirarchy, instead there is segmentation. Society is a mosaic of discontinuous pieces placed side by side.

And the segments get ever smaller.


So that (say) society starts out with a hierarchy - religion at the top, everything else subordinated. Law, education, health, entertainment - all are subordinated to religion, and ideally all will promote the religion (or will be suppressed if they conflict with the religion).  And there are not many specializations.

But in modernity, first philosophy became independent of religion - from about 1200 religion and philosophy must be compatible - from Descartes (or so) philosophy had its own system of evaluations - from then on it began to fragment into science, then ever more sciences. Each with its own system of evaluation.

Now science (even science!) is utterly fragmented, different sciences have utterly different systems of experiment, proof, evaluation etc. Even in biology, even in neuroscience there are thousands of people pursuing utterly independent programs - nobody tries to put these programs together and if they do try, nobody takes any notice. And if a program is criticized by another program (using different evaluations) this is ignored, and indeed specialization continues and each becomes ever more irrefutable.


We are used to this in science, we are used to it in life.

Life is a succession of micro-specialized evaluations. Nobody tries to add them up, but if they did nobody would take any notice (it would be missing the point, it would merely display ignorance, incompetence).

Life is broken up into ever smaller chunks, each of which is 'explained' (or conceptualized) by ever smaller specialized disciplines. A piece of prose will go through these sequentially - it doesn't need lots of specialists to do this, many people can mimic the process; and anyone can assemble the quotes.

But it can be done with quotes. An old days documentary  was one man expoundung an idea, a thesis - a modern documentary is a series of talking heads and disembodied voices, and diagrams and pictures - each looking at a little bit of - what? Well there is no thesis exactly, maybe there is a take home message, but no thesis - the various parts are not subordinated to an overarching thesis.


But why not? Because there is no hierarchy.

Modernity is a mosaic not a hierarchy, so there can be no integration. Integration involves subordination.

Subordination involves suppression of some field, selection from other fields - it involves orientation, teleology - pointing many fields in a single direction, evaluating them in terms of their contribution to this goal.

Without conceptual hierarchy, life is just one dammned thing after another. There is no sense to be made of it - neither in principle nor in practice. 


The sound bite, the short attention span, the rapid cut, the restless appetite for the new, for a change.

The tolerance of unrelated segments (the modern news program, the modern magazine of newspaper, web browsing).

The blend of sheer distraction with emotional self-manipulation.

These phenomena are intrinsic to modernity, and necessary to such incoherent, vapid trash as political correctness.


Dream experience - the product of thought fragmentation


From Karl Jaspers General Psychopathology page 99:

"Suddenly things seem to mean something quite different. The patient sees people in uniform in the street; they are Spanish soldiers. There are other uniforms: they are Turkish soldiers. Soldiers of all kinds are being concentrated here.

"There is a world war. Then a man in a brown jacket is seen a few steps away. He is the dead Archduke who has resurrected.

"Two people in raincoats are Schiller and Goethe. There are scaffolding's up on some houses; she knows at once he is an old lover of hers; he looks quite different it is true; he has disguised himself with a wig and there are other changes.

"It is all a bit queer."


This is an account of delusional perceptions, it could equally be the account of a dream.

The mood behind this is perplexity.

"Patients feel uncanny and that there is something suspicious afoot. Everything gets a new meaning. The environment is somehow different (...) some change which envelops everything with a subtle, pervasive and strangely uncertain light."


In dream experience there is an alternation between puzzlement - that something is not quite right, absurd indeed; and a bland acceptance of whatever emerges.

Indeed some dreams are just the one, or the other. In some dreams there is no perplexity but only a series of weird events which flow into each other and are accepted without question. Other dreams are just like a slice of waking life, logical, maybe mundane.


Likewise with delusions. Some delusions are mundane, logical, in clear consciousness; others are utterly bizarre alternative worlds - in an other worldly state of altered and clouded consciousness, where the patient lives in imagination.

But the characteristic situation for psychotics is an in-between state of perplexed, semi-rational bizarreness.

Underpinning this is disordered thought, underpinning disordered thought is fluctuating consciousness - delirium which comes and goes.

Segments of dream alternate with segments of rationality which try to make sense of the dreams but are interrupted by further dream segments before progress can be made - the world is bizarre and it is puzzling.


The primary delusional nature of PC - holding-together the Left coalition


In what follows I am talking about the process by which a false belief is created and sustained at a group level, and not about individual people.

Obviously, it would be plain silly (as well as wrong) to assert that PC is a mass primary delusion which happened to arise simultaneously among many individual people.

What I am talking about is the mechanism of PC delusion formation PC at the group level.

Individuals participate in the PC group delusion (indeed they are now forced to participate); but that is not its origin, nor is it individuals who sustain the process of thinking imposed by PC.


While some roots of PC are innate, and some trends of modernization can be traced back a long way (at least 1000 years, with acceleration from about 300 years ago) - nonetheless PC crystallized and grew to become large and powerful from the late 1960s.

People tend to forget how rationalist was mainstream Old Left political thinking up to the mid 1960s - it was essentially a highly intellectual, Marxist-based thing.

In the UK the dominant Leftism was from the Fabian society: an upper middle class led, meritocratic, non-revolutionary gradualism, based on the assumed superiority of the planned economy, the assumed superiority of bureaucracy over markets, and therefore nationalisation of all significant parts of the economy. 

This was highly meritocratic, based on equality of opportunity (enforced by the state), and egalitarian in terms of economic distribution - the state would substantially equalize incomes.

There was a belief that economic hardship damaged life chances, and that there was therefore more talent among the poorer classes than was being allowed to emerge.

But for Fabians there was no question about the fact that people were themselves unequal in character and ability. Socialism was not about denying the obvious facts - it was about rewarding people more or less the same, and giving them the same life chances, regardless of their character and ability.

So Old Left socialism was limited in scope mainly to economics (at least in its aspired scope - in practice it became corrupted into totalitarianism) - and it was not necessarily, nor even usually, atheist - there were many Christian socialists.

The Fabians were not trying to build heaven on earth, merely trying to alleviate some of the more extreme suffering on earth.


So mainstream Fabian socialism was very different indeed from political correctness - although, as a matter of observation, many Fabian socialists became PC, went along with it, as it emerged.


The other strand of socialism came from idealist roots, in people such as William Morris, but further back the romantics.

However, it did not come from specific people, it was not primarily rational - it was an irruption of animism, of id, of the craving for connection with the world - it sought a society in which alienation would be abolished, a society of positive bliss where people could live naturally, at their fullest human potential.

It hated industrialisation, hated bureaucracy, hated rationality - it sought a situation where for everyone impulse and instinct could flow swiftly and unimpeded into action.

It was about remaking society as paradise upon earth; and when existing humans impeded this, it was about re-making humankind.

It was this strand of unreasonable, irrational, Utopian leftism which irrupted in the 1960s among the intellectual elite and which threatened to sweep away the Old Left.


But the Old Left was already in deep trouble, although few had yet realized it.

The moral force of the Old Left came from the idea that lower class people of brains and ability were being kept down, while upper class people - many or most of whom who lacked brains and ability - were ruling the country.

Simply from the perspective of efficiency this was undesirable.

However, by the mid-twentieth century (with improvements in psychometric testing) it gradually became clear that modern societies already had equality of opportunity, the meritocracy was already in place - or at least as equal as could reasonably be aspired to, in an imperfect world.

So Old Left socialism - if it was honest - had lost it main moral driving force.


At around the same time, or a bit later, the Old Left also lost its economic legitimacy; as it became ever-clearer that economic planning, the replacement of markets with bureaucracy, was less- rather than more- efficient.


So the Old Left were, as a movement, en route to electoral destruction - caught between the hammer of the idealistic Utopian New Left and the Right. The Old Left had no idealism, and they had no rationale - and they were rapidly losing their electoral base.

The New Left electoral base was not rooted in the working class/ proletariat - but in gathering all kinds of groups (a 'rainbow coalition')  who felt (or could be made to feel) oppressed - and added these to the traditional proletarian (Trade Union) left, with its upper class Marxist leadership.

This involved systemic dishonesty - and it was from the mid-1960s that the Left became based upon dishonesty (not just accidentally dishonest, or corrupted into dishonesty - but dishonest at its deepest root.)


Political correctness emerged as a modus operandi between the Old Left and the New Left.

Although a rainbow coalition of the disaffected made electoral sense as a way for the Left to survive, there seemed to be no way that anything could possibly hold together groups whose interests were - in the end - mutually exclusive and even in the short-term were in conflict.

To survive, as it has, the Left evolved that group cognitive process which we call PC, which alternates between Old Left bureaucracy and New Left/ romantic Utopianism.


The psychological basis of the PC Left is simply, merely, being anti-Right - merely the idea that the Right is evil.

At an individual level this ascribed evil on the Right could be of many, contradictory origins - its fundamental basis might be economic oppression, racism, sexism, environmental destruction, hereditary aristocracy, free market libertarianism, some personal slight, nationalism, the soul-less  materialism of the Right, the Christianity of the Right : the specific grudge does not matter, so long as the Right-haters continue to work together, vote together.


Nobody invented political correctness, but over a relatively short period its amazing explanatory power dawned on more and more people on the Left.

In order to mobilize passion, PC can talk like a 1960s hippie revolutionary; in order to get power and stay in power PC can act like a 19th century Prussian bureaucracy. It simply flips back and forth in response to threats from the Right.

PC will use whatever works to defeat the Right, here and now, in this particular situation, regardless of the implications - because none of the implications will be followed through but merely swept aside in the next fight.

PC is a bundle of tactics. But PC tactics do not ever add-up to a strategy.


PC is hard to defeat, and impossible to defeat rationally, because it is not aiming anywhere in particular in any sustained way - it is aiming to defeat the right and to keep-together its coalition in this fight.

Sometimes PC is anti-racism, sometimes anti-sexism; sometimes pro-worker, sometimes pro-shirker; sometimes it is nationalist, sometimes it is internationalist; sometimes pro-economic growth, sometimes engaged in wrecking the economy; sometimes it aims to cover society in a blanket of bureaucracy, sometimes it wants a society of utterly free and spontaneous individuals - and so on.

Each of these happens for a segment of time, then is followed (but not consecutively) by something else happening; and the earlier is pushed aside.

PC is like the evening news, and then and then and then - something shocking, then something inspiring, then some politics, then a bad person, then a funny item, then a bit of gossip - it never adds-up, it cannot be added up, nobody tries to add it up; it is just a means of attracting and holding attention.

PC is functionally just a means of attracting and holding-together a short-term effective anti-Right alliance.  

These irreconcilable demands are not to be balanced, nor to be compromised, but instead a mode of disrupted social cognition is to be imposed in which apparently all of this is possible, or rather not-impossible.


Delusions are, at root, not about belief, but about a mode of thought, a style of cognition - they are rooted in thought disorder where consecutiveness is disrupted.

Political correctness is an insane mode of discourse. It cannot be argued-away because its mode of thinking is not rational.

If you argue rationally against PC you will be met by gut feelings, if you argue emotionally this will be discarded and forgotten, and you will be met by rationalist bureaucratic speak.

If you argue from the needs of proletarian workers, you will be met by the needs of immigrants and the underclass; but if you reverse the basis of the argument then PC will reverse its focus of concern.

PC knows that the Right is evil and that anything they do to defeat it here and now is justifiable.


That's it. PC is a delusional system which arose, evolved in order to preserve the power of the political Left; it must be intrinsically (not accidentally, not remediably) delusional (irrational, fragmented) in order to hold-together the disparate and conflicted members of the Old/ New Leftist rainbow coalition in their opposition to the Right; and the dishonesty of PC is an intrinsic (although unintended, because unperceived) by-product of this cognitive  irrationality.


Thursday, 28 October 2010

Is political correctness an un-understandable primary delusion?



Or, at least, political correctness shares at the group level many properties of the un-understandable primary delusion in an individual with psychotic illness.


A primary delusion is something which arises without external cause, and is contrasted with a secondary delusion that explains something such as why a patient is hearing voices or having other strange experiences. (e.g. the secondary delusion that the patient is hearing voices inaudible to others because aliens are projecting them with x-rays, or the strange sensation under the skin is caused by an infestation of insects.)

The classic primary delusion is something like; "I saw the traffic lights turn green, and knew I was the son of God".

Comes from nowhere, without rational connection, and cannot be understood by the outsider. 

But for the patient, the primary delusion then goes on to be elaborated as a 'master key' for understanding the world.


But although un-understandable, the primary delusion does not come out of nowhere - it is usually the resolution of a period of time of delusional 'mood' or 'atmosphere' , of perplexity and suspicion - that "something is going on".

In this state everything is experienced (to quote Andrew Sims in Symptoms in the Mind page 125) "as sinister, portentous, uncanny, peculiar in an indefinable way. He knows that he is personally involved, but cannot tell how. He has a feeling of anticipation, sometimes even of excitement, that soon all the separate parts of his experience will fit together to reveal something immensely significant."

The primary delusion provides exactly that: "When the delusion becomes fully formed, he often appears to accept it with a feeling of relief from the previous unendurable tension of the atmosphere. "


My understanding is that this delusional mood is itself a consequence of thought-disorder, of fragmentation of the train of thought such that reasoning is disrupted, ideas cannot be followed through: short periods of consecutive logic continually being derailed.

And my understanding of thought disorder is that it is a delirious, dreamlike state which comes and goes - so the patient experiences fragments of dream illogic interspersed with fragments of waking reason; fragmentary meanings which never get to add up.


The analogy with political correctness is that PC is a primary delusion which derives from the perplexed atmosphere resulting from a fragmented and discontinuous mode of thinking - except that this is primarily at the social level rather than the individual level.

I regard PC as a combination of Marxist atheist materialist rationalism - the desire for social efficiency, for the sake of efficiency; but continually disrupted by the dreamlike illogic of primary process thinking, of id, of spontaneous instinctuality.

In other words, a mode of thinking in which Old Left Marxian or Fabian ultra-logical planning is continually disrupted by New Left, counter-cultural, utopian idealism; when hard-nosed bureaucracy experiences irruptions of un-bounded hippie fantasy; where cold rationalism alternates with warm-hearted wishful-thinking.

This is the perplexed delusional atmosphere characteristic of the ruling intellectual elite in the 1960s, the feeling of anticipation and excitement, that unendurable tension, from which the primary delusion of political correctness arose  to fit together the separate parts of experience, revealing something immensely significant.


So political correctness makes no more real world sense than a primary delusion in schizophrenia - and it is equally immune to reality-testing; but PC provides the same sense of relief, of resolution to the unendurable tensions of leftist politics.

PC makes psychological sense, not political sense. It can only partly be understood by logic, and only partly understood in terms of wishful fantasy - but is able in practice to fuse these two irreconcilable modes.

But so profound and totalizing a delusion is only possible due to the profound thought-disorder operative in modernity; a situation in which the ruling intellectual elite can neither abandon logic nor live wholly by it; neither abandon animistic fantasy nor live wholly by it - and are doomed to alternate wildly, unpredictably, uncontrollably between the two.


Green grow the rushes O - and oral transmission


JRR Tolkien saw the myth-making process at work in his lifetime when he got back, indirectly (via an identification query sent from the USA to Oxford University) an orally-transmitted version of his poem Errantry which had preserved the 13th century word 'sigaldry' (= enchantment).

"...bore out my views on oral tradition (at any rate in early stages): sc. that the 'hard words' are well preserved, and the more common words altered, but the metre is often disturbed." 

(Letter to Rayner Unwin, 22 June 1952).

This corresponds to the way in which the garbling process of oral transmission yet retains a fascinating core; based around the preservation of strange words or phrases.


I find this quality in many nursery rhymes - for instance in the counting song 'Green grow the rushes O' which has always cast a spell over me since childhood.


This is the final verse as I knew it:

I'll sing you twelve O
Green grow the rushes O

What are your twelve O?

Twelve for the twelve Apostles
Eleven for the 'leven who went to heaven
And ten for the ten commandments

Nine for the nine bright shiners
And eight for the April Rainers

Seven for the seven stars in the sky
And six for the six proud walkers

Five for the symbols at your door
Ands four for the Gospel makers

Three, three, the rivals

Two, two, the lily-white boys
Cloth├Ęd all in green, Yo Ho
One is one and all alone
And evermore shall be so


Rather than regarding this lyric as nonsense, I assumed that it meant something.

Probably it was ('just') a Christian mnemonic (although this never crossed my mind as a child - e.g. I had no idea what was meant by 'Gospel makers') , and certainly it has been brushed-up by revivalists (including re-regularising the metre - because ordinary people have tin ears when it comes to scansion, and fail to make even the simplest and most obvious changes to maintain or restore regular metre).

But after all this explaining-away, there remains an extraordinary sense of a world of mysterious numbers and symbols.


Five for the symbols at your door! Was this (I wondered) about somebody coming to the door of my house - something like carol singers, but showing a symbol?

Robert Graves guessed that this meant the pentangle, or interlaced five pointed star; this is indeed a Christian symbol in Sir Gawain and the Green Knight - although any such tradition has since been lost, and the pentangle is now the premier symbol of neo-pagans.


'Real' history becoming myth



Wednesday, 27 October 2010

"Whom the gods would destroy, they first make mad" - true, but how?


I am thinking, here, about the Western elites.

That they are mad is clear, that they are en route to utter destruction I believe.

That is where the 'madness' comes in.

That 'the gods' wish to extirpate the Western elites seems obvious, and the reasons why are also obvious; but for the requisite irreversible annihilation certainly to happen requires that the elites become insane - not merely in losing their spontaneous sense of self-preservation, but actively willing self-annihilation.

Otherwise, sans lunacy, a sane elite would pull-back just enough to avert nemesis, and survive to wreak further havoc.

Since the gods want to ensure this does not happen, they first ensure that the elite are mad.


Robin Hood - what is the appeal?


Robin Hood is the premier English legendary hero. Which is somewhat frustrating, because there never has been a satisfactory depiction of the legend.

While the Scottish-English border had ballads which were in the first rank as poetry, the English mainland ballads of the Middle Ages were almost all about Robin Hood and his adventures - and were pitched, pretty much, at the level of Medieval pop songs or soap operas.

Yet Robin Hood does have a powerful and enduring appeal - what is the essence?


The setting is important - a pastoral idyll, a gang of friends living in the woods: marvellous.

Then there is the idea that each Merry Man has a different and complementary character and role.

Men's groups differentiate in this way spontaneously: there is always a wise and brave leader, a big strong slow one, a clown, a brainy one, a fat jolly one, a scary psychopath-berserker, a singer-poet; and since there may not be any very good examples of these types in a particular group, the men just do their best (e.g. the clown may not actually be funny - but there will nonetheless always be a clown; the brainy one may not really be very smart - but he only needs to be a bit smarter than the others).


For me, Robin Hood was blended with Red Indians - specifically the Hiawatha type who lived in the North East Woodlands of the US, around the Great Lakes, or in New England.

So I liked the elements of woodcraft, hunting, making stuff from bark, thongs and sinews, being able to stalk silently - without cracking a twig.


At the heart of the story is the heroic idea of being in just rebellion against unjust rule - Robin Hood is not really an 'outlaw', since it is the rulers (King John and the Sheriff of Nottingham) who flout the law.

The Merry Men are merely trying to restore the rightful King.

And then there is the idea (which I think Walter Scott propagated) that Robin Hood (maybe himself an aristocrat gone native) is leading Anglo Saxon rebels against the Norman oppressor.


So there is enough and more ingredients here for a genuine folk hero; but even at its root there is an element of playful holiday make-believe about Robin Hood, which deprives the Robin Hood tales of the dignity and depth of the Arthurian legends.


Come all you little streamers - a mysteriously garbled folk song


Oh come all you little streamers wherever you may be
These are the finest flowers that ever my eyes did see.
Fine flowery hills and fishing dells and hunting also
At the top of yonder mountain where fine flowers grow.

At the top all of the mountains where my love's castle stands
It's over-decked with ivory to the bottom of the strand.
There's arches and there's parches and a diamond stone so bright;
It's a beacon for a sailor on a dark stormy night

At the bottom of the mountain there runs a river clear.
A ship from the Indies did once anchor there,
With her red flags a-flying and the beating of a drum
Sweet instruments of music and the firing of her gun.

So come all you little streamers that walks the meadows gay
And write unto my own true love wherever he may be
For his sweet lips entice me, but his tongue it tells me “No!”
And an angel might direct us and it's where shall we go?    



These lyrics are found, sung to a lovely tune by Shirley Collins, on an album of the Etchingham Steam Band (album of that name) - which was a small, almost-forgotten folk group who had a big influence on me in mid-teens. 

I was playing the piano accordeon and a bit of mouth organ, and learned all their stuff I could find (not much) and tried to emulate the general sound and musical philosophy in some lunchtime performances at school with my friend Gareth Jones (who played bass guitar and flute). We also did versions of Mike Oldfield's folksy singles of that era (In Dulce Jubilo and Portsmouth). 

Anyway... this particular song and tune, Come all you little streamers, was only issued much later - and I only came across it about 5 years ago. I loved everything about the performance, including the way that after the song they used the tune as a dance, done double speed. 

The lyrics are fascinating. Of course they are more-or-less nonsense, and have been garbled and mis-remembered by the hazards of oral transmission - in a 'Chinese whispers' kind of way - until collected from an old man called Ned Spooner in Sussex.  

There are, in fact, other versions of the same song, which make clear what it was originally about - but I prefer this one, with its weird suggestions at meaning. 


My point? 'Poetry' can be accidental, the product of accident - so long as the intention is pure. 

At least, this can happen by the oral process, when it is not intended to happen but does anyway. 

When hinted meanings etc are artificially contrived by professional intellectual poets, the results are (to my mind) despicable: for example the Waste Land by TS Eliot is a despicable 'poem' (deliberately obscure and fragmentary, full of show-off referencing) which has had an appalling influence. 

Hints at depths: that is what lyric poetry and song can do - and often we never really know whether or not the depths are there, but we need to be fascinated that they might be. 


Recall, the whole of Tolkien's mythology was triggered by a single suggestive and obscure phrase, indeed a single word: Earendel:

''Eala, Earendel engla beorhtost
ofer middangeard monnum sended"

Hail Earendel, brightest of angels, above the middle earth sent unto men!

In the mouth of Lowdham, in The Notion Club Papers page 236: "When I came across that citation in the dictionary I felt a curious thrill, as if something had stirred in me, half wakened from sleep. There was something very remote and strange and beautiful behind those words, if I could grasp it, far beyond ancient English".


Tuesday, 26 October 2010

Dark Eyed Sailer - a ballad


As I roved out one evening fair
It bein' the summertime to take the air
I spied a sailor and a lady gay
And I stood to listen
And I stood to listen to hear what they would say.

He said "Fair lady, why do you roam
For the day is spent and the night is on."
She heaved a sigh while the tears did roll
"For my dark-eyed sailor
For my dark-eyed sailor, so young and stout and bold."

" 'Tis seven long years since he left this land
A ring he took from off his lily-white hand
One half of the ring is still here with me
But the other's rollin'
But the other's rollin' at the bottom of the sea."

He said "You may drive him out of your mind
Some other young man you will surely find
Love turns aside and soon cold has grown
Like the winter's morning
Like the winter's morning, the hills are white with snow."

She said "I'll never forsake my dear
Although we're parted this many a year
Genteel he was and a rake like you
To induce a maiden
To induce a maiden to slight the jacket blue."

One half of the ring did young William show
She ran distracted in grief and woe
Sayin' "William, William, I have gold in store
For my dark-eyed sailor
For my dark-eyed sailor has proved his honour long"

And there is a cottage by yonder lea
This couple's married and does agree.
So maids be loyal when your love's at sea
For a cloudy morning
For a cloudy morning brings in a sunny day.


I know this ballad from the version sung on the Steeleye Span album Hark! The Village Wait, which is simply sublime:

But the words alone illustrate for me the tremendous power of stock phrases in poetry, specifically oral poetry. Pretty much the whole ballad consists of stock phrases, yet it has immense elegiac strength.

This has a 'eucatastrophe' at "One half of the ring did young William show/
She ran distracted in grief and woe" which produces that 'lift' of the spirit from which we gain such inspiration.


King Arthur, and legends of the British Isles


I am English (mostly) and have lived in Scotland - but what about Britain?

The energetic reality of Britain, at a gut level, comes almost wholly from the Legends of King Arthur - the matter of Britain; which crops-up all around England, Scotland, Wales and even Brittany.


Like so many people, I have a fascination for Arthur. My main basis for this - its crystallization - was The Once and Future King by TH White, which had a big impact on me in my late teens into early twenties.

But even now I consume quite a lot of Arthurian stuff in the realm of popular art and high art (indeed, I have never read Malory, nor any of the old sources).

An Arthurian setting does not have to be perfect to work its charm on me, even slight hints and pictures are effective enough to keep me interested - so long as I detect a seriousness of intent. I like to visit places with Arthurian associations.

On the other hand I strongly dislike exploitative, crude, vulgar use of the Arthurian legend: the Camelot musical, the idea that JFK's smug coterie was 'like Camelot', Mark Twain's Yankee in King Arthur's Court (or whatever it is called), and - although parts of the movie are very funny (notably the two stupid guards), I did not like the Arthurian links in Monty Python and the Holy Grail.


I like the character of Merlin very much, and also Arthur (as a good, just, somewhat unimaginative but well-meaning Englishman - not, of course, as admirable as the real King Alfred the Great - but better than any other real monarchs!).

I also like the idea of hot-blooded Scots or Welsh knights, and seductive Celtic witches and sorceresses like Nimue and Morgan le Fey.

But I have reservations. I have never liked Lancelot, or Guinevere, and I never liked the Grail plot (a big thing not to like!).


So, King Arthur is the main British legend; indeed (for all its flaws) the only one.

The main English legend is Robin Hood; indeed (for all its flaws) the only one - until Tolkien.

Except that there was the mythic-reality of the Anglo Saxon golden age, upon which (covertly) so much of Englishness was built - St Cuthbert, St Bede, Alfred and so on.


The other dimension - to which I alluded in yesterday's posting of the Border Widow's Lament - is the Border between England and Scotland; which for me has a rather distinct identity, based on the cohesion of Anglo Saxon times and again during the Middle Ages. 

Presumably this feeling about the Borders is hereditary - since three grandparents are Northumbrian (two going way-back) and the other came from Ulster, ex-Scots borderers.

Do I believe in such familial influences? Yes, like Tolkien, I do.

I don't try to prove them, nor do I try to explain how such influences work, but they are real.

Ancestral influences can be denied, of course, and existence does not end when they are denied: it merely becomes flat and dead.


The daimonic force of great myths and legends



Living in the past, or a fantasy


It is striking that so many people spend so much of the time living - psychologically and to some extent emotionally, otherwhere than they physically are.

This is not really wishful thinking (which instead focuses mainly on the future). It may be escape, but if so it is interesting that escape is often into imagined situations of hardship and terror.

What it mostly is, is escape into myth and away from materialism.


A person's real life - in terms of the life of the mind - may have little or no connection with the observable facts of their life.

Because the facts of a person's life are dead (not unimportant, not without powerful effects - but dead).

The most immediate and distressing deficiency of modern life is is deadness, the disconnection of the human soul from human experience.

Of course, life is also experienced by most as meaningless and purposeless; but this deadness comes before that - a person disconnected from real life, or denying of the reality of life, cannot even begin to think about meaning and purpose.


The modern experience is of materialism (dead facts and events) and future hope (of different dead facts and more pleasurable anticipated experiences).

This is the reason why the modern condition is so intractable. The deadness and disconnection generate desperation: an urgent desperation which is unbearable and which leads to desperate measures in order to distract or to obliterate - or to escape.

Distraction of the mind and obliteration of thinking are dead ends, they are the driving force behind hedonism and the driving force behind the growing economy and technological neophilia - measures leading only the need for more of the same; but escape keeps life alive, keeps the soul connected with the imagined world - albeit in an encapsulated fashion. A holding operation.


Perhaps this is why the modern elite (whatever they may think or do in private) publicly scorn the world of imagination and fantasy, and openly despise those who inhabit it.

This is, of course, what the system requires for perpetual growth. 


Living in the past or a fantasy - a situation in which the realest life is experience in the imagination - is not a sufficient answer; but it is the first step in retaining a sense that life has been or could feel real, and a person might connect with life - have a relationship with reality.

Meaning and purpose are absent, there is only the here and now of imagined, immersive actuality stretching back into memory of similar times.

A state of limbo, rather than a heavenly state - but a state with much more potential than the usual alternative.

At any rate, it seems clear that their 'secret life' of fantasy - living in the past or imagination - is what some people, and not the least lucky people, may look back upon as their real life.

When the hundreds of hours of office work and passively-experienced entertainments and forced-gossip are faded, certain bright moments of imagination (pregnant, it seemed, with something greater) may yet remain.


Monday, 25 October 2010

The Border Widow's Lament


A Scottish-English Border Ballad - Anon.


My love he built me a bonny bower,
And clad it a’ wi’ lilye flour
A brawer bower ye ne’er did see,
Than my true love he built for me.

There came a man, by middle day,
He spied his sport and went away;
And brought the king that very night,
Who brake my bower, and slew my knight.

He slew my knight, to me sae dear;
He slew my knight and poin’d his gear;
My servants all did life for flee,
And left me in extremitie.

I sew’d his sheet, making my mane,
I watch’d the corpse myself alane
I watch’d his body night and day;
No living creature came that way.

I took his body on my back,
And whiles I gaed and whiles I sat;
I digged a grave, and laid him in,
And happ’d him with the sod sae green.

But think not ye my heart was sair,
When I laid the moule on his yellow hair?
O think na ye my heart was wae
When I turn’d about, away to gae?

Nae living man I’ll love again,
Since that my lovely knight was slain,
Wi’ a lock of his yellow hair,
I’ll chain my heart for evermair.



Anon was one of the best of poets, better than any now alive and writing in English it seems.

This ballad poem is almost unbearably moving, each stanza contains a superb phrase or more than one.

Like the greatest ballads, and presumably as a consequence of being honed through oral transmission, there is no padding: the story is told with extraordinary concision and lack of elaboration, assembled mostly from traditional lines, I guess.

But I wonder where that remarkable section came from "I took his body on my back,/ And whiles I gaed and whiles I sat."

It sounds horribly like personal experience.


Religious autism - a developmental disorder common among intellectuals



Religious autism is a developmental disorder - present from birth or before, which emerges clearly about the time that language develops.

Rather as ordinary autism includes a defect in 'theory of mind' - that is to say an inability mentally to model the dispositions, motivations and intentions of other people; so religious autism involves a deficiency in the ability to feel religious impulses, and consequently an inability to model the religiousness of other people.

Sufferers are superficially normal, and can even fulfill complex roles in modern society; however on closer examination they exhibit subtle behavioural problems - although that these are indeed problems may explicitly be denied.

The clearest problem is an aversion to reproduction - low fertility rates are a reliable sign of severe underlying pathology. Other less clear-cut signs of disorder include an affiliation for liberal, libertarian and left-wing politics; and 'negativism' - the tendency (for no good reason) to think and do the opposite of common sense.

Critics have suggested that the overall picture is one of low-grade, chronic psychosis due to a lack of reality-testing, deficiency of insight and indifference to personal survival.

The best current hypotheses concerning a putative aetiology is along the lines of the ancient Greek proverb that "those whom the gods wish to destroy they first make mad."


General autism is sometimes conceptualized as mind-blindness, the inability to recognize the distinctiveness and reality of other minds. And this inability is caused by the inability to feel certain types of emotion. The autistic world seems to be populated by mind-less robots.

Similarly, religious autism may be conceptualized as a blindness to religious dispositions, motivations and intentions caused by the inability oneself to feel these emotions - consequently the world of the religious autistic seems to be populated by materialist animals, responding purely to basic biological motivations such as libido, hunger and status-seeking.

The sufferer from religious autism therefore simply cannot comprehend religion. Due to a presumed defect in brain structure or functioning, this aspect of universal humanity is missing from their mental map of reality.

Religious autism results in a serious distortion of understanding of social reality for sufferers - since the massive role played by religion in human affairs is invisible; and the effects of religion in individuals and society must for them be explained by other causal mechanisms (biological, psychological, economic, political, pathological etc).



However, just as some general autism sufferers exhibit compensatory gifts - such as amazing feats of memory, or artistic ability, or the performance of  'impossible' mathematical calculations - so the sufferers from religious autism often show high abilities in other areas of life.

Examples from history include many great philosophers and scientists - such as the mathematician and philosopher Bertrand Russell. Famous modern sufferers include the biologist and writer Richard Dawkins, in whom extreme abilities in comprehending and explaining science exist alongside a total inability to feel  religious emotion. Philosopher Daniel Dennett is capable of brilliance in polemical prose and knock-about dialectics - but has suffered a life-long inability to take religion seriously at a gut level.

Indeed, some have claimed that the whole intellectual elite in North America and Western Europe may be suffering from a high prevalence of endemic religious autism, exacerbated (in recent years) by inbreeding.

The paradoxical effect of such a high prevalence of the disorder is that - in intellectual circles - the defect is often taken as normal, and propagated as rational.


Experts on religious autism suggest that the current situation resembles that of an aristocracy afflicted with colour-blindness who (based on their own abnormal perceptions) deny any real difference between green and blue; and teach this as fact in schools, government propaganda and the media.

Indeed, being surrounded by colour blind people operating on that basis, they eventually come to insist on colour blindness as an essential pre-requisite for membership of the ruling class - those not afflicted with this blind spot are regarded as either dumb or crazy.

By building a society which functioned on the basis that colour-blindness is true, and that blue and green are indeed the same colour; a colour-blind elite are eventually able to claim (without contradiction from social reality) that colour-blindness is normal, and that those who claim to see a colour difference between blue and green are deluded.

And this claim seems at least superficially correct, since making blue-green discrimination has no immediate benefits in a 'real world' that has been carefully constructed to exclude such distinctions.


Unfortunately, the sufferer from religious autism can never be fully normalized; it is a lifelong affliction - probably due to something missing from the brain.

However, there is hope because a sufferer can come to recognize their affliction, and can learn to live with the disorder.

Experts say that the first, and essential, step for the religious autistic is to stop being proud of their defect and instead to acknowledge that they are are sick and deluded; only after such a candid admission can the necessary psychological and social adjustments be made. 

Just as a person with autism can never model the minds of others - but can learn about their deficiency, and compensate by learning psychological mechanisms or 'workarounds' - in other words over-riding subjectivity with reason; so the religious autistic can over-ride their own delusional subjectivity by the application of reason.

But the news is not all goo. Some of the more hard-line religious say that this practice of using rational workarounds to 'simulate' religiousness is not acceptable - is indeed hypocritical and deceptive.

And this view is shared by some of the most deeply deluded religious autistics; who say that subjectivity is the primary reality, and it is therefore dishonest for someone who cannot feel spontaneous and powerful religiousness to attempt to 'pass' as a normal, religious person.


However, some of the most highly qualified religious experts say just the opposite; that making a sincere effort to be religious despite suffering a religious blind spot is sufficient - is indeed the whole point.

They argue that normal people should be tolerant and forgiving of religious autism, and need to recognize that it is very difficult for intellectuals in particular to over-ride their subjective convictions purely on logical and rational grounds.

After all, the experts point-out, many of the ruling elite have been trained since childhood to regard their whims and impulses as the primary reality. The existence of a real world beyond their own gut-feelings of what is important to them is an alien concept.  

But there does seem to be unanimity on one matter: that a stop-must-be-put to the all-too-common practice of religious autistics denying their illness, claiming that their pathological state is superior to normal, and working to change society to fit-around their disability!


It is one thing to suffer a defect, it is quite another matter deliberately to inflict this defect on others. 

Religious sages are quite clear on this. On the one hand religious autistics are deserving of sympathy and help - but on the other hand society must not lose sight of the fact that although religious autistics 'cannot help it', they are nonetheless wrong; and the influence of their wrongness needs to be opposed.

Any society which based-itself on the comforting autistic delusion that religion did not matter because it felt unreal, and which structured society such that religious distinctions were rendered ineffectual, would be a society doomed to self-destruction.

Kindness to religious autistics, and the understandable wish to make them feel more comfortable, must not be allowed to over-ride the absolute necessity for basing personal and social conduct upon truth.



In the meantime, awareness of the problem of religious autism is increasing.

Indeed, a few sufferers are beginning to 'come-out' in hope that the condition will begin to attract more attention and lead to more effective action.

But most religious autistics remain too self-conscious, or too worried by the social sanctions from an intellectual elite who stigmatize those who admit their deficiency. 

"After all" says one sufferer, who prefers not to be named but goes by the pseudonym of 'bgc'; "I have nothing to be ashamed of simply because I was born lacking an essential part of my brain. Naturally, I strongly regret having propagated my defect in early life, that was wrong; but as from now, all I can do is sincerely to do my best to be religious. Luckily, that seems to be enough."


Sunday, 24 October 2010

The scope and nature of epidemiology


Bruce G. Charlton. Journal of Clinical Epidemiology 1996; Vol. 49, No. 6, pp. 623-626.

Introductory comment for the memorial conference for Petr Skrabanek (1940-1994): My friendship with Petr Skrabanek was conducted entirely through the medium of the written word. I initially made contact after reading Follies and fallacies in medicine [l], and in the all too brief time before his death we exchanged letters on a regular basis-our correspondence fuelled by regular indulgence in the quaint academic habit of swapping publications. I never met the man, yet few individuals have had more influence on my intellectual development. It soon emerged that we shared a background of laboratory work in the biological sciences, a love of literature (both pursuing parallel academic activities in English studies) and an attitude of scepticism concerning the common claims of epidemiology and public health. The boldness, wit, and incisiveness of his papers gave fresh impetus and inspiration to my already established interest in group, or population, studies. The current essay will argue on methodological grounds that the abuses of epidemiology, so lucidly exposed by Petr Skrabanek, are a direct consequence of misunderstanding the scope and nature of epidemiology.



The value of epidemiology as an approach to understanding and improving health is frequently subject to exaggeration by its practitioners, and by those involved in health promotion and public health.

Partly as a consequence, epidemiology increasingly sees itself as an autonomous scientific” discipline-with its own approach, techniques, departments, conferences, journals, and, most significantly, intellectual standards of proof. Proposals have been made in the United Kingdom for basing a comprehensive and detailed national program of preventive medicine and health promotion entirely on epidemiological evidence [2,3]. I will argue that such autonomy is impossible in principle, and the attempt to achieve it will only result in scientific artifacts and political abuses.

The standard definition of the subject describes epidemiology as the study of health in populations [4]: it can therefore be considered the study of health at the group level of analysis.

Epidemiology comprises both observational and experimental methods, and includes megatrials (those “large, simple” randomized clinical trials [5] with “pragmatic” aims [6,7]).

The methodological unity of such disparate epidemiology techniques as the survey, the case-control study, the cohort study, and the megatrial is derived from a characteristic mode of inference by induction, based on generalizing from a “sample” [8,9]. This mode of inference can be contrasted with a “scientific” mode of inference based on devising and testing causal hypotheses.

The specific impetus behind the rise of epidemiology seems to be a striving for ever increasing precision in the measurement of health states. As therapy has advanced, clinicians have come to seek quantitative rather than qualitative improvements in management.

A further, and perhaps more urgent, demand has come from those concerned with health policy, who need precise estimates for use in statistical models designed to monitor and control health service performance.

Epidemiology appears to offer a way of quantifying the magnitude of health risks and therapeutic interventions. Precision can be enhanced with a power seemingly limited only by the size of studies. Studies have grown progressively larger; and more recently there has been a fashion for aggregating trials in the process called meta-analysis [ 11].


The major problem that besets quantification in medicine is the large variation between patients: in therapeutic terms this translates as excessive unpredictability in prognosis. The tacit assumption that lies behind the epidemiological practice of averaging populations in order to enhance precision is that the major barrier to attaining valid biological estimates is excessive random error in measurement.

Epidemiology presupposes that the underlying nature and quantity of a variable is obscured by “noise” that can be removed by averaging and other statistical adjustments, on the basis that - given adequate numbers of instances - errors in one direction will cancel errors in the other direction.

And here lies the root of the problem. The implicit assumption that noise, or random error, is the major obstacle to biological understanding is incorrect. The major difficulty in measuring the true value of biological phenomena, and the principal cause of variation between individuals, is typically not random but systematic error [8,9].

Systematic error is due to qualitative differences between either the entities being compared or the causal processes operating on them. Such qualitative differences produce the problem of bias or distortion of comparisons between instances due to the “unlikeness” of instances.

And systematic error may be difficult to deal with, because the complexity of causal interactions at the level of biological phenomena is often extremely great [12]. Even if all the relevant causes are understood, it may not be possible to control them. This difficulty is compounded in human studies by a mass of subjective factors (such as placebo effects), as well as by ethical constraints on study design.

It is this problem of intractable bias - rather than random noise - that accounts for the bulk of observed variation in medicine.

Another common error is to assume that strict randomization of large numbers of subjects is able to control all important forms of bias; yet randomization does nothing to eliminate systematic differences between subjects from the experiment, it merely distributes the systematic error equally between comparison groups [9].


For averaging to have the effect of increasing precision, the instances averaged must differ only randomly so that errors will cancel [9]. In other words, instances should be qualitatively identical, or homogeneous in all respects relevant to the circumstance. If averaging is to be used to increase precision, each subject in a study should be de facto a “duplicate” of any other subject.

If, on the other hand, these assumptions do not hold, and instances within a group are heterogeneous, then, averaging will not simply increase precision, but will also create an artefact - an entity that does not correspond to any individual instance and is real only at the group level of analysis [13]. An artifact is the inevitable consequence of summarizing dissimilar instances in a single statistic.

In the basic empirical biological sciences, great effort is expended to attain control of relevant causes and ensure homogeneity between instances - for instance, by inbreeding and identical rearing of laboratory animals and by subjecting them to rigorous experimental protocols [9]. Each population or group studied is then composed of interchangeable subjects.

Averages of these “duplicate” instances are a valid way of enhancing precision in measurement because the process serves merely to reveal underlying uniformity in the data. In the human sciences, subjects within a population cannot usually be regarded as interchangeable and the use of averaged data is correspondingly beset with hazards: the poorer the control, the greater the hazard.

Yet, epidemiological methods routinely involve creating statistical summaries of instances that are heterogeneous-indeed, in some cases the heterogeneity is an inevitable by-product of the need to recruit large numbers of subjects.

For instance, the megatrial methodology requires large numbers of subjects; and criteria for entry, experimental protocols, and the outcome measures are deliberately simplified with this aim in mind.


Such deliberate simplification corresponds to deliberate reduction in experimental control, and means that bias is wittingly introduced into the experimental situation by allowing the incorporation of systematic differences between subjects and the causes operating upon them.

The averaging of heterogeneous instances produces summary statistics of populations that are artifactual from the clinical perspective of individuals. Yet the assumption is routinely made that group data are predictive of individuals.

Indeed, the practice of “evidence-based medicine” regards megatrials as the “gold standard” of guidance for the treatment of individual patients [14]. Such misapplication of group data to individual instances is sometimes called the ecological fallacy [4].

On formal methodological grounds, the estimate derived from an epidemiological study including heterogeneous subjects (such as a megatrial, or a meta-analysis of such trials) tells us nothing about the experience of individuals either within that study (internal validity) or in other circumstances (external validity).

This lack of generalizability arises because enhanced precision has been attained only at the cost of reduced validity: narrow confidence intervals around invalid estimates.

Misunderstandings of the megatrial methodology are due to the tacit conflation of systematic and random error so that a large, simple megatrial and a small, rigorously controlled study are regarded as equivalent when they share the same size of confidence interval.

But experiments with different protocols are different experiments!


The distinction between random and systematic error forms the crux of the argument that asserts that epidemiology is not a natural science. Science strives to describe the underlying structure of phenomena: the nature of entities and how they are causally interrelated [15].

But when epidemiology reduces random error at the cost of increasing systematic error, the “populations” that form the entities of epidemiological analysis become artifactual when interpreted at the level of individual instances, and the relationships between these entities are not unitary causes but incompletely controlled mixtures of causes that can be interpreted only as associations.

Science aspires to create “structured knowledge” in terms of causally linked real entities. Such scientific structures are of the nature of hypothetical models that can be tested against observation and experiment, or used to draw deductions, or make predictions.

Epidemiology, by contrast, provides a summary of a state of affairs rather than a description of an underlying structure, and therefore cannot use the same inferential procedures as science.

Instead, epidemiology makes generalizations on the basis that a specific study constitutes an estimate of the state of affairs in a larger notional population to which its results may be generalized. This style of inference is inductive and relies on the assumption of uniformity in the phenomena under investigation [8] - the larger notional population should differ only randomly from the epidemiological sample.

The paradigm of epidemiological techniques is therefore the survey.


The ideal survey is a sample containing the full range of instances of a measured variable in their proper proportions, and selected in an unbiased fashion, such that the sample is a microcosm of the larger population.

In practice, this usually implies the need for a large, randomly selected sample (i.e., random with respect to the relevant causal processes). Assuming that bias can be eliminated, the survey can then be summarized purely on the basis of its statistical parameters.

Even epidemiological experiments, such as the megatrial, can be conceptualized as surveys: a megatrial yields estimates of health states in two or more populations that differ only in having experienced different protocols.

Like other forms of inductive reasoning, the validity of epidemiological inference depends on extraneous knowledge from scientific investigation of entities and causes to ensure that populations between which estimates are generalized are similar with regard to relevant causes [8].

If they are not, and populations differ systematically (as is commonly the case with megatrials), then epidemiology depends on science for an understanding of the magnitude of expected bias. Epidemiological studies are interpretable only when performed within a framework of existant knowledge.

It should also be emphasized that because epidemiological inference is of the nature of a statistical summary, its validity is entirely derived from the validity of its constituent data [16]. Epidemiological models, in contrast to scientific models, are not analytic and do not contribute to understanding phenomena, being merely a “representation” in microcosm of a body of information.


The above analysis may be used to underpin Petr Skrabanek’s forceful criticisms of epidemiological practice [17-20].

Epidemiological techniques, which yield merely statistical summaries of states of health, should not be used as methods of exploring the determinants of human health.

Epidemiology is signified by its population level of analysis and consequent limitation to an inductive mode of inference: it is a mistake to attempt to create an autonomous subject of epidemiology from such negative qualifications.

The reification of this non-causal level of analysis to the status of a discipline has been conspicuously unsuccessful in terms of generating “reliable knowledge” of disease [21], although highly successful at generating research funding, which for many people is justification enough.

Furthermore, the problem with using epidemiology as a “gold standard” or criterion reference becomes clear. “Evidence-based medicine” (EBM) explicitly regards epidemiological techniques such as the megatrial and meta-analysis as the ultimate foundation of clinical practice and hierarchically superior in nature to evidence from the natural sciences [ 14] (indeed, the proponents of EBM mistakenly believe that megatrials and meta-analyses are sciences).

Medical science can manage without epidemiology, but useful epidemiology cannot be done without medical science.


If we appreciate that the investigation of biological phenomena is beset by problems of uncontrolled bias, we can understand why the common epidemiological practice of seeking small real effects amidst large systematic error reliably leads to estimates of high precision but low validity.

For instance, using big case-control studies to investigate marginal relative risks to health is a recipe for false inference [19,20]. Such studies conflate random and systematic error in their quest for a spurious notion of accurate measurement.

The level of uncontrolled and residual unadjusted bias in such studies will result in systematic differences between the case and control populations that quite overwhelm any real or imagined causal effects of modest magnitude. Large case-control studies do not measure what they purport to measure, but they describe their artifacts with exquisite exactitude.

These basic mistakes in the approach of epidemiology are compounded by the search for multiple “risk factors,” the effects of selective publication in favor of positive results, and by an attitude to public policy that takes a one-sided view of risk assessment [17].


The criteria for ascribing causation to a relationship have been progressively weakened over recent decades by mainstream epidemiologists, to the point that any correlation is readily interpreted as an element in a vague “multifactorial” web of contributing “determinants.” The need for a biologically meaningful cause is explicitly rejected, and “black box” association is vaunted as a methodological advance [20].

It is ironic that the massive attention paid to the topic of “causality” in epidemiological texts and journals has served, after all, merely to reduce the rigor with which causation is attributed [22].

Weaknesses in epidemiological methods are compounded by the frequently moralistic or political aims that drive investigation in this field. The dangers are great, given the inherent lack of ability of epidemiological methods to discriminate real causes from biased associations.

Skrabanek documented, exhaustively and with pungent wit, the innumerable ways in which epidemiology is used to give a veneer of quasi-scientific respectability to the recommendations of government officials, managers, and those who have something to sell [1,23].

This activity trades on the prestige of mathematics and statistical analysis, combined with the impressive weight of evidence provided by large data bases. Merely because a research study is big, slow, and costly, and involves hard sums, does not confer validity on its conclusions.

The end result of these epidemiological abuses has been to transform risk-factor epidemiology into a highly effective, albeit expensive, mechanism for generating irrefutable health scares based on untestably vague pseudo-hypotheses.

And such is the degree of precision with which putative risk is defined that only further and larger epidemiological studies can investigate the question. The asserted autonomy of epidemiology is apparently confirmed because, once a scare has begun, epidemiological studies follow one on another, generating work for the investigators but failing to move any closer to settling the dispute, for the good reason that epidemiology is systematically incapable of resolving debates concerning causal mechanisms.


So much for the down-side of epidemiology. However, the preceding analysis also allows us to make some suggestions for improvement.

The nature of epidemiology may be defined as that activity concerned with preparing and comparing statistical summaries of health states in populations. The scope of epidemiology is dependent on the parameters within which such summaries may legitimately be applied. Legitimate inference is contingent on the prerequisites for induction being present in a given situation.

Whether or not an estimate from an epidemiological study is a valid estimate of a state of affairs in a target population is therefore conditional on the study population being a microcosm of the target population. And determination of this critical attribute of “representativeness” is (mostly) a scientific matter of understanding systematic biases, not a statistical matter of dealing with random error.

One consequence is that epidemiology should be regarded as subordinate to, and contained by, science; because knowledge of causes is essential for establishing the legitimacy of generalizing from a specific epidemiological study.

The major role for epidemiological studies is therefore to enhance precision of estimates of states of health in situations where the nature and magnitude of systematic errors are known. So representativeness is always vital.

Furthermore, if epidemiology is to be used to inform clinical practice (i.e., if summary statistics derived from populations are to be applied to individual patients) then subjects in the populations studied should also be homogeneous in terms of the relevant causal variables. In this case it will be necessary to establish both representativeness and homogeneity of groups in an epidemiological study.


What are the implications for practice? The proper way to practice epidemiology is to regard it as merely one element in the investigation of pathology-as a set of tools, not as an autonomous discipline.

Epidemiological investigation should be subsumed within larger goals of either a biological or medical kind: seen as part of a repertoire of techniques brought to bear in understanding, explaining, and intervening to ameliorate disease.

It is striking that, with only a few counter-examples, the best epidemiology has usually been done by clinical scientists primarily interested-in, and knowledgeable-about, specific problems of pathology, rather than by specialist “epidemiologists” whose interest is primarily in statistics and methods. Collaborations between physicians who do not understand statistics and epidemiologists who do not understand disease are just as bad, due to the lack of any cohesive critical, integrative, and guiding intelligence.

Given these considerations, the current trend for recruiting “pure” epidemiologists from those whose skills are numerical, and whose approach and methods are concerned with noise reduction rather than bias elimination, is a mistake. Collecting these epidemiologists into specialist academic groupings (departments, units, schools, etc.) and regarding them as general purpose “guns for hire” whose expertise is impartially applicable to health problems (“Give us the data, and we will tell you what it means”), only compounds the problem and diminishes the likelihood of correcting error.

We cannot expect much more than a narrowly circumscribed facility from investigators who are “hands-off’ designers and analyzers of population studies, and-only as an afterthought-try to learn enough biology and medicine for the job in hand. Such specialists are more akin to a technician running a blood analyzer than to the scientists and clinicians who use the machine as a tool for understanding natural phenomena.


The ideal epidemiologist would therefore be a generalist, primarily a scientist or a clinician who has an extra realm of skill in dealing with population health. Epidemiologists who aspire to this ideal would wish to develop a profound interest and knowledge of particular diseases (or health states) and their determinants. This is vital because most epidemiological studies are riddled with systematic errors.

The major problems in interpretation are not statistical (size, power, confidence intervals, etc.), but systematic (representativeness, homogeneity, degree of control, etc.). Adjusting and interpreting the results to compensate for inevitable biases is a matter for the numerate scientist, not the abiological statistician.

The ideal epidemiologist should be de facto a theoretical medical scientist, grounded in the kind of biological and clinical “common sense” that allows the possibility of working on a problem in an interdisciplinary fashion. Fragmentary evidence from a range of disciplines may need to be gathered and combined to produce plausible and testable hypothetical models.

Much of the work would involve participating in the surveying, planning, and critical evaluation of clinical and laboratory studies; but the epidemiologist’s distinctive task would be to fine-tune estimates of quantity and define the scope of their applicability, by establishing the nature and magnitude of adjustments required for generalizing study results to specific target populations.


Epidemiology does not, of itself, increase our understanding of the world, although it may increase the ability to make predictions.

Inductive inference tells us that because the sun has always risen in the morning, we may assume that it will do so tomorrow.

Furthermore, humankind has known the calendrical procession of daybreak for tens of thousands of years.

But the parameters within which we can apply such inductive knowledge can be set only by scientific knowledge of causes, and it is this understanding that enables us to unlock the hidden potential of nature [15].

Until we grasped that the solar system was heliocentric, and discovered the pathways and rotations of the heavenly bodies, we could never have predicted the sunrise on another planet.



1. Skrabanek P, McCormick J. Follies and Fallacies in Medicine. Tarragon, Glasgow, Scotland, 1989.

2. Rose G. The Strategy of Preventive Medicine. Oxford University Press, Oxford, 1992.

3. Charlton BG. A critique of Geoffrey Rose’s ‘population strategy’ for preventive medicine. J R Sot Med 1995; 88: 607-610.

4. Last JM. A dictionary of Epidemiology. Oxford University Press, New York, 1988.

5. Yusuf S, Collins R, Peto R. Why do we need some large, simple controlled trials? Stat Med 1984; 3:409-420.

6. Schwartz D, Lellouch J. Explanatory and pragmatic attitudes in therapeutic trials. J Chron Dis 1967; 20: 637-648.

7. Charlton BG. Understanding randomized controlled trials: Explanatory or pragmatic? Fam Pratt 1994; 11: 243-244.

8. Van Valen LM. Whv misunderstand the evolutionary half of bioloev? In: Conceptual Issues in Ecology (Saarinen E, ed.). Reidel, Dordre&t, Holland, 1982.

9. Charlton BG. Mega-trials: Methodological issues and implications for clinical effectiveness. J R Co11 Physicians Lond 1995; 29: 96-100.

10. Charlton BG. Management of science. Lancet 1993; 342: 99-100.

11. Feinstein AR. Meta-analysis: Statistical alchemy for the 2lst century. J Clin Epidemiol 1995; 48: 71-79.

12. Rosenberg A. Instrumental Biology or The Disunity of Science. University of Chicago Press, Chicago, 1994.

13. Bernard C. An Introduction to the Study of Experimental Medicine. Dover. New York. 1957. 1Reurint of 1865 edition.

14. Rosenberg W, Donald A: Evidence based medicine: An approach to clinical problem solving. Br Med J 1995; 310:1122-1126.

15. Bronowski J. Science and Human Values. Harper Colophon, New York, 1975.

16. Crick F. What Mad Pursuit: A Personal View of Scientific Discovery. Wiedenfeld and Nicolson, London, 1989.

17. Skrabanek P. Risk factor epidemiology: Science or non-science? In: Health, Lifestyle and Environment. Social Affairs Unit, London, 1991, pp. 47-56.

18. Skrabanek P. The poverty of epidemiology. Perspect Biol Med 1992; 35: 182-185.

19. Skrabanek P. The epidemiology of errors. Lancet 1993; 342: 1502.

20. Skrabanek P. The emptiness of the black box. Epidemiology 1994; 5: 553-555.

21. Ziman J. Reliable Knowledge: An Exploration of the Grounds for Belief in Science. Cambridge University Press, Cambridge, 1978.

22. Charlton BG. Attribution of causation in epidemiology: Chain or mosaic? J Clin Epidemiol 1995; 39: 146-149.

23. Skrabanek P. The death of humane medicine and the rise of coercive healthism. Social Affairs Unit, London, 1994.


Saturday, 23 October 2010

My alpha MSH RIA study - The biggest waste of time of my life?...


Arguably, the work leading to the paper published here:

and referenced:

Charlton BG, Ferrier IN, Gibson AM, Biggins JA, Leake A, Wright C, Edwardson JA. A preliminary study of plasma alpha MSH concentrations in depressed patients and normal subjects. Biological Psychiatry. 1987 Oct;22(10):1276-9.

Was my biggest waste of time - at least in my early career (in the days when I had a career). 


It was a huge waste of time since this alpha MSH study was something I intended to 'knock-off' in six weeks, but ended up consuming 18 months (although not full time). It led to a small brief report paper (just a couple of pages) with a largely negative result that has received (I think) one citation in 23 years...


But I did have an interesting experience as a result - on the nature of scientific 'evidence' and the effect of  thresholds of skepticism.

I need to be careful not to tell this story in a self-glorifying way, since here I was a skeptic, and (as is usual, over the long term) skepticism was justified - yet of course the essence in science is not about being skeptical as such (which would lead nowhere), but about being skeptical-enough.


The whole thing hinged on detecting a little peptide hormone called alpha MSH in human blood, and the question was firstly whether or not this hormone was actually present in human blood, and secondly if it was present whether there was a circadian pattern of variation (daily rise and fall).

The MSH was detected using a radio-immunoassay (RIA), and the whole thing hinged on the bottom limit of detection in the assay - since the levels being recorded in the previous literature were right at the lower end of assay sensitivity (this is a normal situation in science - the methods are only just good enough to do whatever is trying to be done - because easy stuff in science has already been done and what remains is on the limit of do-ability).

To measure MSH in the blood sample involved using the assay to create a standard curve: known amounts of MSH were added to buffer, and measured with the assay which detected amounts of radioactivity. By measuring radioactivity in the blood samples, the amount of MSH could then be 'read off' the standard curve by interpolating the unknown level into the curve.

But the standard curve has a bottom limit of detection and could only yield results from the point where a sample containing certain amount of added MSH gave a radioactivity count which was significantly higher than adding zero MSH. Everything hinged on distinguishing zero from the point at which the MSH became above-zero.

The problem is that the assay should not be too large - or it would not work (big RIAs did not work - I never really knew why), and also a large assay would use-up the blood samples and antibody - both of which were essentially irreplaceable.

To get the best results the assay should be as large as necessary to generate reliable results, but no larger.


What this boiled down to, was how many 'replicates' (identical repetitions of the assay - how many test-tubes of reagents) should be made for each measurement.

Originally the assays had been done by measuring a single sample for each point on the standard curve and for each unknown sample.

Then people began averaging duplicate samples and using the operational criteria that different concentrations would count as different if the duplicate measurements did not overlap.  The bottom limit of detection was then the lowest concentration of MSH on the standard curve where the duplicates did not overlap with the duplicate for zero.

But duplicates doubled the amount of material consumed by each experiment!

However, I found that duplicates could be fairly widely spaced - there was a fair bit of 'noise' in the assay. So I began averaging triplicates to get a better estimate. This, however, meant that the amount of precious material consumed by each measurement went up by another fifty percent!

Then I became concerned specifically about the detectability threshold, so I used sextuplicates (six measures) for the zero point, and did a mean average and standard deviation on these six - the threshold of the assay was then set as the mean plus two standard deviations above the radioactivity measurement from the zero point (i.e. any sample registering higher radioactivity than this threshold counted as having alpha MSH present at above-zero levels).


While sorting this out, months were passing in the world!

The assay was temperamental, and often did not work at all for reasons which were seldom really clear - but this often happens with biological systems.

Eventually I was able to perform a couple of satisfactory assays which showed (confirmed previous reports, really) that alpha MSH was indeed present in human plasma at just above the limit of detectability - but that it did not vary in any interesting way - at least not that I could detect.

Plasma MSH was, I thought, probably doing nothing functional; just overflowing from MSH produced in the pituitary gland perhaps.

But it was not, by any stretch, an interesting result.


The reason I mention this is to show that science can depend upon extraordinarily tiny decisions - like how many replicates of a measurement can be made.

And that there is no 'right answer' to these questions - just degrees of tolerance, or of skepticism perhaps, or differences in your prior hypotheses - but it is a very personal thing.

Or rather it is a mixture of personal (including differences in personal truthfulness - because in these areas of uncertainty slight differences in the degree of personal honesty or scrupulosity can lead to big differences in apparent-outcome); and also the social - because it was essentially a consensus within the group of people engaged in measuring MSH and similar hormones that led to the practice of how many replicates were necessary.

If you went too far from the consensus, the work would not get published - or if it did get published it would be ignored (which usually happened anyway). 

So within the field there were those who were building elaborate theories of function and disease on the basis of detecting consistent patterns of MSH variation in human plasma; there were those like me who said that MSH was present but didn't seem to be doing anything much; and those who said that MSH was either not present at all - or present at such low concentrations as to have no effect - and that the apparent detection of was due to technical imperfections (such as cross-reactivity between antibodies and a variety of hormones).

These were big, and potentially important, differences in conclusions - and at root they hinged on subjective decisions about how many replicates were necessary or desirable, and how to handle the construction of standard curves.


NB: As in interesting further point - I would always, on principle, draw standard curves by hand - doing the curve-fitting by eye rather than using the 'least squares' statistical method of line drawing which is how curves were generated by statistical programs.

I saw no reason why the least curves statistic was intrinsically valid - to use it was an arbitrary and subjective decision; I preferred to used subjective judgment with respect to line-fitting on each specific curve, rather than as a broad brush decision regarding a standardized statistical technique.

I think I was right - and this is after all how scientists proceeded during the golden age - but nobody would be able to do this nowadays. Statistical conventions rule, and the subject is not open to debate.


Living with a biography


It was not always thus - but in recent decades (since about the mid 1980s, perhaps) I have often 'lived with' a particular biography or autobiography for periods of weeks or months - dipping in frequently, and trying to get at the heart of the human subject of the book.

These books (which I am about to list) were not necessarily ones which I would (wearing a critical hat) regard as exceptionally well written books, nor would I necessarily recommend them, and sometimes I would regard them as rather disappointing (the subject was generally more important to me than the style)  nonetheless these are indeed biographies with which I spent a lot of time.

The following is incomplete - but in broadly chronological order, or rather the period of my life (some weeks or months) dominated by the book:


Lucky Poet by Hugh MacDiarmid

JRR Tolkien and The Inklings by Humphrey Carpenter

MacDiarmid: a critical biography by Alan Bold

Robert Graves by Martin Seymour Smith

Genius: the life and science of Richard Feynman by James Gleick

Emerson: the Mind on Fire, by RD Richardson (Ralph Waldo Emerson).

The Flowering of New England by Van Wyck Brooks (group biography of early 19th century)

Robert Frost by Jay Parini

Joseph Campbell by Steven and Robin Larsen

Autobiography by John Cowper Powys

Charles Williams by Alice Mary Hadfield

Father Seraphim Rose by Heiromonk Damascene


For example, the Emerson biography by Robert D Richardson is associated with a long period around 1996-8 - I was continually going back and re-reading certain parts, until my copy was fallen into pieces. I then was more easily able to carry pieces of the book to read in bed, the garden or cafes (it is, intact, a very thick book).

My interest in this, as in others listed, was that Emerson seemed (at the time, not now) to have lived a kind of life (not specifically his life - but in some specific respects) which I wanted (in some way) to live - and I think I was trying to learn from this. In fact my motivations were not clear to me, now or since, but anyway I kept returning to the book and trying to puzzle-out something.

Or, and this would apply to the next book - The Flowering of New England, I returned to the parts of these books because they 'cast a spell' on me. VW Brooks prose is incantatory, and evokes a delicious (to me) idyllic quality.

There was, here and elsewhere, an element of day-dreaming, wish fulfillment and escape. The book was working as a technology, or a magical device, to create an alternative world in my mind.


Of course, most people get this from novels - which are, after all, designed to do it. But novels don't usually work for me; at least seldom since my late twenties (for example, the novels of Halldor Laxness did this for me, for a while circa 1999-2001, after visiting Iceland).

So biographies have been, for better or worse, a linking thread of life over the past 25 years.

It may have been for worse - in so far as they were a distraction (on the one hand) yet (on the other hand) apparently held out a (slender) promise of ultimate worldly gratification.

The idea that there had been satisfactory lives, and that these might perhaps be emulated - at least in their satisfactoriness - was a delusion, ultimately. 


NB: It is probably not relevant - but I have myself written a (very) mini-biography. A chapter length account of the life of writer/ painter Alasdair Gray - published in The Arts of Alasdair Gray, 1991 - and based on a few months of  proper archival research in diaries, letters etc.