Friday, 28 May 2021

Life in the pandemic XXVI Words and the “death” of postmodernism

I have led a fairly sheltered intellectual and academic existence, just one of many advantages working on the science side of a modern University campus. Modern universities don’t really operate as universities of course. Ideally a university should be a community of scholars with cross-fertilization of ideas across a wide range of disciplines and outlooks. The idea is that even very different disciplines can enlighten and stimulate each other. I can’t be the only scientist to whom good ideas have come while sitting in a seminar whose topic is light years away from some current piece of gristle I’ve been chewing on. However, someone once quipped that academia is the business of getting to know more and more about less and less. On this logic, professors know everything about nothing. Would it be remiss of me to point out that I’m a mere Reader? But it is a fact that we tend to hunker down in ever tighter intellectual cliques and tribes as time and careers progress. Eventually the cell and molecular biologists rarely see those who work on the behaviour of whole organisms, never encounter those (still within the scientific family) who reside in the departments of the physical (as opposed to biological) sciences, and are barely aware of those mythical creatures across the road (actually usually across several roads) who deal in words or thought, sound or pictures. That said, such isolation does have its advantages.

Most of us in the scientific world are probably best described as “modern” in the way we go about our task. This doesn’t sound too bad until you understand that since the 1960s or thereabouts, “modernism” has been seen as dangerous tomfoolery by many of our more arty colleagues who generally consider themselves post-modernists. Modernism is that post-enlightenment mode of thinking that elevates human reason as the key tool for obtaining objective knowledge about the world around us, providing a sure way for humanity to progress. It has been both powered and validated by the apparent success of science and technology. However, it has always had its critics. Romanticism in the late 18th and early 19th centuries was an early harbinger of trouble ahead. While the power and success of science seemed hard to deny, the materialism that usually accompanied modernity (and it was sometimes a radical materialism) seemed to leave something important out of the account. And the kind of progress science and technology generated wasn’t always perceived as an unalloyed good. The same industrialisation that provided economic progress for many, spawned dark satanic mills for some. Diseases may have been conquered, but poverty killed thousands. And even scientific endeavour had some ugly pseudoscientific offspring in the form of movements like social Darwinism and eugenics.

Bubbling away under the surface were the intellectual forces that eventually led to the “postmodernism” that emerged in the 1960s, sweeping all before it. Or at least it appeared to. Defining postmodernism is a bit like trying to eat soup with a fork; it’s an enterprise doomed to failure. But definitions abound. Britannica defines it as “a late 20th-century movement characterized by broad scepticism, subjectivism, or relativism; a general suspicion of reason; and an acute sensitivity to the role of ideology in asserting and maintaining political and economic power.” Postmodernism came to be seen as a broad attack on the kind of reason and reasoning that we thought we depended upon in science, and even on the idea that words carry meaning and allow sensible discourse about a world “out there”. There was a specifically scientific manifestation of postmodernism in the form of Kuhn’s famous book “The structure of scientific revolutions” (discussed briefly here). This sought to reduce progress in science, in which a new theory or approach displaces and old one, to a type of “conversion” experience; scientific “progress” (so Kuhn’s critics claimed) was being reduced to a series of almost irrational leaps. Not that most of us scientists were that bothered you understand. Much of this “revolution” passed us by in our isolation from such intellectual fashions.

Perhaps it was because in principle we have to deal with reality as it is (or at least as we perceive it to be). All scientist are in some sense “realists” – there is a real external world, independent of my ideas and feelings about it, that can be prodded and poked. The methods that had stood us in good stead for a couple of centuries, seemed still, indeed seem still, to serve us well. So we left our colleagues in the humanities and social sciences to argue the toss over who was oppressing whom by this or that word or sentence, continued to prod and poke, wrote up and published our results, refined and refuted, and generally just got on with things. Admittedly, neither we nor our students thought as hard as we should have done about the thinking we were actually doing (something I lamented here). But, as the pandemic has demonstrated, it’s probably just as well that we did "just get on with it". Some of the most powerful tools that have led to effective vaccines being delivered in record time stem from just quietly beavering away. And perhaps that’s why, particularly in the pandemic, postmodernism appears to be in big trouble. At least in its more extreme forms it has been unmasked as is a diversion, an entertainment and an indulgence that can’t cope with hard realities. The science that is now saving lives has turned out to be more important than academic word games.

Personally, while not a complete fan of modernism (reason has always had its limits), some of postmodernism’s contentions always seemed ridiculous to me. There is a whole strand that prizes obscure language and then seeks to claim that reason must always be subverted by slippery communication with mixed motives. Words cannot be trusted to accurately convey meaning, they are inevitably ambiguous. The problem is that the proponents of these views apparently thought this only applied to other people’s words; their words were to be taken at face value. But this has to be a sort of self-refuting proposition. But it gets worse. It was the postmoderns’ deliberately obscure and convoluted language that turned out to be easily subverted and exploited by parody.

Famously, the physicist Alan Sokal composed a nonsense paper and submitted it to a prominent academic journal (Social Text). The paper went through the normal (rigorous?) review processes of the journal, and was accepted for publication in a revised form. It was, in Sokal’s words “brimming with absurdities and blatant non sequiturs” but was actually published in a special edition of the journal. The aftermath of the hoax, and the debate which followed, are detailed by Sokal and Bricmont in their book “Intellectual Impostures”. This was not a one off. In 2018 essentially the same thing was done on a much larger scale. Twenty fake papers were submitted to a number of prominent academic journals, bastions of postmodern thought in various forms. Of the 20 papers, seven were accepted for publication, and most of the others might well have been had not the perpetrators called time on their hoax. Only six of the twenty were thrown out. This was a field in trouble.

It turns out the trouble may be have been terminal. Having almost missed the “death”of new atheism, I may actually have missed the death of postmodernism. Before some of us had even begun to grapple with it at our end of the campus, Alan Kirby was writing in “Philosophy Now” that we all really should be post-postmodernists. That was back in 2006. It seems that words do convey meaning, and reason is reasonable again.  Some of us never thought anything different.

Saturday, 15 May 2021

Life in the pandemic XXV The touching faith of atheists…….

Atheism, in its various forms, has a very old and in some quarters a cherished history. It’s a history that many modern-day atheists seem to be ignorant of, something I discussed a while ago. As you may have gathered, I am not an atheist. But I’m interested in the views of folk who are. I admit that this is partly out of curiosity. As the views and ideas of most atheists (at least the ones who have thought about it) are different to my way of thinking, it’s hardly surprising that they evoke curiosity. There’s also the possibility that there is something fundamental they’ve noticed that I’ve missed. And I suppose the writer of Ecclesiastes could have been wrong; something “new under the sun” could crop up that finally demonstrates, once and for all, that there can be no God. This seems unlikely (although I would say that), but for the sake of friendly interaction I’m prepared to accept this as a logical possibility.

It was in this spirit that I was interested to read an atheist writing about atheism. John Gray’s “Seven Types of Atheism” is readable, entertaining and short (only 150-odd pages in my 2019 Penguin paperback). I don’t suppose all atheists will agree with either his classification or his analysis, but neither do I think anyone will accuse him of rampant misrepresentation. In particular, he in no way writes as a theist critic. He remains quite content with his own atheist position, which he identifies as being closest to a couple of the categories he describes. It is worth noting a the outset that there is a close resemblance between what Grey writes and the thrust of Tom Holland’s “Dominion” (discussed  briefly here). It is terrifically hard to drive out the intellectual and cultural effects of 2000 years of Christian monotheism (and before that Jewish monotheism) and start thinking from (or to) a genuinely different position. It is a big task to find new concepts not dependant on the same foundations as the repudiated system, even if such a thing is possible. This was something that Nietzsche cottoned on to, but apparently not so many others before or since. In his early chapters Grey insists that this leads to a sort of lazy atheism that essentially maintains categories that actually need God, but simply swapping Him for someone or something else. Gray accuses secular humanists of doing this, swapping God for humanity, and then not noticing that the resulting system doesn’t work. Apart from anything else, Gray thinks that this is doomed to fail because humanity doesn’t exist as a single, functional entity; it is a myth inherited from monotheism: “’Humanity’ is not going to turn itself into God, because ‘humanity’ does not exist”. His point is that all we really see is lots of individual human beings with “intractable enmities and divisions”, not a single organism capable of fulfilling God’s role.

But time and again Gray also throws up interesting little insights into the sayings and doings of important atheist thinkers. Many of them seem to be stark examples of what is alluded to in a quotation often attributed to G.K. Chesterton: “ When men chose not to believe in God, they do not thereafter believe in nothing. They then become capable of believing in anything.” For example, Grey calls Henry Sidgwick “one of the greatest 19th century minds”. But having lost his faith, he hoped science would supply him with the meaning he now felt he lacked. Bizarrely, he eventually turned to psychical research, and Grey quotes him as telling a friend later in life  “As I look back …. I see little but wasted hours”. Nietzsche was prepared to put his faith in a few exception human beings, “supermen” who could “will into being the meaning God had once secured”. Grey’s main point is that even arguing that the redemption of humanity by such “supermen” was required or could be accomplished, demonstrated that Nietzsche continued to be held captive by Christian concepts he so deeply despised and had declared dead. But it’s been a while now since Nietzsche’s scheme. No sign of his “supermen”.

Grey is also fairly severe on the idea of the inevitable human progress so beloved of many scientifically minded atheists over the last couple of centuries. This appears to be one of their supreme acts of faith. But as he points out, no-one can really agree what constitutes progress or what it might mean in the future. And there is precious little evidence of overall net progress for the mass of humanity. You might think that this surely goes too far. After all, in technology hasn’t the invention and growth of the internet brought tremendous benefits? I can sit on my sofa and book my next holiday or order my dinner. I can find the answer (or at least an answer) to almost any question using my smartphone. But then this same technology has brought new problems and crises not conceived of previously, like the rise of  social media persecution (which has already cost lives) and the cyber world as a new venue for crime and warfare. But in medicine, haven’t we eradicated some of humanity’s most serious disease? The obvious retort is yes, but oh the irony. Here was are in a global pandemic in which the old scourges have been replaced by a new one, with more around the corner aided and abetted by modern human behaviour. Faith in the progress of humanity (even if you think “it” exists) is touching, but hardly evidenced based!

Grey assembles a bewildering cast of characters with no interest in the God of the Bible, and often resolutely dedicated to denigrating and disproving Christianity as anything more than a fable, and quite possibly a dangerous fable at that. Some were aggressive in their denunciations, some more muted and less evangelical. Many I suspect would be bemused by Christianity’s continuing ability to attract adherents, and its continuing ability to play any a role in thought and intellectual discourse.

Grey quotes Schopenhauer as writing in 1851: “A religion which has at its foundation a single event …. has so feeble a foundation that it cannot possibly survive.” Such faith. Touching. But sorry Arthur, misplaced.