I have led a fairly sheltered intellectual and academic existence, just one of many advantages working on the science side of a modern University campus. Modern universities don’t really operate as universities of course. Ideally a university should be a community of scholars with cross-fertilization of ideas across a wide range of disciplines and outlooks. The idea is that even very different disciplines can enlighten and stimulate each other. I can’t be the only scientist to whom good ideas have come while sitting in a seminar whose topic is light years away from some current piece of gristle I’ve been chewing on. However, someone once quipped that academia is the business of getting to know more and more about less and less. On this logic, professors know everything about nothing. Would it be remiss of me to point out that I’m a mere Reader? But it is a fact that we tend to hunker down in ever tighter intellectual cliques and tribes as time and careers progress. Eventually the cell and molecular biologists rarely see those who work on the behaviour of whole organisms, never encounter those (still within the scientific family) who reside in the departments of the physical (as opposed to biological) sciences, and are barely aware of those mythical creatures across the road (actually usually across several roads) who deal in words or thought, sound or pictures. That said, such isolation does have its advantages.
Most of us in the scientific world are probably best described as “modern” in the way we go about our task. This doesn’t sound too bad until you understand that since the 1960s or thereabouts, “modernism” has been seen as dangerous tomfoolery by many of our more arty colleagues who generally consider themselves post-modernists. Modernism is that post-enlightenment mode of thinking that elevates human reason as the key tool for obtaining objective knowledge about the world around us, providing a sure way for humanity to progress. It has been both powered and validated by the apparent success of science and technology. However, it has always had its critics. Romanticism in the late 18th and early 19th centuries was an early harbinger of trouble ahead. While the power and success of science seemed hard to deny, the materialism that usually accompanied modernity (and it was sometimes a radical materialism) seemed to leave something important out of the account. And the kind of progress science and technology generated wasn’t always perceived as an unalloyed good. The same industrialisation that provided economic progress for many, spawned dark satanic mills for some. Diseases may have been conquered, but poverty killed thousands. And even scientific endeavour had some ugly pseudoscientific offspring in the form of movements like social Darwinism and eugenics.
Bubbling away under the surface were the intellectual forces that eventually led to the “postmodernism” that emerged in the 1960s, sweeping all before it. Or at least it appeared to. Defining postmodernism is a bit like trying to eat soup with a fork; it’s an enterprise doomed to failure. But definitions abound. Britannica defines it as “a late 20th-century movement characterized by broad scepticism, subjectivism, or relativism; a general suspicion of reason; and an acute sensitivity to the role of ideology in asserting and maintaining political and economic power.” Postmodernism came to be seen as a broad attack on the kind of reason and reasoning that we thought we depended upon in science, and even on the idea that words carry meaning and allow sensible discourse about a world “out there”. There was a specifically scientific manifestation of postmodernism in the form of Kuhn’s famous book “The structure of scientific revolutions” (discussed briefly here). This sought to reduce progress in science, in which a new theory or approach displaces and old one, to a type of “conversion” experience; scientific “progress” (so Kuhn’s critics claimed) was being reduced to a series of almost irrational leaps. Not that most of us scientists were that bothered you understand. Much of this “revolution” passed us by in our isolation from such intellectual fashions.
Perhaps it was because in principle we have to deal with reality as it is (or at least as we perceive it to be). All scientist are in some sense “realists” – there is a real external world, independent of my ideas and feelings about it, that can be prodded and poked. The methods that had stood us in good stead for a couple of centuries, seemed still, indeed seem still, to serve us well. So we left our colleagues in the humanities and social sciences to argue the toss over who was oppressing whom by this or that word or sentence, continued to prod and poke, wrote up and published our results, refined and refuted, and generally just got on with things. Admittedly, neither we nor our students thought as hard as we should have done about the thinking we were actually doing (something I lamented here). But, as the pandemic has demonstrated, it’s probably just as well that we did "just get on with it". Some of the most powerful tools that have led to effective vaccines being delivered in record time stem from just quietly beavering away. And perhaps that’s why, particularly in the pandemic, postmodernism appears to be in big trouble. At least in its more extreme forms it has been unmasked as is a diversion, an entertainment and an indulgence that can’t cope with hard realities. The science that is now saving lives has turned out to be more important than academic word games.
Personally, while not a complete fan of modernism (reason has always had its limits), some of postmodernism’s contentions always seemed ridiculous to me. There is a whole strand that prizes obscure language and then seeks to claim that reason must always be subverted by slippery communication with mixed motives. Words cannot be trusted to accurately convey meaning, they are inevitably ambiguous. The problem is that the proponents of these views apparently thought this only applied to other people’s words; their words were to be taken at face value. But this has to be a sort of self-refuting proposition. But it gets worse. It was the postmoderns’ deliberately obscure and convoluted language that turned out to be easily subverted and exploited by parody.
Famously, the physicist Alan Sokal composed a nonsense paper and submitted it to a prominent academic journal (Social Text). The paper went through the normal (rigorous?) review processes of the journal, and was accepted for publication in a revised form. It was, in Sokal’s words “brimming with absurdities and blatant non sequiturs” but was actually published in a special edition of the journal. The aftermath of the hoax, and the debate which followed, are detailed by Sokal and Bricmont in their book “Intellectual Impostures”. This was not a one off. In 2018 essentially the same thing was done on a much larger scale. Twenty fake papers were submitted to a number of prominent academic journals, bastions of postmodern thought in various forms. Of the 20 papers, seven were accepted for publication, and most of the others might well have been had not the perpetrators called time on their hoax. Only six of the twenty were thrown out. This was a field in trouble.
It turns out the trouble may be have been terminal. Having almost missed the “death”of new atheism, I may actually have missed the death of postmodernism. Before some of us had even begun to grapple with it at our end of the campus, Alan Kirby was writing in “Philosophy Now” that we all really should be post-postmodernists. That was back in 2006. It seems that words do convey meaning, and reason is reasonable again. Some of us never thought anything different.
No comments:
Post a Comment