Showing posts with label philosophy. Show all posts
Showing posts with label philosophy. Show all posts

Sunday, 2 November 2025

Barely conscious(ness)

You may not have been conscious of it, but big arguments have been swirling around the issue of consciousness (for the example that prompted this post see here). Science had, then didn’t have and now again has big problems with consciousness. You know an argument is in trouble when its starting point is the denial of probably the one thing we are all aware of (at least when we are awake and healthy) – our consciousness. Descartes was so sure of his that he based his philosophy on it (the famous “I think [doubt] therefore I am”). But you might be surprised to learn that for a good chunk of the 20th century in science, a good chunk of scientists were convinced that either it did it not exist or if it did, it didn’t “do” anything. They were the behaviourists, represented by B. F. Skinner (he of the infamous box). Consciousness was “nothing but” (ie reducible to) behaviour (by which they primarily meant movement) or propensities to behave. Don’t feel any need to understand any this (a notion which clearly assumes some degree of consciousness on your part!), because such views didn’t last long into the second half of the twentieth century. Behaviourist schemes clearly didn’t work, and the starting point was in any case fatally flawed.

But historically there had been an ongoing struggle to accommodate subjective, first-person, mental states (consciousness) within a thoroughly empirical (scientific) approach to our understanding of ourselves. Those devoted to the nineteenth century theory/myth of the conflict between science (good) and theology (bad), didn’t want to provide any space for the immaterial (whether soul or mind – for current purposes assume that both words name the same thing). But not having a satisfying material explanation for what we are all most aware of was a bit of a problem. If claiming that things like mental states did not exist was not viable, what to do? Well, assuming there was a thoroughly material explanation for our private interior self (potentially another fatally flawed assumption), given the powerful new tools of neuroscience such phenomena had to be explicable in terms of what was going on in the brain (that clearly material lump of stuff inside our heads). So there arose an empirical subdiscipline within neuroscience, that of “consciousness studies”.

Writing 10 years ago in the inaugural editorial of the journal “Neuroscience of Consciousnessthe editors credited a 1990 paper by Crick and Koch as marking “the rebirth of consciousness science as a serious exercise”(Seth et al. 2015; Crick and Koch 1990). The publication of the new journal reflected “the maturity of this rigorous and empirically grounded approach to the science of subjective experience”. While they themselves made no claim that this was necessarily the only available approach to subjective experience, such a claim had already appeared in Crick’s book, published the previous year (Crick 1994). Crick and Koch claimed in their paper that “Everyone has a rough idea of what is meant by consciousness” and avoided a “precise definition”. This, along with other knotty issues, was left to one side “otherwise much time can be frittered away in fruitless argument”, implicitly a criticism of what had gone before. Philosophy (and certainly theology) had had its day. It was now over to science to explain the previously inexplicable, even consciousness (see Chemero and Silberstein 2008). This particular body swerve would prove to be costly.

Now, thirty five years on from the “rebirth of consciousness science”, where stands the project that had reached “maturity” ten years ago? Franken and colleagues recently published the results of a survey of consciousness researchers who attended two consecutive annual meetings of the Association of the Scientific Study of Consciousness (established in 1994 and later the sponsor of Neuroscience of Consciousness) to investigate “the theoretical and methodological foundations, common assumptions, and the current state of the field of consciousness research” (Francken et al. 2022). Among the issues they found “a lack of consensus regarding the definition and most promising theory of consciousness” and “that many views and opinions currently coexist in the consciousness community. Moreover, individual respondents appear to hold views that are not always completely consistent from a theoretical point of view”. Lest it be felt that this is a rather slim basis on which to form a view as to the current state of the field, Seth and Bayne (2022) reported in a recent extensive review that “in the case of consciousness, it is unclear how current theories relate to each other, or whether they can be empirically distinguished”. They recommended “the iterative development, testing and comparison of theories of consciousness” . Franken et al (2022) used ten different theoretical constructs in their survey, Seth and Bayne (2022) identified a “selection” of twenty-two “theories of consciousness” (see their Table 1) which they grouped into four broad categories and Kuhn (limiting himself to “materialism” theories) identified fourteen neurobiological theories, to which he added lists of philosophical (N=12), electromagnetic (7) and computational/informational (4) theories (Kuhn 2024). Confused? Well, it turns out the field of consciousness studies is.

An attempt to follow Seth and Bayne’s advice, using a “large-scale adversarial collaboration” to experimentally compare predictions made by two of the major competing theories of consciousness (“global neuronal workspace theory”, GNWT vs “integrated informational theory”, IIT) recently reported results in Nature (Ferrante et al. 2025; see also the accompanying Nature Editorial). The evidence that emerged partially supported and partially challenged both theories. However, the aftermath is more revealing. In response to the preprint and media coverage of the paper (the actual Nature paper was submitted for publication in June 2023, accepted for publication in March 2025 and published in April 2025) a long list of researchers (including recognised leaders in neuroscience) put their names to an open letter on the PsyArXiv preprint server condemning the exercise as flawed, calling IIT “pseudoscience” and objecting to its characterisation as a leading candidate theory for explaining consciousness at all (Fleming, et al. 2023). Proponents of GNWT also called into question the discussion of the results and the conclusions drawn (Naccache et al. 2025). All of this suggests that what flowed from Crick and Koch’s avoidance of a definition of consciousness was basic conceptual confusion. But many had claimed that this was the problem at the time; this was precisely the charge made against the field by the philosopher Peter Hacker not long after its “rebirth” (Bennett and Hacker 2003, 239–44; see also Hacker 2012). Nobody is sure what it is they’re talking about, and even those who do claim to know what they mean usually agree that the have no way of measuring the “it” they are clear about. So the next time you read a headline about “understanding” consciousness, just be aware – we don’t.

It’s not just the state of the specific scientific sub-field of consciousness research that appears to have problems and confusions. Concerns have emerged from within the wider materialist camp. Some more history is in order. The philosopher Thomas Nagel is perhaps best known for his classic paper “What is it like to be a bat?”; with regard to the problem of consciousness, the philosopher Patricia Churchland called this paper a “watershed articulation” (Nagel 1974; Churchland 1996). The problem which Nagel drew attention to was the one left by the demise of the behaviourists; the “subjective character of experience” (the what-is-it-like-to-be-ness) was not captured “by any of the familiar, recently devised reductive analyses of the mental”. Materialist accounts of thinking people left something vital out of the account. So he suggested that what was needed were new studies of the subjective and the mental partially answered in subsequent development of consciousness studies described above.

But that was then, what about now? Advances in neuroscience have definitely occurred. With all that  we know now (all those lovely coloured brain scans, snapshots of what goes on while people think), surely a thoroughly materialist account of us, which leaves the concept of the immaterial (be it mind or soul) lying redundant in its wake, is possible? Or at least given such progress, we should be in a position to see clearly how in principal it might be possible. Writing in 2012, Nagel was, if anything, more concerned than he was in the 70’s. Consciousness remained one of the major sticking points causing his concern: “The fundamental elements and laws of physics and chemistry have been inferred to explain the behaviours of the inanimate world. Something more is needed to explain how there can be conscious, thinking creatures..” (Nagel 2012, 20). And yet his concerns went beyond the existence of (as yet unexplained) consciousness to the wider materialist project: “The inadequacies of the naturalistic and reductionist world picture seem to me to be real”(Nagel 2012, 22). He did not find theism (the “polar opposite” of materialism) “any more credible than materialism as a comprehensive world view”, but was having a problem trying to imagine naturalistic accounts that were able to accommodate previously excluded elements like consciousness (or purpose, belief, love and the like). He concluded by accepting as conceivable that “the truth is beyond our reach, in virtue of our intrinsic cognitive limitations” (Nagel 2012, 128). The philosopher Mary Midgley took Nagel’s argument (along with those made by others) to provide evidence that the “credo of materialism” was “beginning to fray around the edges” (Midgley 2014, 14). Things haven’t improved since.

Does any of this matter? On one level, not really. You are still you, even although there is no scientific explanation for you in material terms. At least no one is now claiming that because of the lack of that explanation “you” don’t exist. Fundamentally, of course, I would be argue that science with its third-party, observational statements, which necessarily leave out of the account things like purpose, hope, love, agency and the like (ie things that really matter to us), can only ever provide a partial account of what we are as “persons” (something most scientist are clear about – usually). As Midgley and many others have argued the argument that only science defines or explains important stuff, including what we are as persons, is a monstrous overreach. Such claims are still occasionally made, but this view too is “fraying”.

But there are of course other sources of data, other (complimentary) ways of reasoning, other views of who and what we are as persons (something I touched on previously). If the materialist program is faltering, these need to be heard again. Wonder what (the decidedly immaterial) God thinks?


[PS: I don't normally provide references to the literature in these posts, but as I happened to have them to hand, I thought it would be churlish not to....]

Bennett, Maxwell R, and Peter Michael Stephan Hacker. 2003. Philosophical Foundations of Neuroscience. Blackwell.

Chemero, Anthony, and Michael Silberstein. 2008. “After the Philosophy of Mind: Replacing Scholasticism with Science*.” Philosophy of Science 75 (1): 1–27. JSTOR. https://doi.org/10.1086/587820.

Churchland, Patricia S. 1996. “The Hornswoggle Problem.” Journal of Consciousness Studies 3 (5–6): 402–8.

Crick, F. H. C. 1994. The Astonishing Hypothesis: The Scientific Search for the Soul. Macmillan.

Crick, Francis, and Christof Koch. 1990. “Towards a Neurobiological Theory of Consciousness.” 2 (263–275): 203.

Ferrante, Oscar, Urszula Gorska-Klimowska, Simon Henin, et al. 2025. “Adversarial Testing of Global Neuronal Workspace and Integrated Information Theories of Consciousness.” Nature 642 (8066): 133–42. https://doi.org/10.1038/s41586-025-08888-1.

Fleming, S.M, Chris D Frith, M Goodale, et al. 2023. “The Integrated Information Theory of Consciousness as Pseudoscience.” Preprint, PsyArXiv. https://doi.org/10.31234/osf.io/zsr78.

Francken, Jolien C, Lola Beerendonk, Dylan Molenaar, et al. 2022. “An Academic Survey on Theoretical Foundations, Common Assumptions and the Current State of Consciousness Science.” Neuroscience of Consciousness 2022 (1): niac011. https://doi.org/10.1093/nc/niac011.

Hacker, P. M. S. 2012. “The Sad and Sorry History of Consciousness: Being, among Other Things, a Challenge to the ‘Consciousness-Studies Community.’” Royal Institute of Philosophy Supplements 70: 149–68.

Kuhn, Robert Lawrence. 2024. “A Landscape of Consciousness: Toward a Taxonomy of Explanations and Implications.” Progress in Biophysics and Molecular Biology 190 (August): 28–169. https://doi.org/10.1016/j.pbiomolbio.2023.12.003.

Midgley, M. 2014. Are You an Illusion? Heretics (Durham, England). Acumen. https://books.google.co.uk/books?id=6hHtnQEACAAJ.

Naccache, Lionel, Claire Sergent, Stanislas Dehaene, Xia-Jing Wang, Michele Farisco, and Jean-Pierre Changeux. 2025. “GNW Theoretical Framework and the ‘Adversarial Testing of Global Neuronal Workspace and Integrated Information Theories of Consciousness.’” Neuroscience of Consciousness 2025 (1): niaf037. https://doi.org/10.1093/nc/niaf037.

Nagel, Thomas. 1974. “What Is It Like to Be a Bat?” The Philosophical Review 83: 435–50.

Nagel, Thomas. 2012. Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False. Oxford University Press.

Seth, Anil K., and Tim Bayne. 2022. “Theories of Consciousness.” Nature Reviews Neuroscience 23 (7): 439–52. https://doi.org/10.1038/s41583-022-00587-4.

Seth, Anil K., Biyu J. He, and Jakob Hohwy. 2015. “Editorial.” Neuroscience of Consciousness 2015 (1): niv001. https://doi.org/10.1093/nc/niv001

Friday, 22 August 2025

On “Losing my religion”….

I am a mandolin player. Or perhaps more accurately I should say that I play the mandolin. On this side of the Anglo-Saxon Atlantic, mandolin playing is mainly limited to folk music, although across the Chanel it has long been known as a classical instrument (Vivaldi wrote at least two mandolin concertos). In the US the mandolin has a long and treasured place in country and bluegrass music. But as far as I know there is only one rock/pop mandolin riff that is widely known. Back in the ‘90’s R.E.M. had a hit with the song “Losing My Religion” which starts with it. The song and the accompanying video went on to win multiple awards. You might think that the song had something to do with religion. Perhaps a celebratory atheistic anthem of its newly recognised irrelevance or a wistful retrospective of a now forgotten childhood heritage. But apparently not. R.E.M. frontman Michael Stipe who wrote the lyric has said that it was actually about unrequited love: “..what I was pulling from was being the shy wallflower who hangs back at the party or at the dance and doesn’t go up to the person that you’re madly in love with and say ‘I’ve kind of got a crush on you, how do you feel about me?’”. Doesn’t take away from the brilliant mandolin riff of course. In any case it turns out religion isn't quite what you might think.

That’s interesting because it often isn’t. The meanings given to the word have changed over time, as often happens. And even if there really is a thing being labelled (in the sense that we also give names to non-things like purple spotted unicorns) this is also likely to change through time and and over space (i.e. being different in different places and spaces). So it is sometimes genuinely difficult to know what is meant when we talk (or even sing) about religion, lost or otherwise. There is nothing new or unique in this; try looking up the etymological history of “nice” – you’ll be surprised. Even broad categories used to identify obvious and necessary boundaries turn out in some important cases to be recent innovations that are neither obvious nor necessary. The rhetorical drawing of contrasts is therefore also tricky. The idea that the categories of “natural” and “supernatural” have always been with us, and we’ve always been clear about what these categories are, crops up in many debates. Indeed it is the supernatural, as distinct from religion or God, that was Dawkins’ main target in “The God Delusion”. He clearly thought he knew what he meant, and that his readers did too.

But the categories of natural and supernatural are relatively recent. And around them there has been more than a little myth-making particularly once they transformed into “-isms” claimed to competing with each other. This particular framing (although not the words themselves) appeared late on in the 19th century promoted by, among others, T.H. Huxley. Huxley and his ilk then read these categories back into history. Promising (in their terms) pre-Socratic philosophers were identified as being early stalwarts taking their plucky stance against surrounding supernatural beliefs and religious practices. A line of heroes was then traced through that most influential of ancient philosophers, Aristotle. And so down to contemporary debates where science, rationality and naturalism were pitted against religion, faith and supernaturalism, with the implication that we all know which side of the line we (and the intellectual greats of the past) must stand. Except it was never thus and is not so now.

The Greek philosophers, of all schools and stages, were clear that the divine was involved with all aspects of human life and thought, whether for good or ill. For them, “natural” inevitably implied, among other things, divine activity. And Greek science (a much wider activity than what is meant in English by the word today) showed little sign of progress or development away from such notions. Arguably it was actually the rise of Christianity which in some of its forms began to remove the divine from many of the areas of life it was formerly thought to inhabit. Many of the innovators who began to give science the form it has today, from Bacon on, made no great distinction between their thinking as scientists (not a word they would have understood in our sense) and theological thinking. Investigating the world with the tools available was an investigation of the works of God. The success of science  was, to many, not the success of naturalism in the face of supernatural resistance, but actually progress in illuminating and understanding the works of the Creator. No contest here. But something does thereafter seem to have been lost.

A broadly Biblical understanding of everything there was and is was what led to (or at least was the context of) the development of science as we know it today. But a catastrophic narrowing of science seems to have taken place, particularly as it became professionalised and institutionalised. The historian Peter Harrison recently put it like this “Whereas the sciences are sometimes said to be based in curiosity, from the mid-twentieth century that curiously rarely extended to fundamental questions about the metaphysical foundations of science or the intelligibility of the natural world” (Some New World, p328). As a matter of history those “metaphysical” foundations were thought to be Biblical by the majority of the practitioners from the sixteenth to the nineteenth centuries. It was Huxley and others, relatively recently, who set up various false antitheses. And they were then highly successful in evangelising for this particular view of our intellectual and scientific history. Once constructed in their terms, loosing the supernatural, indeed losing religion, was not the loss of anything of value. Indeed, it was seen as a necessary and progressive step.

The problem is that we are now living with the consequences of this loss of “who knows what”. And it actually turns out that the most serious consequences are not for religion (in the modern sense) as much for science, politics and culture. Religion appears to be going from strength to strength all over the world. But particularly in Western Europe and the US, wistful noises are now being made in the oddest of corners for what has been lost. And science itself seems particularly to be suffering. 

So if you thought REM was celebrating the loss of religion in the sense of losing the religious, think again. And even if you had been right, it would probably not be something worth celebrating.


Friday, 24 January 2025

Reading for 2025 (so far...)

 

How long does it take for a tradition to become a tradition? I have no idea. But I think I'll stick with one that began only twelve months ago, and commence the blogging year by mentioning some of the books that it is my intention to read in 2025. Some are part of ongoing projects and there are two complete series in view. And no doubt that there will be other "one-offs" that I’ve yet to encounter.

At the bottom of the pile (and still foundational in more than the obvious sense) is my Tyndale House Greek New Testament. The acutely observant with longish memories will remember that this was also at the bottom of last year's pile, but it was there perhaps more in hope than expectation. I had embarked on learning NT Greek with the help of resources from Union. At the time I thought I might eventually embark on further, formal language study. But alas my progress was rather slower than I had hoped (and slower than was necessary to undertake the courses I had in mind). However, by last September I had made sufficient progress to join a local group that met online once a week to read and translate the NT. So, for an hour each Wednesday morning that’s what we’ve been doing. Reading our way through John’s Gospel, there have already been some lightbulb moments. I confess that some are a bit nerdy; a verb in a tense freighted with meaning that is missed in the English. Others have come as a result of feeling the full force of the language John reports Jesus as using (albeit in his translation from Aramaic to Greek). The clarity with which Jesus claims not merely to be a prophet but God Himself was not lost on His original hearers who, in John 8:59, are literally ready to stone Him to death (ie they’ve got to the stone picking-up stage). But while this is clear in English translation, Jesus constantly taking up the language of Exodus 3:14 (I am) comes through loud and clear in the Greek. In the same section at least one other person uses the same words (once), but the context and repetition on Jesus’ part emphasise His claim.

My strategy for our sessions is to try to do several verses of translation each day over the preceding week, allowing me to spot difficult vocabulary or grammar (of which there’s still a lot) ahead of time. I am still very much in the foothills, but the Tyndale “Reader’s Addition” helpfully lists less familiar words in footnotes at the bottom of each page, meaning that one doesn’t constantly have to refer to a separate lexicon or the interweb, thus saving lots of time. This year I’ve also been trying to read a couple of verses in Greek from my daily Bible reading schedule. And to keep moving forward I thought I’d better try and advance my understanding of the grammar beyond the basics covered last year. To some extent this develops from the reading, for it quickly becomes clear that basic rules are, well, basic. As with any language (and English must be a nightmare in this respect) such rules are often more broken than kept. So on my pile is Mathewson and Emig’s “Intermediate Greek Grammar”. While admittedly not what you would call “ a right riveting read” this is none-the-less useful for understanding some of the rule bending and breaking that actually occurs with the language “in the wild”. 

What I did have last year (although I didn’t discus it in the relevant post) was some serious theological reading - Calvin’s Institutes (edited by McNeill, expertly and entertainingly translated by Ford Lewis Battles). The “Institutes” represented some of the first “proper” theology I read when I began the MTh at Union. I had of course heard of the man before, and had enough reformed friends to have heard of the Institutes. But I had never actually read Calvin (and now I wonder if my friends ever had either). I initially approached the two substantial volumes of the McNeill edition with some trepidation. After all the Institutes were originally written in the 16th century, within a particular context and with some fairly specific polemical targets. I had already been exposed to some of Barth’s “Church Dogmatics” which was not an entirely happy experience. I needn’t have worried. The combination of Calvin’s clarity of organisation and thought (and his wit) on the one hand, and Battle’s skill as a translator on the other, made it an intellectual and spiritual treat. Even for those not of a reformed disposition, there is much to learn and admire in Calvin’s efforts. But that was last year. I wanted to continue reading theology, but what next? Providentially I picked N.T. Wright’s five volume “Christian Origins and the Question of God”. I say providentially because, a bit like Calvin (or was it Battles?) Wright has a way with words. I managed to get started on Vol 1 early, and finished it last week. It is written with verve and wit, but without sacrificing depth and thoroughness (and providing plenty of footnotes and an extensive bibliography). There are those occasions when one encounters writing dealing with difficult or potentially dense issues, but the author does so in way that provides assurance that they “know their onions”. Having learned lots about the Judaism that provided a key element of the context for Jesus’ arrival, life, death and resurrection, I’m now enjoying the second volume which concentrates on Jesus Himself. The plan is to complete all five volumes this year. So far, I have no reason to believe this will be a chore.

To digress from the theology for a moment (but not as far as you might think), I also plan to read Hillary Mantell’s “Wolf Hall” trilogy. Now it is true that she won the 2009 Booker Prize for the fist book in the trilogy, and this would normally scare me off. The books that critics deem worthy of awards and the books that I enjoy reading usually fall into two distinct and mutually exclusive categories. Prize-winning prose is usually not my thing. But I was was impressed with the BBC’s adaptation of the books, and enjoyed Mark Rylance’s portrayal of the central character, Thomas Cromwell. So I took the plunge and made the trilogy one of my 2024 Christmas asks. Some kind relative duly obliged and this has been my bedtime reading throughout January. Bedtime it may be, but “light” it is not. I’ll spare you the review, but I will be persevering. And the story of Cromwell (if not the man himself) is growing on me. I have two and a bit books to make up my mind.

Towards the top of the pile is reading for another “project”. I completed my PhD at the end of the 1980’s, and spent a good part of the 90’s in the Centre for Neuroscience at the University of Edinburgh. These were heady days in what we’ll call the “neurosciences” (really a collection of fields and techniques all aimed at understanding the operations of the brain and nervous system). As a subject it was reaching maturity and new tools, particularly those for imaging the brain in awake human subjects (ie while they were doing things like thinking), were becoming routinely available. The new techniques and results had not gone unnoticed by philosophers, who were beginning to think that there might be light at the end of the very long, very dark mind/brain tunnel. It was around this time that “eliminative materialism” came into its own with loud and confident statements made, asserting that things like beliefs were the product of a soon-to-be-refuted and redundant “folk psychology”. Soon we would all get used to the (correct) idea that beliefs were the phlogiston of the neurosciences and they would be properly replaced by talk about brain states. “I” am merely my brain and have no more basis in reality than the immaterial God who has already been routed and driven from polite public discourse. What I didn’t know at the time was that this was (of course) only a very partial view of the state of the philosophical (never mind the theological) world.

So my aim is to now read some of the rejoinders I should have read then. To be fair I was doing other things at the time like making my own modest contribution to trying to understand vision and eye movement. This time round I’m also specifically interested in the serious theology as well as the philosophy involved, because it turns out there is quite a lot of it. Including (as can bee seen in my pile) Barth. Actually Cortez's "Embodied Souls, Ensouled Bodies" has been very helpful on that front. Suffice to say that already I’m discovering that time has not been kind to the eliminativists, and that’s even before one begins to take on board what Divine revelation has to say about the constitution of human beings, mental and otherwise.

It turns out God has much to say about us as well as Himself.


Tuesday, 30 November 2021

Why does science matter?

Although it’s really my last post that prompted this one, I am admittedly returning to something I’ve blogged about before. It was a while ago, so I won’t take it personally if you can’t remember what those particular posts were about. I’ll try not to repeat any of the specifics here as you can obviously go back and read them (eg here and here). But having opined about why theology matters (about which I know relatively little), it seemed only fair to reflect on what I spent most of my adult life working in.

However, there are a couple of issues we have to deal with first. Although it’s common to talk about “science” as though it is a single institution, it really isn’t. There is no single body that polices a rule book, and the reality is that there is no single agreed definition or set of rules. There is also no single agreed scientific method. It used to be thought that a single recipe for doing good science might be either discoverable or definable, and that a single, coherent method could be established. And of course the philosophers got busy trying to cook one up. But with due respect to the likes of Francis Bacon, John Locke, William Whewell, Karl Popper and Thomas Kuhn, none of them really produced anything that you could pull off a shelf, apply to a problem and obtain a “scientific answer”. Indeed the most many of them managed was an attempt at describing what scientists actually did. This is an interesting exercise in its own right. Mind you, it has always seemed to me that they were overly infatuated with physics, from which they drew many of their key examples. If of course science is just one thing, and there is a single method, then why not start with an area of science that seems to have delivered. Perhaps this explains why “big physics” is often reported in the media and is supported by such massive sums of public money (over the last decade the UK has invested an average of £152M per year in CERN alone). Biology has usually suffered in comparison. The philosophers didn’t seem to like biology that much, it was maybe too wet and messy.

It’s odd, but all this philosophical effort, individually and cumulatively, has had relatively little impact on the activities of scientists themselves. By and large they just got on doing “it”, and apparently quite successfully. It looked like there might be a common core of things that were a good idea, things like collecting evidence, forming tentative explanations, and then testing these rather than just blithely accepting and asserting them. But single, codified, rigorous method? Not really. Occasionally, individual scientists were influenced by reading about what they were supposed to be doing in the writings of one or more of the aforementioned philosophers or thinkers (many of whom were not themselves scientists). They might try to construe their activities in the sort of terms they had read about. But this all tended to be rather post-hoc. Suspiciously, such accounts tended to crop up in books written at the end of careers, as though they were a relatively recent discovery.

Now this all may be a good or a bad thing. But part of the problem is that relatively few pure science degrees (particularly in the Anglo-Saxon world) provide a rigorous introduction to the intellectual procedures involved in science. There are lots of lectures, lots of learning about great previous experiments, occasional attempt to repeat them and so on. Such degrees are certainly fact-packed (and very often great fun too – mine was!). But as to the principles of how your thinking was supposed to operate, one was rather expected to simply imbibe or intuit this. To be fair, this is a criticism that has so often made, that in many degree programmes today there may be an optional module in the philosophy of science. But it is rarely a key component of the education of a young scientist. And this has the disturbing consequence of a highly skilled but philosophically unsophisticated workforce.

None of this means that science (in its various forms) has been generally unsuccessful; clearly it hasn’t. But one unwelcome effect has been the unfortunate inability of many of us scientists (and I include myself in this) to helpfully articulate why science has been successful, what its product has enabled, and why this all matters. What we often end up with is hubristic, triumphalist babble that can sometimes seem  more like paternalistic propaganda. Scientists do all have skin in the game of course, because many of us earn our money from the scientific enterprise. And the source of that money is very often hard-pushed taxpayers, and in the case of the health and clinical sciences, patients. When we try to explain what we’re up to and why it matters, we can sometimes sound rather as though we’re saying that you should simply trust us (and keep paying us) because we know what’s best, and it would be far too complicated to explain to you.

Now there is a sense in which this is true. These days the technical details are often complicated, and a degree of trust is required. But the problem is that because we have not articulated well enough or often enough how science works (in its various forms), trust is now rather lacking. This is illustrated by the range of responses to the undoubted success of the vaccines developed to combat the COVID19 pandemic. The mRNA vaccines that have been so successful are the product of a completely new approach to vaccine development that emerged from years of patient and largely unheralded basic science, working out the details of what goes on in cells at a molecular level. The speed at which this led to highly effective vaccines coming into use and saving lives was unprecedented. And yet, all over the world there is significant resistance to their use and a marked reluctance to their uptake.  

Part of the problem is that science doesn’t exist within a bubble. The “modern” world that science both grew up in and helped to shape, has now morphed into a very different context. Intellectual authority is now a weakness and trust has been undermined. We now have facts, duly established by tried and tested procedures (technical and intellectual) duelling in the media with alt-facts (opinion, suspicion and assertion dressed up as facts). And the individualism that stemmed from the same revolution that gave rise to modern science, means everyone is an expert who has to understand the evidence, even when everyone really isn’t an expert and really can’t weigh the evidence in an appropriate way.

Science really is the best way we have to generate certain types of reliable information of critical importance. It cannot answer any and all questions, but it has and can answer some really important ones. At the edges of course, there is scope for debate as to what is and what is not an appropriate question that can be answered scientifically. Over-claiming, often by prominent scientists, or putting down other approaches in non-scientific domains (like theology among others) has done science no favours. But make no mistake – science has mattered in the past, is making a big impact now, and will be needed in the future. It will continue to matter - bigtime.

Sunday, 31 January 2021

Life in the Pandemic XVIII: Truth in trouble…?

Truth is having a hard time. This statement of the obvious is worth stating for two reasons. Firstly, it implies that there is something called truth, and that, in my view, is something worth implying and indeed asserting. Given that you probably have a fairly instant and rough idea of what I mean (whether you agree with me or not), suggests that such a statement is neither incoherent or meaningless. The second reason that it’s worth stating is that while obvious, it alludes to the observation that something interesting is going on. On one level truth has always had a hard time. Defining and debating what “it” is, has kept busy both amateur and professional philosophers for thousands of years. And yet, as I’ve noted before, at least as far as public and political life in the West is concerned, we seem to have moved into a new phase of hardship.

In the US, the “big lie” is not yet dead. Nor has it yet been driven from the field by the “big truth”. According to CNN (not an entirely unbiased source of information I grant you), former president Trump has just had his impeachment legal team quit/fired because of a disagreement over strategy. This disagreement, it is claimed, comes about because Trump wants to maintain the fiction that the election was stolen from him. His lawyers apparently thought that this was not a viable strategy for the trial in the Senate that he now faces. It is unclear (at least to me) whether this is just about the narrow strategic issue, or because the lawyers understand that they cannot assert what they know to be manifestly untrue. However, at a minimum this shows a certain level of dedication to the lie on Trump’s part. Again, this could all be a strategy. But it could also be because he actually believes it. We shall probably never know the truth (as it were). Strength of belief, while often admirable, can’t turn a lie into the truth. Trump does still have his supporters, and they number in the tens of millions. This again is not sufficient to make the lie true. It just means that it’s a widely believed lie. Who knows which way this story is going to end. Is a complete partisan detachment from facts and truth simply going to become one more viable path to power with no accountability? Or will the political culture in the US revert to the more normal pattern of a commitment to at least the semblance of prizing and speaking the truth, with suitable wiggle room provided by the careful use of words?  So to this extent, in this particular context, the truth is still in trouble. It remains to be see whether this approach to life, this particular and brazen abuse of truth, will successfully spread to this side of the Atlantic.

Of course, some would maintain that either it already had, or in fact crossed from here to there – the “all politicians are liars” school of thought. But it appears that here there still is an interest in at least seeming to tell the truth. In Scotland, the First Minister, may be in big trouble for misleading the Scottish Parliament. The story is complicated and not particularly edifying. But if it turns out she has said x when in fact y is true, she will be greatly diminished, even if not completely finished. And the x’s and y’s in this case are themselves matters of detail. It’s the misleading, if it is proved, that will do the damage, not the content of the misleading itself. On the pandemic front, there is still liberal quoting of science and evidence, because accurate, truthful information matters, and science is still seen as a way of procuring it. So truth may be fighting back. Of course if it turns out that it’s all just carefully crafted propaganda, then things might turn again. The idea that it somehow doesn't matter has yet to gain much traction.

All of this comes against a background of “truth” not really having had any clear moorings for a while. Plato et al argued for truth that was universal, ideal and unchanging, belonging to a different realm from the one which we inhabit. These ideas were adopted and rejigged by Augustine and others, so that truth found its foundation in God. And indeed the Bible reveals that the basis of all truth is personal, not primarily rational. It is found in the God Who is both true and truth and intimately linked to His truthful, faithful and true person. The clear answer to Pilate’s question (“what is truth?”) was the person standing in front of him; a person who both claimed to be truth (Jn 14:6), and whose enemies recognised as “true” (Matt 22:16).

Things worked fairly well until this foundation was “abolished”. Nietzsche succinctly captured it with his “death of God” ramblings. He called it the most important of recent events “that ‘god is dead’, that the belief in the Christian God has become unworthy of belief..”. The retreat from truth, truth that is true everywhere for all time, gathered pace and in more recent times culminated in some of the more radical proposals of first existentialism and now postmodernism. And how is that all working out? Well apparently it's not just that we won't ever know, but we can't ever know!

Fortunately, Nietzsche’s (probably syphilitic) ramblings were just that. As the apocryphal graffiti on the walls of countless University Philosophy department walls attests, it is in fact Nietzsche who is dead (“signed God”). Dostoyevsky has Ivan Karamazov say that “Without God, everything is permitted” (although for some reason this is disputed in some quarters as false news; but see here). But as He is not dead, truth is still with is, and everything is not permitted. Hence the general idea (although again under assault) that truth is good and lies are bad. Even although such notions are inevitably inconvenient for all of us at some point, for most of us this should actually be a comfort. It is not necessary to walk in confusion, knowing nothing for sure and being able to communicate even less. Even in trouble, we can find and know truth. It’s to be found where it all has been, and always will be.

Saturday, 4 July 2020

Life in the Pandemic VII: Don’t panic, there’s still plenty of books to read…

Frank Zappa is quoted as having said “So many books, so little time”. But of course, for a while now, many of us have had considerably more time for reading than we bargained for, thanks to the pandemic and the lockdown. I’ve been going to work in my dining room for the last few months, so as it turns out I haven’t had a lot of extra reading time. But I have enjoyed a few notable (and eclectic) pandemic reads…so far.

As I’ve noted inside its front cover, my first lockdown read was Alistair McGrath’s biography of C.S. Lewis(1). To my mind both the author and the subject are interesting characters. McGrath is interesting because he began his academic sojourn in the world I am most familiar with. His initial calling was to science, eventually obtaining his Oxford DPhil in molecular biophysics. However, around the time he went up to Oxford, he discovered that there were other, complimentary ways of investigating and understanding the world around him, including theology. And it was to theology he turned, and in which he has made his mark. Lewis, along with other authors and scholars, helped him to understand his journey, and it is perhaps this that explains his interest in Lewis. McGrath’s approach as a biographer turns out to be quite scientific, because in order to master his subject, his approach was to immerse himself in the data - in Lewis’ case his published writings, broadcasts and, importantly, his letters. I came to Lewis in my teens, although I confess that I have still to read the Narnia books. My introduction to him was his science fiction trilogy (“Out of the Silent Planet”, “Voyage to Venus”, and “That Hideous Strength”) from which I moved on to books like “Mere Christianity” and “The Screwtape Letters”. What these don’t particularly reveal is much about the man himself. But McGrath does this forensically, although from a sympathetic standpoint. In doing so he reveals a complex character, flawed (as we all are) in many ways, navigating his way through two world wars and the cultural upheavals of the 20th century. It is well worth a read.

Much harder work, but no less rewarding, was Peter Sanlon’s “Simply God”(2). This isn’t bedtime reading, but it addressed something that’s bothered me for a while. As readers of this blog will know, I’m interested in God. Admittedly my interest is more personal than academic, but that doesn’t mean I’m somehow exempt from doing hard thinking about Him. And one of the dangers I’ve become aware of is that I come to see Him as simply a bigger and better version of me. This is in part the age old issue of creating God in my image, instead of recognising that I’m created in His. Of course, I’m not alone. Arguably this is fallen humanity’s biggest and most devastating mistake, stretching all the way back to Eden and the Fall. And it’s pervasive. The “gods” of the ancient world were just big versions of their Canaanite, Babylonian, Persian, Greek and Roman inventors. The “straw-God” of the new atheists is/was just a big version of what they observed/observe in humanity around them. More worryingly the God who is the object of some contemporary evangelical prayer and worship often seems to suffer from similar deficits. But Sanlon’s starting point is that this is a total misconception. Yes we are created in His image, but it is a fundamental mistake to see in this the idea that the difference between us and Him is quantitative. It turns out (and no real surprise here if you’ve been paying attention) that He is a totally different type of being. The gap between Him and us is way bigger than, and of a completely different order to, the gap between a person and a paramecium. This causes an obvious problem. How are we to understand Him if He is so different? Thankfully, it turns out that He has provided help towards exactly this end, because He wants to be known. So He has revealed things about Himself in ways that we can understand. Not being able to understand everything, shouldn't stop us from trying to understand something. Starting with God’s simplicity (which in this context has a particular meaning and significance) Sanlon investigates God as He is revealed. And there is an interesting subtext. I may be reading more into Sanlon’s writing than is there (for which I apologise in advance), but I think he’s fairly angry about the small of view of God that many of us carry around in our heads. I think he’s right to be angry about this (if he is). And to the extent that this book helped me to understand that my view of God had been inaccurate, weak and impoverished, I’m more than happy to apologise! Hard work, but a good read.

A third lockdown read that I’ll mention is completely different. It’s John Searle’s “Seeing Things as They Are”(3). This book has nothing to do with theology. Searle is a UC Berkeley philosopher, as far as I know not a believer, with little interest in theology. I discovered him through his “Chinese Room” argument which appeared in the late 70’s/early 80’s; this seeks to show why brains are not computers and why computers cannot be conscious (or at least conscious in the way that you and I are). He writes with a compelling and elegant clarity. Not that I would claim to always follow his arguments fully, I’m sure I miss a lot. I am after all, just a scientist not a philosopher. But I always get the feeling that there’s something in his arguments, and that it’s worth paying attention. “Seeing..” is an attempt to explain consciousness, in this case the kind of consciousness that is involved in the process of perception. While many regard consciousness as a mystery (and some have argued that it is a mystery that can’t be solved), for Searle there is no mystery, once we think about it clearly enough. The mystery results from confusing categories, and holding on to philosophical baggage and bad arguments from the past. This one’s been keeping me going for a while. So quite handy in a pandemic.

And as if this were not enough already, I’ve just ordered my holiday reads. A similarly eclectic bunch including Stephen Westaby’s “The Knife’s Edge”, George Zuckerman’s “The Greatest Trade Ever Made”, and on the theological front Peter Hick’s “Evangelicals and Truth”. So many books. But then the pandemic isn’t over.

1.       McGrath A (2013) “C.S. Lewis: A Life”. Hodder & Stoughton.

2.       Sanlon P (2014) “Simply God”. IVP.

3.       Searle JR (2015) “Seeing Things As They Are”. Oxford University Press.


Wednesday, 24 June 2020

Life in the Pandemic VI: Bigging up science - but a bit too much?

You may not have heard of Jennifer Doudna. But then there was a time when no one had heard of Charles Darwin or Albert Einstein. And of course, her name may not become as well-known as theirs. But, perhaps it will. Doudna played a key role in working out how to edit the genome using the CRISPR-Cas9 system. While this has opened up a can of ethical worms, it has transformed genetics and molecular biology and could transform medicine and a lot else besides. She recently had some interesting things to say about COVID19 and science. In an invited article in the Economist (June 5th) she wrote that “After covid-19, science will never be the same—and this will be for the better.” Among other things, because of the role science has played in the pandemic, the public’s attitude to it will be transformed (for the better). Science itself will be fundamentally altered (improved), becoming more accessible through modern communications, more collaborative, and more nimble because of it. She likens all of this to a “Kuhnian paradigm shift”, a “new era” that she welcomes.

The problem is that her expertise is much narrower that the issues she tackles. You might think that a prominent scientist with an international reputation is exactly the right person to opine on big issues like the future of science and the public’s relationship with it. Her views are certainly cogent and worth examining. But on the particular issues she has no special expertise and therefore caries no particular authority beyond that of any other intelligent person (with or without scientific expertise). And full disclosure, if you choose to read on, exactly the same applies to me. The first thing that peaked my interest was her use of the idea of a Kuhnian paradigm shift. I’ve discussed before the odd state of affairs that scientists tend to be rather poorly educated in the philosophy of science, and that some have been happy to celebrate their ignorance. Professor Doudna at least discovered and read Thomas Kuhn’s classic essay “The Structure of Scientific Revolutions”. Published originally in 1962, it’s a fascinating read. It’s also something of a product of its time that philosophers of science have to some extent left behind (for a kindly appraisal see here). From Kuhn, Professor Doudna picked up the idea of paradigm shifts. In Kuhn’s account these occur when a period of “normal” science in some mature field of scientific endeavour is disrupted by “revolutionary” science, resulting in a new theory (or paradigm – a set of theories, methods, observations, way of looking at the world) overtaking an existing one. Paradigms and paradigm shifts quickly moved out of philosophy and went mainstream. The problem was that these notions came to be used in a very lazy way, and that’s how they are used in Professor Doudna’s article.

There is no paradigm shift going on science currently. Even if you thought science was one thing, a single institution, with a single methodology and single output (and this is far from the case), if you believed that it was a single entity that could be changed by a seminal happening like a pandemic, there’s no evidence that it is. And if science is about anything, it has to be about evidence. But science isn’t one thing. An analogy would be that of the distinction between sport and rugby. “Sport” describes a collection of things; “rugby” names one of the things in the collection (along with football, tennis, cricket etc). Sport is a collective noun, and so is “science”. There are commonalities between neurophysiology and geology, and key features of methodology that they might share (eg a common commitment to the collection of data of various kinds), but there are big differences too. The idea that there is something going on across all of science, that “science” is changing in some fundamental way, doesn’t stand up to scrutiny. To which the eminent professor might retort that she isn’t claiming that it is. But that’s what she has written.

What is perhaps in a state of flux is the public’s attitude to science. Certainly there's been a lot of public exposure for certain elements of the scientific enterprise. For months here in the UK we’ve watched daily press conferences where the mantra has been “we’re following the science”. There are things this means, and things it doesn’t mean, or at least shouldn’t. It should certainly mean that scientists of various kinds are feeding into a process that informs government decisions. Science provides a set of tools that can describe what a virus (in this case COVID19) can do and is doing in a population. It provides tools that can predict to a given level of precision what the virus is going to do, at least in terms of the numbers infected and the numbers likely to die over a given period of time. It has identified drugs that are useful (like dexamethasone) and drugs that aren’t (hydroxychloroquine – sorry Donald). It’s worth pointing out that the product of the scientific exercise is rarely, if ever, a single, simple number; science, particularly biological science, deals with ranges of numbers with uncertainties baked in. And scientific explanations and answers never come with cast iron guarantees. They are not guesses, but neither are they infallible. At various stages, judgements have to be been made; these are human judgements and therefore potentially flawed judgements. Through discussion, debate, further testing, replication, or refutation, we hope the flaws are spotted and eliminated. It’s never the clean, clever, clear process that after the fact even scientists (and some philosophers) tend to construct. Don’t misunderstand me, science provides a sound means of finding the answers to certain types of question. But when it comes to the decisions that a government might have to take in a pandemic, often the questions are broader and more complex than science is equipped to deal with. This is not to argue that it is in some way flawed and that a better science has to be developed. Just that it is limited. So what “following the science” cannot mean is that the only information flowing in to government is scientific information. Because many of the decision that are being taken are big, complex, tricky, properly political decisions.

When and how school children are going to return to school has become a major bone of contention over the last few weeks. Scientific advice may be able to provide an estimate of how many children would be infected and how many might become seriously ill (probably a very small number) if all children went back to school right now with no social distancing. This estimate might contrast with an alternative estimate of how many might be affected if only 20% of children of a given age went back now, all wearing masks with 2m between them. A risk can be calculated. But that’s the easy bit. What level of risk is acceptable, what level of sickness is acceptable, given the needs of children for their education and the other social, economic and  health benefits that being in school brings? That’s a much trickier question that science on its own can't answer, and shouldn't be asked to. Other disciplines and experience are needed. Hopefully all kinds of information is being fed into the decision making process before judgements on these kinds of issue are made. But given the prominence that science has been given in the pandemic, what happens to science when the judgements made become controversial and disputed? Will science get the blame?

Professor Doudna only sees an upside. But the prominence of science in the pandemic, or at least the lip service being paid to it, could create a backlash. That would be dangerous. Vaccination against disease is a real success story. It’s a science that has been worked on for centuries, and its wide-scale adoption in the 20th century saved millions of lives and delivered hundreds of millions from misery. Yet in our time we have to contend with the phenomenon of the anti-vaxers, who are already gearing up to decry any COVID19 vaccine as some dangerous conspiracy. In the US, Dr Anthony Faucci, has been found lamenting recently on the impact of a growing anti-science bias. Scientist should beware of becoming just another elite disconnected from the mass of people (who, it turns out, pay for most of the science through their taxes) and talking down to them. A bit of care is needed. And perhaps a bit of humility too. Science does have its limits and scientists do have their flaws. Personally, I’d be careful about bigging science up – too much.


Friday, 14 February 2020

Surely you’re joking (Mr Feynman)?


Richard Feynman was a Nobel prize-winning physicist, who is perhaps best known these days for his role in the Rogers Commission which investigated the Challenger disaster. It was Feynman who famously worked out what had caused the disaster. He was as far from the stereotypical nerdy, bespectacled, white coated boffin as it is possible to get. When I was a student (of Physiology, not Physics) his memoir “Surely you’re joking, Mr Feynman”[1] was a must read. One thing you won’t read in it is a quote about the philosophy of science that’s usually attributed to Feynman: “The philosophy of science is as useful to scientists and ornithology is to birds”. No one appears to be able to pin down where and when he said (or wrote) this – hence the brackets in the title of this blog piece. So it is possible that it’s not one of his aphorisms. But it captures fairly accurately his attitude toward philosophy in general and the philosophy of science in particular. It is an attitude probably shared by not a few physicists.

Sir Peter Medawar, also a Nobel prize winner (this time for Physiology and Medicine for his work on immunity), had a bit more time for philosophy, at least to the extent that he was quite fond of perpetrating it. He pointed out that if you ask a scientist about the scientific method, “….he will adopt an expression that is at once solemn and shifty-eyed: solemn because he feels he ought to declare an opinion; shifty-eyed because he is wondering how to conceal the fact that he has no opinion to declare.”[2]  What he was highlighting was that in professional science we have tended not to think about the intellectual procedures we follow, and we rarely explicitly teach them to students either. I was expected to learn my scientific methodology through a combination of observation and osmosis. Of course what this has meant is that when challenged to articulate how we do what we do, we are apt to come up short. That was Medawar’s point. Given the undoubted success of science in providing explanations for, and control over, all sorts of aspects of the natural world, this apparent vacuum about science itself was bound to be filled with something.

Of course on one level there have always been philosophers of science. The list includes the like of Aristotle who philosophised about science before science, as we know it now, existed. Bacon, Hume, Mill and Kant all had something to say on the topic. Scientists did from time to time contribute; Newton famously had a dig at hypothises. But throughout the 19th Century a division began to set in between those on the outside talking about science, and an increasingly professional cadre of scientists on the inside doing the science. And it appeared that you could do it fairly successfully, without actually knowing too much about how you were doing it. Perhaps this is when (some) scientists started getting a bit sniffy about the philosophers. It didn’t help that sometimes the description of science from the outside was not flattering. In the 1960’s it was the philosopher Thomas Kuhn who talked about one set of new scientific theories conquering and displacing an older less powerful set as a “conversion experience that cannot be forced”[3]. Not entirely rational on Kuhn’s account. Interestingly, his views were shaped by examples from physics and cosmology; perhaps this explains the antipathy of at least some physicists to philosophers.

But thinking has to be done, concepts have to clarified, and this is the proper province of philosophers. Yet even today there remains a bit of a prejudice against burdening science students with thinking about what science is and how it works. I used to be in charge of a large health sciences module on research methods. As part of the module I introduced a session on the philosophy of science, so that students would be introduced to a coherent account of scientific methodology (the sort of thing that might avoid the situation described by Medawar). To say that my colleagues thought that this was the lowest of low priorities would be an understatement. It didn’t remain a part of their course for very long!

However, there are a number of issues within contemporary science that mean it is more important than ever that  students are trained properly in scientific methodology, and that as a profession we understand what we’re doing and to what standards. There’s no harm at all in understanding research ethics (ethics being a branch of philosophy no less), and being introduced to issues in research integrity. There has always been successful and unsuccessful science. Some experiments work, others fail. Some turn out to be misconceived and doomed to failure from the start, at least when viewed with scientific hindsight. That’s all grist for the scientific mill. But success and failure in scientific terms are not the same as good and bad science, or for that matter good and bad scientists. The bad ones are the ones that fabricate data and such like – in other words they lie and cheat. This is of sufficient concern for governments, agencies and institutions to have introduced research integrity codes of practice. Perhaps the best known example of these is the Office of Research Integrity in the US.

Research misconduct certainly happens (as the ORI website attests). It is not common, and it is not widespread (probably). Along with proper policing and an open culture, better training might well improve the situation. Clearer understanding of how science works and what is, and is not acceptable practice, can only be a good thing. But more is required. This is about something beyond science; one might even say that it requires knowledge of something above science that underpins good science. Policies and procedures, clear thinking (yes, aided by the philosophers) will get us so far. But at root, this is about right and wrong, it is about values. But where do we get the right values? This is not a scientific question at all. But science (as well as every other area of human endeavour) depends on it.

Birds don’t need ornithology, but scientists do need lots of resources from beyond science. Intellectually, the help of the philosophers should be welcome. But an underpinning morality is needed too. And where are we going to get that?  

1. R.P. Feynman (1985) "Surely you're joking, Mr Feynman". 
2. P. Medawar (1982)  "Induction and intuition in scientific thought" in "Pluto's Republic".
3.T.S.Kuhn (1962) "The structure of scientific revolutions"