Saturday, April 6, 2013
I describe briefly a new and hitherto undocumented phobia, which I shall name neurophobia and those who display it as neurophobes. It is a somewhat new phobia, perhaps no more than 15 years old but it shares characteristics with other phobias. It is to be distinguished from the neurophobia that medical students apparently suffer from when studying neurology.
Neurophobia can be defined as a profound dislike, with various degrees of severity, for cognitive neurobiology and especially for neuroesthetics and for what these disciplines promise to show us.
Neurophobes are a motley crowd and, as with so many other phobias, they include people from different backgrounds and walks of life – philosophers of different degrees of eminence, humanists, religionists and even (surprisingly) some neurobiologists. This is not to say that all philosophers and humanists are neurophobes, far from it; many are interested and excited by the discoveries that neurobiology and neuroesthetics have to offer, but neurophobes are more vocal. Nor are all religionists neurophobes: I have had some very interesting discussions with some religionists, who have shown themselves to be hospitable to new ideas. Interestingly, I have not encountered neurophobia among artists (yet), which again does not mean that there aren’t neurophobes among them. Hence, neurophobia, like other phobias, cannot be associated with any particular grouping, either socio-economic, cultural or otherwise.
Among the characteristics of neurophobia, one may list the following:
1. An irrational fear: they invest neuroesthetics in particular with imaginary powers; these include weapons of mass destruction (WMD), for how else to interpret a statement that neuroesthetics “… will flatten all the complexity of culture, and the beauty of it as well”? and other similar statements.
2. A desire to find a place for the mind outside the brain, not perhaps realizing that cognitive neurobiology and neuroesthetics study neural mechanisms and hence the brain, and that their conclusions are to be seen in that context.
3. The use of emotionally charged and pejorative terms to dismiss neuroesthetics, terms such as “trash”, “buncombe”, “rubbish” and others like them, which have no place, or should have no place, in scholarly (and especially scientific) discourse. Hence neurophobia shares a similarity with other phobias in that it is not easy to rationalize it cognitively, an appeal to emotional and pejorative language being the only way out.
4. The pursuit of ignorance: As with so many other phobias, this amounts to the wish not to know. Hence, neurophobes don’t want any scientific ‘de-mystification’, which they would regard as a “desecration” (note again the emotive language) and prefer to live in ignorance. This is of course similar to other prejudices, where ignorance is the preferred course.
5. This arrogance displays itself in their protecting themselves against the facts. As I have said before, once they relegate our discipline to the status of “trash”, they need not bother with it. And there is, in their writings, good evidence that they have not read what we have written.
6. Arrogance of ignorance: neurophobes always assume that they know better, and hence lecture us on what they suppose we are not aware of. They never cease to tell us that art and beauty are not the same, as if we are not aware of that and have not written about it. They never cease to emphasize the importance of culture and learning in aesthetic appreciation, as if this is a new insight that we are not aware of.
7. Attack the methods: where all else fails, there is always recourse to attacking our methodology – principally the imaging techniques. They fault these for their spatial and temporal resolution (sometimes using emotive language) as if we are not aware of these shortcomings and do not take account of them in our interpretations. (I will have more to say about this in a future post.) I imagine most are scared of new technologies that will have greater powers of resolution.
This collection of characteristics is very descriptive of neurophobia, and they are interlinked. Hence if one detects one of these characteristics in an individual, one must suspect him/her of being a neurophobe and display the other characteristics on gentle probing. Here I would advise caution; it is best to probe a little further before classifying someone as a neurophobe.
Of course, many of them preface their pejorative remarks with feint praise, such as "Neurology has made important advances" (rather like, some of my best friends are neurologists).
And finally…what one neurophobe says or writes is remarkably similar to what another one says or writes, reminding me of the famous line of President Reagan, “There you go again”. Indeed, so similar are their articles that it becomes reminiscent of another one of Reagan’s famous lines (about redwood trees): “Once you’ve seen one, you’ve seen them all”.
Tuesday, April 2, 2013
In 1995, a Japanese team was awarded the Ig Nobel Prize in Psychology for their work describing how pigeons can be trained to discriminate between the paintings of Picasso and those of Monet. Previous work had shown that pigeons could distinguish between the music of Bach and Stravinsky.
Receiving the Ig Nobel Prize must be a mixed blessing, as its very title implies. Often the implication is that there is something trivial in the research reported and sometimes it is awarded for what many would regard as work that is not scientifically worthy, for example a report to the US Congress that nicotine is not addictive (awarded the Ig Nobel Prize in Medicine in 1996).
Others are frankly funny, such as the Ig Nobel Prize for Peace (2000), awarded to the British Royal Navy for a Monty Python-like command, that its sailors should not use live cannon but instead shout “Bang”, or the one awarded in Biology (2004) for showing that herrings communicate by passing wind (farting).
In fact, many of these Ig Nobel prizes go to worthy and scientifically interesting work. The one about herrings communicating by farting turned out, apparently, to be strategically and financially important because the Swedish Navy, suspecting that Swedish waters were being infiltrated by Soviet submarines, instigated a widespread but futile hunt for those submarines. After many inconclusive years, it turned out that the noises were probably coming from farting herrings. Had this been known, it is claimed, the Swedes would have saved hundreds of millions of Swedish Krones.
Science is, or should be, fun. And even apparently simple science can be fun BECAUSE it leads to new and interesting clues. The work for which the Japanese scientists got the Ig Nobel prize in 1995 really showed that pigeons, which have a well-developed visual apparatus, could distinguish between the paintings of Picasso and those of Monet because they formed a concept of these paintings. They did not apparently distinguish them because of the presence of sharp edges in the cubist paintings or colour in those of Monet. Hence, in addition to a well-developed visual apparatus, they have brains that are sophisticated enough (if that is the right word) to develop visual concepts about visual stimuli unrelated to their daily lives.
Concept formation, critical for the acquisition of knowledge, is a fascinating subject, but how the brain forms concepts is not known in any detail. That pigeons should be able to form concepts around works designed by humans for consumption by humans, works which have little to do with their world, perhaps has the germs of an insight into how more complex brains form concepts. It would, in fact, be just as interesting to learn how humans form concepts around different schools of paintings.
If the Ig Nobel prize brings such interesting science to wider attention, then it is pursuing a worthy cause.
Saturday, March 23, 2013
There are some who fear neuroesthetics because they fear that it may ‘de-mystify’ what they prefer to remain mysterious. Knowledge about brain mechanisms that may be involved in the experience of beauty or of love and desire would deprive them, so they believe, of the full enjoyment of those experiences. I gather that a prominent professor has said that he regards it as ‘unwelcome’ to learn what happens in his brain when he is experiencing beauty. Presumably, if he were sitting on some research council, he would use his influence to suspend research in these areas. So, it is a relief that those who hate neuroesthetics and fear it are not in a position to halt research in the subject, at least not at present. There was a time when they could have and, in some areas of research, came close to doing so. Galileo was investigated by the Inquisition and ordered to stay silent, which he did, sort of, for a while. In the Soviet Union, a law was passed forbidding dissent from Lysenko’s anti-Mendelian views, which resulted in many losing their jobs and even being imprisoned. The law was rescinded in the 1960s.
I have no complaints against those who do not want, through knowledge, to de-mystify things which they hope will remain mysterious. That is their view, and I respect it, sort of. But it has to be noted that these are not people who are avid to learn more. It is not that they are simply dis-interested in certain things but that they are vocal in trying to discourage the rest of us from trying to learn more about important subjects – for I take it that the experience of love, beauty and desire are important and interesting subjects. In this sense, then, their intellect is somewhat limited. Though perfectly entitled to their views, these are not the sort of people whom I would like to have sitting on research councils.
In other ways, their attitude seems strange. Science has been de-mystifying things for millennia but I am not at all sure that the world has been rendered any less marvelous because of it. One could say that landing humans on the moon and bringing them back safely to earth was a step in de-mystifying the heavenly bodies, but it has not rendered the moon any less glorious; one could say that compressing all the secrets of life into two strands of DNA de-mystifies life, but it has made it all the more wondrous to me; one could also say that the role of neurotransmitters in regulating sexual behaviour (and hence determining, at least in the world of rodents, the extent of promiscuity) de-mystifies morality or immorality, at least in the world of rodents, but to me it raises a host of interesting questions about how behaviour is regulated, even when it threatens to invade the world of morality.
Perhaps much the more interesting question is a neurobiological one: why do some people (and there are many of them) prefer mystery to knowledge? What advantage does it bring them and what does it satisfy in them? If one of the functions of the brain is to acquire knowledge, what mechanism is it that suppresses the desire to acquire knowledge in such interesting spheres, when the knowledge does not harm anyone? What dis-advantage would such knowledge bring to them?
The answers to such questions, too, might de-mystify things and those hostile to learning more might want to discourage research councils from funding research in these areas as well. But they remain, nevertheless, interesting questions and so I hope that those who want to dictate what kind of knowledge should be pursued and what avoided are never given a seat in the councils that make decisions about funding research.
Sunday, March 10, 2013
The Light Show at the Hayward Gallery, London, is a delight and, quite rightly, oversubscribed. The number entering at any one time is strictly controlled, allowing viewers the space to appreciate the exhibits – quite unlike the disgraceful “cram them in” policy at the Leonardo exhibition at the National Gallery last year. Some of the exhibits, like the Chromosaturation of Carlos Cruz-Diez, or the Model for a Timeless Garden of Olafur Eliasson or Conrad Shawcross’ Show Arc Inside a Cube IV (a bit of an unnecessary mouthful this one) are ones to enjoy sensorially and to reflect about as much as one would about any work of art.
The weakness of the Hayward exhibition is that it pretends to combine science with art, or rather give a scientific explanation of the artistic exhibits, when it should really be seen as an art show and a delight to the senses, or should have appended to the exhibits something that is scientifically valid. As it is, the show was spoiled somewhat for me by the explanations appended. At the entrance, the viewer reads that “Vision is the least reliable of the senses”. What is the basis for this? Many, probably most, neurobiologists would argue exactly the opposite; it is the most reliable of the senses, perhaps reflected in the fact that so much of our brain is devoted to vision.
We are then told that “What we see, or think we see, is not always how things are”. This is a profound misunderstanding of the workings of the brain – for what we see and experience is dictated by the organization of our brains, and is precisely how things are in perceptual reality, however that reality may depart from the “objective” reality. That is why, at my own exhibition at the Pecci Museum of Contemporary Art in Milan (Bianco su bianco: oltre Malevich), the visitor was welcomed with the following statement: “The only reality we experience is brain reality”.
When one looks at the Hering Illusion, the two straight lines, which are parallel, appear perceptually to be somewhat curved. The perceptual reality dominates even when one knows that the two lines are straight and strictly parallel. Or consider the rapid motion in the rings in Isia Leviant’s Enigma; to those who see the movement, there is no doubting its reality, even if there is no actual movement in the rings.
It never ceases to surprise me that we downgrade our true perceptual reality in favour of the “objective reality”; the former is always what it does not seem, while the latter is always true. This gives to the reality we experience a subservient place when in fact the only truths that we are able to experience are brain truths.
I am not saying anything particularly new here. Immanuel Kant said it long ago – that our knowledge of this world is a compound of the objective reality and the operations of the mind; we can therefore never know the thing as it is (Das ding an sich) because our only knowledge of the world is through the operations of the mind (brain). In discussing the philosophical importance of colour vision, Arthur Schopenhauer wrote of its importance for understanding the “Kantian doctrine of the likewise subjective, intellectual forms of all knowledge” – in other words that all knowledge is mediated through the operations of the brain.
This exhibition pretends to explain the visual sensory process through art. Thus, the exciting Chromosaturation of Carlos Cruz-Diez has appended to it the following: “since the retina perceives a wide range of colours simultaneously, experiencing these monochromatic situations causes visual disturbances”.
Almost everything in that statement is incorrect. There are no monochromatic lights in the exhibit (all the lights are broadband although there may be some dominance of one waveband over the others in some), the retina does not “perceive” colours, and there is no “visual disturbance” but only visual sensory excitement, leaving one wondering where the “misty” environment induced comes from. The exhibit would have been better without these incorrect explanations. Why not call it an unusual visual experience instead?
Perhaps artists do not read about advances in science – why should they after all? Perhaps we do not explain our findings properly. Whatever the real reasons, here is a good example of artists and curators trying to explain perceptual processes through artistic achievements and doing so very badly and, worse, inaccurately. It is exactly the reverse of what neuroesthetics has been falsely accused of doing, namely explain works of art through neuroscience, even though that is not its aim (see this post and this post).
Hence, my advice is – go to this delightful exhibition and enjoy the exhibits as creative works of art. Many might want to do more than that; they might wonder what these exhibits tell us about the brain’s perceptual mechanism. But, please ignore the explanations appended to the exhibits – they say nothing about the visual process, or about the sensory brain or about perception, which is not to say that viewing these works does not raise questions about sensory processes.
Here, then, is an exhibition which inspires thinking about the operations of the brain. It is not what it pretends to be, namely an explanations of overall sensory processes. It is a good illustration of how works of art can inspire neuroesthetic studies.
Sunday, March 3, 2013
A jury was unable to reach a verdict at a recent high profile trial of the wife of a disgraced ex-politician who had been accused of the obstruction of justice. The jury came in for much ridicule for the questions they asked of the judge while deliberating. The lawyer for the prosecution,
“questioned whether the case could continue. “I don’t ever recollect getting to this stage in any trial – even for more complicated trials than this – and after two days of retirement a list of questions of this very basic kind illustrating at least some jurors don’t appear to have grasped it,” he said.”
I myself do not share the view that these were all silly or irrelevant questions, although one was somewhat funny and got a funny answer in return:
[Jury]: Can you define what is reasonable doubt?
[Judge]: A reasonable doubt is a doubt which is reasonable. These are ordinary English words that the law doesn’t allow me to help you with beyond reasonable written directions.
But my main interest is the jurors’ question that captured the headline in at least one daily newspaper:
[Jury]: Can a juror come to a verdict based on a reason that was not presented in court and has no facts or evidence to support it either from the prosecution or the defence?
[Judge]: The answer to that question is firmly no. That is because it would be completely contrary to the directions I have given.
But in assessing a situation we often rely on evidence that is not “factual” in the literal sense but may be factual in that it speaks to our faculties of judgment. It is absurd to believe that we do not frequently come to doubt whether someone is telling the truth simply by studying their body language, or the hesitation in the voice or because when we gaze into the eyes, there was something that jarred with the ‘factual’ story being told.
I myself have been a juror on two occasions and can testify that these, though not facts presented in court and do not constitute evidence presented by the prosecution or the defence, nevertheless play an important role in reaching a decision. I believe that, in addition to the evidence presented in court, my co-jurors used the same or similar signals in reaching our common verdict.
I have also asked judges whether, when presiding over a case, they use visual cues which are not facts presented in court to reach a judgment as to whether the defendant is innocent or guilty. They have always answered that it plays an important role. Of course in most such cases, the judge can leave the final verdict to the jury but there have been cases where the judge has disagreed with the jury. Somerset Maugham even wrote a very interesting short story about the consequences of such a divergence of views when the judge in a case meets the (acquitted) accused years later at a dinner party (I read it years ago and cannot now recall its name).
The final verdict must depend significantly upon whether a witness or the accused is telling the truth or lying and, in judging that, many factors besides the evidence presented in court come into play – in the form of signals that the brain receives and interprets but which do not constitute part of the body of evidence presented in court.
My point is that the brain is very good at picking up signals that do not constitute ‘evidence’ in the legal sense but are nevertheless vital in reaching judgment.
Nor am I saying something new. Everyone knows that we make judgments based on objectively non-factual but subjectively critical evidence, nearly every day.
Hence, to ridicule the jurors’ question given above is silly. It is indeed as silly as the statement attributed to Picasso, which I alluded to in a previous post, that “when we love a woman we don’t start by measuring her limbs.” The truth is of course otherwise; we actually start by making very detailed, often nearly instantaneous but perhaps partially unconscious, measurements of a great deal before we fall in love.
Posted by S.Z. at 12:35 PM
Thursday, January 10, 2013
Labeling something often suggests a haste to catalogue it and be done with it. It also implies some level of understanding of that which is labeled. But labels, especially pejorative ones, also commonly help to insulate one from the need to enquire further. Why would anyone who has labeled something as “trash”, for example, be bothered to read or learn anything further about it?
Every now and then, someone who is seemingly exasperated by the profusion of neurobiological facts describing a localization of some function or other in the brain, labels the whole enterprise as nothing more than the manifestation of the “new phrenology”. Nor does such labeling come only from those outside the field; sometimes the same dismissive label is used by neurobiologists themselves.
Essentially, (the old) phrenology supposed that mental faculties are localized in the brain and that an especially well developed mental faculty would result in a corresponding bump in the skull. By measurement of the skull and its bumps one would therefore be able to infer something about character, moral qualities and personality. Its originator was Franz Josef Gall, who took refuge in France after his ideas had been disapproved of in Austria and was shocked when the Institut de France, at the instigation of Napoleon, did not promote him to membership.
There were some good reasons for dismissing phrenology and especially the use that was later made of it to promote racists ideas. But there is nothing wrong with its implicit assumption that the brain is the seat of the mind.
Those who label the tendency of modern neurobiological research to find that special cortical areas are associated with distinct functions as nothing more than a manifestation of the “new phrenology” do both the subject and themselves a disservice. That distinct cortical areas are associated with distinct functions does not mean that they can act in isolation; indeed all cortical areas have multiple inputs and outputs, both to other cortical zones as well as to sub-cortical stations and the healthy activity of an entire system is critical for a specialized area to execute its functions. It is trite to suppose, as some (non-scientists) have, that an area that is specialized for a special function, for example colour vision, can be isolated from the rest of the cortex or the brain and still mediate the experience of colour. No biologist has ever made such a claim and those outside biology who make it know nothing of biology or the brain.
It is equally untrue that the whole of the brain is involved in all its functions, as was believed in the 19th century. No one could possibly deny that there is an area of the brain that is specialized for vision or for some attributes of vision, such as visual motion; nor can anyone deny that there are areas of the brain that are specialized for audition. Nor would any reasonable person want to deny that lesions in these different zones of the cerebral cortex have different consequences.
More recently, with the advent of brain imaging studies, neurobiologists have shown that even the experience of subjective mental states does not mobilize the entire brain with equal intensity. Rather the results of such studies commonly show that a set of areas is especially involved in some subjective state or another. But activity in the areas comprising that set does not necessarily correlate only with one subjective state. An area of that set may do “double” or “multiple” duty and be active during the experience of several subjective states, even contradictory ones. But one nevertheless commonly finds that the set of areas especially active during some experiences is different from the set of areas active in another, or in other, subjective experiences, even if they share common areas.
This, of course, is a far cry from those who, usually anxious to stigmatize the findings of neurobiology, write of neurobiologists as having discovered the “love spot” or the “beauty spot” in the brain. Or to dismiss them as nothing more than “modern phrenologists”.
The “unity” of mind
A fertile terrain for questioning the localizationist claim – that cortical areas with characteristic histologies and specific sets of inputs and outputs can be associated with special functions - lies in the so-called “unity of mind” which makes us act holistically.
But let those who ridicule the efforts of neurobiologists consider what has been the greatest success of cortical studies on the one hand and what has been its greatest failure on the other.
The greatest success - which almost links the history of cortical neurobiology in one unbroken thread – is the association of special functions with distinct cortical areas. This theme has run through cortical studies since the day in April 1862 when Broca announced that the third left frontal convolution is critical for the production of articulate language.
The greatest failure has been its inability to account for how these specialized areas “interact to provide the integration evident in thought and behavior” as the American neuropsychologist Karl Lashley put it in the 1930s. He also added, however, that just because the mind is a unit, it does not follow that the brain is a unit.
Those who dismiss all these “localizationist” studies as nothing more than a “modern phrenology” may want to ask why neurobiology has failed so miserably just when it might have been expected to succeed spectacularly in light of its findings.
Perhaps a good first step in this enquiry would be to stand back – even if momentarily – and ask whether the mind is an integrated unit after all. The answer may come as a surprise.
Thursday, January 3, 2013
In his last speech to the House of Lords as Archbishop of Canterbury, Rowan Williams lamented society’s attitude towards older people. He said: "It is assumptions about the basically passive character of the older population that foster attitudes of contempt and exasperation, and ultimately create a climate in which abuse occurs" and referred to estimates that a quarter of the older population is abused one way or another.
This comes against ghastly stories of the mis-treatment of older people by their nurses in old peoples’ homes, often verging on outright cruelty, stories that are repeated annually throughout the country, and probably mirroring similar stories in many other countries as well.
I believe that the Archbishop showed wisdom and compassion in choosing the theme for his last speech and in speaking up for older people, but he did not go far enough in his analysis.
I have long wondered whether we are not biologically programmed to dislike and even hate older people for being older, just as we seem to be biologically programmed to love vulnerable and defenseless young children just because they are younger. The latter merit our attention and care while the former our avoidance and, where occasion permits, our cruelty and mis-treatment of them.
I have no scientific evidence for this belief, though there might be such evidence somewhere. But if my analysis is correct, or turns out to be correct, then it is not that we have “assumptions about the basically passive character” of older people that leads to their mis-treatment, as the Archbishop believes, but something biological and therefore much more difficult to control.
Of course, the hatred is probably more easily directed against those older people who are not members of the family, or at least the immediate family. But even in that context, older people are not immune. In the Prologue to his autobiography, Bertrand Russell wrote that one of the things that had made him suffer was the sight of “helpless old people a hated burden to their sons”.
If we are biologically programmed to dislike older people at best and hate them at worst, especially when they are not members of our family, then it is right, as the Archbishop suggested, that they should be given some kind of state protection, for example by appointing a national Older People’s Commissioner.
Society does, after all, police other biological urges that are difficult to control. It is perhaps time to introduce severe punishment for those who heap so much misery on the helpless in our society.
But that of course leaves another aspect which society simply cannot control. The dislike of old people, and their avoidance, are no doubt the source of much misery and alienation for them, and I just don’t know how society can combat that. We cannot, after all, legislate against dislike though we should be able to do so against its consequences