By Susan Greenfield
Adapted from her 2008 book, ID: The Quest For Identity In The 21st Century
Human identity, the idea that defines each and every one of us, could be facing an unprecedented crisis. It is a crisis that would threaten long-held notions of who we are, what we do and how we behave. It goes right to the heart—or the head—of us all. This crisis could reshape how we interact with each other, alter what makes us happy, and modify our capacity for reaching our full potential as individuals. And it’s caused by one simple fact: the human brain, that most sensitive of organs, is under threat from the modern world.
Unless we wake up to the damage that the gadget-filled, pharmaceutically-enhanced 21st century is doing to our brains, we could be sleepwalking towards a future in which neuro-chip technology blurs the line between living and non-living machines, and between our bodies and the outside world. It would be a world where such devices could enhance our muscle power, or our senses, beyond the norm, and where we all take a daily cocktail of drugs to control our moods and performance. Already, an electronic chip is being developed that could allow a paralysed patient to move a robotic limb just by thinking about it. As for drug manipulated moods, they’re already with us—although so far only to a medically prescribed extent. Increasing numbers of people already take Prozac for depression, Paxil as an antidote for shyness, and give Ritalin to children to improve their concentration.
But what if there were still more pills to enhance or “correct” a range of other specific mental functions? What would such aspirations to be “perfect” or “better” do to our notions of identity, and what would it do to those who could not get their hands on the pills? Would some finally have become more equal than others, as George Orwell always feared? Of course, there are benefits from technical progress—but there are great dangers as well, and I believe that we are seeing some of those today. I’m a neuroscientist and my day-to-day research at Oxford University strives for an ever greater understanding—and therefore maybe, one day, a cure—for Alzheimer’s disease. But one vital fact I have learnt is that the brain is not the unchanging organ that we might imagine. It not only goes on developing, changing and, in some tragic cases, eventually deteriorating with age, it is also substantially shaped by what we do to it and by the experience of daily life. When I say “shaped”, I’m not talking figuratively or metaphorically; I’m talking literally.
At a microcellular level, the infinitely complex network of nerve cells that make up the constituent parts of the brain actually change in response to certain experiences and stimuli. The brain, in other words, is malleable—not just in early childhood but right up to early adulthood, and, in certain instances, beyond. The surrounding environment has a huge impact both on the way our brains develop and how that brain is transformed into a unique human mind.
Of course, there’s nothing new about that: human brains have been changing, adapting and developing in response to outside stimuli for centuries. What prompted me to write my book is that the pace of change in the outside environment and in the development of new technologies has increased dramatically. This will affect our brains over the next 100 years in ways we might never have imagined. Our brains are under the influence of an ever-expanding world of new technology: multichannel television, video games, MP3 players, the internet, wireless networks, Bluetooth links—the list goes on and on. But our modern brains are also having to adapt to other 21st century intrusions, some of which, such as prescribed drugs like Ritalin and Prozac, are supposed to be of benefit, and some of which, such as widely available illegal drugs like cannabis and heroin, are not. Electronic devices and pharmaceutical drugs all have an impact on the micro- cellular structure and complex biochemistry of our brains. And that, in turn, affects our personality, our behaviour and our characteristics. In short, the modern world could well be altering our human identity.
Three hundred years ago, our notions of human identity were vastly simpler: we were defined by the family we were born into and our position within that family. Social advancement was nigh on impossible and the concept of “individuality” took a back seat. That only arrived with the Industrial Revolution, which for the first time offered rewards for initiative, ingenuity and ambition. Suddenly, people had their own life stories—ones which could be shaped by their own thoughts and actions. For the first time, individuals had a real sense of self. But with our brains now under such widespread attack from the modern world, there’s a danger that that cherished sense of self could be diminished or even lost.
Anyone who doubts the malleability of the adult brain should consider a startling piece of research conducted at Harvard Medical School. There, a group of adult volunteers, none of whom could previously play the piano, were split into three groups. The first group were taken into a room with a piano and given intensive piano practise for five days. The second group were taken into an identical room with an identical piano—but had nothing to do with the instrument at all. And the third group were taken into an identical room with an identical piano and were then told that for the next five days they had to just imagine they were practising piano exercises. The resultant brain scans were extraordinary. Not surprisingly, the brains of those who simply sat in the same room as the piano hadn’t changed at all. Equally unsurprising was the fact that those who had performed the piano exercises saw marked structural changes in the area of the brain associated with finger movement. But what was truly astonishing was that the group who had merely imagined doing the piano exercises saw changes in brain structure that were almost as pronounced as those that had actually had lessons.
“The power of imagination” is not a metaphor, it seems; it’s real, and has a physical basis in your brain. Alas, no neuroscientist can explain how the sort of changes that the Harvard experimenters reported at the micro-cellular level translate into changes in character, personality or behaviour. But we don’t need to know that to realise that changes in brain structure and our higher thoughts and feelings are incontrovertibly linked.
What worries me is that if something as innocuous as imagining a piano lesson can bring about a visible physical change in brain structure, and therefore some presumably minor change in the way the aspiring player performs, what changes might long stints playing violent computer games bring about? That eternal teenage protest of “it’s only a game” certainly begins to ring alarmingly hollow. Already, it’s pretty clear that the screen-based, two dimensional world that so many teenagers—and a growing number of adults—choose to inhabit is producing changes in behaviour. Attention spans are shorter, personal communication skills are reduced and there’s a marked reduction in the ability to think abstractly.
This games-driven generation interprets the world through screen-shaped eyes. It’s almost as if something hasn’t really happened until it’s been posted on Facebook or YouTube. Add that to the huge amount of personal information now stored on the internet—births, marriages, telephone numbers, credit ratings, holiday pictures—and it’s sometimes difficult to know where the boundaries of our individuality actually lie.
Only one thing is certain: those boundaries are weakening. And they could weaken further still if, and when, neurochip technology becomes more widely available. These tiny devices will take advantage of the discovery that nerve cells and silicon chips can happily co-exist, allowing an interface between the electronic world and the human body. One of my colleagues recently suggested that someone could be fitted with a cochlear implant (devices that convert sound waves into electronic impulses and enable the deaf to hear) and a skull-mounted micro-chip that converts brain waves into words (a prototype is under research). Then, if both devices were connected to a wireless network, we really would have arrived at the point which science fiction writers have been getting excited about for years. Mind reading! He was joking, but for how long the gag remains funny is far from clear.
Today’s technology is already producing a marked shift in the way we think and behave, particularly among the young. I mustn’t, however, be too censorious, because what I’m talking about is pleasure. For some, pleasure means wine, women and song; for others, more recently, sex, drugs and rock ’n’ roll; and for millions today, endless hours at the computer console. But whatever your particular variety of pleasure (and energetic sport needs to be added to the list), it’s long been accepted that ‘pure’ pleasure—that is to say, activity during which you truly “let yourself go”—was part of the diverse portfolio of normal human life. Until now, that is.
Now, coinciding with the moment when technology and pharmaceutical companies are finding ever more ways to have a direct influence on the human brain, pleasure is becoming the sole be-all and end-all of many lives, especially among the young. We could be raising a hedonistic generation who live only in the thrill of the computer-generated moment, and are in distinct danger of detaching themselves from what the rest of us would consider the real world. This is a trend that worries me profoundly. For as any alcoholic or drug addict will tell you, nobody can be trapped in the moment of pleasure forever. Sooner or later, you have to come down.
I’m certainly not saying all video games are addictive (as yet, there is not enough research to back that up), and I genuinely welcome the new generation of “brain-training” computer games aimed at keeping the little grey cells active for longer. As my Alzheimer’s research has shown me, when it comes to higher brain function, it’s clear that there is some truth in the adage “use it or lose it.” However, playing certain games can mimic addiction, and that the heaviest users of these games might soon begin to do a pretty good impersonation of an addict. Throw in circumstantial evidence that links a sharp rise in diagnoses of ADHD (Attention Deficit Hyperactivity Disorder) and the associated three-fold increase in Ritalin prescriptions over the past ten years with the boom in computer games and you have an immensely worrying scenario.
But we mustn’t be too pessimistic about the future. It may sound frighteningly Orwellian, but there may be some potential advantages to be gained from our growing understanding of the human brain’s tremendous plasticity. What if we could create an environment that would allow the brain to develop in a way that was seen to be of universal benefit? I’m not convinced that scientists will ever find a way of manipulating the brain to make us all much cleverer (it would probably be cheaper and far more effective to manipulate the education system). Nor do I believe that we can somehow be made much happier—not, at least, without somehow anaesthetising ourselves against the sadness and misery that is part and parcel of the human condition. When someone I love dies, I still want to be able to cry.
But I do, paradoxically, see potential in one particular direction. I think it possible that we might one day be able to harness outside stimuli in such a way that creativity—surely the ultimate expression of individuality—is actually boosted rather than diminished. I am optimistic and excited by what future research will reveal into the workings of the human brain, and the extraordinary process by which it is translated into a uniquely individual mind. But I’m also concerned that we seem to be so oblivious to the dangers that are already upon us.
Well, that debate must start now. Identity, the very essence of what it is to be human, is open to change—both good and bad. Our children, and certainly our grandchildren, will not thank us if we put off discussion much longer.