Unweaving The Rainbow, by Richard Dawkins. Scotland on Sunday, November 1, 1998. Review by Andrew Crumey.

NEWTON'S explanation of the rainbow, wrote Keats, sent charm flying at the touch of cold philosophy. Many fear that by solving the universe's mysteries we destroy its wonder; but Dawkins, departing from his usual Darwinian theme, here argues that science is just as wonderful as any poetry. I happen to agree, but I fear his book will merely reinforce the prejudices of those it seeks to convert.

Whereas previously Dawkins has been content to describe the facts then leave us to supply our own gasps of amazement, now he sets out to highlight the "poetry" of it all; and the effect, sadly, is as deadening as that of a comedian intent on explaining why his jokes should make us laugh.

An introductory chapter whirls us through the mind-blowing vastness, emptiness and arbitrariness of the cosmos. In his determination to make our jaws drop, Dawkins compares time to successively thicker books. Human civilisation fills a tome you could sit on; the discovery of fire takes us as high as the Statue of Liberty. Very nice, but unfortunately the bookshelf that stretches to the trilobite is itself too vast to contemplate except by further analogy (involving Manhattan island and the Library of Congress), putting me in mind not so much of Keatsian "dreams of Gods" as of those The Sky at Night programmes in which a melon would be the Sun, the Earth a pea in a corner of the studio, and Alpha Centauri a grubby golf ball sitting mournfully on the hard shoulder of the M25. Realising it all might be getting out of hand, Dawkins resorts to quoting Sir Thomas Browne, and we get the feeling that when it comes to putting the "gee-whizz" factor into words, writers of literary imagination might have their uses after all.

A couple of chapters largely devoted to the basic physics of light and sound take us into realms far removed from the evolutionary patch which is Dawkins' natural environment. The story he tells, for the most part, is one you'll find in any school text-book; Dawkins shines no brighter. In fact, his dreary account of the rainbow left me feeling considerable sympathy for Keats; it isn't even particularly accurate. And in trying to explain the behaviour of light, Dawkins admits it's all to do with quantum theory, and beyond him. Refreshingly honest, but not particularly entertaining.

It's only when he returns to biology that we suddenly see him back in his element; we're reminded of the attention to detail, the knack for finding subtleties of which only the expert could be aware, that was such a delight in earlier books from The Selfish Gene to Climbing Mount Improbable. Dawkins can turn the auditory mechanism of a grasshopper into a miniature drama of ingenuity; why then does an intellectual organ such as his, so perfectly adapted to its own particular niche, feel the need to gasp and wallow in regions of popular science where the competition is already so fierce? Reading Dawkins explain probability theory is like listening to Pavarotti play banjo.

But whereas Dawkins on physics is dull, Dawkins on poetry is downright embarrassing. Auden gets the thumbs up for having had the decency to acknowledge that among scientists he felt like a "shabby curate… in a drawing room full of dukes." So he's okay, then; which is more than can be said for Blake with his "Newton, sheath'd in dismal steel." "What a waste of poetic talent," is Dawkins' lofty assessment.

By now we're beginning to get the real message of this book.

Dawkins, among artists, feels a bit of a shabby curate too.

He's weary of the disdain of high-table humanists, he's sick of trendy relativists who want to dismiss science as a cultural myth. He even bothers to quote and rebut Bernard Levin, as if this were either difficult or necessary. Yet Dawkins' hectoring response to his various critics, burdened by the ever -present weight of his own unshakeable conviction, is an own-goal reminiscent of the desperate assertion made by lonely hearts aspirants that they have a great sense of humour.

Even when he gets off the soap box and talks biology, he still allows himself a distinctly personalised attack on Stephen Jay Gould.

In fact, throughout this bewilderingly disorganised book, Dawkins so often adopts a fight-or-flight posture of snarling defensiveness that much of it simply reads like some 'Angry of Oxford' venting spleen. We see the same obsession, the same gratuitous assumption that the reader ought to share his outrage, that characterises the battier margins of 'Letters to the Editor' pages. If only we were all better scientists, he pleads (in other words, more like Dawkins), Keats would have been a better poet, judges would be better judges, we'd all be better people. It's a theme that quickly becomes monotonous.

At least I can understand now why so many people regard scientists as unimaginative, unfeeling and humourless.

Dawkins wit at times is not so much donnish as schoolmasterish. Ever the pedantic killjoy, he mentions a conversation with a six-year-old for whom he began to calculate how little time Santa Claus could spend on a single chimney, given he has to visit every house in the world before Christmas morning.

Dawkins presents himself as a model of rationality, and there's nothing at all wrong with that. But he goes much too far. His is a world in which crossing your fingers for luck amounts almost to a crime against reason (he seeks to explain the phenomenon on the basis of supposed reinforcements from times when the trick has worked by chance). Yes, finger-crossing is completely irrational; so too, for that matter, is crying (tears, he says, are an evolutionary mystery as yet unexplained). These things may make no sense; that doesn't imply they have no meaning.

Dawkins, though, just doesn't seem to get it; and this is what makes his book such a depressing parody of the scientific spirit, such a perfect example of the cold philosophy Keats bemoaned, and which is seen at its most disturbing in an account of DNA fingerprinting. Forensic samples, we're told, can be extracted "from a bloodstain on a carpet, from semen inside a rape victim, from a crust of dried nasal mucus on a handkerchief".

One can't help feeling that the second object on this list is equal, for Dawkins, to a bit of carpet or a snotty hankie. I suppose there's no "rational" reason why she shouldn't be.

But that's why we need poets, to help us be the kind of people for whom such things matter.


The Feeling Of What Happens, by Antonio Damasio. Scotland on Sunday, January 23, 2000. Review by Andrew Crumey.

THAT you can read these words is a miracle; even stranger is that you're aware that you're reading, aware that you're you. Consciousness is as much an unsolved problem today as it was in Aristotle's time, but a solution may be close.

Drawing on years of research as a clinical neurologist, Professor Damasio teases apart aspects of mental life to reveal the various constituents of our elusive sense of self. As a student he was taught that consciousness depends on language, but he gives examples of brain-damaged patients totally lacking in verbal comprehension yet still clearly able to think.

Memory, too, is held by some to be vital to personal identity, and so Damasio introduces David, who can no longer remember anything for more than a minute. David has lost all sense of "autobiographical self"; he can't recall anything of his former life. But what remains is his "core self", a friendly and sociable person whose window on the world extends roughly 45 seconds into the past.

Damasio associates "extended" and "core" consciousness with specific parts of the brain, and underlying these he proposes "proto-self", a shifting collection of neural patterns representing the information our senses constantly receive.

We aren't aware of proto-self, yet the processing of its data ultimately gives rise to our sense of being.

The neurological details of these various systems are heavy going for the lay reader, and the emergence of consciousness from them looks a bit like pulling a rabbit from a hat.

Explaining consciousness is rather like explaining emotion; scientists may be able to identify the brain's feel-good sectors, they can never say why happiness feels the way it does. Damasio is acutely aware of such philosophical issues, however, and his intricate journey through the brain has been praised by fellow specialists as the best account to date of consciousness from a biological perspective.

More attractive to the general reader, though, are the case histories of people whose damaged sense of self, corresponding to damaged areas of the brain, has enabled such insights to be made.

Damasio writes beautifully about his patients with a humaniy that goes far beyond good bedside manner, and nowhere is he more moving than in an account of a friend with Alzheimer's. Watching him stare without any sign of recognition at a photograph of his own wife, Damasio reminds us that Alzheimer's not only affects memory, but also progressively pares consciousness to its core.

Damasio demonstrates that remembering something is not the same as knowing that you remember it; having an emotion is not the same as feeling that emotion.

Patients in a persistent vegetative state show signs of recognising familiar faces, yet appear unaware that they're seeing anything. The amnesiac David is suspicious of people who have been formerly rude to him, though he doesn't know who they are or why he should feel the way he does. And having identified the regions of the brain responsible for such phenomena, Damasio is able to infer that chimps and perhaps dogs are self-aware, though human babies don't fully attain this state until the age of 18 months.

One comes away from this rewarding, if demanding, book with a deepened awareness of the mysteries of the mind, and an enormous sense of gratitude that for most of us consciousness is something we can take for granted.


It Ain't Necessarily So, by Richard Lewontin. Scotland on Sunday, July 2, 2000. Review by Andrew Crumey.

THIS is "the most important scientific effort mankind has ever mounted, and that includes splitting the atom and going to the moon." So said a spokesman for the international Human Genome Project announcing the completion of the "first draft" of the DNA blueprint, last week.

I first met the most famous initials on the planet some years ago while visiting a biochemist friend in the research laboratory where she worked. Treading gingerly past the gleaming test tubes, high-tech equipment and other paraphernalia that filled the place, I noticed a black plastic bucket on the floor, filled with dirty grey liquid. "What's that?" I asked, wondering if it was something the cleaner had forgotten to empty. "Oh," said my friend airily, "that's human DNA."

The key to life itself! Was it ethical to leave such sacred material sloshing in a bucket? Might it even reproduce? Harvard geneticist Richard Lewontin, in a book that provides a valuable antidote to the current hyperbole surrounding the Genome Project, GM food and human cloning, offers reassurance. "DNA is a dead molecule, among the most nonreactive, chemically inert molecules in the living world."

The rise of this dead molecule from scientific conundrum to cultural icon is the central theme of the essays making up his book. Published in The New York Review over a span of nearly 20 years and covering major topics in biology from Darwin to Dolly the sheep, they present the maverick professor's case against the prevailing view - championed by EO Wilson and Richard Dawkins - that genes determine everything about us, from our appearance to our behaviour.

The core chapter concerns the mapping of the human genome; a "cultural achievement on a par with the works of Shakespeare, or the pictures of Rembrandt, or the music of Wagner," according to an excited European spokesman for the project. Sounds wonderful, but what exactly are the scientists doing? How, for instance, can they find a single DNA blueprint if everybody's genes are supposedly different? Whose genome are they mapping? Lewontin explains: "The final catalog of 'the' human DNA sequence will be a mosaic of some hypothetical average person corresponding to no one."

The grand printout, due in 2003, will be a string of three billion As, Cs, Gs and Ts representing the mingled genetic code of some 10 or 20 anonymous donors. The reason why their DNA should bear any relevance to yours or mine is that everybody's code is almost identical. As Lewontin puts it: "The DNA I got from my mother differs by about one tenth of one per cent, or about 3,000,000 nucleotides, from the DNA I got from my father, and I differ by about that much from any other human being."

It's in that crucial tenth of a percent that human variation lies, not in the remaining billions of "junk" letters; but a Diversity Project to study this receives a relatively small budget, and its sampling may be too limited to be really useful.

What about the promised breakthroughs in medicine? Lewontin warns: "Over and over, reports of first isolated successes of some form of DNA therapy appear in popular media, but the prudent reader should await the second report before beginning to invest either psychic or material capital in the proposed treatment."

The Human Genome Project is a magnificent piece of basic science, as awe -inspiring as any photogenic galaxy captured by the Hubble Telescope. It may prove to have about as much relevance to the lives of you and me. Finding the gene responsible for a hereditary disorder is quite different from finding a cure; and while any new therapy will be wonderful news, it is worth remembering that genetic disorders are rare in the population as a whole: "One in 2,300 births for cystic fibrosis, one in 3,000 for Duchenne's muscular dystrophy, one in 10,000 for Huntingdon's disease." As for the biggest killers, "cardiovascular disease has utterly defied genetic analysis," and while some cancers are known to be genetic, nobody knows how large a proportion they represent, or how the defective genes do their damage.

Given that the difference between your DNA and mine is apparently so slight, you might be wondering how DNA fingerprinting can be so infallible. According to Lewontin, it isn't. The perfect forensic test of identity is as elusive as the perfect smart bomb, and errors arise through contamination or flawed methodology. But while the Human Genome Project argues that its work is useful because everybody's DNA is much the same, the FBI argue that DNA fingerprinting works because everybody's DNA is unique.

In both cases there are big vested interests; law enforcement agencies want to prosecute people, and the bottom line of the Human Genome Project is summed up by Lewontin in a single sentence: "No prominent molecular biologist of my acquaintance is without a financial stake in the biotechnology business."

Cutting up DNA - anybody's DNA - and identifying the constituent genes is worth doing because those genes can be patented, and every one of them is a potential drug in the hypothetical future world of gene therapy. Already genomics companies are seen as the new dotcoms, their share prices boosted by media excitement. Yet: "It may turn out, in the end, that the providers of capital have been as deluded by the hype of the human genome as has anyone else."

Where did all this "hype" start? Lewontin places the origin of molecular biology not in Watson and Crick's 1953 announcement of the double helix, but in a book called What is Life? published nine years earlier by the physicist Erwin Schrodinger; the first real attempt to fulfil a 19th-century dream, expressed by Emile Zola in the preface to his Rougon-Macquart novels, that "heredity has its laws, just like gravitation".

While the power of the atom preoccupied most minds during the Fifties and Sixties, a growing number of physicists and chemists followed Schrodinger's lead in switching to biology. With the publication in 1976 of Richard Dawkins' The Selfish Gene, a new cultural fetish was born. Physics retreated to the exotic and bewildering world of superstrings, leaving the stage clear for biologists to make their own ever-larger claims about unlocking nature's secrets. The logo of the Human Genome Project shows "a human body wrapped Laocoon-like in the serpent coils of DNA and surrounded by the motto 'Engineering, Chemistry, Biology, Physics, Mathematics.' The Genome Project is the nexus of all sciences."

But do genes really determine everything? Lewontin suggests we look at our own hands. The same genes made both, but the fingerprints on the left are unlike those on the right; a result of "developmental noise." Identical twins have identical genes - they're natural clones - but possess completely different sets of fingerprints, not to mention different personalities. This should serve as a warning to anyone tempted to explain fingerprints, or any other human trait, through one of those Panglossian arguments that Lewontin's colleague Stephen Jay Gould calls "Just So" stories; claiming, perhaps, that the ridges have been carefully shaped by evolution to provide us with the best possible grip.

An artificial human clone, if one were ever made, would differ from its donor even more than an identical twin, because of the influence on its development of the egg that hosts it. The ethical problems of human cloning, Lewontin argues, come down to the ethical problems of creating artificial twins. As for GM foods, they hardly get a mention in Lewontin's book because they simply aren't an issue in the US.

But ask yourself how you would react to news that genes from a cat had been transferred to a baboon. This grotesque piece of Frankenstein science was carried out not by white-coated boffins, but rather at some recent point in evolutionary time by an agent that was probably a mosquito. Genes are naturally transposable, and the tree of life is really a tangled bush.

Such complexities do not yield good soundbites, but if we want our lives to be improved rather than threatened by biotechnology then we all have to try and understand the issues involved as best we can, and question the assumptions being made on our behalf by scientists, politicians and pundits. This book may not give you the complete picture, but it's a good place to start.


Driving Mr Albert by Michael Paterniti. Scotland on Sunday, August 13, 2000. Review by Andrew Crumey.

PATERNITI heard it as an urban legend: Einstein's brain was removed for study after he died, then languished in a jar in somebody's garage. Not only is the legend true, but Paterniti wound up chauffeuring the brain and its custodian across America.

Thomas Harvey was the pathologist who performed the autopsy on Einstein in 1955. It was supposed to be a routine search for the cause of death, but Einstein was no ordinary patient, and Harvey, without authorisation, took the brain.

The incident quickly became an embarrassment to Princeton Hospital. Harvey's bosses laid claim to the brain, but Harvey held on to it and was eventually sacked. Apart from one or two scientific papers of dubious value, the promised research never materialised. Instead, the brain became what Paterniti describes as "the ultimate Faberge egg".

The whole organ is not up for sale, though, as Harvey quickly dissected it. One fragment belongs to a Japanese collector who showed his trophy to Paterniti, explaining: "Peace of brain bring harmony." To Paterniti the tiny object looked "less like a brain than a sneeze".

After leaving medicine, Harvey took a job in a plastics factory, went through several relationships and lived on his dubious fame. Soon after first meeting him, Paterniti is offered a brief glimpse of his treasure. "(Harvey) pulled out two large glass cookie jars full of what looked to be chunks of chicken in a golden broth: Einstein's brain chopped into pieces ranging from the size of a turkey neck to a dime."

Harvey needs to attend to some business "out West", Paterniti offers to drive him, and the two set off with the brain in the car boot.

This road-movie element is what gives the book its appeal. Paterniti tells us a little about Einstein's life and work, but what interests him is not so much the man's ideas as his fame. As such, the pickled cargo could be any of the numerous other celebrity brains, such as Walt Whitman's and Mussolini's, that have been harvested in the past.

As for Harvey, Paterniti finds him affable but enigmatic, regarding Einstein as an old friend, though their only meeting in life was when Harvey once took blood from the ailing physicist. Equally fleeting was Harvey's friendship with William Burroughs, whom they visit en route. Like much in this book, the event doesn't quite live up to its billing. More amusing is a visit to the home of a 19th-century American eccentric embalmed for public display; a sight which fills Harvey with delight.

The most compelling encounter is with Einstein's granddaughter, Evelyn, cancer-stricken and ready to make peace with her foe. Her quiet dignity puts Harvey to shame; he makes a swift exit, leaving the brain behind. Perhaps he wants Evelyn to keep it, but after lifting some of the wax-coated pieces in her hand she decides she can't.

Harvey's main engagement "out West" is a talk at a school arranged by a fellow Einstein freak. The kids shuffle impatiently until Harvey finally brings out the goods, prompting one to ask: "Yeah, but like, what's the point?" That's a question many will ask during this strange if diverting book. Harvey eventually relinquished ownership of the brain. Einstein's grey matter is now back in Princeton Hospital, where the chief of pathology muses about one day extracting its DNA.


Stardust, by John Gribbin. Scotland on Sunday, September 17, 2000. Review by Andrew Crumey.

THE universe, as we all know, began with a bang. What came before is a matter for theologians, what happened during the first hundred-thousandth of a second is a matter of current research. But after that incredibly short time, the infant universe was already cool enough to obey known laws of physics, and John Gribbin's book describes how the primordial fireball (an inaccurate but convenient term) went on to become the stuff that makes you and me and everything else.

Think, for instance, of a glass of water. You might wonder how many people it has passed through before you drink it, but the fact is that there could well be molecules in there that have been through millions of people, not to mention countless animals going back to dinosaurs and beyond.

The world is a great recycler, as is the universe. The hydrogen atoms in the water (or more precisely, their nuclei) are only a tiny fraction of a second younger than the universe itself. On the other hand, the larger oxygen atoms partnered with them could not have coalesced during the universe's hot early stages.

How heavier elements arose - including the carbon and nitrogen which are essential for life - was a major problem in 20th century astrophysics.

Some experts doubted that there ever was a Big Bang. The name was actually coined as a term of derision by the theory's arch enemy, Fred Hoyle, in a radio broadcast in 1950.

However, evidence steadily mounted. Edwin Hubble had already shown that distant galaxies are moving rapidly away from us, but the best proof came from observation of low energy microwaves filling the sky, predicted by Gamow and others as the warmth that must remain from the Big Bang.

Gamow also realised that hydrogen and helium would make up nearly all of the universe in proportions he accurately predicted. Like it or not, the Big Bang is one of the best confirmed theories in science.

We only ever see helium in children's balloons; it is too unreactive to play any part in the living world. Yet together with hydrogen, it accounts for 99% of the universe. The remaining 1%, which includes most of what we are made of, was left by Gamow for others to explain.

So where do atoms come from? They were manufactured in stars as a by-product of the nuclear reactions that make them shine; the heaviest of our elements came from exploding supernovae. We are, in what is something of a popular science cliche, made of stardust.

The key work was done in the Fifties and Sixties, though it was not until 1983 that the Nobel Foundation got round to awarding a prize to just one member of a team that had also included Hoyle, who arguably did most of the work.

Hoyle believes he was snubbed for having criticised the Foundation's earlier award for the discovery of pulsars (pulsating stars) not to Jocelyn Bell, who found them, but to her male supervisor. The world of astrophysics is clearly as prone to pettiness as every other walk of life.

Gribbin, whose own career in astrophysics is long and distinguished, proves in this book to be as adept a recycler as the universe itself, given that he has already touched on many of its themes in earlier works.

Nevertheless, what he provides here is a concise, readable account of a truly incredible story: the scientific search for our origins among the stars.


Homage To Gaia, by James Lovelock. Scotland on Sunday, October 1, 2000. Review by Andrew Crumey.

AS A government scientist during the 1950s, James Lovelock froze hamsters alive. His hope was to find a way of preserving human blood supplies, but he was disturbed by the hamsters' suffering while being revived. A humane way to waken them, he thought, would be to warm their hearts gently with radio signals, so he built a contraption to do this out of spare radar parts. Not only did it work, but he found he could use it to bake potatoes. He had made what was possibly the world's first microwave oven.

Most readers of this autobiography will mainly be interested in Gaia, the holistic Earth theory (named by his friend William Golding) for which Lovelock is best known. But before arriving at his controversial idea, Lovelock had a long and distinguished career in "mainstream" research, during which he applied his inventiveness to many problems.

One was the common cold. "Coughs and sneezes spread diseases" had been the wartime slogan, but Lovelock found that touching contaminated surfaces was vastly more effective in transmitting the disease. Remember this next time you shake hands with a stranger, or open a door in a public place.

He also thought he should investigate what everyone thinks: that colds are caused by being chilled. The need to measure air currents led him to invent the electron capture device (ECD) which, as well as enabling him to show cold does not cause colds (but only makes infected mucus droplets survive longer between people), also proved good at detecting chemicals with a strong capacity for grabbing electrons. These chemicals, he found, were usually poisonous or carcinogenic. He had inadvertently made an ideal device for monitoring pollution.

One of the first discoveries made with it was the ubiquitous presence in food of the pesticide DDT. Widely publicised by Rachel Carson's book Silent Spring, this marked the beginning of the environmental movement. The ECD also alerted the world to the rise of CFCs, responsible for damaging the ozone layer.

Preferring independent research to academic life, Lovelock made his living from lucrative consultancy work. An involvement with NASA's Viking mission made him ponder the tell-tale signs of Martian life. With typical lateral thinking, he realised a particular kind of chemical imbalance in the atmosphere would be the most likely avenue.

This realisation, that a planet and its life forms might influence each other in a self-regulatory way, was the cornerstone of Gaia theory. The Earth itself, according to Lovelock, can be thought of almost as a living organism. Despite obvious New Age appeal, the theory is intended as hard science, dealing with problems such as cloud formation, land erosion and ocean composition. One outstanding mystery, he says, is why the level of salt in the sea remains roughly constant.

Despite support from many equally distinguished scientists, Lovelock has also faced ridicule. His theory's romantic name, and his independent status, have prompted some critics to go beyond the level of petulance considered normal in academic life. Public funding has been sparse because, says Lovelock, the relevant committees are often filled with people who prefer handing out money to their friends.

Even a sceptic, though, will find stimulation in this account of Lovelock's extraordinary scientific life. The personal is also covered, though with a detachment that can be almost unsettling. What comes across is a man who has lived for his passion, at whatever personal cost, and who, now in his eighties, can look back on many remarkable achievements. Whether he will ultimately be remembered as maverick or genius, only time will tell.


E=mc2, by David Bodanis. Scotland on Sunday, October 8, 2000. Review by Andrew Crumey.

SCIENTISTS in Geneva think they may have found a particle called the Higgs Boson. If the findings are confirmed (an announcement is due next month) we can expect the same sort of media fanfare that accompanied the Human Genome Project - and a similar level of public bafflement. The Higgs "mechanism", named in honour of Edinburgh physicist Peter Higgs, is in fact a way of explaining why certain things have mass. As such, it represents the latest descendant of the equation whose "biography" is presented to us in David Bodanis's new book, E=mc2.

Bodanis cites as his inspiration an interview in which actress Cameron Diaz said she wished she knew what Einstein's equation really means. Any Star Trek fan could tell her that energy (E) and mass (m) are equivalent; but Diaz probably knows that already. She asks what it means. An equation, like a novel, is open to interpretation.

Take that innocent word "mass", for example. For many people it means "matter"; however, this was not what Einstein meant by it. Imagine trying to push a car. You heave hard at first, then things get easier as it rolls. The car's reluctance to change speed is its mass. If the car was in outer space it would be weightless, but just as hard to push. It would still have mass.

As you push the car, you give it energy. The more you give, the faster it moves. But Einstein's equation says that energy and mass are equivalent. So as the car speeds up, its mass increases along with its energy. We never notice this because the equivalence is lop-sided; a huge amount of energy equals a tiny mass, and to make a car gain noticeably in mass you would have to push it at several million miles per hour. Raise its energy still further, and eventually no amount of effort will make it go any faster. This ultimate speed limit is c, the speed of light.

Here is Bodanis's version, in which a space shuttle approaches c: "So what happens? Think of frat boys jammed into a phone booth, their faces squashed against the glass walls. Think of a parade balloon, with an air hose pumping into it that can't be turned off. The whole balloon starts swelling, far beyond any size for which it was intended. The same thing would happen to the shuttle."

Oh no it wouldn't. Bodanis's breathless assertion that the shuttle "swells" is simply wrong. Mass increases, not size - but Bodanis repeats the howler, insisting that in particle accelerators: "protons end up 430 times bigger than their original size".

At this point (page 52) I was more than ready to take my copy of E=mc2 to Oxfam. Then I happened to flick to the end, where a long series of "Notes" contains the calm comment: "The term swelling has to be thought of only as a metaphor. The shuttle - or a proton, or any other object - doesn't get fatter..."

So why did Bodanis say it did? With its blatant inaccuracies and deliberate contradictions, the book began to seem more like a postmodernist novel. Unreliable narrators in fiction are one thing - but in popular science?

Intrigued, I checked Bodanis's credentials inside the back cover. Associate Member of St Antony's College, Oxford, and teacher of "An Intellectual Tool -Kit: Selected Topics in Social Enquiry, Aristotle to Chaos Theory." A very clever chap, in other words. Now a picture was emerging. The first part of his book (all that most people will read) merely looks as though it has been written by someone who hasn't a clue about relativity. In the notes we find a wiser alter-ego. Oxfam, I decided, could wait a little.

Bodanis follows the post-Longitude formula of focusing on historical personalities and avoiding anything requiring too much thought on the reader's part. He gives us the history of every term in Einstein's equation, including the equals sign (invented by Robert Recorde), and c (for celeritas). His story of mass centres on Lavoisier, the French chemist who lost his head in the Revolution, and it is a cracking tale.

For energy, we turn to Voltaire and his lover Madame du Chetelet, a brilliant scientist who died in childbirth. Bodanis's description of this remarkable woman is: "Imagine the actress Geena Davis, Mensa member and onetime action -film star, trapped in the early eighteenth century."

Bodanis says that Newton tried to define energy but got it wrong, Leibniz got it right, and Madame du Chetelet told the world about it. The notes unpick the mistakes; though even here there is no word on James Prescott Joule, the 19th -century physicist largely responsible for defining energy as Einstein understood it: as a convertible currency that can mean heat, motion and anything else capable of moving a car, driving a machine or generally doing "work." This omission of key players in favour of romantic vignettes is characteristic of a book in which Bodanis - curiously for someone with an interest in "social enquiry" - repeatedly subscribes to the lone-genius view of science. Einstein, isolated in his Bern patent office, invented an unprecedented theory that nobody could understand. Wrong again.

Mass-energy equivalence goes back at least to 1881, when Thomson showed that a particle's electric field contributes to its overall mass. In 1904 (the year before Einstein published relativity), a now-forgotten German physicist called Hasenshrl published a special-case version of E=mc2 which was later trumpeted by the Nazis as an Aryan victory. Einstein's real achievement was not so much the equation itself as the new philosophy of space and time that led to it.

Bodanis takes E=mc2 into "adulthood" with the story of the atomic bomb, and here the book gets very entertaining, being a tale of adventure, love and betrayal against a wartime backdrop. Einstein realised in 1905 that radioactivity might be a way of verifying his equation, but neither he nor most physicists foresaw what was to come. Instead it was left to science fiction; HG Wells's 1914 novel The World Set Free describes the discovery - in 1933 - of atomic bombs subsequently used in a war between Britain and Germany. The book was read by Leo Szilard, another key figure in the life of E=mc2 whom Bodanis ignores completely.

Szilard worked with Einstein in Berlin; they took out a patent together on a new kind of refrigerator. In 1934 Szilard sought a patent for a more serious invention - the nuclear chain reaction. He worked on it in America with Enrico Fermi, and in 1939 Szilard and Eugene Wigner visited Einstein to explain the principle of the atomic bomb. Einstein's response was: "I never thought of that."

Szilard, together with Wigner and Edward Teller, penned the famous letters to President Roosevelt which Einstein merely signed. Being politically suspect, Einstein was never allowed any further involvement with the bomb, whose "father", if such a title can be given to anyone, was Leo Szilard.

Why is such a crucial figure left out of the picture? Not even the notes could help me this time.

In the final acknowledgements Bodanis thanks more than 20 people who offered him advice. If Einstein in 1905 had asked so many for an opinion, we might never have heard of relativity. My guess is that what started out as a useful book fell victim to readers (some of them perhaps in literary agencies, publishing houses or the media) who demand only an easy read about interesting personalities, with little regard for science, and with any sales-threatening "ideas" shoved at the end or dumped in cyberspace at www.davidbodanis.com.

Cameron Diaz, and the rest of us, deserve better. Among the book-buying public there is an enormous, unquenchable thirst for hard information about the deepest mysteries of the universe. Why does a car have mass in the first place? Even Einstein never solved that one. Perhaps news from Geneva next month will take us a little nearer to the answer.


Night Falls Fast, by Kay Redfield Jamison. Scotland on Sunday, December 3, 2000. Review by Andrew Crumey.

ACROSS the world, roughly 2,000 people will kill themselves today. Suicide is a subject many would prefer not to talk about, but it is one on which Kay Redfield Jamison can write with authority, since she herself battled with manic-depressive illness and tried to take her own life before going on to become professor of psychiatry at John Hopkins University. Anyone who thinks doctors should never get ill might regard Jamison as being dubiously qualified for her profession; but her experiences have surely made her better able than most to empathise with those for whom death seems the only way out.

Jamison's ordeal formed the basis of her memoir An Unquiet Mind; the present book draws less directly on her own life, being instead an investigation into the nature and history of suicide. Medical annals offer a stark catalogue of desperation. People have swallowed knives, boiling water, dynamite. One man managed to stab himself to death with his own spectacles, and in the US there is now a steady rise of 'suicide by cop', with people raising guns at police officers in order to be shot.

These examples can easily seem like a freak show, but Jamison leaves us in no doubt about the extremes of despair that can lead to such behaviour. They also show the dedication with which the suicidal will pursue their goal; a point chillingly illustrated by Sylvia Plath in The Bell Jar. In hospital, Plath says she saw on the dinner table "glasses and paper napkins. I stored the fact that they were real glasses in the corner of my mind the way a squirrel stores a nut".

As Jamison points out, modern science and large-scale surveys have in many cases borne out the wisdom of folk psychology. Suicide does often run in families, and may sometimes stem from genetic predisposition. Creative people are unusually prone to killing themselves, with poets topping the list of high -risk groups. And advocates of St John's Wort as an anti-depressant can draw on the authority of Robert Burton's The Anatomy of Melancholy, published in 1621, which said the herb "mightily helps" - especially if picked on a Friday. This last proviso might seem mere mediaeval irrationalism, but more suicides happen on a Monday than on any other day.

The supposed shame of suicide has added to the misery of those whose lives have been touched by it, and is perpetuated by attempts to rewrite history in a more palatable form. Jamison cites American schoolbook hero Meriwether Lewis, an explorer whose later years of idleness led to fatal depression. Even some modern biographers have sought to remove this 'blot' from his name by imagining his death to have been accident or murder. Jamison prefers to acknowledge both his greatness, and the sadness of his end. Such compassion is typical of her book as a whole; it is both informative and deeply moving. While not meant for those in the midst of crisis, it offers rational help to anyone seeking to understand the mind's darkest corners.


2001: A Space Odyssey, by Arthur C Clarke. Scotland on Sunday, December 31, 2000. Review by Andrew Crumey.

TOMORROW, a year that is inescapably associated with science fiction will finally become fact. Not surprisingly, Arthur C Clarke's novel 2001: A Space Odyssey has been reissued to mark the occasion.

How much of it has come true? Well, space stations exist, but not the graceful rotating wheels depicted in Stanley Kubrick's film. The ill-fated Mir will fall to earth (or more hopefully water) some time in 2001, while opinion is still divided on whether the new International Space Station will ever produce enough scientific knowledge to justify its enormous cost.

Contrary to the visions of Clarke and others, today there are no lunar bases, no manned flights to the planets. Technology in our homes is largely confined to computers for our children, and video recorders which only they can operate. We still fasten our clothes with buttons. We still have wallpaper. What Clarke could never account for was simple human nature. No, we don't walk around with videophones strapped to our wrists. Instead, we clutch tiny microwave ovens to our skulls. And rather than live on the protein pills of prophecy, we buy organic food in ever greater quantities. Science has come to be seen as a threat.

Such fears have never troubled Clarke. A physics graduate, he worked in the late Forties for a science journal, and was at the same time an adviser to the Dan Dare comic. His career as populariser of science fact and fiction was forged at an early stage, and his spirit was always to be that of techno -evangelist.

In December 1948, Clarke wrote a story called 'The Sentinel' which would later become the basis for 2001: A Space Odyssey. At exactly the same time, George Orwell was putting the finishing touches to 1984. Both were responding to a world dominated by political ideologies and nuclear threat. Yet despite Orwell's profound pessimism, today for most people Big Brother and Room 101 are only TV programmes. And though Clarke saw science as our salvation - if stupid politicians could be kept from killing us all - it is no longer space travel that is our icon of progress, but rather the internet.

What Orwell and Clarke failed to foresee was a developed world dominated not by ideas - political or scientific - but by the simple pursuit of entertainment, which now forms the main item of expenditure for the average British family, ahead of food.

Such a world was still a long way off when, in 1964, Clarke and Kubrick began collaborating on a project initially called How the Solar System Was Won. Kubrick saw 'The Sentinel' as a good starting point from which he and Clarke could develop a screenplay; in his preface to the new "special edition", Clarke explains how novel and film grew simultaneously, with the book being published after the movie's 1968 release.

The two differ in plot details and tone, but the opening scene with its grunting ape-men is just as silly in either. Where Kubrick goes for visual effects that are grand if obscure - prompting Rock Hudson to storm out of the premiere pleading, "Can someone tell me what the hell this is all about?" - Clarke explains all, in prose that is authoritative, ponderous, and dull. His ideas may be wonderful, but his heavy cadences belong more to the steam age than the space era, and so do his values.

Catapulted into orbit - via Kubrick's famous bone in the air, or Clarke's simple chapter break - we get one of their more fortunate errors of vision. For them, 2001 is a world in which women's career opportunities are still purely decorative; a "charming little stewardess" entertains a thoughtful expert as he flies to the Moon to investigate an alien monolith beaming signals to the planets. A further expedition to Saturn is crewed by males chosen for being young, fearless and single. No bothersome emotion must be allowed to hold humanity back from the stars.

Accompanying them is the only memorable character in the entire piece; the psychotic computer HAL. Back in the Sixties, few doubted that thinking machines would soon be an everyday reality. In 1956, Marvin Minsky and others founded the science of artificial intelligence. Minsky was an adviser to 2001: A Space Odyssey, responsible for the film's then-cutting-edge computer graphics, and credited in the novel as an architect of the "Heuristically programmed ALgorithmic computer" (no, Clarke reiterates in his preface, HAL is not, as legend has it, IBM with each letter moved one place).

Minsky's optimism was as great as the space scientists', but today's biggest computers are still no smarter than an ant. Beating the world chess champion, it turns out, takes far less intelligence than being able to recognise your mother's voice on the phone. Consciousness, the mysterious human essence HAL may or may not possess, remains science's great unanswered question.

Tackling the subject, a flood of books this year - from authors including Susan Greenfield (The Private Life of the Brain, Allen Lane, GBP 18.99), Gerald Edelman and Giulio Tononi (Consciousness, Allen Lane, GBP 20) and Antonio Damasio (The Feeling of What Happens, William Heinemann, GBP 20) - has illustrated that neurologists rather than computer scientists now have the upper hand in the mind debate.

Similarly, genomics - ruefully assessed in Richard Lewontin's It Ain't Necessarily So (Granta, GBP 14.99) - has helped put biology, not physics, at the forefront of the public's scientific awareness.

Nor could Clarke have predicted how techno-optimism would go online, with the Space Race giving way to the Information Age, and the future of the written word itself being called into question. After defeating the murderous HAL, astronaut Dave Bowman finds solace in his spaceship's computer archives of Shaw, Ibsen and Shakespeare. Perhaps, before long, it will only be in such conditions of extreme desperation that the average consumer reaches for the classics instead of a shopping channel.

Alert to the danger, publishers at this year's Frankfurt Book Fair made 'e -books' their main discussion point. Already we can pay to download new novels to a PC, or to more portable 'e-readers' which, according to some people (mainly those trying to sell them) will be everywhere by the end of the decade. Yet high-profile e-books this year by Stephen King and Frederick Forsyth failed to win over the paper-loving public, and both authors soon lost interest in the idea. And while it was online bookseller Amazon who spearheaded the dotcom revolution, sliding shares have left traditional bookshops with renewed optimism.

Arthur C Clarke should be pleased, however, by the fact the most talked-about book at Frankfurt this year (ie the one worth most money) was a science title, Brian Greene's forthcoming The Fabric of the Cosmos, which its publishers hope will surpass A Brief History of Time. Its subject, superstring theory, was already born when 2001: A Space Odyssey played to its first audiences; so too was the internet. Sometimes the biggest ideas, barely noticed at first, take longest to catch on.

Clarke's astronaut finally makes it to Saturn - Jupiter in the film, since Kubrick couldn't rig up convincing rings. From there he is hurtled to an alien imitation of Earth, where he regresses into a super-foetus and goes home to rule the world. Clarke's childish power fantasy is a lot more clearly stated than Kubrick's enigmatic finale; but after the 2001 theme faded at the end of the Apollo missions - which Clarke covered for CBS - space became largely the preserve of machines.

Clarke's greatest legacy is that his writings inspired many of today's professional scientists, who read him in their youth and inherited his vision. In April an unmanned space probe will be launched, called '2001 Mars Odyssey'. It will be a great tribute; though Nasa is hoping it will fare better than their previous two Martian missions, in 1999. One was felled by a simple electrical fault; the other did a nose-dive because, it turned out, half the construction team had done their calculations in metric, while the rest had used Imperial. One lesson the scientists could never learn from Clarke: to err is human.

© Andrew Crumey

Thanks to HTML Codex