Marcus Chown, The Universe Next Door. Scotland on Sunday, January 6, 2002. Review by Andrew Crumey.

THROUGHOUT history, people have wondered if worlds exist other than our own. The Greek philosopher Aristarchus apparently thought life arrived on Earth from outer space; David Hume supposed our ordered universe to be merely the final successful version after numerous failed attempts. The theme of multiple worlds runs through most of Chown's "12 mind-blowing ideas from the cutting edge of science".

Take, for instance, "mirror world." This was postulated in the 1950s as the reason why particles called neutrinos always spin to the left. Their right -handed partners would inhabit a world existing alongside ours, but interacting only gravitationally. Since mirror matter has no electrical effect, it cannot be seen or touched. Largely dismissed when first aired, the theory has been revived following experiments showing that some unstable particles survive longer than expected, possibly nipping back and forth between our world and its mirror.

Mind-blowing indeed; and the theory's proponents tell Chown there could be mirror stars, mirror planets and mirror people, helping to account for the "missing mass" believed to make up 90% of the cosmos. Who knows, perhaps the mysterious 1908 meteorite that flattened Siberian forests without leaving a crater was made of mirror stuff. At this point, though, hearing the X-Files theme start to play in my head, I felt inclined to agree with the scientist's own admission that he has "a vivid imagination".

Chown is naturally keen to encourage his interviewees' wilder speculations. Physicist Max Tegmark describes the famous "many worlds" interpretation of quantum theory - the idea that particles simultaneously inhabit parallel universes of possibility. Tegmark believes our own lives branch at every moment into these various worlds; so that even if we die in one, we might continue living in another. And since life is the only thing we experience, he says, we become effectively immortal. A man could play Russian Roulette forever, never feeling himself to lose; though in countless universes his death is witnessed by others. Invited to try this Borgesian experiment for himself, however, Tegmark declines, saying it would not be fair on his wife.

Most of us would regard the Russian Roulette test as mind-blowing only in the suicidal sense; Tegmark appears genuinely perplexed by it. However, the truly weirdest theory in the book comes from astronomer Ed Harrison, seeking to explain where the universe came from in the first place.

For years, physicists have puzzled over an apparent "fine tuning" of natural forces. If gravity were only slightly weaker, for instance, there would have been no planet for us to evolve on. Then they came up with "baby universes." Take a small piece of matter and compress it astronomically (this is the hard part) and it would become a black hole whose hidden interior could wormhole its way into a brand new universe.

Black holes could therefore be the vehicle for a kind of cosmic natural selection, in which universes are reproductively favoured if they make lots of black-hole offspring. According to Lee Smolin, the carbon they produce makes life an inevitable by-product. Harrison, however, explains our fine-tuned cosmos as an alien experiment. The reason we live in a universe fit for life, is that it was hatched in a laboratory in some other universe fit for life.

Any philosopher since Aristotle could point out a certain infinite regress here. Where did the aliens come from? Presumably from more aliens, says Harrison, in a chain that must have started with God. I doubt David Hume would have been convinced; indeed, the increasing willingness of some American physicists to seek God in their equations shows how much things have lapsed over there since the Age of Reason.

It should be clear, then, that Chown's book does not seek to chart the hottest current topics in the field. Refreshingly, what we have instead are, for the most part, speculative ideas from respectable sources, some of which have drawn widespread interest. Chown's style will be familiar to any regular reader of New Scientist, where he writes regularly. In fact, one gets the impression that these pieces are reworked articles from the magazine, though the book does not state this. The evidence comes from frequent repetitions between chapters, and from the slightly clumsy links that run them together.

Chown's delivery is punchy, conversational and well-stocked with reader -friendly analogies keeping galactic concepts at an everyday level. One chapter concerns "fridge-sized black holes"; another is about a Russian scientist's efforts to convince people to look for alien "pepper pots", 4,000 of which he seriously believes have fallen to Earth.

The downside of Chown's chummy prose is that it turns every new theory into a "breakthrough", "bombshell", "revolution" or "stroke of genius" that is sure to "knock your socks off." Chown knows the journalistic value of hyperbole, but people read popular science because they want facts, not hype. It is similarly dismaying to find that when he ventures into a bit of detail (describing wave superposition), Chown quickly adds: "Maybe this sounds technical and dull. However, it has truly earth-shattering implications..." Anyone who found it dull would never pull either this book or New Scientist off the shelf; the plea seems aimed more at Chown's publisher than at his intended audience.

Read this, then, for a wonderful collection of exceedingly strange ideas. Just don't count on being sockless by the end.


Martin Rees, Our Cosmic Habitat. Scotland on Sunday, January 20, 2002. Review by Andrew Crumey.

JUST about anyone who has ever looked up at the stars has wondered if space goes on forever, or if perhaps it ends somewhere. Martin Rees - Britain's Astronomer Royal - offers a guide to such immensities in a book that brings the latest discoveries of cosmology within the grasp of the general reader.

Rees is a gifted communicator as well as an outstanding scientist. Too often we hear the plea from specialists that their exotic ideas really do make sense, just as long as we can follow their equations. Rees, though, feels a professional and moral obligation to put technicalities in plain terms. "Even if we do it badly, the effort is salutary: it reminds us that our efforts are worthwhile only insofar as they help illuminate the big picture." In Rees's case, this picture is very big indeed: he believes in a "multiverse" in which the cosmos we see is only a speck. But before arriving at this stunning conclusion, he considers some other astronomical breakthroughs that have made headline news recently.

The first of these is the growing number of new planets turning up on our celestial doorstep, orbiting nearby stars. Too small and faint to be seen by any telescope, these planets betray their presence through gravity, as Rees explains. "A planet tugs the star around in a small counterorbit, rather like a small dog pulling its owner on a leash." So far, only large, Jupiter-like planets can be found in this way, but as techniques improve, Rees anticipates that whole new planetary systems will emerge, including worlds not unlike our own. "We were all, when young, taught the layout of our own solar system… But, 20 years from now we shall be able to tell our grandchildren far more interesting things on a starry night… We will know the orbits of each star's retinue of planets, and the sizes (and even some topographic details) of the bigger ones."

One wonders what names these planets might be given. In contrast to the florid nomenclature of science fiction, the boringly-named 51 Persei was the first new sun to be discovered; its planet is only a number. Even the equipment employed in the search is, as Rees concedes, "unimaginatively named." Currently, the best Earth-bound instrument is called, wait for it, the Very Large Telescope. At least its proposed sequel, the 100m diameter OverWhelmingly Large Telescope, will have the appropriately keen-eyed acronym OWL.

This is all very different from 1781, when William Herschel became the first person to find a new planet in our own solar system - an event that brought him world fame, and provided Keats with a beautiful metaphor in his poem about Chapman's Homer. Herschel wanted to name the new planet after the king, but Planet George never caught on. A pity, as this would have added a pleasantly homely touch to the mystic ruminations of modern horoscope compilers. Our own planets, it was decided, had to maintain the classical tradition. This was why, when our ninth planet was found, its discoverer was over-ruled in his choice of name - Constance, after the widow of his former boss. The winning name - Pluto - was coined by an English schoolgirl.

Some hint of what sort of planetary lists our grandchildren might learn comes from asteroids, which can be named after real people. Somewhere up there, Asteroid Frank Zappa is pursuing its lonely orbit. More recently, a group of asteroids were named in memory of victims of September 11. So the possibility remains that our descendants' wall posters might show extra-solar planets with names like Madonna, Beckham, or even Planet Blair.

As Rees explains, the technology that is uncovering new planets might even bring about a return to the golden age of Herschel - a musician who took up amateur astronomy after reading a popular book on the subject. Today's astronomers never actually look through an eyepiece: images are relayed directly to computers in warm offices. The Internet makes this data accessible to anyone. Even the search for extra-terrestrial life has become a cottage industry: the SETI (Search for Extraterrestrial Intelligence) institute recruits volunteers who process signals from space on their home computers. "At the time of writing," says Rees, "three million people had taken up this offer - each, no doubt, inspired by the hope of being the first to find ET."

Rees discusses another headline-grabbing discovery, though one most find harder to grasp: the accelerating universe. This first hint of it came in 1998, when astronomers studied light from Type 1A supernovae, which are "in effect, thermonuclear bombs - exploding stars with a standardised yield." The astronomers knew how bright these explosions really were, and by measuring their luminosity seen from Earth, could work out how far away they were. They also measured their redshift - an indication of how quickly the supernovae are speeding away from us. The result was a surprise: the expansion of the universe is not slowing down, as expected, but is speeding up. It seems gravity is not solely a force of attraction - at very large distances, and also maybe at very small ones, gravity repels. This repulsion kicked off the Big Bang, lay dormant while the universe grew and has come into play again.

Einstein predicted this possibility, thanks to a term in his equations called the cosmological constant. He subsequently called this his "biggest blunder", but many astronomers now think otherwise. Rees, though, remains cautious. "The case is not yet overwhelming," he says. "A fog due to intergalactic dust might make distant supernovae appear fainter (hence further) than they actually are." Another cause of acceleration could be a "dark energy" known as quintessence, possibly mediated by "tracker fields." One wonders if the physicists who invented this idea - which was inspired by "inflation" - were actually thinking about their personal pensions.

What will be the fate of an accelerating universe? The future, unfortunately, looks black. Stars will fade, glowing "as dimly as a portable heater" before expiring completely. Our galaxy will dissolve into a mush of decayed atoms; other dead galaxies will accelerate out of sight, redshifted beyond the speed of light, so that anyone left to see the sky would find it studded with faint, frozen images, as if the distant universe had all fallen into a black hole.

We are saved from such gloom in the final, most speculative, but also most fascinating part of Rees's book. Here he describes how black holes might spawn new universes. Our own Big Bang, he says, may have been only one among infinitely many. Other realms might exist beyond the edge of our observable universe, or in extra dimensions we are unable to perceive. The cosmos Rees envisages is so huge that even the most unlikely events are certain to take place eventually. There could be other Earths, containing our other lives.

Reading about these theories, which Rees describes so eloquently, is as dizzying and awe-inspiring as looking at the night sky itself. If it's too cold or cloudy outside for a peek at the stars tonight, you could always lose yourself in this marvellous book instead.


Patricia Fara, An Entertainment for Angels. Scotland on Sunday, February 10, 2002. Review by Andrew Crumey.

IF YOU were feeling off-colour in 18th-century London you could seek comfort in James Graham's Celestial Bed. "Surmounted by an elaborately decorated canopy lined with mirrors, this enormous tiltable edifice was set in a private room through which flowed soft music, spicy aromas and circulating electrical fluids." It was this last element that was the really important one. Scottish -born Graham was the most famous "electrical physician" of the 1780s, and Fara's entertaining book shows how scientific breakthroughs of the Enlightenment era sent sparks flying through European society.

Graham charged a whacking GBP 50 for treatment at his Temple of Health, yet as many as 200 well-heeled patients passed through its doors each day, lured by promises of "that full-toned juvenile virility which speaks so cordially and effectually home to the female heart." Those unwilling to brave the Celestial Bed could make do with Graham's "stimulating lectures on sex and electricity." Heady stuff, though still not enough to prevent the whole grandiose venture from eventually going bust.

Graham was caricatured in an engraving showing him accompanied by a squawking duck, indicating him to be a mere quack; but "electrical therapy" was widely used by even the most respectable physicians. "By the 1780s," says Fara, "several of Britain's major hospitals had installed their own electrical machines." At Shrewsbury Hospital, a young woman whose arm was paralysed with rheumatism was subjected to a week's worth of painful shocks. At the end of it she claimed to feel much better, but perhaps this was only in order to escape further treatment.

Enterprising "electrical performers" forsook the stage and acquired medical licences. One, who called himself "Comus", treated people for epilepsy. Advertisements appeared for "electric tractors"; bits of metal shaped like tuning forks that promised to 'draw out' disease by being stroked along the sick person's body. No doubt the treatment still has its followers in New Age circles, but it fell victim to the first medical use of "blind testing", when doctors at Chester Hospital used "tractors" alongside wooden imitations, and found there to be no apparent benefit from either. Nobody understood how electrical therapies might work, because there was no clear idea of what electricity actually was. Most people regarded it as a "fluid" - which is why we still speak of electric "current" - and debate raged over whether there was one type or two. It was Benjamin Franklin who solved this problem with his theory of positive and negative charge. His famous experiment with a kite in a thunderstorm, we are told, took place with the kite tethered to a post, and Franklin - clearly no fool - standing under cover at a safe distance.

Others took a less cautious approach. One experimenter described to the Royal Society an encounter with a Leyden jar - a primitive capacitor - that left him with a bleeding nose and "a heaviness in my head, as if I had a stone lying upon it." After this, out of scientific thoroughness or else pure spite, he wired up his wife, blasting her off her feet.

There was no clear boundary between the serious scientists on the one hand and those on the other who were merely larking about with new technology. A case in point is William Watson and his "electrical mine." Hidden beneath a carpet, it provided Watson with experimental data on the effects of electricity on unsuspecting subjects. It no doubt also gave him a good laugh. For the Enlightenment, electricity played the role that subatomic physics does today. This was cutting-edge stuff, with the added bonus of being great for party tricks. At a public lecture in 1730, a young boy was hung from the ceiling with a clothes line, charged up with static electricity, then dangled over a pile of feathers that floated magically up to his body.

The word electrician - first used in print by Franklin - was almost a synonym for magician. Soon, people were speculating that electricity was the true "life force", an idea championed by the Italian scientist Galvani, who tied frogs' legs from his garden fence in order to see if lightning would make them twitch. The weather stayed calm, and the dismembered frogs were only "galvanised" into action later, in the scientist's laboratory. Even so, the thought of life from a thundercloud thrilled Percy Shelley, who was a keen electrical experimenter, and was immortalised by his wife Mary in her novel Frankenstein.

Her idea of re-animating human tissue with electricity was no mere fantasy; real-life Dr Frankensteins were actively pursuing it. In 1803, 14 years before Mary Shelley's novel appeared, a convicted murderer was brought down dead from the gallows in Newgate, and hooked up to a huge battery. "The dead jaws quivered and one eye leered open… (his) clenched fist rose into the air." Had he been restored to life, he would presumably have been hanged all over again.

Could the process work in reverse? Do living creatures produce their own electricity? Some thought this might explain the power of the torpedo, a fish named for its mysterious ability to numb its prey into a torpor. An English scientist called John Walsh re-christened the fish the "electric ray", and set about trying to discover how it worked. The great physicist Henry Cavendish took up the problem, and solved it in an admirably practical way. Cavendish made an artificial version of the torpedo from "pieces of thick leather, soaked in salt water and cut into a fish-like shape." Pewter plates inside the model fish were connected to a hefty electrical supply, and then the hazardous contraption was put in a bath. The only way to test it was to stick in your hand and touch it, and the resulting shock was enough to convince any sceptic.

The period engravings reproduced in Fara's book reinforce the sense of quaintness that these early experiments evoke. My only regret is that the book is not longer; this is a very brief guide to a subject that sheds light on the role of science in shaping the 18th-century imagination, and helps put in context many views still prevailing today. As such, it skims the surface but provides just the sort of delightful diversion crowds enjoyed two centuries ago, as they watched hapless victims' hair stand on end


Graham Farmelo (ed.), It Must Be Beautiful. Scotland on Sunday, February 24, 2002. Review by Andrew Crumey.

BEAUTY, as the saying goes, is in the eye of the beholder; and in this book, experts cast their eyes over some of the most famous equations in modern science, attempting to explain their beauty. The line-up of contributors is impressive, including such luminaries as Roger Penrose and Steven Weinberg. The result of their efforts, it must be said, is a mixed bag.

The perils of the venture are made clear in the introductory essay by the book's editor, Graham Farmelo, a physics professor who is head of communication at the London Science Museum. Farmelo quotes Philip Larkin's assertion that a good poem is like an onion (meaning it is many-layered, rather than that it makes you cry for no good reason). "The poetry of science," says Farmelo, "is in some sense embodied in its great equations." The metaphor is a hackneyed one, and not very illuminating; an equation is a tool for making predictions, and Farmelo's analogies with Cezanne or Ella Fitzgerald are less convincing than another with Stanley Kubrick's Full Metal Jacket, in which a character describes the "beauty" of his gun.

Some of the equations here are of dubious beauty, even in scientific terms. The Drake Equation - for estimating the chances of finding extraterrestrial life - is more Faberge egg than Ferrari, being pretty to look at, but almost completely useless. Even so, the chapter on it is one of the book's highlights, since it shuns aesthetics, and instead gives an interesting potted history of the SETI (Search for Extra Terrestrial Intelligence) project.

Pride of place, of course, goes to Einstein, who gets two chapters, the first being on the ubiquitous E=mc2. More has been written about this than any other scientific

discovery in history, and the present book has precious little to add. Einstein's rather less catchy equations of general relativity are then dealt with by Roger Penrose, in a chapter that will be hard work for anyone not already primed on popular expositions elsewhere, and which offers much the same account that Penrose gave in his excellent book The Emperor's New Mind.

For lay readers, the greatest interest is likely to come from the biographical asides. Schrsdinger, for instance, discovered his famous wave equation while enjoying a Christmas holiday away from his wife, with one of his countless girlfriends. It is not clear exactly which one, since Schrsdinger's usually detailed diary of sexual conquests is missing for the crucial days in question. Arthur Miller, in his chapter on Schrsdinger, does not pursue the story of the physicist's tangled love life any further, however, which is a pity. Literary biographers, by comparison, decided long ago that an artist's sexuality is relevant to their work; or at the very least is fun to read. On the other hand, it is entertaining to be reminded here of prickly rival physicist Werner Heisenberg's opinion of Schrsdinger's now universally accepted wave theory, summed up by Heisenberg in a word Miller politely translates as "crap".

Physics inevitably gets the lion's share of this book; but there is also chaos theory - described by Robert May, who advised Tom Stoppard on his play Arcadia - and evolutionary biology, discussed by one of the masters of the subject, John Maynard Smith. He describes his work on game theory, saying he once studied the strategy of "scissors-paper-stone", purely out of mathematical interest, only to discover later that there is a species of lizard whose males come in three different colour schemes, and whose fight for the best mate matches the game exactly. This, surely, is the real power and beauty of science: managing to bring together disparate truths in a way no poet ever could.


Janna Levin, How The Universe Got Its Spots. Scotland on Sunday, March 10, 2002. Review by Andrew Crumey.

LEVIN aptly describes her book as a "diary of a finite time in a finite space." On one level, this is a story about her troubled relationship with an "obsessive compulsive" boyfriend who appears to have a nervous breakdown. On another, more important level, the book is about Levin's research in cosmology, and her idea that the universe may be finite in size.

What we have, in other words, is a very interesting departure from the usual popular science formula. Yes, we get all the usual diagrams and metaphors about inflating balloons and crawling ants, as an effort to explain the subtleties of general relativity, cosmic expansion, and higher dimensions. But we also get a degree of emotional engagement that is very rare in the genre.

Take the first sentence: "Some of the great mathematicians killed themselves." Levin mentions Boltzmann and Ehrenfest - two tragic giants of modern physics - and reflects on how the pursuit of abstraction can leave scientists adrift in a realm of beauty which only they can understand. For Levin, this has personal resonance. An American-born physicist of evident distinction, her partner in the book - Warren - is a Mancunian musician who left school at 15. The roles of brilliant (male) scientist and uncomprehending other half - so familiar in biography and fiction - are here reversed, as Levin's boyfriend follows her on a chain of lectureships to Brighton, Cambridge, London, and eventual break-up.

Levin frames her book as a series of letters to her mother: a venerable device, from Euler's Letters To A German Princess to Sophie's World, that adds a personal touch to the exposition of difficult ideas. And at least to begin with, the formula promises a great deal, as Levin invites us to share her day -to-day problems, as well as the bigger, more philosophical ones that are her professional bread and butter.

The hard part, however, is striking the right balance between the two extremes - the mundane and the cosmic. As Levin warms to the latter, the book becomes more conventional: anyone who has ever read anything about the rubber -sheet universe bequeathed us by Einstein will be struck here chiefly by a sense of familiarity; readers new to warped space may find themselves getting lost, and wanting to return to Warren's mandolin-making, or the problems of finding a flat. The grandly poetic tone struck so resoundingly by Levin in her opening pages proves an impossibly hard act to follow: we are never told, for example, exactly what happened to Boltzmann and Ehrenfest. Instead, the biographical elements in the book are mostly marginal to the science.

Certainly, though, Levin is to be applauded for the sheer courage of producing something her mostly male colleagues, one guesses, will greet with raised eyebrows. An outsider in her profession simply because of her sex, Levin says gender is a subject never raised, even with female co-workers, except perhaps by means of "mutual eye rolling meant to indicate a sense of 'if only you knew.'" Even the language of physics, she comments, is masculinised, with acronyms such as MACHOs and WIMPs reflecting, perhaps, male insecurity as well as feeble humour.

Levin clearly delights in subverting stereotypes. Her friends are artists and film people; she takes part in a public discussion with the sculptor Marc Quinn, best known for casting his head with his own blood. It is poor Warren, meanwhile, who dresses shabbily, forgets to shave, likes tracing endless triangles with his fingers, and appears somewhat ill at ease with life on Earth.

As for the physics, it gets most interesting in the second half of the book, when Levin starts describing her own work. Think of her hypothetical universe as a room with permeable walls. Go through one wall, and you find yourself simply re-entering from the opposite one. Look down, and you see light from your own head, coming up from the floor. The universe is finite, but we see infinite copies of it; like being in a hall of mirrors. Levin unpacks the technicalities with a skill honed from giving many lectures on the subject, and it is fascinating to read. Far more so, in fact, than the preamble, and the relationship difficulties that by now have come to seem like needless distractions.

Her theory, she admits, is speculative. We do not see copies of our own galaxy in space, so the hall of mirrors must be very large. Careful analysis of the cosmic microwave background should show repeating "spots" which will decide the matter one way or another. Like all physicists, Levin lives with the possibility that her work will simply be proved wrong, and her research papers will go into the dustbin of history. Her book is, in part, a cry against such oblivion. Levin's frankness about the precariousness of her work is one of many admirable features here. She is also good on the insecurity of academic life and the fact nobody ever dares admit there is something they do not know or understand. For all its flaws, this is a book to be applauded. I hope we shall see more like it.


Jay Rayner, Star Dust Falling. Scotland on Sunday, March 31, 2002. Review by Andrew Crumey.

IN 1947, a converted Lancaster bomber sent a mysterious coded message to a bewildered radio operator in Santiago. Moments later, the plane ploughed into a mountain in the Andes, killing everyone on board. Jay Rayner's entertaining book tells the story of this legendary flight.

The plane had been painted silver after the war, had its gun turrets removed and seats installed, and was one of many 'Lancastrians' used by civilian airlines. From 1946, Britain's nationalised air industry consisted of three separate corporations: BOAC, BEA, and the less well known British South American Airways. The latter was the brainchild of Don Bennett, a figure straight out of a Boys' Own adventure, whose daring spirit was his downfall. It was a BSAA airliner called Star Dust that crashed in the Andes; part of a fleet that made Aeroflot look like a model of safety.

Australian-born Bennett found fame in the Thirties by flying non-stop from Scotland to South Africa, establishing a seaplane distance record that was never beaten. Gung-ho and patriotic to the core, Bennett was a workaholic who even spent his honeymoon writing a book about aircraft navigation. In wartime, he set up an ace flying squad called the Pathfinders, whose task was to pinpoint enemy targets. Losses were staggering, but Bennett - promoted to Air Vice Marshal - continued to fly alongside his men. When peace came, Bennett transplanted the RAF ethos into his fledgling BSAA.

Most of the crew were former bomber pilots. Even one of the air hostesses - or 'Star Girls' as they were known - had flown in the war. They continued to address one another by their former military rank and ran their operations out of the unfinished Heathrow airport, after Bennett simply parked his plane there one day, demanding use of the runway. It could have been the stuff of Ealing comedy, were the results not to be so tragic.

Determined to make his airline profitable, Bennett cut every corner. Passengers in the noisy, unpressurised aircraft were given cotton-wool earplugs and an oxygen tube to keep them alive. Seatbelts were provided, but Bennett angrily refused one, saying they only caused more injury. Important navigation equipment was banned from his planes, after Bennett discovered it was not British. His pilots had to find their way using the stars. Nor were there any weather forecasts, since these came from American airlines, at a cost. Instead, pilots read sketchy reports of earlier conditions and left the rest to guesswork.

The outcome was inevitable. Bennett's response to the first crash was: "It could have been worse," since not everyone was killed. A third of his fleet was lost before the air ministry booted him out after four years, during which he managed to run up a crash rate comparable with his wartime Pathfinders. At least they had the excuse of enemy fire.

Two of the planes vanished in the Bermuda Triangle; and high in the Andes, Star Dust bit the dust after sending the mysterious Morse message "STENDEC." The plane's wreckage was found two years ago, and Rayner describes the mission to recover it. As for STENDEC, to this day no one knows what it meant. Websites are devoted to it - there was even a science-fiction magazine named after it. Perhaps the answer still lies somewhere in the mountains.


Richard Lourie, Sakharov. Scotland on Sunday, April 28, 2002. Review by Andrew Crumey.

IT WAS one of the defining images of 'perestroika'. In 1986, a frail, dignified man stepped onto the platform of a Moscow railway station: Andrei Sakharov - inventor of the Soviet hydrogen bomb, Nobel Peace Prize winner and moral conscience of his nation - had returned from exile, freed by Mikhail Gorbachev. When Sakharov died three years later, the Soviet Empire teetered on the verge of a collapse he largely helped bring about.

This rich, authoritative biography is written by an American Russophile who knew Sakharov personally. Lourie translated Sakharov's memoirs, whose story neatly encapsulates the mixture of pettiness and tyranny characterising the cat and mouse game played out over three decades between Sakharov and the KGB.

Sakharov began his memoirs soon after he was exiled to the closed city of Gorky. By 1981 it filled a large satchel, but was stolen by undercover agents. Sakharov began the book again, and 12 months later had produced 900 pages of handwritten manuscript. These were in his car one day when a man approached, asking for a lift. An aerosol narcotic was administered and the documents were snatched. Despite being "suicidally depressed", Sakharov started his precious book yet again within days. "Andrei has a talent," his wife Elena Bonner said. "I call it his main talent, to finish what he starts."

This was the talent that made Sakharov a national hero, long before he became the Soviets' biggest headache. As a newly graduated physicist during the Second World War, Sakharov devised an improved way of testing artillery shells. Soon he was head-hunted by the innocuously named Ministry of Medium Machine Building: code name for the Soviet nuclear effort.

The project was headed by Lavrenty Beria, Stalin's murderous lackey, whose management strategy was "Medals if you succeed, bullets if you fail." Largely thanks to Sakharov, they succeeded. The first device went off in August 1949, and Beria immediately phoned Stalin with the good news. "I already know," said Stalin, and hung up. Four years later, with Stalin dead and Beria awaiting execution, the first Soviet hydrogen bomb was tested in Kazakhstan. Unfortunately, civilians were hit by the blast, and a young girl was killed. Hitherto, it seems, Sakharov felt few qualms about his grim work - a moral blindness one wishes Lourie could have probed deeper. Sakharov later called it his 'Edward Teller' phase, in comparison with America's real-life Dr Strangelove.

But now began Sakharov's 'Oppenheimer' phase, when, like the father of the US atom bomb, he woke to the reality of his creation. Calculating the potential effects of fallout on civilians, he advocated a worldwide test ban: Kruschev saw its political value, and complied. But in 1961, when Kruschev proposed resuming tests, Sakharov passed a note to him at a presidential dinner, advising him to reconsider. When he stood up, Kruschev brought the note from his pocket and turned on his finest scientist.

Lourie's description of the scene is chilling. A novelist himself, Lourie shows a keen awareness of narrative pace, and an eye for detail. His familiarity with Russian history and culture enriches this book throughout. But Lourie does not attempt to tackle Sakharov's scientific work, instead quoting briefly from others' explanations. While this is entirely understandable, it does leave a significant gap.

Sakharov's awkwardness continued, his prestige and military value forcing his superiors to excuse him as "naive." He began attending the silent vigils at Moscow's Pushkin memorial honouring prisoners of conscience. And during the Prague Spring, he wrote an essay on the future of world politics that turned him into a dissident himself.

Sakharov's essay was indeed naive, in its advocacy of world government and in its unbridled celebration of scientific progress. But when it appeared in the New York Times in July 1968, it suddenly transformed him into the unwitting spokesman for an entire movement. He was dismissed from his research post, then suffered the added blow of the death of his first wife. The KGB began bugging him, giving him the code name 'Ascetic'.

Sakharov found a new love in human rights campaigner Elena Bonner, marrying her in 1972. But his children never liked her, and she became a hate figure for those unable to comprehend how their former hero had turned "traitor." The KGB called her 'Vixen', using petty vandalism to intimidate the couple.

Sakharov went on hunger strike in 1974, in support of political prisoners. Not content with petty misdeeds, the KGB now gave Bonner's infant grandson a poisoned biscuit that nearly killed him. Much of the couple's efforts subsequently went on enabling their own family to emigrate to the West. Sakharov, of course, could never be allowed to escape, nor could he be killed; there were too many valuable military secrets in his head. So when the Nobel Prize was awarded in 1975, it was Bonner who went to collect it, while the Soviet press, and much of the public, jeered.

During the detente era, the Soviets were careful not to give the West too much propaganda ammunition. But when the war in Afghanistan began, and East -West relations collapsed, Sakharov no longer mattered. He and Bonner were bundled on to a plane, and found themselves in Gorky with a KGB landlady and 30 intelligence officers stationed across the street. There followed seven years of isolation, intimidation, further hunger strikes, hideous force -feeding, and the autobiography that Sakharov refused to give up.

The change, when it came, was sudden. Soon after Mikhail Gorbachev took power in 1985, Sakharov wrote to him, pointing out the falsity of Gorbachev's claim that the Soviet Union had no political prisoners. Lourie describes the almost surreal consequence. An engineer showed up at the Sakharovs' flat with a telephone, and proceeded to install it. "You'll get a call tomorrow," he told Sakharov. When the phone rang, it was Gorbachev. "You can return to Moscow together," he told them.

Gorbachev's political coup was also his downfall. Elected in 1989 as People's Deputy, Sakharov called for the abolition of the Communist Party's monopoly on power. Gorbachev was so incensed, he cut off Sakharov's microphone. But the viewing public heard - and only the growing opposition, led by Boris Yeltsin, could deliver.

Many felt it was Gorbachev's humiliation of Sakharov that led to his fatal heart attack; others blamed the years of hunger strikes. But Sakharov himself seems to have foreseen the end. The day before he died, he wrote the last words of his memoir. In Lourie's portrait of him as a brilliant and courageous man, Sakharov could not have wished for a more eloquent epitaph.


Patricia Fara, Newton: The Making of a Genius. Scotland on Sunday, May 19, 2002. Review by Andrew Crumey.

WHEN Margaret Thatcher chose a coat of arms to go with her status as Baroness, she opted for a shield flanked by two figures. On one side stands a sailor strongly resembling Tintin's Captain Haddock, meant to symbolise Falklands glory, while on the other there is a chap in a long wig, who is the only other famous person ever to come from Grantham. This, of course, is Isaac Newton, voted Man of the Millennium in a poll that put him above Shakespeare and Martin Luther.

Why, though, is Newton rated so highly? Is his reputation deserved? Patricia Fara sets out to provide an answer; and unfortunately gets lost along the way. Her book, by virtue of its sheer muddle-headedness, raises extremely important issues about the way science is viewed and taught in our society.

The best place to start is the one thing about Newton that everybody knows: he sat under an apple tree and discovered gravity. The most surprising aspect about this tale is that it was Newton himself who first told it, a few months before he died. Where the legend goes wrong, though, is in the notion that Newton then suddenly discovered "gravity." People had known about gravity for a very long time; Newton's achievement was to show how, as well as making objects fall, gravity also keeps the Moon and planets in their orbits.

The tree's descendant can still be seen in Newton's garden at Woolsthorpe, producing apples that in Newton's day were used to make cider. "However," Fara says primly, "the apple's specific variety and the tree's precise location are historically less interesting than the anecdote's historical associations."

The apple is, of course, ripe with symbolism. We have the Garden of Eden, and Milton's Paradise Lost, published in the same year - 1667 - that Newton sat in his garden. That's a cute coincidence, but it leads Fara to make a remarkable suggestion. "Milton and Newton may also have been thinking of Hercules' arrival in the garden of Hesperides." Quite how this piece of retrospective mind-reading is meant to be justified, we are not told. Instead, we are led further along a path of free association, in the sort of academic game played out daily in the humanities departments of universities across the globe.

Fara is a lecturer in the History and Philosophy of Science at Cambridge University. Yet on the few occasions in this book when she deals with Newton's scientific work, she frequently makes glaring errors. She says, for example, that Newton believed light to consist of particles that slow down when passing through glass. In fact he predicted they would speed up.

Fara concerns herself with the way that Newtonian "ideology" was disseminated by "hagiographers", ultimately putting the great man on such pedestals as Thatcher's coat of arms and the now-defunct English pound note. Scientific theories, in other words, are a matter of opinion. Explaining how Newton got where he is today is no different from accounting for the success of New Labour or Harry Potter. For the benefit of Fara's students at Cambridge University, I feel I ought to point out something her book neglects to mention.

In Newton's day, there were two main theories of light. One was his idea that it is made of particles; the other was Huygens' theory that light consists of waves. Both had their strengths and weaknesses, and which one you opted for was indeed largely a matter of taste. But while Newton predicted that light should speed up in water or glass, Huygens reckoned it should slow down. And it does slow down: Newton was wrong. With gravity he did better, and it took more than two centuries before Einstein came up with a more accurate description of the universe. Hence Newton's amply deserved fame.

Scientific theories are not simply "ideologies", nor are their supporters necessarily "propagandists." Certainly, scientists cheat, denigrate their opponents, further their own careers, and do many other nasty things. They're human. But that does not mean that science itself is purely a matter of fashion.

To see the danger of Fara's position, look at the situation under Stalin, when genetics was declared counter-revolutionary, and a Gulag awaited anyone who did not adhere to the "official" biological theories of Lysenko. That is what it really means, to say that one scientific theory is no more "true" than another. Or look at Nazi Germany, where efforts were made to build an "Aryan" science to replace the work of Jews such as Einstein. One popular theory, which apparently appealed to Hitler, was that the world is hollow, with all of us living on the inside. Or look at the United States, where schools teach Darwinian evolution and Creationism as viable alternative hypotheses, and it is now possible to sign up for a degree course in astrology.

Recent books on Newton have increasingly stressed his homosexuality; and a thoroughly unpleasant side to his personality displayed in vicious professional feuds, and a keenness for hanging counterfeiters while in charge of the Royal Mint. Fara takes this sort of hatchet work a step further. Her book supposedly considers the way Newton's "genius" was constructed by his "disciples"; but what really concerns her is fame - the word appears countless times, as a generic qualifier for any historical figure whose name has withstood the test of time. Fame is what has replaced genius, glory or wealth as the ultimate mark of distinction in our celebrity-obsessed culture. Newton won it, and Fara somehow grudges him it. In an epilogue, she extends her debunking. "During the twentieth century, the main competitors for Newton's place were Einstein and Hawking." No physicist on earth - least of all Hawking himself - would claim that A Brief History Of Time ranks with Newton's Principia. Yet Fara ludicrously links them, then snipes at how, during a Newton celebration in Grantham, "Hawking promoted himself as Newton's natural successor by ensuring that he was photographed in the Woolsthorpe garden, sitting beneath a supposed descendant of the original apple tree."

Fara has much to say about the ideological motives of Newton's "self -promoting" followers, but little about her own. Her book is destined to baffle and disappoint those in search of something in the Longitude genre, mixing homely cultural history and simple science. After trawling through this dense miscellany of art criticism, erudition and polemic, I can only feel sorry for her students.


Oliver Morton, Mapping Mars. Scotland on Sunday, June 23, 2002. Review by Andrew Crumey.

IS THERE water on Mars? NASA thinks so - though they said the same about the moon a few years ago, and that turned out to be overly optimistic. Whatever the final outcome, the widespread media coverage of the latest findings from the Mars Odyssey spacecraft illustrates the enduring fascination of the red planet.

It is this public interest that Fourth Estate evidently wishes to tap, with a book aimed squarely at the burgeoning popular science market, with its eye -catching cover and the imposing subtitle "Science, imagination and the birth of a world." What you get, though, is a book likely to appeal only if you are truly, deeply, interested in Martian minutiae. In fact, if you can follow all of this from cover to cover, you probably work for NASA.

Amid the dorsa and mensae, the andesites and clathrates, a few nuggets of general interest do stick out. For instance, there are the artists who make a living by painting imaginary Martian landscapes. Among the best known is Pat Rawlings, whose evocative First Light shows astronauts exploring a Martian Grand Canyon. In another painting, The Key, a globe lies in the sand - the relic of an ancient civilisation.

As Morton points out, the iconography of these Mars paintings comes from 19th -century American landscape art, and the subtext is the same. This is land to be conquered, with only the odd pueblo ruin in the way. It was Vannevar Bush, in a report to President Truman, who called space an "endless frontier", a term transformed by Kennedy to "new frontiers" and by Star Trek to "the final frontier".

Now there is a Mars Society dedicated to future colonisation, an amateur organisation whose members escape to the desert to rehearse life on another world. Their earth-bound existence, they maintain, is too government -controlled, too constrained, too free from risk and adventure. Morton was invited to join, but declined. "When you hear someone saying of future Mars colonists , 'It's a frontier. People are supposed to die. That's the point', it's hard to resist suggesting he just go and play in the traffic - or work in southern Sudan - until his appetite for risk wears off."

Mars, it seems, has always reflected human preoccupations - from its ancient connection with war (thanks to its flaming redness in the sky) - to the illusory 'canals' that astronomer Percival Lowell claimed to see through his telescope in the late 19th century. These were evidence, it was claimed, of intelligent beings, an idea calmly accepted by a general public who knew there was no chance of ever meeting them. Science fiction writers, of course, thought otherwise, giving us a century of Martian adventures that ended only when visiting spacecraft revealed the planet as a dusty wilderness.

Despite the stories of Wells and others, many believed the Martians to be peace-loving. They had, after all, laced their planet with waterways as a means of sharing their most scarce resource. Lowell's friend Edward Henry Clements, editor of a Boston newspaper, wrote a 400-line poem showing that "Mars is carrying through our heavens the heart-red flag of socialism." Russian writer Alexander Malinovsky did something similar in a 1924 poem called 'The Red Star: The First Bolshevik Utopia'.

Alas, it was all fantasy; but such details at least enliven a book whose sequences on areography and areology (Martian geography and geology), while admirably precise, are a little too technical and a great deal too abundant for the mass readership Fourth Estate has in mind.


Stephen Jay Gould, I Have Landed. Scotland on Sunday, June 30, 2002. Review by Andrew Crumey.

EVER wondered why white people are called 'Caucasian'? It's all thanks to JF Blumenbach, a German naturalist who in 1795 decided Mount Caucasus in Russia was the place where mankind began. He arrived at this curious conclusion on the basis of a collection of human skulls, among which the most beautiful - in his estimation - was of a Caucasian woman. Since it was the most perfect, he reasoned, it must also be the starting point from which all other races derived. White was therefore the first colour of mankind, since, according to Blumenbach,

It is very easy for that to degenerate into brown, but very much more difficult for dark to become white.

I'm very grateful to Gould's book for this precious nugget, with which I shall no doubt impress or bore my companions at some future dinner party. More seriously, Gould highlights the way in which scientific theories - even fallacious ones - enter into mass consciousness, where they can be used to support whatever prejudice one happens to hold. As another illustration, Gould discusses the ideas of Ernst Haeckel, which still linger in the popular notion that human foetuses have gills because we are descended from fish, when it is more accurate that foetuses have gills because humans and fish share a common ancestor. A nit-picking distinction, you might think; but Haeckel's theories formed part of the 'intellectual' justification for Nazi eugenics.

Gould, who died in May, was one of the foremost popularisers of evolutionary theory in our time; and several of the essays in this book remind us of the explanatory flair of his finest work, Wonderful Life. Yet we are also reminded, more sadly, of the degree of self-indulgence which prestige entitled him to.

For more than 20 years, Gould contributed a monthly essay to Natural History magazine, and these occasional pieces have been published in successive volumes, of which I Have Landed is the last. Some of the pieces are fascinating; but not every month was a winner, and in the final essays we see the increasing victory of style over substance, as the depths of biological history are gradually mined until only obscurity or repetition remain.

More than once, Gould cited Montaigne as his stylistic mentor in the essay form; but while the Frenchman is an unimpeachable role model, it seems Montaigne's most essential feature - his humility - is something that rather passed Gould by.

In fact, in this last instalment of despatches, I was reminded not so much of Montaigne as of Alistair Cook. Like Letter from America, Gould's essays often feel like exercises in the filling of time. We have the oblique opening - something about baseball, perhaps - before a homely lesson on taxonomy or palaeontology, couched in Gould's genial, verbose tone, that flatters us into thinking we are nearly as clever as the good professor himself.

This is not to detract from Gould's best work, which I can heartily recommend to anyone interested in popular science. I Have Landed, quite simply, is not Gould at his best, despite its occasional gems. Although once described as the greatest living scientist, Gould was too smart to believe such hype. Instead, his essays betray vanity of a different kind - the literary sort, which is a lot harder to shake off.


James Hamilton, Faraday: The Life. Scotland on Sunday, July 14, 2002. Review by Andrew Crumey.

THE difference between artists and scientists, we like to think, is clear cut. James Hamilton's exemplary biography of the father of electrical power, however, reminds us of the overlap that has always existed between the "two cultures." His portrait of this elusive, intensely private genius describes Faraday's links with painters and poets, polymaths and mystics.

Michael Faraday's origins were humble. Born in 1791, he was a blacksmith's son who grew up among London's poor, with little formal education. His parents belonged to the Sandemanian sect, founded by Scottish linen-maker Robert Sandeman, who died 20 years before Faraday's birth.

The sect was extreme even by fundamentalist standards: at their day-long services the Bible was read, chapter by chapter. "When they reached the end," says Hamilton, "they started again at the beginning."

Among their rules was a ban on eating any animal that had been strangled; an Edinburgh congregation disobeyed and was immediately cast out. That was another Sandemanian rule: you were not allowed to disagree with any of the rules.

Seeking converts was not part of their way; in fact this, too, was banned, and since they only married within their own church, their eventual extinction was assured. But in early 19th-century Britain, such non-conformist groups were a significant social force. When Faraday became an apprentice bookbinder at the age of 13, it was to an emigre Huguenot who followed another sect, the Swedenborgians.

It was in this London bookshop that Faraday acquired his learning, reading what he bound. Among the countless books he saw, one was a popular science primer by a leading exponent of the genre, Jane Marcet.

Faraday performed the simple experiments Marcet described. His employer recognised his talent and encouraged him to attend public lectures on science.

Among the speakers he heard, the greatest by far was Humphry Davy. Davy is remembered for his safety lamp, but in his lifetime his fame was wider. He had the intellectual aura of a Stephen Hawking and the looks of a matinee idol. Hamilton carefully avoids such anachronous comparisons, but points out that in an era when the most crowd-pulling names on theatrical posters were marked with an asterisk (the origin of our word 'star'), Davy was truly a bill -topping act. His lectures at London's Royal Institution were high-class entertainment, which Davy's friend Samuel Taylor Coleridge attended "to increase his stock of metaphors".

As Hamilton says, Davy - an amateur poet - was a romantic, while Faraday - though only 13 years younger - belonged to the generation that would see art and science pursue increasingly separate courses. Faraday's friends in later life included several painters, particularly William Turner, and Hamilton notes some interesting parallels between the two men: in particular, their lowly origins and their astonishing, obsessive desire to pursue their creative vocations.

Unlike them, Davy was rooted in the fixed social hierarchies of the 18th century. He acquired a fabulously wealthy wife - an Edinburgh widow named Jane Apreece who was a society hostess at the tail-end of the Scottish Enlightenment - and gave up lecturing in order to pursue his twin passions of chemistry and fly-fishing. Nearly blinded by an experiment gone wrong, Davy needed an assistant, and offered the job to Faraday.

In 1813, Davy planned to go to Paris to collect a medal for his scientific work. It was the height of the Napoleonic War, but Davy loftily declared science to be above politics, and arranged passage for his entourage on a 'cartel' - a ship licensed to sail between warring countries. While Davy and his wife sat inside their coach, Faraday rode on top as Davy's valet. Davy acknowledged Faraday as an intellectual but not social equal, while Davy's domineering wife never let Faraday forget his place.

Back in London, Faraday worked as laboratory assistant at the Royal Institution, and began the series of electrical experiments for which he is famous. As early as 1819, visionaries predicted cities would one day be lit by electricity. Two years later, Faraday began to make this a reality, when he discovered how a magnet can set a conducting wire in motion. It was the world's first electric motor.

Faraday swiftly published his landmark findings, and walked into disaster. Anonymous detractors claimed the discovery had been made earlier. Faraday's enemy, it seems, was Davy, indignant that a lowly assistant should claim credit for the work of gentlemen. Davy nevertheless continued to give Faraday scientific projects to carry out: one caused an explosion from which Faraday was lucky to escape with his life. When Faraday announced this result, Davy claimed to have known it already. Why, then, had Davy proposed such a lethal experiment? Hamilton acknowledges there is "no real evidence", but leaves us wondering.

After Davy's death in 1829, it was Faraday who became the Royal Institution's star turn. The children's Christmas lectures he founded are still broadcast each year, and if nowadays they seem a little like a Billy Smart's Circus for swots, they nevertheless preserve Faraday's determination to bring science to the masses. This - as much as his great electrical discoveries - is what made him a Victorian hero whose fame grew with the decades. As Hamilton says: "The only comparable figure to Faraday in public life in the 1850s was Charles Dickens."

George Eliot wrote: "You must know Faraday's lectures are as fashionable an amusement as the opera."

She attended one on oxygen, which Faraday illustrated by blowing soap bubbles.

Faraday's childless marriage - to a fellow Sandemanian, Sarah - was long and apparently happy. But fame brought the same kind of adulation Davy had received, and Lord Byron's daughter, Ada Lovelace, became smitten with Faraday when she was 29 and he was 53.

Famous for her work with Victorian computer pioneer Charles Babbage, Lovelace was a brilliant mathematician who had inherited her father's headstrong romanticism. Writing to enquire about Faraday's religious views, she embarked on a series of letters that became increasingly overt in their aim of making her Faraday's muse - "a position that was vacant", says Hamilton, "as Sarah was not her husband's muse but his carer".

Signing herself "the ladye fairy", Lovelace set up a meeting at which the unworldly Faraday finally twigged to her intentions. True to his Sandemanian principles, he broke off what could have been a remarkable meeting of minds. His beliefs made him shun public honours and privileges just as vigorously, so that when he died in 1867, he seemed an almost saintly example of Victorian self-improvement and self-denial.

Hamilton's aim is to place Faraday in his cultural context. In this, he succeeds brilliantly. A specialist in the visual arts rather than science, Hamilton brings a layman's sense of wonder to Faraday's discoveries, describing the experiments with admirable vividness. If Faraday remains enigmatic, it can only be because this is how he was: a fundamentalist who questioned nature's rulebook; a man of deep passions, who kept his non -scientifc ones largely to himself.


Jonathan Katz, The Biggest Bangs. Scotland on Sunday, August 4, 2002. Review by Andrew Crumey.

IN THE 1960s, America feared the Soviet Union might cheat on nuclear test ban treaties by detonating devices behind the Moon. Satellites were launched to look for tell-tale radiation, and accidentally discovered the biggest bangs in the universe. These 'gamma-ray bursts' puzzled astronomers for decades, and were only recently explained.

Popular science television shows like Horizon and Equinox have leapt on the topic, using suitably dramatic special effects. Jonathan Katz cannot rely on colourful explosions and loud noises; instead he tells an intricate story of scientific trial and error, personal feuding, and ultimate triumph.

It was long thought that gamma-ray bursts were caused by comets falling on neutron stars. A dissenting voice belonged to Soviet physicist Vladimir Usov, who thought the bangs came from far beyond our galaxy, and hence were bigger than anyone imagined. He was ignored - perhaps, Katz suggests, because the soft-spoken Usov "never became a member of the informal club of 'leading' astrophysicists who organize meetings and invite each other to lecture at them".

In his technically challenging book, Katz's highlighting of the social aspects of his scientific field offers a welcome and intriguing undercurrent. He gives a brief portrait of Astronomer Royal Martin Rees, who "quickly came to dominate British astrophysics, casting such a shadow that many younger scientists left for America to escape it." Katz adds that in his bachelor days, the dapper Rees "had something of a reputation as a ladies' man".

Personal gossip forms only a minor element of the book, but a recurring strain is Katz's disgruntlement with the mechanisms of scientific funding and publication. One of Usov's Soviet colleagues was unable to publish in prestigious Western journals because of the hard-currency fee charged to contributors; so this work, too, fell by the wayside. An American researcher proposed a low-cost wide-field telescope that could hunt for light flashes accompanying the invisible gamma rays. The National Science Foundation rejected it in favour of costlier projects from people inside the 'club'. "Had this proposal been approved," says Katz, "the visible counterparts to gamma -ray bursts might well have been discovered 20 years earlier than they actually were." The device that finally succeeded was made from a commercial telephoto lens hooked up to a digital camera.

Contrast this with the prestige projects of NASA - the recipient of Katz's most biting comments. Their gamma-ray detecting satellite was allowed to crash for what Katz calls "political" reasons. "NASA has a long history of launching scientific satellites and then losing interest in them", he says bitterly. The Hubble Space Telescope "has fulfilled most of the dreams of NASA's publicists, and even of astronomers… Its spectacular photographs are the pride of every glossy popular astronomy book."

Perhaps this is too much like carping, from someone who saw his own favoured project fall out of the sky. But the value of Katz's fascinating book is the way it highlights the many wrong turns astronomers took, for reasons that were all too human, before finding the answer.

It's not comets that are falling on neutron stars, but other neutron stars, in distant galaxies. Their collision forms a black hole, and the resulting hypernova could fry the Earth from a range of 100 light-years. One day we might need to prepare for one. "People living on the hemisphere exposed to the burst," Katz says calmly, "could be evacuated to the side of Earth shielded from it." Does he really think, given his experience with ruthless scientists, that the people lucky enough to be on the dark side would be so accommodating?


Edward Marriott, The Plague Race. Scotland on Sunday, August 25, 2002. Review by Andrew Crumey.

BE WARNED: this is a book about rats, fleas, buboes and pus. The gory facts of bubonic plague are part of every schoolchild's repertoire of horror: we all know it's the flea that is the real villain. What fewer know is how the facts were discovered.

Marriott presents the story as a race between two scientists: the illustrious Japanese bacteriologist Professor Shibasaburo Kitasato and the reticent French doctor Alexandre Yersin. Both went to Hong Kong when plague broke out there in 1894, in the notorious district of Taipingshan: 10 acres crammed with nearly 400 filthy and overcrowded houses that were a ready breeding ground for any kind of disease.

Sheer squalor was held to be the cause of the plague outbreak, which soon brought calls for the demolition of Taipingshan, and divided the colony along racial lines. The Chinese had a different form of medicine from the British, and the western-style plague hospitals they quickly set up were rumoured to be factories where Chinese plague victims were ground into medicine for Europeans. As panic spread among the "lower orders", one witness snootily noted that "Europeans began to fear they would find themselves without servants".

Kitasato - famous for his discovery of the tetanus bacillus - was greeted as a saviour by his host James Lowson, the Scottish-born superintendent of Hong Kong's main hospital. Marriott characterises Lowson as comically narrow minded: obsequious towards Kitasato, yet patronisingly dismissive to the unknown Yersin. You can almost imagine Lowson being played by Fulton Mackay, in a variant of his prison warder character in Porridge.

Such simple characterisation is also used to heighten the contrast between Kitasato - "sleek, proper and establishment" - and the "maverick" Yersin, working alone in what was little more than a shack, while Kitasato's entourage were given all the facilities they required. It makes for a gripping narrative; but as one progresses through the book, it begins to look more like stereotyping.

The assumption being made here is that scientists who find the right answers must be better than those who fail. Kitasato analysed the vital organs of plague victims and isolated a bacillus which he took to be plague. Yersin instead took samples from the buboes, and found a different organism. Kitasato rushed into print with his results, then gradually it emerged that Yersin had found the real plague bug - belatedly called Yersinia pestis, once the controversy died down. According to Marriott, Yersin was the better scientist because of his complete objectivity. But was he really smarter, or just luckier?

By Marriott's own account, Yersin long harboured an erroneous belief that contaminated soil is the pathway of plague. And unfortunately for the book's heroic narrative, Yersin never figured out the role of rats, which he - along with many others - suspected but could not prove. That honour went to Paul -Louis Simond - a man predictably portrayed by Marriott in terms almost identical to those applied to Yersin - "refreshingly unencumbered by the herd instinct." Simond kept dead plague-ridden rats in jars, and observed fleas leaping off in search of any warm host they could find.

As a further complication, Marriott interpolates a fictionalised account of the 1994 plague outbreak in India, which frankly adds nothing, as well as asides on the history of plague in America and elsewhere. The story of Kitasato and Yersin is indeed fascinating, but the book's subtitle - "a tale of fear, science and heroism" - leaves one wishing for less heroism, more science.


Ian Stewart and Jack Cohen, Evolving The Alien. Scotland on Sunday, September 1, 2002. Review by Andrew Crumey.

WHAT does an alien look like? Any six-year-old will tell you: they've got big bald heads and large slanting eyes. The so-called 'greys' of UFO lore have passed into mass culture as the authentic image of extraterrestrial life. The real thing, the authors of this book argue, is probably altogether stranger.

Ian Stewart is well known as a writer of popular science, and has collaborated with biologist Jack Cohen on a number of previous books, such as The Science of Discworld. Collaboration has its advantages and its pitfalls - and in the case of Evolving The Alien it's the latter, unfortunately, that predominate.

The narrator of this odyssey into the unknown is a curious two-headed entity called 'Ian&Jack'. In real life, Cohen was enlisted as scientific consultant by a team of sci-fi authors that included Larry Niven. In Evolving The Alien, it is "we" - Ian&Jack - who go to California. And what a curiously condescending beast Ian&Jack is, offering us lengthy remarks in the opening chapters about the rubbishy quality of "media" sci-fi - meaning Star Trek, Alien and just about every other film or TV series the general public is likely to have heard of.

What Ian&Jack object to is their scientific inaccuracy, and they do have a point. Alien, for instance, asks us to believe in a parasite whose life cycle depends on finding an astronaut as host - hardly a winning evolutionary strategy. But for most people, simple old-fashioned things like plot and characterisation count for more than making the technical details work.

Not, it seems, for Ian&Jack. Their book is interspersed with synopses of sci -fi stories, variously held up as "good" (ie scientifically plausible) or "bad" examples. Among the good scenarios we find some magnetic creatures featured in Wheelers, by none other than Ian&Jack, who are not afraid to sing the praises of their earlier works.

After so much introductory sniping - at sci-fi film-makers and at other writers and scientists - Evolving The Alien begins to feel like an exercise in alienating the reader. But things improve when we finally start getting some science.

In trying to figure out how extraterrestrial life could evolve, Ian&Jack distinguish between "universal" features of Earth life, and "parochial" ones. Eyes have evolved independently in many different ways, so appear to be universal. But having two eyes above a nose and mouth is a parochial feature derived from our fishy ancestors. Aliens are just as likely to have the mouth parts of a crab or butterfly.

The hard part is deciding what is universal. Ian&Jack put Darwinian natural selection top of their list, which seems reasonable. But after that, the rest can only be pure speculation. Advanced intelligence, they plead, must surely be universal - even though only one example (ourselves) has ever evolved on Earth.

On the one hand, Ian&Jack argue that alien life will be so unimaginably different from anything on Earth that we will not even have words to describe it yet. Elsewhere, they conclude there must be lots of planets out there with water, plants and animals, predators and prey, and an ecology very much like our own. So forget all that carping about Star Trek - anything is possible after all; even a duplicate Earth. It's all curiously inconsistent. But that, I suppose, is the price you pay for having two brains.


Steve Jones, Y: The Descent of Men. Scotland on Sunday, September 8, 2002. Review by Andrew Crumey.

THE essence of masculinity is the Y chromosome - a tiny package of DNA found only in males, and passed from fathers to sons like a genetic surname.

As Steve Jones argues, it is a dubious inheritance. Men live shorter, unhealthier, more stress-ridden lives than their opposite sex, whose hearts are kept healthy by oestrogen, and who manage to avoid the numerous nasty side -affects of testosterone.

But maleness is good for the species, if not for males themselves, which is why there are so many of us poor souls around.

Masculinity has, of course, come under siege in recent years, with stories about academic under-performance, loss of social status and declining sperm counts prompting much talk of crisis.

Jones is keen to distance himself from the waffle that usually characterises much of that debate, but his book - emphasising as it does, the hard-luck aspect of maleness - nevertheless sits comfortably in a marketing niche that countless armchair pundits have opened up in recent years.

Ostensibly, Jones's book is a homage to Darwin's The Descent of Man, in the same way that his earlier book Almost Like A Whale paid tribute to The Origin of Species. But the agenda here is less clear cut, making Y: The Descent of Men a curiously patchy affair.

The earliest chapters are, unfortunately, the hardest work. Any non -specialist entering the linguistic jungle of cellular biology is immediately struck by how many words end in -ase, and by a degree of complexity matched, in daily life, only by the insides of a television set.

The sheer richness of Jones's prose only adds to the problem. "Males are, in many ways, parasites… Like all vermina they force their reluctant landladies to adapt or be overwhelmed. As the host evolvesa the two parties enter a biological dance."

I make that four metaphors in one paragraph - more than my humble Y-brain can readily cope with.

Most readers will prefer Jones's account of male "hydraulics", as he describes how nature (sometimes with the aid of Viagra) defies gravity in the cause of reproduction. The average length of the male member is given as a worryingly exact 5.877 inches; those opting for artificial extension should beware the droopy-tipped "Concorde Deformity" that sometimes results. Circumcision - favoured by many Americans and also, apparently, by our own Royal Family - is heartily condemned.

Historical anecdotes and wry humour enliven the book - a French nun who was surprised to find herself male; the attempts of the poet Yeats to keep his sex -life active through vasectomy; or the invention of sperm donation by an 18th -century Scotsman.

Equally intriguing is the Y chromosome's usefulness in tracing human evolution. Jones - proud of his Welsh roots - discusses the legend that some of his Celtic forefathers may have colonised America. Genetics has disproved this. On the other hand, a dwindling Russian tribe called the Kets turn out to share the same type of Y chromosome as the majority of modern Native Americans, proving that this group colonised the new world, by way of Alaska, 16,000 years ago.

The book is at its best when it tackles the social aspects of maleness, and in particular the issue of biological determinism. Many today will readily ascribe homosexuality, intelligence or criminality to the human genome.

In earlier times, as Jones points out, attributes such as nationality, class and gender were seen as the determining factors. Our faith in biological destiny may prove just as flawed. Is testosterone the key to manly success? "Builders have more than architects, but architects are their bosses," says Jones.

And bald men are not, after all, more highly sexed than their hirsute fellows. Their scalp simply lacks the chemical that normally mops up the ever -destructive male hormone.


Lisa Jardine, On a Grander Scale. Scotland on Sunday, September 15, 2002. Review by Andrew Crumey.

CHRISTOPHER Wren was a scientist before embarking on his career in architecture - "rubbish", as he called it - that made him famous. He was a virtuoso mathematician, pioneering physician and professor of astronomy.

A fascinating and ideal subject then, for a writer such as Lisa Jardine, who has so successfully brought the byways of history to a mass audience. And yet her ample biography is a disappointment.

The flaw is certainly not a lack of admiration for her subject. In an introduction, she describes the wonder she felt, discovering a forgotten basement to the London Monument which Wren and Robert Hooke designed in such a way that it could be used for gathering data on air pressure and the motion of stars. Jardine positively gushes, calling Wren - a man equally at ease in the arts or sciences - "my kind of hero".

She says her aim is to explore the full range of Wren's mind, and it is a dazzling prospect. In fact, however, we embark on an exploration of political and social territory in which Wren often appears almost incidental.

A better title for this book would be Patronage In Restoration England: A Case Study. Wren's father was Dean of Windsor, an intimate courtier of Charles I. The English Civil War left such high-ranking men suddenly out of favour; the restoration of Charles II brought them centre stage once more. This is undoubtedly crucial to Christopher Wren's story; but is it really necessary to be given such an extensive grounding in the political history of the period?

Few players are allowed to enter without their family background and social connections first being explored. This explains why Jardine bizarrely fills a gap in Wren's chronology by telling us about somebody else instead - a Dutch diplomat whose activities "make suggestive comparison" with what Wren might have been doing. We get some of his poetry, and a portrait of him. Eventually we realise he is significant as the father of Christian Huygens, whose astronomical interests overlapped with Wren's.

But when it comes to exploring Wren's scientific mind, we are left short -changed. A long quotation, illustrating the care Wren took in observing and trying to explain Saturn's rings, is highlighted by Jardine only for the way in which Wren gives customary praise to the noble provider of his telescopes. Social pecking order takes precedence over science.

Most readers will want to know about Wren's architecture, but even here, Jardine's fondness for quoting original sources at length offers a test of any reader's patience. There is Wren's suggestion for repairing the spire of Salisbury Cathedral, for instance: "Let a curbe of Iron made of 8 peeces be fixed cleane round on ye outside and Joynted at ye corners..." This goes on for the best part of a page, and unless you're an ironmonger or a linguist, you'll be asleep by the end.

Jardine's aim is laudably ambitious; but by attempting such a wide scope, she loses sight of simpler questions. I was over halfway through before I got any idea of what Wren was actually like as a person (charming and diplomatic). This book taught me much about how he got his professional breaks, but left me no closer to the polymath, for whom the dome of St Paul's was mere icing on the cake.


John Gribbin, Science: a History. Scotland on Sunday, September 29, 2002. Review by Andrew Crumey.

EM FORSTER defined 'classic' literature as whatever we were made to read as children. John Gribbin's hefty account of Western science's high points over the last half-millennium reminds us that Forster's maxim extends further. Science: A History is a return to school; nostalgic, illuminating and frustrating.

The starting point is 1543, when Copernicus died and his revolutionary cosmology was published. An evident turning point, and a reasonable place to draw a line between 'ancient' and 'modern'. The schoolbook view of scientific history, however, goes further. The ancients were woolly-headed philosophers and mystics who did no experiments and were content to cite the authority of illustrious sources like Aristotle.

Then, at the end of the 16th century (specifically with the arrival of Galileo), a great lightbulb of rationality switched itself on, and people became 'modern'. This is fine for schoolbooks, but it is sad to see such received wisdom replicated here. There were careful experimenters before Galileo, and mystics long after.

Yet such arbitrary boundaries underpin the book, the most significant being between who's in and who's out. Gribbin states many times that scientific breakthroughs are largely a matter of luck: if Einstein had been run over by a tram in 1904, somebody else would have discovered relativity in 1905 or thereafter. Discoveries are mountains to be climbed, and we remember the people who got there first. Yet 'right' answers can change over time (Descartes thought a perfect void cannot exist - a century ago that was "right"; now it is not). And how are we to judge the Everests of science from the piddling Pennines?

The magic ingredient, of course, is tradition: the oral history passed from lecturer to student. I do not for one moment doubt the depth and accuracy of Gribbin's research; and I was delighted by the biographical anecdotes, and more particularly the work of unsung heroes, which he describes. Yet I was constantly reminded that what I was reading was very much an accepted, canonical, 'official' history of science, whose surprises are few.

And that, for all my quibbles, is its strength. Yes, the index is appalling; and I question the need to give Count Rumford 13 pages, when Boltzmann gets less than three, and Herschel a mere half dozen lines. Yet I'm sure I shall go back to this book often, for there is so much in it that is simply marvellous to read. Historical narratives are treacherous things: yet who can fail to be seduced?


Atul Gawande, Complications. Scotland on Sunday, November 10, 2002. Review by Andrew Crumey.

EVER wondered how realistic ER is? Then read Gawande's superb book. The truth, you will find, is far more compelling, though the endings are never as neat.

The messiness of real life is what lies at the heart of Gawande's fascinating reflections on his years of surgical training in the US. At work in the operating theatre, he finds that his female patient - an unconscious crash victim - is not breathing properly. He reports this to his assistant - not in the urgent tones of a TV actor, but "in the flattened-out, wake-me-up-when -something-interesting-happens tone that all surgeons have acquired by about three months into residency".

He rummages around in her throat, tries getting a tube in, and then suddenly something "interesting" happens. He must perform an emergency tracheotomy - pierce her throat - to keep her alive.

Gawande makes the scene far more dramatic than television ever can. He is a first-class writer. What this near-fatal incident demonstrates is that when things go wrong, they do so with breathtaking speed. And not because the surgeon is bad, but because - like the patient - he is human.

That is the "complication" of medicine. It's meant to be all about science, hi-tech gear and clinical exactitude. Except that in real life, nothing is ever certain. The patient whom Gawande nearly killed could have been left at the roadside, and might have woken next morning with little more than a headache. As it was, she woke with a small scar in her throat that soon healed, and things ended happily. It could easily have gone the other way.

In concentrating on the human factor, Gawande does here for surgery what Oliver Sacks did for neurology in The Man Who Mistook His Wife For A Hat. There are curious case histories - for example, the pregnant woman whose morning sickness developed into life-threatening hyperemesis, leaving her unable to keep down any food. The only recommendation was abortion; but she soldiered on, finally giving birth to healthy twins. Immediately afterwards, she tucked into a hamburger.

Not many writers could make the subject of vomiting interesting, but Gawande does. In fact, as further testimony to his skills, he left me feeling distinctly queasy by the chapter's end. But it's the pregnant woman he cares about; her stoicism is deeply moving. He writes throughout with an empathy that makes you wish you lived near his hospital.

The theme to which he constantly returns is the imprecision, the fallibility of medicine; something which society at large refuses to address. We want progress without risk, but this is impossible. With any new procedure, there are failures before there are successes. What if it happens to be your own child who is in line for the latest innovation?

Risk can never be eliminated, but it can always be reduced. Before doing the tracheotomy, Gawande at least had some practice - on a goat. In the UK, such training is banned in the interests of animal rights. If you should ever need one, it could well be your surgeon's debut performance.


Mario Livio, The Golden Ratio. Scotland on Sunday, November 17, 2002. Review by Andrew Crumey.

YOU'VE heard of Life of Pi; now meet the Life of Phi. Mario Livio's fascinating book tells the story of the unending number that inspired scientists and artists from Euclid to Le Corbusier.

Try this with a pocket calculator. Think of any two numbers and add them to get a third. Add the second and third numbers to get a fourth, then add the third and fourth, and so on. Keep going, then divide the 20th number by the 19th. No matter what you started with, your answer will be 1.618: the Golden Ratio.

According to Livio - an astrophysicist who works with the Hubble Telescope - this same magical number governs the spirals of galaxies, the flight patterns of falcons, the shapes of pineapples. It gets its name phi in honour of the Greek architect Phidias, who allegedly used it in planning the proportions of the Parthenon.

Phi might also have played a part in the Egyptian pyramids. Their proportions have been analysed by countless scholars and cranks, a craze that seems to have started with a 19th century Astronomer-Royal for Scotland, Charles Piazzi Smyth. He used the pyramids as support for a totally barmy argument against metric units. Measured in inches, he claimed, the pyramids provide some beautifully exact illustrations of phi; but centimetres spoil the effect.

Therefore, Smyth concluded, God uses inches. Smyth subsequently acquired the nickname "pyramidiot", but the search for mystic meaning in the pyramids continues unabated.

Others have found phi in the Mona Lisa, in Mozart's sonatas, in Seurat's paintings. Examining the evidence, Livio dismisses nearly all of this as "number juggling." By way of illustration, he proceeds to find a fair approximation to 1.618 in the proportions of his television set.

A 19th-century psychologist, Gustav Fechner, did much to popularise phi -mania. He showed people lots of rectangles and asked them to choose the most "beautiful." He claimed that people tended to choose rectangles whose sides were in the Golden Ratio, but later studies have cast doubt on this.

Nevertheless, phi enthusiasts still manage to find the number in new and unlikely places - reminiscent of the intellectual thriller movie Pi, in which the infinite digits haunt every aspect of the hero's life. According to economist Ralph Elliott, it is not pi that governs stock market movements, but phi.

Whether or not that is the case, the number is certainly out there in the stars, and its unique properties are a delight for anyone who can handle the modest amount of mathematics Livio provides among the historical anecdotes. Expressed as an infinitely nested series of square roots, for example, phi becomes the fractal equivalent of standing between two mirrors, watching the reflections multiply without end.

The Renaissance artist Piero della Francesca was certainly keen on it. When he retired from painting he wrote maths books instead, though much of his work was later plagiarised by a pupil, eclipsing his discoveries. For Piero, mathematics was a path to God, and art was his way of illustrating it. In modern times, Le Corbusier called phi "the Modulor", and made it the basis of his architectural theories.

Livio's book serves as a reminder that artists have always found inspiration in science, and that fear of intellectual effort is a particularly modern phenomenon. As Plato said, "all that is beautiful is difficult." The learning curve in Livio's book might occasionally leave some readers scrambling, but the climb is worth the effort.


Karl Sabbagh, Dr Riemann's Zeros. December 1, 2002. Review by Andrew Crumey.

WHEN Andrew Wiles cracked Fermat's Last Theorem in 1996, it made world news and spawned a whole new publishing sub-genre. Wiles had solved a 300-year-old mathematical puzzle, and a book by Simon Singh about Wiles' esoteric achievement became a surprise bestseller. Several other books have appeared since, retelling the story in various ways.

But mathematicians agree that Fermat's theorem was not particularly significant in its own right. The prize for "most important problem" goes instead to the Riemann Hypothesis, which is the subject of Karl Sabbagh's book. Solve that one, and the million-dollar prize on offer from the Clay Institute will be mere icing on the cake. For anyone who cracks the Riemann Hypothesis, immortality is assured.

With such high stakes, we have the makings of a rattling good yarn. The only problem is that the Riemann Hypothesis is not easy to state in simple terms. Get a handle on sums to infinity and the square root of minus one, and you're ready to make a start. For many, however, such things only evoke a state of mental paralysis and memories of schoolroom panic.

That is a pity, because Sabbagh not only makes a very good job of explaining the formidably abstract concepts, but also serves up one of the best lay accounts I have come across of what professional mathematics is really like. Forget the movie A Beautiful Mind; far more realistic is Sabbagh's portrayal of this book's hero - or rather, anti-hero - the maverick French mathematician Louis de Branges, who thinks he is on the verge of solving the Riemann Hypothesis. The only problem is that he's been saying he's on the verge for as long as anyone can remember.

Sabbagh sits through one of De Branges' lectures, totally lost while the professor covers every blackboard in sight with imponderable equations. De Branges' life is completely dominated by a quest that everyone else believes to be futile.

In fact, colleagues of John Nash - he of Beautiful Mind fame - first became aware their friend was losing it when he announced that he had solved the problem, and proceeded to give a nonsensical lecture about it.

With such a history, it looks as if the million-dollar prize will remain unclaimed for some time. But all the mathematicians Sabbagh interviews agree that the money plays no part in their reason for studying the problem. What attracts them is its sheer difficulty.

Does it matter? Perhaps not; though if we one day make contact with aliens, they will be unlikely to understand our art or politics - but may have an answer to the Riemann Hypothesis.

© Andrew Crumey

Thanks to HTML Codex