Lee Smolin, Three Roads To Quantum Gravity. Scotland on Sunday, January 28, 2001. Review by Andrew Crumey.
WHAT is space made of? The question may seem bizarre, but the answer, says Lee Smolin in this excellent, challenging book, may come from uniting two great but so far irreconcilable pillars of 20-century physics - general relativity and quantum mechanics.
Quantum theory explains all but one of the presently known fundamental forces of nature. The exception is gravity which, according to Einstein, is made by warps and ripples in spacetime. Everything produces gravity - even gravity - and the resulting infinite regress long posed an insurmountable barrier to theorists. Recently, however, three approaches to quantum gravity have begun to look promising, the first being the study of black holes.
Black holes are strange things. They are not completely black but release streams of radiation and elementary particles, too faint to be seen in any telescope. Smolin suggests black holes might cradle "baby universes", their carbon-dependent existence favoured by a kind of cosmic natural selection that has inadvertently made possible our own organic life.
Equally strange is the view of Nobel Laureate Gerard 't Hooft that black holes are like computers, their surfaces encoding a form of quantum information. This prompts the "holographic principle" which says the universe itself is a kind of hologram, storing multidimensional data on its three -dimensional "surface".
Smolin's presentation of these mind-stretching ideas, tinged with an air of humility and wonder, is both lucid and persuasive. Addressing ancient unresolved questions about the nature of reality, his book is closer in spirit to Julian Barbour's The End of Time than to Brian Greene's wonderfully informative yet philosophically evasive guide to superstrings, The Elegant Universe.
String theory - the idea that fundamental particles are really the vibrations of tiny filaments - is another of Smolin's three roads. The conjecture magically eliminates the infinities of quantum gravity, which is why strings are currently widely touted as the "theory of everything"; yet Smolin highlights some problems. There is not one string theory but many. They are all "background dependent", meaning the strings are imagined to move through an unexplained, pre-existing space, contradicting the spirit of general relativity. And strings are only approximations to a more exact model, called M theory, whose details are still vague.
What black holes and strings both imply, says Smolin, is that in nature there must be a smallest possible size. Space is discrete, effectively consisting of "atoms." These are very small - the number contained in a cubic centimetre is a hundred digits long. But what are they? For Smolin they are the nodes of a "spin network", a structure arising from his third path, "loop quantum gravity." Yet he admits this too has its limitations.
The ultimate theory, currently being sought by Smolin and others, will explain space and matter in terms of something more fundamental than either strings or spins. No-one knows how it will look, but Smolin thinks it will include some kind of holographic principle, and it will explain why, when you brake suddenly at a red light, you are thrown forwards in your seat. Newton gave an answer now known to be wrong; Einstein invoked the action of the stars. Quantum gravity's explanation, Smolin predicts at the close of this exhilarating book, will be taught to schoolchildren by the end of the century.
Stephen Jay Gould, Rocks of Ages. Scotland on Sunday, February 4, 2001. Review by Andrew Crumey.
HOW are we meant to feel about a monkey born with the gene of a jellyfish? What are we to make of the arguments about research on human embryos? In Rocks of Ages, Stephen Jay Gould explores the centuries-old divide between science and religion, between cold facts and moral values. Yet strangely, his book offers little guidance through the ethical minefield to which contemporary science and technology have given rise, from HIV-infected mice to human cloning.
Instead, the Harvard professor - whose erudite yet eminently readable books on evolution (such as Wonderful Life) have made him famous - adopts a moral ground so high as to put the messy details of most current debate beyond sight. When he climbs down, it is to consider the history of supposed antagonism between science and religion; and here the details are fascinating.
Consider Newton, for example, who spent more time on theology than on physics. His theory of gravity implied the universe itself should collapse into a fireball, a problem he solved by supposing the stars to be evenly distributed to prevent this happening. The hand that placed the stars so carefully was, of course, the hand of God, made manifest in Newton's equations.
Newton entered a lively correspondence with the Reverend Thomas Burnet, whose book The Sacred Theory of the Earth sought scientific explanations for Biblical stories. Burnet had trouble, though, fitting creation into just six days, and suggested these "days" were merely allegorical. Newton replied that the Earth might initially have rotated very slowly, producing days of huge length. In that case, Burnet objected, what made the Earth speed up? Newton's response, in essence, was that God can do whatever He likes.
Burnet has suffered the ridicule of later scholars; yet, as Gould points out, it was Burnet who was doing science here, while Newton resorted to unprovable concepts from the incompatible domain of religious faith.
Gould has a name for these "domains." He calls them "magisteria", and the principle he advocates is that science and religion, the two "rocks of ages" intended by his title, far from being at war or even converging, should be "non-overlapping magisteria".
Abbreviated throughout the book to NOMA, Gould elevates this principle to the dangerously unchallengeable status of "common sense", arguing that it has long been the consensus view of scientists and religious leaders alike, and that the debate between the two camps is a "non problem." Science, says Gould, addresses factual questions, while religion deals with moral values. Yet "facts" are notoriously slippery things, and the origin of morality is equally problematic. These issues slip through the net of Gould's overly simplistic argument, while he offers his delightful miscellany of historical examples.
Take the most famous: Galileo's appearance before the Inquisition. As Gould points out, the story was a complex one, involving court politics and personal pride, and in reality the Catholic church soon accepted the Copernican view of the universe. Similarly, the myth that Columbus's expedition was opposed by church leaders who thought the world was flat (even though Aristotle had known otherwise) is attributable to anti-Catholic books of the 1870s.
Gould's stylistic model in his elegant essays, he says, is Montaigne, the pioneer of the genre. We even find references here to Plutarch, and a taste for aphorism, and the obscure. The warmth of Gould's writing makes his occasional moments of pedantry entirely forgivable, and nobody could finish this book without having learned something interesting that they didn't know to begin with. Yet Gould's purpose is to justify a thesis, and in this he does not succeed.
The problem is that his argument is tailored to fit one particular case: the rise of "creationism" in the US. Many people in America believe that the universe was made 10,000 years ago, over a period of six days. And they believe it can be scientifically proved, though the proof amounts to finding fault in any data that appears to contradict them. The creationist lobby won the right to have their ideas taught in US schools, and Gould testified in legal hearings which overturned the ruling.
Like any thesis designed to cover a single example, Gould's NOMA fits the creationist controversy perfectly. If you want to believe in a creator then that's fine, Gould effectively says; just don't start making factual claims about the age of the universe, unless you are willing to be proved wrong.
The famous 1925 trial of John Scopes, for teaching evolution in a Tennessee school, further illustrates Gould's argument. Far from being a case of religion at war with science, the trial was engineered by civil liberties groups hoping to use the willing Scopes as a stalking horse to expose an absurd law. Scopes's nemesis, the populist politician William Jennings Bryan, is shown by Gould to have made a reasonable case in his denunciation of Darwinism as it was then popularly (mis)understood. Bryan was horrified at the teaching in schools of the notion of "survival of the fittest", used to justify racism, the colonial atrocities of Britain and Belgium, and the military expansion of the Kaiser's Germany.
The lesson Gould draws from this is that we can never use science to justify a moral position. Science, he argues, is morally neutral, and only the religious "magisterium" can tell us how to treat our fellow human being.
What of the many people who have no religious belief? Gould, we can assume, is one himself, and he reassures us that a moral life need not be based on supernatural faith. As a very touching example, he describes the grief of Charles Darwin, and then of Darwin's friend and champion, Thomas Huxley, who both suffered the agony of seeing their children die. Huxley's eloquent, impassioned letter to Charles Kingsley, defending his agnosticism in the face of senseless loss, should, says Gould who quotes it, be "required reading for all courses in English literature and philosophy".
We can be grateful to Gould for such tasty historical morsels, and also for the unusual words he delights in. "Magisterium" is one I can do without; but I was pleased to learn that Gould's language of peaceful reconciliation, the opposite of polemic, is called "irenic." Another one worth noting is "syncretism", the union of disparate tenets like science and religion. With the "war" well and truly over for everyone except creationists, syncretism has become the flavour of the day, though for Gould it is equally objectionable.
In America, the wealthy John Templeton Foundation launched an experiment to measure the power of prayer. At one of their conferences, physicist Russell Stannard suggested the God-in-man duality of Christ could be thought to resemble the wave-particle duality of physics. Gould is right to take this only as evidence of people's capacity for wild metaphor; but it is worth remembering that Niels Bohr originally derived his idea of quantum duality, in part, from Buddhist philosophy, and extended it to all aspects of life.
Gould's brief and dismissive treatment of syncretism is his book's greatest flaw, since it is surely by examining the overlap between science and religion that we can test his claim that no such overlap can meaningfully exist. By depositing morality in the zone of religion, Gould is apparently claiming that our moral values can have no empirical content. Yet his beloved Montaigne would readily retort, as might Aristotle, that the source of all knowledge, and all faith, is human experience.
As our knowledge grows, new questions must be asked. If a deranged man kills someone, is he an evil person deserving punishment, or a sick one in need of help? When airlines plan safety programmes, what probability of passenger fatality is acceptable? Do animals have rights? These questions sit firmly in the overlap Gould ignores. His pure, autonomous spheres of factual and moral reasoning simply do not exist.
Our moral and cultural values will always shape the questions scientists pose, and the answers they derive. Philosophers have been trying for millennia to prove the existence of God; they have also asked about time and space, the birth of matter, human consciousness, and other concepts that once belonged only to "metaphysics", but which are now legitimate questions of scientific "fact." The puzzle of what remains of us after death is surely, from the point of view of Homo sapiens, the biggest factual question of all. Yet the Inquisition said there are some things science must never ask about; and so, it seems, does Gould.
Anthony Grafton, Leon Battista Alberti. Scotland on Sunday, February 11, 2001. Review by Andrew Crumey.
THE YOUNG Leon Battista Alberti was under no illusions about the perils of scholarly life. "To put it in a nutshell," he wrote in 1428, "the learned don't become rich, or if they do become rich from literary pursuits, the sources of their wealth are shameful." Yet despite these reservations, he was to win fame and glory from his prodigious learning, and his insatiable appetite for new ideas. The career of this true "Renaissance man", and the world he not only inhabited but came to influence profoundly, are brought vividly to life in a magnificent tour de force of intellectual biography by one of the world's leading experts on the period.
As Professor Grafton points out, new science and technology were at the forefront of 15th-century Florence's cultural rebirth. Filippo Brunelleschi, designer of the city cathedral's famous dome, was classed not by what he produced, but by the brilliance of his mind. He was called ingeniator, from which we get both "engineer" and "ingenious." The meeting of art and science - in both of which Alberti took an interest - was what drove the Renaissance.
Another factor was patronage; one major benefit of Cosimo de Medici's rise to power was the return to Florence of many talented people forced out by his predecessors. Alberti's family had gone to Genoa, and illegitimacy, as well as exile, were stigma the young Alberti overcame through a mixture of self -effacing humour and proud defiance.
In Florence, Alberti quickly established himself as a classical scholar and as a writer of pithy dialogues. But he was also a skilled artist, and sculpted a profile portrait of himself resembling a Roman emperor. The idea was emulated by Pisanello, whose bronze portrait medals are treasures of the Renaissance.
Still more influential was Alberti's book On Painting. Alberti did not actually invent perspective, but his book, drawing heavily on ideas from surveying and applied mathematics, was the first clear exposition for painters. Alberti asked artists to imagine a sheet hanging vertically between themselves and their subject, which light rays would meet in different places before reaching the artist's eye. Albrecht Dorer put this technique into practice: a woodcut by him shows an artist drawing a nude woman through an intervening gauze. Paolo Uccello was also heavily influenced by Alberti's theories; his fresco of John Hawkwood in Florence Cathedral seems to have followed Alberti's geometric approach so faithfully that Uccello was required to repaint it, in order to lessen its severe fore-shortening.
For Alberti, while the technical aim of painting was to render the solid world in two dimensions, its value as art lay solely in the emotions it could provoke. Alberti's aesthetics, though derived from principles of classical rhetoric, still resonate today. He made "visual devices" consisting of a box with an aperture through which amazed viewers saw realistic landscapes, or images of the sky: a primitive exercise in virtual reality. The perspective diagrams of Uccello and others are often reminiscent, to modern eyes, of computer-drawn designs; and the accurate rendering of reflections was a major problem then as now.
Alberti wrote about sculpture and mathematics, etiquette and horses; yet his most enduring legacy is as an architect, the ornate facade of Santa Maria Novella in Florence being one of his most familiar works. It exemplifies the sense of mathematical proportion, classical elegance and sheer humanity that seems to have characterised the man himself. Alberti became a roving consultant in urban planning, moving between the courts of Ferrara, Mantua, Rimini and Urbino. Needless to say, he put his ideas on the subject into a book.
Alberti's pivotal role in the early Renaissance was recognised in the 19th century by the great Swiss historian Jacob Burckhardt, who drew on Alberti's own autobiography to conjure up the "universal man" who could tame wild horses and leap over people's heads just as easily as he could compose flawless Latin or use an astrolabe. Though subsequent scholarship has made Alberti less superhuman, he has become all the more fascinating.
He won the glory he desired, but in an age before copyright, Alberti made little from his books. In his treatise on painting, he suggested that grateful artists could repay him by adding his portrait to their frescoes. Sadly, no one seems to have taken him up on this. But Grafton's portrait does full justice to a remarkable man.
Pekka Himanen, The Hacker Ethic. Scotland on Sunday, February 18, 2001. Review by Andrew Crumey.
FOR most people, hackers are asocial teenagers or fashion-compromised men with beards who delight in breaking into computer networks in order to vandalise, defraud or spread viruses. This is not what Pekka Himanen means by "hacker." He uses the word in its older and more honourable sense, referring to computer pioneers who, since the 1960s, have had as their prime ethic a desire to share their skills and knowledge for the common good, with financial reward being an occasional side-effect rather than an end in itself.
Bill Gates started out as a hacker; so did Steve Wozniack, inventor of the personal computer. Gates modified some freely available software (whose initials allegedly stood for "Dirty Operating System") and took the very un -hackerish move of licensing it to IBM, with famously profitable results. Wozniack also made a fortune, but sold much of his Apple stock to fellow workers at a discount, retired at the age of 29 and now fills his days by teaching computing to local schoolkids. Gates is, according to Himanen, "the computer hacker's number-one enemy", while "Woz" is a folk hero.
Another hero is Linus Torvalds, who contributes an introductory chapter to Himanen's book. Torvald's Linux system - unlike Windows which it resembles and, some would say, improves upon - is free. While Microsoft put considerable effort into concealing the workings of their products, Linux - developed in open collaboration on the internet - epitomises the hacker philosophy that good ideas belong to everybody, and acclaim is the only reward any discoverer can rightfully expect.
This is the ethic (traditionally, at least) of the academic world, and of Sixties counter-culture, and it is no accident that the first hackers were products of both. Their outlook contrasts with what Max Weber, in 1904, called "the Protestant work ethic", arguing that the Reformation transformed the rigid discipline of monastic life into a general view of work as moral duty, and idleness as sin. Benjamin Franklin, for instance, regulated his days according to strict timetables, coining the quintessentially American phrase "time is money." Himanen also cites the way every new labour-saving invention serves not to give us more freedom, but instead to make us more "productive." E-mail and mobile phones become devices for creating emergencies that would not otherwise exist.
The hacker, by contrast, values "downtime" as an essential part of life. His personal traits, we learn (and it need hardly be said that he is almost invariably male), include a joy in creativity, and a tendency to work strange hours. In other words, hackers resemble a great many artists, scientists, or obsessive hobbyists. The difference is that the hackers' hobby is re-shaping society itself.
Where all this is heading is something I was unable to glean from Himanen, though he does provide an antidote to the dotcom brainwashing we get from politicians and business people, who have much to say about the commerical possibilities of the world wide web, but nothing about its cultural potential, other than to offer scare stories about pornographers and terrorists. From a hacker's point of view, it is globalisation and intrusion of privacy on the net that need to be controlled, not free speech.
Will the caring-sharing hackers prevail against the oppressive forces of market and state? Himanen does not care to guess; nor does Manuel Castells in an unfathomable postscript which informs us, however, that "the network society is a social structure made of information networks powered by the information technologies characteristic of the informationalist paradigm." Even computer scientists, with their taste for recursive definitions, might find that bit of code a little hard to parse.
Michael White, Rivals. Scotland on Sunday, March 4, 2001. Review by Andrew Crumey.
ARTISTS often take years over their work: refining details, perfecting some inner vision and allowing their ideas to take on a definitive form. Scientists cannot allow themselves such luxury. While novelists might worry whether the market will move on before their next book is finished, they need not fear that someone else will publish it first. Yet this is the constant worry of scientists, for whom new discoveries are peaks waiting to be climbed, and for whom competition is an inescapable part of life. This is the theme that links the episodes in Rivals, taking us from Isaac Newton to Bill Gates.
A particularly stark illustration is provided by the controversy over AC and DC electricity. Thomas Edison offered immigrant inventor Nikola Tesla 5,000 if he could devise an AC generator, which seemed at the time an impossibility. When Tesla succeeded, Edison shamelessly claimed the offer was only a joke. Tesla took his invention to George Westinghouse, and Edison's supporters launched a dirty-tricks campaign to try to ensure that domestic electricity would follow Edison's DC system. To make the public think AC was more dangerous, it was used in the grotesque public electrocution of animals. Humans duly followed, when Edison's camp designed the first electric chair using a surreptitiously obtained Westinghouse generator. Edison lost the battle, but the foulness of his methods lingers long afterwards.
Money was the motive, as it is too in the case of Bill Gates and his rival Larry Ellison, whose attempt to supplant the PC with simpler machines has so far failed to catch on. Michael White draws on an interview he had with Gates in 1995, in which the author clearly saw, beneath Gates's mask of modesty, a "monumental ego… dazzling in its intensity".
White's book is at its best when dealing with such relatively recent controversies, whose social context is a world we still recognise. The atom bomb, the structure of DNA, and the exploration of space are all clear examples of science as a race, with no prizes for coming second. White's well -researched portraits of the personalities involved are informative and engaging. Where his book fares less well, however, is in its handling of more distant episodes.
White appears to believe, for example, that feuds and disagreements were not a part of intellectual life before the age of Leonardo, whom he calls "the first scientist." For centuries, he asserts, scholars parroted the absurd ideas of Aristotle, never bothering to test them, until at last Gassendi and others rediscovered the "visionary" ideas of Democritus. This traditional view is repeated here with the same unquestioning attitude that White ascribes to the monks in their mediaeval darkness; though Aristotle was a wonderfully astute observer of nature who dismissed Democritus's mystical notion for sound reasons, and whose theories were challenged and modified by a succession of brilliant thinkers in the centuries before Galileo.
White's mistake is his assumption that because an idea has been discarded, it must have been bad. Quite the opposite is the case: good theories are ones that can be tested, and Aristotle's were mostly good, even though mostly wrong. Rivalries and feuds are indeed a part of science, just as they are part of all human activity, but they are not a defining feature or an essential one. Among scientists - as among artists - there will always be those driven by naked ambition, whose lives make such fascinating reading. Yet there will also be others who see in art or science a way of transcending our universal frailties.
Martin Brookes, Fly: An Experimental Life. Scotland on Sunday, March 18, 2001. Review by Andrew Crumey.
WHEN Friedrich Miescher discovered DNA in 1871, there was no press conference or media fanfare. Nobody knew if the new chemical mattered. Science, contrary to its usual portrayal, is made not of eureka-moments of sudden insight, but of long hard slog.
Martin Brookes knows this, having done a PhD on the distribution of moths. As he says himself in this delightful book, his was the kind of research work, useful in its own way but probably read by half a dozen people, that is the scientific norm. And when Thomas Hunt Morgan examined a batch of fruit flies in his laboratory one morning in 1910, he certainly did not suspect that what he saw would change the world.
Morgan was using Drosophila flies as a way of testing the theory of evolution. Darwin, like most people, believed that a plant or animal's characteristics are a blend of its parents'. But this "averaging out" was a problem for Darwin's theory. How, for example, could eyes ever have evolved? In 1901, Hugo de Vries offered the "mutation theory", suggesting new traits arise as freaks of nature, produced by environmental forces.
This was what brought Morgan and Drosophila together. The flies were easy to come by - a banana on an open windowsill was enough to draw a batch, and they breed quickly. Morgan's "Fly Room" was no sterile laboratory. According to Brookes, "If the idea was to make the fly feel at home by recreating the atmosphere of its natural environment - the dustbin - then the Fly Room worked a treat."
In a bid to prove De Vries's mutation theory, Morgan subjected his flies to "a barrage of abuse", injecting them with acid, spinning them in centrifuges, and sticking them in fridges and ovens. Yet they refused to mutate. Then, one morning in 1910, Morgan noticed a newly hatched fly with white eyes. It was a spontaneous mutation. By breeding so many flies, Morgan had won the lottery; he was witness to a rare natural freak.
The mutant fly's offspring were normal, but among its grandchildren, roughly one in four were white-eyed. Morgan had confirmed the principle of heredity, discovered by Gregor Mendel decades earlier. In 1909, a Danish biologist had proposed that Mendel's "particles of inheritance" be called genes. Now Morgan and his co-workers would take the first steps in mapping them.
Morgan's experiments convinced him that genes are arranged on chromosomes - strands of DNA - like beads on a string. His student Alfred Sturtevant located five of them - the world's first genetic map. By 1915, a hundred individual genes had been isolated on the fruit fly's four chromosomes.
This information came from the natural mutants that occasionally arose among the millions of insects buzzing around Morgan's Fly Room in New York. Every summer, the whole operation was moved to the Massachusetts coast, though "a few bottles of flies were always left behind as insurance, just in case something unpleasant happened to those in transit."
The pace of research accelerated thanks to the work of Hermann Muller. "To many people," says Brookes, "Muller was a maverick and a visionary. To the fly, however, he was a cruel tyrant." In 1926 Muller discovered that bombarding the flies with X-rays made them more likely to produce mutant offspring. This eventually won him a Nobel Prize, and it was the start of a long craze for producing deformed versions of Drosophila whose official scientific names are often tinged with black humour. Groucho has bushy antenna; chico has small body cells. "Popular with Japanese bonsai lovers," Brookes wrily observes. Bubble, spook and bladderwing are left to the reader's imagination, but we are told that genghis khan is not, after all, a fly intent on conquering the world, but simply one with strong leg muscles.
Van gogh has patterned wings supposedly resembling the master's paintings; drop dead is an unfortunate soul that does exactly that, without warning. But that's better than the fate suffered by bicaudal, a fly born with a second anus where its head ought to be. "Lacking brain, eyes or any form of locomotory appendages, bicaudal had little choice but to arse about for the two or three hours of its short life on earth."
Brookes highlights the issue of animal rights; though when it comes to pleading for the dignity of fruit flies, anyone who has ever swatted a wasp at a picnic can't exactly take the moral high ground. And Muller's experiments did some good for humanity, by revealing the dangers of radiation. Muller campaigned to raise public awareness of the issue, and even advocated storing human sperm as a safeguard against a future age of widespread mutants. He "espoused a kind of socialist eugenics", though Brookes suspects personal bias in Muller's concern for saving the genes of intelligent people rather than, say, beautiful ones. As Brookes says, "Muller himself was short, portly, and bald in a Bobby Charlton kind of way."
The humble fruit fly continues, unwittingly, to provide humans with new scientific insights. The flies' sleep patterns have been studied, revealing the way in which brain chemicals govern tiredness and wakefulness. The poor mutant of relevance here is timeless, an eternal insomniac lacking the gene that will let it take a nap. And fruit flies, apparently, are affected by alcohol in a remarkably similar way to humans. Fly drunkenness has three stages: first, a "euphoric, boisterous phase… This is when the fly would begin to lose its inhibitions, if it actually had any to start with." Next is the "unco-ordinated" phase, followed by "coma." So at least laboratory flies have some comfort in their lives.
They also have a great deal of sex, which must be quite a sight in the case of Drosophila bifurca, whose sperm cells are ten times the length of its own body. And drugs have reached the fly world; researchers have found that the insects can become addicted to crack cocaine, which causes them to "indulge in manic grooming".
Brookes describes these findings with an engaging mix of irony and wonder. As he says, most research is worthy but largely useless; some of it, quite unexpectedly, marks a new turning point. Morgan's white-eyed fly was a case in point, and we are still trying to come to terms with its legacy.
Benjamin Woolley, The Queen's Conjuror. Scotland on Sunday, March 25, 2001. Review by Andrew Crumey.
ASTRONOMERS get snooty when people mistakenly refer to them as "astrologers"; chemists have a similar disregard for "alchemy." A few centuries ago, such distinctions did not exist, however. Kepler drew up horoscopes for rich patrons; Newton found the search for the philosopher's stone as seductive as his quest for gravity. We remember their scientific breakthroughs, and quietly overlook the mysticism that inspired them.
With Dr John Dee, it is the other way round. Dee made no great discoveries, but his dabblings in the occult have made him an enduring figure of fascination. When his private papers were published in 1659, 50 years after his death, it was to warn people against the dangers of summoning up the forces of darkness. Dee's first biographer thought him insane, but in the 19th -century he was hailed as an English Nostradamus, and as founder of the Rosicrucian movement. Dee's sinister "skryer", or medium, the manipulative charlatan Edward Kelley, was admired by Aleister Crowley, who claimed to be Kelley's reincarnation. Peter Ackroyd's 1994 novel, The House of Doctor Dee, attests to his continuing appeal.
How ironic, then, that Dee's early ambition was to become the English Mercator, and that his measurements of the heavens brought him close to making discoveries of value. Woolley considers both the science and the magic in this engaging account of Dee's life.
Dee met Mercator in Louvain in 1548, and they became good friends. Dee learned the new theories of Copernicus, gave lectures on mathematics, received numerous offers of royal patronage, and became a passionate collector of rare books. When he returned to England, he opened his library to the public, though his books were shelved according to size rather than author or subject. Visitors were amazed by Dee's ability to find whatever piece of information they wanted, from a seemingly random collection that included books on rhetoric, saints, surveying, tides, veterinary science, weather, women and zoology.
One book he particularly treasured was Johannes Trithemius's Steganographia. The first part describes how messages can be telepathically sent using magical incantations that read like gibberish. The rest of the book consists of obscure tables whose meaning was finally unlocked in the 1990s. They provide a code for deciphering the earlier imaginary language, revealing the book to be a pioneering work of cryptography.
It is unlikely that Dee worked this out, but his enthusiasm for the occult brought him into contact with Edward Kelley, the man who was to be his downfall. The "spirits" who spoke through Kelley informed Dee he was to receive a book of wisdom written in the original language of Adam. The nonsense-alphabet of this supposed language has defied modern mathematicians, but while the two men's quest for buried treasure was harmless enough, Kelley's mind games eventually became more sinister. Together with their families, the men set themselves up in Bohemia, where Kelley announced that his spirits had ordered a bit of wife swapping. The others were aghast, but complied, and Dee's wife gave birth to a child as a result.
Dee parted company with Kelley, but the affair gave rise to his posthumous notoriety. He should have stuck to astronomy. His observations of the nova of 1572 convinced him that stars move freely through space; a revolutionary idea, but one that gained little attention in England. Tycho Brahe saw the new star too, and measured it just like Dee. Tycho is hailed as a great scientist, Dee is mocked as a magician. The dividing line, Woolley reminds us, is not as clear as we might think.
Lawrence M Krauss, Atom. Scotland on Sunday, April 22, 2001. Review by Andrew Crumey.
PRIMO Levi concludes his 1975 classic The Periodic Table with the biography of a carbon atom that has lain buried for millennia in a piece of limestone. "It already has a very long cosmic history behind it," Levi writes, "but we shall ignore it." Now Lawrence Krauss proposes to tell the cosmic part of the story, with the atom's identity switched from carbon to another life-giver, oxygen. "The atom of oxygen you are breathing in as you read this," he says, "could have been part of Julius Caesar's last breath."
Between The Periodic Table and Atom, however, there is little in common. Levi was a Jewish chemist who survived the horrors of Auschwitz, then used his scientific and literary knowledge as a way of illuminating the accidental path of his own life, which ended in suicide in 1987. Krauss is an American cosmologist, author of an earlier book called The Physics of Star Trek, whose new work offers us the scientific facts, but with a disregard for those facts' discoverers that is unusual even by textbook standards. Atom is about atoms; humans hardly get a look-in.
Do not read this book, then, if you want a history of atomic theory, or an insight into the lives of scientists. Nor, alas, can I recommend it to anyone seeking a pain-free tour of the cosmos. Without a single diagram or illustration, everything is conveyed to us in a prose style that is at times almost impenetrable. For instance: "If the X-antiparticles were to decay into the antiparticles of the particles produced by the decays of the X-particles at precisely the same rates, then as many antiquarks would be produced as quarks." Groucho Marx nicely parodied this sort of gobbledygook in A Night At The Opera; Krauss's offering left me in need of a sanity clause.
Who, I wondered, is this book really aimed at? Anyone with a less than passionate devotion to numbers (especially large ones) will soon give up in confusion. For amateur astronomy enthusiasts, there are lots of interesting facts; but later sections on biology may leave them cold. Devotees of popular science can find the material in more digestible form elsewhere, for instance in John Gribbin's Stardust, whose title alludes to the fact that our atoms were largely forged in stellar explosions; though the phrase "we are stardust" (which Krauss repeats) is nowadays such a hackneyed cliche of the pop-sci repertoire that the estate of Carl Sagan - who was saying it back in the Seventies - should start claiming royalties. I prefer the equally accurate version due, I think, to Martin Rees: "We are nuclear waste."
As the "biography of an oxygen atom", Krauss's book has the perversity of Tristram Shandy. By page 60 we still haven't met the atom in question, nor does its identity ever become clear. Levi's wandering carbon atom finally comes to rest in the full stop that ends The Periodic Table; a delightful twist, burdened with the poignancy of all the human experience that has preceded it. For Krauss, Levi's existential full-stop is a wasted "golden opportunity" to branch out and explore all the atom's other possible lives. Lawrence Sterne might have managed something like this; personally I prefer Levi's specificity. It is Krauss who has missed an opportunity. As someone fascinated by science, I expected to love his book. Reading it felt more like cramming for an exam I would rather not take.
Primo Levi, The Search for Roots. Scotland on Sunday, June 24, 2001. Review by Andrew Crumey.
THE "two cultures" debate is very much a British affair. The notion that scientists are as well equipped as anybody else to write imaginatively, or that artists can engage seriously with science, certainly did not seem strange to Aristotle, nor to the Roman author Lucretius, whose long poem On The Nature of the Universe - an ancient exposition of atomism - is included in Primo Levi's personal anthology of "essential reading".
Levi, an industrial chemist, was a Jew from Turin who was deported to Auschwitz and survived to write of his experience in If This is a Man. Other books followed, most famously The Periodic Table in 1975; a book combining fiction and memoir, taking the atomic elements as its guiding thread. Five years later, Levi was asked to compile this anthology, whose translation by Peter Forbes at last brings all of Levi's books into English, 14 years after the author's death.
Naturally, the selections embrace both science and literature. Charles Darwin and Arthur C Clarke mingle with Melville and TS Eliot. In a preface, Levi explains that in his family, "reading was an innocent and traditional vice." Levi's father had jackets specially made with pockets large enough to hold the three books he was always simultaneously reading.
Levi's own reading brought him, at 16, to William Bragg's account of modern atomic theory, Concerning the Nature of Things. "I was captivated by the clear and simple things that it said," Levi writes, "and I decided I would become a chemist." Comparing the Bragg extract with that from Lucretius, one is struck by the clarity of both, as reasons are sought for the solidity of objects, the transparency of a piece of horn, or the viscosity of oil. The same purity of thought pervades Levi's own writing; this was to be the legacy of his scientific training.
Some of the literary selections are unsurprising: the Biblical story of Job, Joseph Conrad's Youth, and Thomas Mann's The Tales of Jacob fit easily with our image of Levi as a writer of density, seriousness, and slowish tempo. A horrific story of the Russo-Polish war by the Jewish writer Isaac Babel - which leads Levi to wonder "to what degree is it legitimate to exploit violence in literature?" - deals with themes that clearly were close to Levi's heart, even if the treatment is not of a kind that Levi sought to emulate.
More surprising is the presence here of Rabelais. Milan Kundera has described Gargantua and Pantagruel as his favourite among all books; it is easy to see what Kundera admires in its wit, humour and epic silliness. Yet Levi was similarly drawn, "faithful for forty years without in the least resembling him or knowing why." Rabelais' spirit lingers in a delightful extract from the 19th-century Ukrainian novelist Sholem Aleichem, whom Levi describes as "Jewish in his talmudic mania for quotation." Another common thread, Levi observes, is a preference for stories of "the world turned upside down"; the weak triumphing over the slow-witted strong. Given Levi's personal history, one can see the theme's appeal.
Some of his chosen writers are obscure, such as Joseph-Henri Rosny, who in 1925 coined the word "astronaut." Roger Vercel's Tug-Boat was the first book Levi read after his release from Auschwitz, where, he says with characteristic understatement, "besides the hunger for food, I suffered a hunger for printed matter."
Levi's commentaries are the book's most valuable feature. Novel extracts are as satisfying as a nibble from someone else's plate; the poets Levi includes are mostly Italian, and with the exception of Giuseppe Belli's pungent sonnets, lose a bit too much in translation. Most revealing of all are the omissions. Borges is debarred by Levi in the preface (though Swift and Marco Polo are in); nor was Levi a fan of Proust. However he includes Paul Celan's Death Fugue, a Holocaust poem he wore "like a graft." The style is far from Levi's, but the authors' fates were sadly similar; for like Celan, Levi ultimately took his own life. Among all the writers here, Levi himself is one of the most essential; an enduring reminder of the intellectual, spiritual and human values that the dictatorships of the 20th century tried and thankfully failed to extinguish.
Edward J Larson, Evolution's Workshop. Scotland on Sunday, July 1, 2001. Review by Andrew Crumey.
HERMAN Melville wrote in 1854 about his visit a decade earlier to a place once known in Spanish as the Encantadas, or Enchanted Isles. Contrary to the enticing name, what Melville found in the remote South American archipelago was a harsh volcanic terrain, little fresh water and hardly any land mammals; a group of islands so poor and barren that "ruin itself can work little more upon them".
We know these islands as the Galapagos, and in Evolution's Workshop - an account of the islands' role in history, from scientific curiosity to eco -tourist playground - Edward J Larson tells us that while Melville's negative account reflected what was then still the prevailing view, that the Galapagos were hardly worth a stop-over, another visitor of the same era found a very different place.
Charles Darwin spent five weeks on the Galapagos in 1835. He was the only civilian passenger aboard the Beagle, admitted at the express wish of the ship's neurotic captain. The previous commander had gone mad and shot himself in Tierra del Fuego; his successor Robert Fitzroy feared a similar fate, and begged the admiralty to provide him with a gentleman companion to stave off melancholy. Darwin, a promising young naturalist picked from a quick trawl through Cambridge's old boy network, was not first choice, but proved a good one from Fitzroy's point of view, since it was only years after their voyage together that the poor captain succumbed to his demons and slit his own throat, just as his unfortunate uncle Lord Castlereagh had once done. For posterity, of course, the inclusion of Darwin in the voyage was fortunate because it was on the Galapagos Islands that Darwin began pondering the natural puzzles that would eventually lead him, more than 20 years later, to his theory of evolution.
Darwin quickly wrote about his visit in a book whose unpromising title, Journal of Researches did not prevent it becoming highly popular; "something of a classic in its genre" according to Larson. The book was in the shipboard library of the vessel that took Herman Melville home to the United States after his apprenticeship aboard the whaling ship Acushnet. Melville called that ship his Harvard and his Yale; his experiences led to the early adventure novels that made him famous, and to the epic Moby-Dick which marked such a radical departure in style that it sank his literary career. Melville visited the Galapagos just six years after Darwin, but their responses were worlds apart. Melville's anthropocentric view of nature - a feature that makes Moby -Dick so arresting - was quite at odds with the revolution in human thought that Darwin was about to inaugurate. While Darwin helped forge the future, Melville was a man rooted, like his magnificent literary style, in the past.
The islands were first discovered in 1532 by a party of Spaniards sent to investigate conquistador atrocities in Peru, who were blown wildly off course and carried 500 miles from the mainland to a place no human had ever seen. Lost and thirsty, the sailors were intrigued by the iguanas and giant tortoises (galapagos in Spanish), but despaired at the lack of drinking water, and were understandably keen to move on. Subsequent expeditions confirmed that the islands were effectively uninhabitable, with no soil in which crops could grow, and with poor vegetation dominated by a thing called the dildo-tree (a kind of cactus), that was only good for burning. For would-be colonists, the islands offered nothing, except for a lot of odd-looking animals.
Modern visitors are delighted by the tameness of these creatures, and by birds that will readily land on an outstretched hand, but for seafarers in search of useful resources, the wildlife were merely "stupid", and hence all the more despised, for letting themselves get caught so easily. Even Darwin spoke at first in such pejorative terms, though he realised the animals' behaviour was due to a lack of predators.
What most struck Darwin was the sheer variety of species. The Lamarckian theory of evolution - that species adapt in response to their environment and pass on these changes to their offspring - could not explain why a few small islands could host so many different kinds of tortoise or finch. Yet if, on the other hand, God had filled the world with species that could never change, why should the remote Galapagos be granted such a rich stock? And how did all the animals ever get there in the first place? Birds could fly, iguanas could swim (Darwin threw one in the water to check its ability); but tortoises did not get very far, alas, before drowning.
It seems that drifting "sea mats" of vegetation may have carried some of the first plants and animals to the Galapagos, but the abundance of species was a mystery that Darwin was left to ponder on the long journey home. It was a puzzle that would take him years to unravel, and although his Galapagos observations played a crucial role in forming his theory, Darwin came to realise that island populations are an extreme and very atypical example of what natural selection can achieve. When he published On The Origin of Species in 1859, he mentioned the islands only six times, devoting far more pages to pigeon breeding than to his famous finches.
Nevertheless, Darwin used the Galapagos wildlife to highlight crucial predictions of his theory, so that the islands came to be seen as a testing ground for evolution. Soon, scientific expeditions were being sent with the explicit aim of verifying or challenging Darwin's assertions.
Prominent among the detractors was Harvard's Louis Agassiz, a dominant figure in 19th-century American science whom Larson calls a "gifted huckster." Agassiz's rival theory of "multiple creations" failed to catch on, but what naturalists on all sides of the debate shared was a chilling passion for gathering specimens. A single expedition by the California Academy of Sciences collected well over 10,000 bird skins and eggs. Seeing that species were vanishing, partly because of introduced rats and goats, collectors raced to "save" the last specimens for science.
In the early 20th century, the islands attracted a growing number of settlers, including a free-living bunch of nudists whose antics ended in "a sensational series of mysterious deaths." I would have liked to have known more; but Larson prefers to stick to the serious stuff of science, history and geography. This is a pity, since the general reader may be left feeling about this book much the same way that the first visitors to the Galapagos felt; namely that it is a little too dry. While much space is given to the valuable efforts of modern conservation groups such as the World Wildlife Fund, Darwin's starring role seems all too brief. Even so, for anyone lucky enough to be planning a trip to the Galapagos, or for anyone who wants to read about what's there before another oil spill does any more damage, Larson's detailed, thoroughly researched account is certainly the ideal book.
J Richard Gott, Time Travel in Einstein's Universe. Scotland on Sunday, August 12, 2001. Review by Andrew Crumey.
A MAN invents a time machine and goes back to 1904. There he meets Albert Einstein and explains to him the theory of relativity. Einstein publishes it and becomes famous. Very neat; but where did the theory first come from? J Richard Gott is untroubled by such logical loops; in fact he revels in them. A distinguished astrophysicist at Princeton University, he has been publishing research papers on time travel since the Seventies, and in one of this year's most outstanding science books, he suggests that a loop in time may even explain the existence of the universe itself.
Such dizzying ideas have of course long attracted the authors of science fiction stories. Gott himself has appeared in one; his work was used to support the plot of Gregory Benford's 1980 novel Timescape. Yet Gott points out that science fiction has also, on many occasions, provided real-life research with new stimulus.
While writing the 1985 novel Contact, astronomer Carl Sagan wanted his heroine to fall into a black hole on Earth and emerge in another star system. Sagan sought advice from colleague Kip Thorne, who began investigating hypothetical "wormholes" in space, and found that if they existed, they could send travellers across time as well. More recently, Miguel Alcubierre has shown how Star Trek's "warp drive" could theoretically be put into effect, without violating any known laws of physics.
Talk of travelling to the past, though, and immediately you confront the familiar "grandmother paradox." What happens if you alter the past in such a way that your own present could never happen? Gott outlines two possible answers. One is that if you tried to shoot your own grandmother when she was still a child, you would invariably find yourself thwarted. The Arnold Schwarzenegger movie Terminator provides a sombre illustration of this fatalistic view, in which our present is determined not only by our past, but by our pre-ordained future as well.
Another possibility is the "many worlds" theory, in which the past we visit would in fact be only one among infinitely many, allowing us the chance to change it. This is what happens in Terminator II; though if the only past we can recapture is an alternative one, you might wonder why it would be worth changing. For anyone seeking to go back to save a loved one, Gott says, "there is already a parallel universe in which your loved one is okay now. That's because all the possible universes exist. Unfortunately, you are just in the wrong one".
Gott prefers the fatalism exemplified by the original Terminator, whose universe obeys what Kip Thorne and Igor Novikov call "the principle of self -consistency." We need not fear being visited by a real-life Terminator, however, since in Gott's scenario, no one can ever go back before the creation of their own time machine. Even so, another disturbing possibility remains. The moment anyone builds a time machine, they are likely to see their older self emerging from it. This weird situation is called a "closed timelike loop", and although many physicists detest it, Einstein's general theory of relativity allows it.
This was first discovered more than 50 years ago, when Kurt Godel - who, like Einstein and Gott, worked at Princeton University - showed that if the whole universe were rotating, light beams would be sent on curved arcs, like pebbles thrown from the centre of a roundabout. This would enable clever space travellers to overtake them; and beating light beams is the key to time travel. Astronomers are now convinced that the universe is not rotating, but other solutions for Einstein's equations have been found, which seem to permit journeys to the past.
Thorne's wormholes are one example, but these employ "exotic matter", which, Gott explains, is "in lay terms, stuff that weighs less than nothing." This makes wormholes an improbable mode of transport, and Stephen Hawking has suggested that quantum effects would wreck any chances of them ever working in practice. Hawking even proposed a "chronology protection conjecture", saying that the laws of physics must forbid travel to the past. If relativity seems to allow it, then the fault lies with Einstein's theory, which does not include the effects of quantum mechanics.
Gott's remarkable discovery, made four years ago with his student Li-Xin Li, is a time-travel solution that still works even when quantum effects are taken into consideration. Gott first looked at "cosmic strings", hypothetical filaments of energy which may have been left over from the Big Bang. When two strings pass each other at speed, they form a natural time machine. An astronaut travelling round the strings as they pass close together would meet herself on her return.
Gott and Li then investigated a model universe which Gott calls "Groundhog Day", after the film in which Bill Murray relives the same day again and again. If you lived for 80 years in this world, says Gott, "as you aged… you would encounter 29,219 other copies of yourself, ranging from babies to senior citizens." A world almost as bizarre was in fact anticipated in Robert Heinlein's 1959 story All You Zombies, which Gott calls "one of the most remarkable time-travel stories ever written." There, the protagonist's many copies take different guises, including those of his own parents, his lover and his child. The riddle, for this multiple hero, is to explain where everybody else came from.
The universe that meets itself is Gott's tour de force, presented in a 1998 paper with Li called 'Can the Universe Create Itself?'. Their aim was to remove the riddle of where the Big Bang came from, using the theory of "chaotic inflation." As explained by its inventor Andrei Linde, this is the idea that the universe is analogous to a tree, endlessly sprouting branches and twigs which are each "baby universes." Gott's twist is to suppose that one of those branches loops back to meet - and form -the tree's trunk. In a breath -taking extension of Heinlein's tale, "general relativity may allow the universe to be its own mother".
Should we therefore expect our universe one day to return to its past, in a future Big Bang? Gott thinks otherwise; the loop in time occupied only the first instant of creation. In other words, if you could run the cosmic clock back far enough, you would reach a stage when the clock hits a primordial Groundhog Day (actually only a split second), whose avenue of escape sprouted the rest of history.
Gott provides a diagram of this universe that a colleague likened to a new musical instrument. One, Gott adds, that plays itself. As another analogy, Gott offers Escher's famous print of two hands drawing each other. "As a religious person," says Gott, "I would not pretend that a self-creating universe is not a troubling notion - but perhaps we should find the universe troubling." Gott's considerable achievement is to make such mind-bending ideas accessible to the general reader, casting a spell on the imagination that lingers long after the pages are closed.
Stephen Hawking, The Universe In A Nutshell. Scotland on Sunday, November 25, 2001. Review by Andrew Crumey.
THIS book is touted as a simplified version of A Brief History of Time that an intelligent child could understand. It will certainly look good on a coffee -table; but any kid who understands it will be the next Einstein.
Hawking's fame will guarantee a huge audience that will be just as baffled now as last time. But it's still worth the effort - and moreover, we are treated here to the new 'theory of everything', M-Theory, that has blossomed since A Brief History of Time was first published.
A part of the earlier book which stumped even well-informed readers was Hawking's use of 'imaginary time' as a way of removing the philosophically awkward Big Bang. His 'no-boundary condition' regards the universe as a sphere, with the Big Bang at its pole. The Universe in a Nutshell is only slightly more illuminating about what 'imaginary time' really means to Hawking - at issue is whether he considers it something we could actually experience, or else requiring a special, artificial kind of clock in order to measure it. I'm still not sure, but think he means the latter - leaving us still puzzling about what came 'before' the Big Bang.
His even briefer history of time achieves its aim by cramming relativity, quantum theory, strings and 'branes' into an even smaller space than before, filling much of the remainder with glossy pictures and caption boxes that give run-downs on pulsars, dark matter and the like. The result is a book that risks leaving regular readers of popular science wishing for rather more detail.
Human interest is limited to a potted biography of Einstein, and one or two personal anecdotes. Hawking's impish humour comes through in a lewd snicker about the "black holes have no hair" theorem, and in the comment that Newton formerly occupied his professorial chair, "though in those days it was not motorised." He mentions that one of his breakthrough ideas came one night as he was getting into bed. I would have liked more touches such as this.
The drama of the book, though, lies in its ever-more bizarre cosmic visions, ultimately so exotic and esoteric that all that seems required of us, as readers, is a slack-jawed sense of uncomprehending wonder. Most people will come away with the notion that the universe is a bubble or balloon - an old metaphor whose continued re-use in different contexts leaves it blurred beyond usefulness. Perhaps the bubble is just one in a boiling cauldron; perhaps our universe was born when two collided. But when Hawking says the history of the universe is 'pear shaped', the accompanying colour diagram shows he means this quite literally.
As he says several times, his own philosophy is 'positivist'. As long as the equations give the right answer, he says, we needn't bother about what they really 'mean'. Perhaps the vacuum is filled with 'virtual' particles appearing from nowhere and vanishing again; or perhaps these are 'really' tiny loops in time. Either view is right, if the numbers add up. What counts is the unseen equation, which only the high priests of the subject can understand.
It may be a valid philosophy. Or else perhaps science has now delved so deep, so far beyond experience, that even its most brilliant practitioners are no longer entirely sure what it all means.
Mark Honigsbaum, The Fever Trail. Scotland on Sunday, December 16, 2001. Review by Andrew Crumey.
IT IS the third biggest killer in the world, and in sub-Saharan Africa it will kill 3,000 people today alone, most of them children under five. Only tuberculosis and dysentry can top the lethal effects of malaria.
It was not until the late 19th century that the disease's link with mosquitoes was scientifically proved, earning a Nobel Prize in 1902 for Ronald Ross, a reluctant medic whose first love, according to journalist Mark Honigsbaum in this informative book, was poetry. Yet centuries earlier, the insect's devilish nature was fully appreciated. "The Babylonians portrayed their pestilence-causing god Nergal as a two-winged fly, while the Canaanites called him Beelzebub, 'prince of flies'."
Folk medicine often seeks cures near the source of the illness to be treated; a wisdom still lingering in the use of dock leaves for nettle stings. Yet the cure for malaria - quinine - came from a part of the world that was completely free of the disease until the arrival of Europeans. The story of how the wonder-drug came to be discovered is an enduring South American legend.
The Spanish Countess of Chinchon fell ill in Lima in 1638, and was treated with a native remedy known to Jesuit missionaries: quinquina or "bark of barks." The countess recovered, and the cure became instantly famous throughout Europe: "It was as if, overnight, someone had discovered a cure for cancer."
This hardly seems an understatement when one considers how widespread malaria was in those days. In Britain it was a seasonal hazard, with outbreaks occuring every summer, occasionally as far north as Inverness. Oliver Cromwell caught it in his youth, and had recurring fever attacks throughout his life. These were attributed to the "bad air" from which the word malaria comes; though the more usual name for the disease was "ague", with the resulting swollen spleen being unappetisingly called "ague cake." The disease retreated thanks largely to the introduction of the turnip, which produced healthier herds that no longer needed to be wintered near humans. The mosquitoes followed the animals to their stables, and farm workers could sleep in safety.
Global warming could change things, though. Chief medical officer Liam Donaldson warned in February that a rise in average temperatures of a few degrees might make our native mosquitoes change their nesting habits. "All it would take is for one or two feverish backpackers returning from Asia to be bitten by a mosquito and malaria could begin to take hold again."
Though they did not know just how dangerous mosquitoes can be, explorers of earlier times were in no doubt that they were insects to be avoided. The author reports that while in Venezuela, one of the beasts "had no trouble biting through the underside of my canvas hammock." Leather and Gortex are no obstacle to a hungry female Anopheles darlingi (the harmless males content themselves with nectar). It is this no doubt searingly painful bite that injects the malaria parasite, ingested by the insect during a meal on a sufferer elsewhere, enabling it to continue a life cycle that sees it feeding on human liver, and turning blood cells into poisoned iron-stained granules.
For the victim, a series of fevers ensues, variously hot and cold, wet and dry. Quinine was the only effective treatment, and this came only from a few species of precious South American trees - a genus named Cinchona by Linnaeus, after the first European to be cured by it. Linnaeus's spelling mistake in the countess's name has never been corrected, despite a later international committee to consider the matter.
As Spain's hold on the South American continent crumbled, newly born nations fully understood the value of their national asset. And Europeans realised that if their own empire building was to continue, they would need to have their own more ready supply.
The story of how Cinchona came to be smuggled out of Bolivia and Peru forms the main theme of Honigsbaum's book. Among the key players were a botanist aptly named Richard Spruce; an explorer called Clements Markham who later, as head of the Royal Geographical Society, sent Captain Scott on his fateful trip to the Antarctic; and an entrepreneur named Charles Ledger, whose failure to successfully raise alpaca in Australia made him turn to Cinchona instead.
Of the three, Spruce is the most engaging. A lifelong hypochondriac, he managed to catch malaria after losing his quinine supply, was then treated by a native nurse who did her best to kill him, and was subsequently plotted against by his porters, whose murderous plan he fortunately overheard. Somehow, this accident-prone pioneer succeeded in getting out of South America alive, complete with precious seeds, and lived to a ripe old age. Like Markham, his motives were largely philanthropic, making their theft of a national asset more palatable. Ledger, on the other hand, is less appealing. A South American Indian who bravely helped him secure the best seeds paid for his loyalty with his own life.
The adventurers' stories - the bulk of the book - make a fine tale, but one told here in a little too much detail. The last chapters race us into the present day, with the new synthetic drugs that have relegated quinine to our gin and tonics, but which have also encouraged the wily parasite to evolve resistant forms. Honigsbaum meets the man tipped to find a vaccine - a Colombian who is his country's best known personality, after Gabriel Garcia Marquez. In a country riddled with violence, Manuel Patarroyo need never fear assasination. Everywhere he goes, children seek his autograph and mothers ask him to bless their babies. He was kidnapped once by Marxists, who released him five hours later, begging him to remember to give them some of the vaccine whenever he finds it.
While the world waits for that crucial breakthrough, many lives could be easily saved. Mosquito nets are the best solution. But at 2.50 pounds apiece, most Third World parents simply cannot afford one for their sleeping children.
Oliver Sacks, Uncle Tungsten. Scotland on Sunday, December 23, 2001. Review by Andrew Crumey.
ANYONE familiar with Oliver Sacks' work might guess he had an unusual upbringing. Just how unusual, though, is revealed in this extraordinary memoir.
The neurologist who found international fame with his classic book of case studies The Man Who Mistook His Wife For A Hat and who was maniacally portrayed by Robin Williams in the film Awakenings, was born in 1933 into a Jewish family in London that was dominated by scientists and richly stocked with colourful characters. An uncle who ran a light bulb factory helped instil in the young Oliver a fascination for chemistry; and Sacks's memoir is evenly split between reminiscences of his large family and accounts of the history of chemistry that he learned in his youth.
Sacks says relatively little about his parents, both of whom were GPs (the large family home was where they ran their practice), and who appear to have been emotionally distant, often physically absent figures. Sacks's father was a talented amateur musician who loved medicine for its human dimension; his mother, on the other hand, was drawn to the purely scientific side, to a quite chilling degree. "She would occasionally bring back malformed foetuses to the house - anencephalic… or spina bifida ones." She then insisted that Oliver dissect them. He was 11 years old at the time. Three years later she arranged for him to study anatomy at a local hospital, where he spent a month dissecting the leg of a dead girl the same age as himself.
Sacks only hints at the emotional effect this ordeal must have had on him and the episode is buried inconspicuously towards the end of the book. Yet throughout, we get a sense of a boy who took refuge from emotional instability in the certainties of science. An evacuee during the war, he and his brother Michael were sent to a Midlands boarding school whose institutionalised sadism came straight out of Nicholas Nickleby. The headmaster once managed to break a cane on Oliver's backside and billed him for it. Solitary reading and experimenting gave Sacks an escape from this, but Michael later became, says Sacks, "psychotic." Sacks's reaction to this crisis was to set up a laboratory.
Sacks's account of his solitary induction into the mysteries and delights of chemistry is infused with wonder and genial humour, as he describes the escalating ambitions of his experiments. He and his much older brothers David and Marcus made "volcanoes" of ammonium dichromate and poured concentrated sulphuric acid onto sugar to make it sizzle. Sacks became fascinated with the colours of chemicals and with their smells. Trimethylamine, he discovered, is what makes rotting fish so obnoxious. The "worst smell in the world", though, was hydrogen selenide, "an indescribably horrible, disgusting smell that caused me to choke and tear, and made me think of putrefying radishes and cabbage".
The response of Sacks's parents to the clouds of toxic gas emanating from his room was to make the boy install a proper fume cupboard. Among his relatives there were several professional scientists who could help him out, but he also obtained chemicals and equipment from a London supplier that happily handed things over to him which nowadays could not be bought by any adult without a proper licence. He even got hold of hydrofluoric acid, a substance so corrosive it dissolves glass, which he kept in a gutta-percha bottle he dared not open.
Sacks explains how his imagination was sparked by these chemicals and by the history of their discovery and nomenclature. Clearly, he developed an extraordinarily detailed knowledge of laboratory techniques through his labours - whose obsessiveness was apparent to those around him. "He will go far, if he doesn't go too far," was one schoolteacher's assessment of him.
Between reminiscences of his own experiments, Sacks discusses the work of great chemists such as Lavoisier, Dalton and Mendeleyev. Some of this will be familiar to any reader with a basic grounding in chemistry, but Sacks's retelling is highly engaging and always imbued with the sense of warmth and empathy that is such a feature of all his writing - surprisingly so, it seems, given the strange childhood he describes.
An obvious point of comparison with Sacks's book is Primo Levi's The Periodic Table, which shows a similarly intimate familiarity with the hands-on practicalities of the chemical world and sets this against the broader perspective of personal experience. Levi, like Sacks, came from a Jewish background and the cabbalistic mysteries of alchemy appear to have exerted a fascination on both. Sacks' book does not rise to the same level of poetic vision, but at times is equally affecting, particularly as the understated emotional deprivations of the author's childhood become more and more apparent, alongside his extraordinarily precocious talent.
In fact, given that Sacks appears to have been able to master virtually the whole of chemistry by the age of 14, the greatest wonder is that he did not go on to become a Nobel Prize winner in the subject. One senses that a certain amount of know-how has been cast retrospectively on to the young Oliver's shoulders here, making him seem something of a wunderkind. Perhaps not everything in this memoir is entirely to be trusted. But taken with a little sodium chloride, it makes for intriguing reading.
Richard Stone, Mammoth. Scotland on Sunday, December 30, 2001. Review by Andrew Crumey.
AS familiar as Tyrannosaurus or the sabre-tooth cat, the mammoth is a stock figure of the prehistoric menagerie. Science journalist Richard Stone tells the history of the beast and describes the modern-day exploits of scientists intent on cloning one in order, perhaps, to stock a real-life "Pleistocene Park." Given such intriguing material, it is a pity the resulting book fails to bring the creature even metaphorically to life.
Among the illustrations, for instance, there is none that shows what a mammoth actually looked like. You may think - as I did - that you already knew, but Stone describes how the animal's trunk had a special grip on its end and how its small ears were more like a human's than an elephant's. The photograph of a stuffed mammoth badly reconstructed by Soviet scientists decades ago does little to enlighten us.
Stone shows a good professional knack for conveying technicalities in easily digestible terms, such as the task faced by would-be mammoth cloners. But the real story, it seems, is one he was only allowed to glimpse. An international team, sponsored by the Discovery Channel, went to the Russian Arctic in an attempt to retrieve an undecayed mammoth. The resulting TV programme drew a bigger audience than The Sopranos, but Stone was allowed only limited access to the expedition thanks to the TV channel's cloak of secrecy.
As a result, the most interesting aspect of this book is the light it sheds - perhaps unintentionally - on the way science is increasingly done these days, in an age of reduced state funding and perpetual calls for academics to seek corporate sponsorship. The expedition leader, Bernard Buiges, is a media-savvy Frenchman who tells journalists to ask the others about science: "You can ask me about women and love." His more sober assistant is a Dutch customs officer who once came across an ivory smuggler claiming his goods were made of mammoth tusk and hence perfectly legal, since you can't endanger a mammoth. The Dutchman recognised the tusk to be modern, and nicked the offender.
A proviso of the Discovery Channel, however, was that these two had to be joined by an American: Larry Agenbroad, a veteran whose business card says: "Have Mammoth??? Will travel!!!!" All are undoubtedly serious scientists, but the showbiz tactics are worrying. Will we one day be relying on TV deals to bring us the next-generation space telescope or a cure for cancer?
Buiges' sense of good television further emerges when we learn of his inviting Agenbroad's wife to come too. "She was the only petunia in our onion patch out there," Agenbroad says proudly, before Stone embarks on a stomach -churning anecdote about their courtship years earlier.
Also on the trip is Edinburgh-born Canadian Ross MacPhee, looking for evidence of the killer disease he believes caused the mammoths' demise, and a dour Russian somehow reminiscent of the traitor in Ice Station Zebra. When the mammoth is eventually extracted, he's sure it will turn out to be crap.
Stone's part in all this is, to a large extent, an ordeal involving perpetually unreliable Russian flights and trips to God-forsaken Arctic mining towns that sound like a permafrosted Cumbernauld. But at least he witnesses the grand finale when the scientists melt the ice for the cameras, wearing grey body suits that make them look - as MacPhee sadly admits - like the men from the Pentium commercial.
This book tells us everything we could possibly want to know about mammoths, but something is evidently missing.
Presumably it's all those great shots that kept the Discovery Channel's viewers so utterly enthralled.
Thanks to HTML Codex