Esther Leslie, Synthetic Worlds. Scotland on Sunday, January 8, 2006. Review by Andrew Crumey.

IN 1856, an 18-year-old student named William Perkin was experimenting with coal tar when he produced a purple substance. It was mauve, the world's first artificial dye, and it led to a huge industrial effort to create new synthetic colours. Esther Leslie's book, subtitled "Nature, art and the chemical industry", aims to examine how science and art shaped people's ideas of nature over the following century and a half.

Leslie's hero is not Perkin, however - who is hardly mentioned in this book - but Friedlieb Ferdinand Runge. He separated coloured chemical mixtures by dripping them on to filter paper, thus discovering the technique of chromatography, and published a beautifully illustrated book showing the complex patterns which could emerge. Leslie's unusual shift of emphasis is because her real concern is not so much with chemistry or the history of technology as with a broader thesis whose framework, she says, is that of the literary theorists Walter Benjamin and Theodor Adorno. So any industrial chemists thinking of buying this book should be warned that it is really intended for the cultural studies readership.

Leslie aspires to Benjamin's all-embracing grasp, but the weight of theory lies so heavily on this book that much of it feels closer to the less readable Adorno. And since Benjamin was deeply concerned with Romantic literature, this is what Leslie takes as her starting point. Runge was a friend of Goethe, whose theory of colour was part of German Romantic "natural philosophy", seeking to unify objective phenomena with subjective impressions. By seeing Runge as part of German Romanticism, Leslie positions the whole synthetic chemical industry there too.

There is certainly some validity in this. Germany was the heart of the industrial effort, and the philosophy of Goethe, Hegel and Schelling was deeply embedded in the minds of its researchers. Leslie quotes fascinating passages in which Runge saw his coloured pictures as indicative of the self-organising principle of life itself. She unearths other writers of equal interest to support her thesis.

As we move forward in time, though, there are annoying digressions, such as a chapter on visions of space, leading to an analysis of 'Twinkle, Twinkle, Little Star' that is almost a parody of the way academics can extract meaning out of the most insubstantial material.

More convincing is Leslie's account of the Nazi years, when chemical synthesis became a way of making cheap substitutes for unobtainable goods. The earlier artificial dyes had been a triumph of science, widening the range of colours in people's lives, but as German factories produced synthetic rubber, synthetic coffee and synthetic paper, the word 'ersatz' began to be used pejoratively. People yearned for the real thing, while outside Germany, the nation's ingenuity was a cause of fear. A British wartime propaganda book, Germany: Land Of Substitutes, claimed Nazis were making clothes from offal and beefsteak from coal. Leslie notes that while the recycling of hair from concentration camp victims is true, the story of bodies being turned into soap has never been verified, and seems to reflect the growing idea of chemical synthesis as being monstrous.

The changes in attitude which she charts are interesting and important. It is worth wading through all the digressions to find them.


The Dalai Lama, The Universe In A Single Atom. Scotland on Sunday, February 5, 2006. Review by Andrew Crumey.

THE Dalai Lama's aim in this fascinating book is to highlight parallels between traditional Buddhist teachings and the findings of western science. Many books have been written on the subject, but what singles this one out is not only the evident authoritativeness with which the Dalai Lama can discuss classic Buddhist texts, but also the real understanding he has of science, gleaned through many meetings and conferences, and from extensive reading. While quick to point out he is no expert, the author nevertheless shows a serious engagement with scientific ideas and a great respect for its practitioners.

This interest goes back, it seems, to childhood. He describes how, at the age of six, he found himself enthroned in the Tibetan capital, Lhasa. While his education was intense, it was nevertheless exclusively in philosophy. "I remember most vividly my first lesson on epistemology as a child, when I had to memorise the dictum, 'The definition of the mental is that which is luminous and knowing'." He would have been eight or nine at the time.

But while his formal studies were theoretical, his winter residence, the huge Potala Palace, offered more practical instruction. Its numerous rooms (too many for him to explore in their entirety) were filled with items collected by former Dalai Lamas: they included globes, clocks and a telescope, which he used to observe the stars in the pristine skies above his mountain palace. There were also two film projectors and three cars, enabling the young Dalai Lama to acquire a taste for mechanical tinkering - and for Charlie Chaplin.

When he made his first visit to the West in 1973 and gave a talk at Cambridge University's faculty of divinity, he was asked if there was anything he particularly wanted to do while he was there. He chose to see the huge radio telescope in the department of astronomy.

One gets the feeling that if the Dalai Lama had been born in the West then he might well have become an engineer or scientist. Instead he is able to offer a truly insightful comparison between the intellectual traditions of East and West which will surprise many readers. While cultural relativists like to hold up creation myths from the world's religions as "alternative theories" to the Big Bang, the Dalai Lama insists empirical evidence is always paramount.

As a child he was taught Abhidharma cosmology, which pictures a flat earth floating on air. He frankly admits that this picture cannot be taken literally and that "Buddhism must abandon many aspects" of it.

Equally, though, western science can learn much from the experience of meditators, some of whom are able to control phenomena such as body temperature. He explains how he has been instrumental in persuading Tibetan meditators to allow researchers to study their brain patterns in an attempt to understand how such things are possible.

Buddhism, the Dalai Lama emphasises, is not theistic, and does not see the universe as consciously designed by an intelligent creator, but rather as an interconnected whole governed by discoverable laws. Many scientists would agree; but Buddhism further insists that there is a final purpose to the whole of creation. True or not, it is an inspiring and profoundly beautiful vision.


James Lovelock, The Revenge Of Gaia. Scotland on Sunday, February 12, 2006. Review by Andrew Crumey.

IN 1969, scientist James Lovelock was walking with his friend and neighbour William Golding. The talk was of life on other planets, and Lovelock explained his idea that the Earth could be considered a single living system, regulating itself so as to make its continued existence possible. Golding was impressed by Lovelock's theory and immediately proposed a name for it: Gaia.

Lovelock's holistic, animistic ideas have always been treated with scepticism by the scientific establishment while being eagerly endorsed by New Age aficionados, but many of his views on the interconnectedness of our planet's biological, meteorological and chemical systems are now an accepted part of environmental science. Yet this book about the threat of global warming, and our possible responses to it, will not please green activists. The only way out, says Lovelock, is to go nuclear.

Experts agree that Earth is warming and that carbon emissions are certainly exacerbating if not causing the problem. The question is whether the trend can be reversed without bringing every factory, aeroplane and car to a halt. Lovelock's prognosis is gloomy, because his self-regulating planet does not take kindly to interference. Instead the Earth will, he predicts, arrive at a new balance point which will suit Gaia but not us: mother nature's revenge.

The kind of runaway effect he foresees is illustrated by snow. As Earth warms, glaciers melt. But snow is very good at reflecting heat, and when it disappears, the newly exposed ground warms even more quickly, causing yet more melting. Something similar happens in the oceans, where cooler water, according to Lovelock, is richer in organisms which ultimately help keep the temperature down.

The standard green solution is to create an economy based on wind or wave power for generating electricity. Lovelock rejects this as unfeasible, saying we would need to cover the country in windmills, which would themselves affect the weather.

Hydrogen-powered cars are similarly dismissed, but hydrogen gets the thumbs-up as fuel for nuclear fusion power stations. Lovelock's optimism about fusion technology is undermined by the fact that decades of work have still failed to come up with a workable system that can produce significantly more energy than it consumes. That leaves traditional nuclear power stations as the one option available today which could provide the nation's energy requirements while keeping a lid on carbon emissions.

Recent pronouncements by Tony Blair and George Bush have shown that this is fast becoming the favoured option, but Lovelock's advocacy will not be particularly welcome. Dismissing the risk of radiation leaks, Lovelock says they are actually a good thing. Wildlife abounds at the deserted Chernobyl site, he says, and perhaps the best way to preserve rainforests would be to dump radioactive waste in them, so that developers would keep away.

Such barminess aside, Lovelock more seriously fails to take into account the decommissioning costs of nuclear plants. But while he may be wrong about the affordability of the nuclear option, he could be right in saying it is the only way to go. The question is whether we are prepared to see countries like Iran take the same path.


Matthew Stewart, The Courtier And The Heretic. Scotland on Sunday, February 19, 2006. Review by Andrew Crumey.

FEW people outside of university philosophy departments read the works of Leibniz or Spinoza nowadays. Yet both men changed history, and Matthew Stewart's enjoyable, if at times contentious, book does a fine job of making their ideas accessible to the general reader.

He tells the contrasting stories of their lives, and his book hinges on the fateful meeting between the two men which took place in Spinoza's home town of Amsterdam in 1676, when Leibniz was 30 and still at the beginning of his philosophical career, while the 44-year-old Spinoza had only months left to live.

Of the two, Leibniz was the more colourful. An inveterate networker, he spent his life trying to get the sort of court patronage that would give him financial security and social status. Eventually he landed a senior position at the court of Hannover, alongside the young George Frederick Handel, but it was nowhere near as illustrious as it might sound. Hannover was a small town where cattle wandered down the unpaved streets, and there was not so much as a single coffee shop in which to pass the hours in intelligent conversation. Leibniz hated the place, and for the rest of his life he did everything he could to escape.

In the end, Stewart notes, he was a little too successful - he was absent without leave when the Elector of Hannover became George I and moved his court to England. Leibniz came back to Hannover to find the palace all but deserted, everyone else having gone.

Spinoza was more reclusive and unworldly. He came from a family of Jewish merchants but it was clear from early on that intellectual pursuits were more to his liking. What brought about a crisis for Spinoza and his family, however, was the philosopher's conviction that scripture could not be read as historical fact. Organised religion, he declared, was a collection of myths, and Spinoza's refusal to recant led to his being excommunicated in the most fiery terms by Amsterdam's chief rabbi. Ostracised from the city's Jewish community, Spinoza lived off the generosity of friends, and by learning to become a lens grinder.

"Spinozism", says Stewart, became another name for atheism, and had Spinoza lived in a less tolerant country he would have been burned at the stake. But Spinoza was really a "pantheist", believing God and nature are one.

Leibniz, meanwhile, believed every atom in the universe to have a soul, the universe being a projection through them of God's will, like a cosmic hologram. God could choose to make things however he liked, but selected the "best of all possible worlds".

Stewart's big idea is that the whole of Leibniz's philosophy was really a reaction to Spinoza, and to the meeting the two men had in 1676. Unfortunately Stewart takes this to improbable lengths, asking us to believe that when Leibniz took issue with other people such as Descartes - or even Louis XIV - he was really targeting Spinoza, the secret bogeyman underpinning all his thought. That is going too far; but the book still offers a readable guide to two contrasting philosophers and their ideas.


Mario Livio, The Equation That Couldn't Be Solved. Scotland on Sunday, March 19, 2006. Review by Andrew Crumey.

MARIO Livio is a renowned astrophysicist who was formerly in charge of the Hubble Space Telescope, and in this book he offers an interesting account of the science of symmetry.

For most of us, symmetry means a visual balance between two halves of an object, but for mathematicians this "bilateral symmetry" is only a special example of something more general which Livio calls "immunity to a possible change".

Swap the white and black squares on a chess board and you still end up with a chess board. Turn a left-handed glove inside out, look at it in a mirror, and you will see a left-handed glove. Words can have symmetries - palindromes are an obvious example - and so too can mathematical equations.

It is when Livio delves into these mathematical symmetries that his book becomes a little technical, but he also describes the lives of the brilliant pioneers who unravelled these ideas over the centuries. One was Gerolamo Cardano, a famous polymath who was lured from Italy to Scotland in 1552 by the promise of a huge fee if he could cure the Archbishop of St Andrews of breathing difficulties. Cardano diagnosed an allergic reaction, cured by a change of bedding, and he headed home a rich man. He later penned an astonishingly detailed and frank autobiography which, as well as trumpeting his achievements, also informs us about such things as his shoe size and impotence - for which he also found a cure.

Cardano's great mathematical discovery was to do with polynomials - equations in powers of x. Schoolchildren nowadays learn how to solve quadratic equations; Cardano found how to solve cubics, and later mathematicians cracked quartics. Next in line was quintics, involving fifth powers of x, but in the early 1800s, a young Norwegian called Niels Abel showed that these are unsolvable in general. Tragically, Abel died in his early 20s, a victim of disease brought on by poverty.

It was another tragic young genius, Evariste Galois, who showed how all of this could be understood as a kind of symmetry now called "group theory". Galois was a hot-headed young Frenchman who made no secret of his republican sympathies during the reign of the unpopular King Louis-Philippe I, but the duel that cost him his life at the age of 20 was more probably an affair of honour over a woman. The circumstances, despite Livio's detective work, are still clouded in mystery, but it appears that Galois and his opponent agreed to settle matters in a kind of Russian roulette, each taking a pistol, only one of which was loaded. Galois chose the wrong one.

Livio's exploration of symmetry goes beyond mathematics to take in art and music, and some of this will be familiar to readers acquainted with venerable classics such as Budden's Fascination Of Groups or Hofstadter's Gödel, Escher, Bach.

Most intriguing of all is his discussion of human evolution, and our species' preference for symmetrical faces. Evenly matched features always score highly in tests of attractiveness, and Livio further reports that the most symmetrical-looking people tend to be healthiest and happiest, with asymmetricals being more prone to illness and depression - presumably because everyone finds them unattractive.


Seth Lloyd, Programming The Universe. Scotland on Sunday, April 9, 2006. Review by Andrew Crumey.

OVER the past half century, the power of computers has doubled every year and a half. This exponential rise - known as Moore's law after the Intel chief executive who first spotted it in the 1960s - has been great news for the computer industry, because it means that every top-of-the-range PC they sell is guaranteed to be obsolete within 18 months.

But how long can the trend continue? Computer chips can never get smaller than the atoms they are made from, and signals can travel no faster than light.

However, in the 1980s, Richard Feynman showed how to beat the limit by creating a "quantum computer". Feynman never actually built one, nor did he even show exactly how it would work - but other people have been working on the answer for the last 20 years.

Seth Lloyd is one of the leading figures in the field. His excellent book explains the science in terms as simple as one could hope for, but also goes further, suggesting that a quantum computer already exists and we are living in it: it is the universe itself.

An ordinary computer is effectively a lot of switches which can be on or off: the ones and zeroes of digital information processing. You don't need to have a screen or keyboard, or even microchips: the essence is in the switches, and theorists have shown that the collision of billiard balls on a table could be considered as a kind of information processing just like the sort that goes on inside a PC.

Quantum computers introduce a new twist: the switches need not be either on or off, but can be both at once. This is a version of the "Schrodinger's cat" paradox, in which a moggy in a box is theoretically both alive and dead - there is no way to tell until the lid is opened. Lloyd's switches are not cats, but single atoms which he and his collaborators are able to link magnetically. Their multiple states enable calculations to be done far more quickly, breaking the speed limit of "classical" computers.

Everything around us is constructed of atoms hooked together, so to that extent we live in a quantum computer, and what it calculates is the future.

Lloyd is alert to the philosophical and spiritual implications: he movingly recounts how he came to terms with the death of a colleague whose "piece of the universal computation goes on".

He also describes a meeting with the great Argentine writer Jorge Luis Borges, whom he asked if his stories were influenced by modern physics and its multiple realities. Borges replied that he was gratified that physicists were starting to catch up with literary insights.

Even assuming that the tremendous pace of Moore's law holds out, quantum computers are still not expected to appear on our desktops for another 60 years: Lloyd ruefully reports that a major breakthrough came recently when a device was made that could work out that 15 equals five times three.


David Leavitt, The Man Who Knew Too Much. Scotland on Sunday, July 2, 2006. Review by Andrew Crumey.

MUCH has been written about Bletchley Park, the secret Second World War code-breaking centre where the German Enigma machine was deciphered, and about Alan Turing, the chief mathematician there who helped found modern computing theory.

This latest offering presents an account of Turing's tragic life - which ended in suicide when he was 42 - and his main ideas, and is distinguished by being the work of an acclaimed novelist, able to bring a certain stylistic flair to the account.

He begins by comparing Turing to the character played by Alec Guinness in The Man In The White Suit: the modest, absent-minded inventor of a revolutionary idea, hounded by forces who see him as a threat.

Leavitt, well known as a gay writer, sees a homophobic metaphor in the film, and in Turing's case this was actual. Prosecuted for indecency after the war, Turing was put on a course of hormone treatment to 'cure' his homosexuality, and it drove him to despair. He took his life by eating an apple he had laced with cyanide.

Leavitt is surely right to see sexuality as a central feature of Turing's life, but he strains the theme through constant references to EM Forster, a gay writer considerably older than Turing. The two were both in Cambridge at the same time - Turing as a shy student who rarely socialised, Forster as a fêted figure - and Forster's novel Maurice, with its theme of awakening homosexual identity in a stiflingly repressive society, serves as a recurring leitmotiv, illustrating the atmosphere of the time.

But Turing never met Forster - he mingled only with scientists and philosophers, most notably Ludwig Wittgenstein, and he never read Maurice (which was published posthumously in 1971). Far more relevant is the short story Turing wrote while undergoing psychoanalysis following his arrest, which Leavitt quotes only in tantalising snippets, describing encounters in parks in 1950s Britain - a world away from Forster's turn-of-the-century milieu.

Although we get a good sense of Turing's middle-class childhood and his lifelong devotion to his mother, the biographical focus slackens with advancing years, and the emphasis is placed more on his work. Turing showed that all mathematical calculations or logical deductions can be seen as a 'machine': made actual in PCs. Leavitt ably explains Turing's reasoning, but perhaps in rather more detail than most general readers would wish: the minutiae are rather dull unless you happen to be a computer scientist, and a quicker synopsis would have sufficed.

If Leavitt never really brings us face to face with Turing then it is probably because the man himself was so reserved, with "more than a touch of Mr Spock" in him.

Only at the end, as Turing's career collapsed in ruins and he began writing strangely camp letters of self-mockery to his friends, do we see a complex personality unfold beneath the protective mask of unworldliness.

Most poignant of all is Leavitt's analysis of the fatal apple - almost certainly inspired by one of Turing's favourite films, Snow White And The Seven Dwarfs. In the fairy tale, Leavitt points out, the apple is not fatal: the sleeping victim is woken by a kiss. Turing's prince never came.


David Standish, Hollow Earth. Scotland on Sunday, September 3, 2006. Review by Andrew Crumey.

JULES Verne was not the first person to imagine journeying to the centre of the Earth. For centuries people believed our planet to be hollow, containing creatures and even whole civilisations hidden from surface dwellers. What began as a serious scientific theory descended through mystical speculation into the stuff of fantasy, so that Standish's fascinating account spans geology, literary history and new-age religion.

It began with Edmond Halley, famous for his comet, who attempted to explain variations in Earth's magnetic field by supposing our planet to contain a series of concentric spheres, perhaps with people living on them.

Saving them from a life of darkness was the task of the next great hollow-Earther, John Cleves Symmes, who toured 19th-century America explaining that our globe has giant holes at both poles which allow light inside. Symmes was ridiculed as a crank but spent his life unsuccessfully trying to get funds for a polar expedition that would prove his theory.

Things got a new twist from Cyrus Teed, who created a utopian community dedicated to his own religion, Koreshanity, claiming that not only is our Earth hollow but we live on the inside, the sun and sky being a clever illusion. It makes for the most entertaining section of Standish's book.

Another convert to Symmes' theory was Edgar Allan Poe, who may have read Symmes' self-publicising novel Symzonia describing a heroic mission down under. Poe seems to allude to the hollow Earth theory in his only novel, The Narrative Of Arthur Gordon Pym.

From there, Standish explains, the hollow Earth went truly global. Poe died unknown in America but had an ardent fan in Charles Baudelaire, who set about translating the master into French, and the young Jules Verne was so smitten by Poe's stories that he set about imitating them - including writing a sequel to Poe's novel.

A substantial part of Standish's book is devoted to the many hollow Earth novels that appeared in the 19th century, most of them totally forgotten now except by scholars, and in many cases their synopses are as dull as the novels themselves. Highlights however include Dorothy and the Wizard in Oz, the fourth book in L Frank Baum's series, which took the characters below ground, and six hollow Earth novels by Edgar Rice Burroughs, including Tarzan at the Earth's Core. Modern pulp fiction pays occasional homage to the genre, a recent addition being Indiana Jones and the Hollow Earth.

Polar exploration finally killed the theory when no holes were found, though devotees still crowd the internet, and Standish has done an admirable job unpicking fact from fantasy by tracking down primary sources, enabling him to debunk many second-hand myths of hollow Earth lore. One that sadly eludes him, though, is the story that Hitler believed in Cyrus Teed's inside-out hollow Earth and ordered a scientific expedition to test it. The tale has been discussed by many fringe-science writers (notably Pauwels and Bergier) and it would be good to know exactly where it started, but Standish fails to follow it up. It is one small hole in an otherwise excellent book.


Joel Primack and Nancy Ellen Abrams, The View From The Centre Of The Universe. Scotland on Sunday, October 15, 2006. Review by Andrew Crumey.

THE astronomer Carl Sagan said "we are stardust", since the atoms in our bodies were forged in stellar explosions. Hence there is a joke among cosmologists that "romantics are made of stardust, but cynics are made of the nuclear waste of worn-out stars". Husband-and-wife team Joel Primack and Nancy Ellen Abrams are evidently romantics, given the number of times they mention stardust in their guide to modern cosmology and its spiritual ramifications, but their evangelising left me feeling like a hardened cynic.

Their book tries to do two things. First it gives an outline of modern scientific ideas about the formation, structure and fate of our universe - and since Primack is one of the world's finest theoretical physicists, this part is naturally very good.

The larger aim, however, is to consider our spiritual relationship with the universe, setting it in the context of ancient creation myths such as those of the Egyptians or the Bible. This is where the problems arise.

The authors define myths as "the stories that people of any culture, at any time including today, communally believe". Cultural relativists would say that science is itself a myth, but the authors do not. Science, they say, "is closing in on the class of myths that could actually be true". What we need, therefore, is a new myth; a story that will enable us to understand modern physics in metaphorical terms.

As an exercise in public education that is all very well, and pretty much what science popularisers have been doing for decades. Yet Primack and Abrams claim science can make life seem meaningless; we must find value by tapping into the "metaphor-ocean of the cognitive unconscious". Countless people from Epicurus to Richard Dawkins would disagree.

What the authors forget is that their metaphor-ocean is a product of history and tradition, hence the images they find are freighted with a cultural baggage to which they appear oblivious. They include "the cosmic density pyramid", the balance of dark matter to ordinary matter, which they depict using the Masonic pyramid shown on US dollar bills - a myth appealing to some but offensive to others.

More neutral is the "cosmic uroboros", a self-eating snake illustrating the relative sizes of objects in the universe, showing humans smack in the middle between smallest and largest. This centrality is a major theme of the book, and will be familiar to anyone who has seen the classic animated film Powers Of Ten, zooming from a person's hand down to the smallest atom, and out to the largest galaxy.

It is indeed a modern myth, though it reflects a far more ancient view, found in Dante, of mankind as the exact midpoint between the terrestrial and the heavenly. The problem is that in order to prove humans to be the centre of creation, the authors need to distort their own figures. The true cosmic middle scale is a tenth of a millimetre - it is not man that is the measure of all things, but dust.

The authors take a more optimistic view. Their personal mythology includes New Universe Day, celebrated with a "cosmic dessert" whose recipe they provide. It sounds sweet, tempting and gooey - much like their philosophy.


Jason Bardi, The Calculus Wars. Scotland on Sunday, November 19, 2006. Review by Andrew Crumey.

ONE of the most famous spats in scientific history was the war of words between Isaac Newton and the German philosopher-scientist Gottfried Leibniz over which of them had been first to discover the mathematical theory of calculus.

Jason Bardi, a biologist by training, explains the scenario in the introduction to this, his first book. Calculus, he explains, is "useful for investigating everything from geometrical shapes to the orbits of planets in motion around the sun". Newton was technically the first to discover it, but failed to publish his results. Leibniz subsequently went into print, and was accused of having stolen Newton's ideas. He was no cheat, however, and it now seems that both men arrived at the same results independently.

It has the potential to be a fascinating tale, but Bardi's opening synopsis lays bare the problems he faces, and is unfortunately unable to overcome. First there is the fact that we know from the outset how the story ends, with the posthumous declaration of an honourable draw. So we cannot expect much in the way of suspense. Secondly, and perhaps more seriously, there is the problem of understanding exactly what it was that Newton and Leibniz were fighting over. What is calculus? Again, though, Bardi goes little further than the bare sketch given in the introduction.

Calculus began as a way of understanding speed. Can we speak of the speed of a car, say, at a particular moment in time? In a single instant, a car travels no distance - yet Newton and Leibniz both found how to define instantaneous speed - the thing we now measure with a speedometer. This is "differential" calculus. If you draw a graph of the car's speed as it varies over time, you get a wavy curve, and the area under the curve is the distance the car has travelled - this is called "integration".

It only takes a few lines to explain, but that is more than Bardi can spare in his tale of intellectual warfare. And since the battle was fought entirely by letters, nearly all of them written by intermediaries supporting one or other side, the tale is not exactly riveting.

Still, there are a few nuggets here that will appeal to maths teachers looking to pad their lessons with historical background. I was interested, for example, to note that the first ever book on calculus to be published in Britain was by a Scotsman called John Craig, "something of a forgotten player in the invention of calculus". People who like to think that the Scots invented everything can therefore throw a new spanner in the works by proposing Craig as the third father of calculus. Craig appears to have got his ideas from Leibniz, since he used all the same terminology, but he was still credited for a while - in Germany at least - as having beaten Newton.

It was, moreover, another Scot, John Keill, who set off the whole spat by accusing Leibniz of plagiarism. There followed years of international arguing, which ended only with Leibniz's death, leaving the elderly Newton satisfied but still indignant. If Bardi fails to make these larger-than-life characters come to life, his supporting cast may nevertheless offer sufficient interest to readers already well enough acquainted with the ideas they fought over.

© Andrew Crumey

Thanks to HTML Codex