Guido Tonelli, Genesis (Wall Street Journal)

Three years ago, Italian physicist Guido Tonelli was in Sicily to give a talk about the origin of the universe. Also on the bill was a Jesuit theologian who spoke about the book of Genesis. It was this event that inspired Professor Tonelli's own take on the creation story - a scientific summary of the last 13.8 billion years of cosmic history.

The theologian pointed out that the biblical account "is in fact two books, written in different eras and by many different hands." Commentators saw inconsistencies - how could there be "days" before there was a Sun? Many preferred to interpret the days flexibly, and so does Professor Tonelli, whose seven take us in very unequal time-steps from the Big Bang to the appearance of planets able to support life.

An obvious first question is, "What came before the Big Bang?" There are three answers that physicists generally choose from. One, prominently advocated by the late Stephen Hawking, is that the question is meaningless: there was no time before. Another is to suggest our universe might be a temporary phase in an infinite process - a bubble in eternal chaos, or the mirror image of an endlessly receding past. The third, most honest and least heard, is "nobody knows." Professor Tonelli opts for number one. "In the beginning was the void," he affirms, though this void was not empty. Einstein's relativity theory tells us that space can have energy, and Heisenberg's uncertainty principle allows for the spontaneous appearance of particles. We may not know where the void came from, but scientists can describe how it grew.

The mathematical theory of an expanding universe was worked out in the first half of the twentieth century by a number of people. Professor Tonelli focuses on Belgian priest, George Lemaitre, though his pyrotechnic "primordial atom" was rather different from the element-forming model of George Gamow and others, which led to the name "Big Bang" and, more importantly, was able to explain why stars are made almost entirely from hydrogen and helium. The appearance of those two lightest atomic elements was the culmination of a process filling the first three days of Professor Tonelli's cosmic week - an actual time of mere seconds.

That brief but eventful era is of greatest professional interest to the author, who played a key role in the experimental work that led to the discovery of the Higgs boson in 2013. The particle had been theorised decades earlier as a way of explaining what went on during the first moments after the Big Bang. Initially, it is assumed, there was a single fundamental force, which very quickly split into the four known today: gravity, electromagnetism, and the "strong" and "weak" forces of nuclear physics. When those forces were unified, they would have been carried by a single type of particle, but as temperature fell in the expanding universe, different kinds arose. The Higgs boson played a crucial role in that process, effectively binding to some particles, making them heavy, but not to others. The particles that carry the strong and weak forces are heavy, only working over subatomic distances, while electromagnetism is carried by photons - the particles that light is made of - and can extend over unlimited distances.

Professor Tonelli writes lyrically about the "delicate touch" of the Higgs boson, whose entry "one hundredth of a billionth of a second after the Big Bang" inaugurated the second of his metaphorical days. What happened during the first was inflation, an incredibly rapid expansion whose evidence is seen in microwave radiation reaching us from the universe's furthest visible limit. The uniform temperature of that radiation must have been caused by very fast growth, but the particle that drove the process, termed the inflaton, has yet to be discovered. The most powerful particle accelerator in the world today - the Large Hadron Collider at the multi-national CERN laboratory on the Swiss-French border - can create, for a brief moment and in a very small space, the conditions of "day two", able to stir the Higgs boson from "that equilibrium in which it has rested now for billions of years". To reach even further back will require something substantially bigger.

Professor Tonelli writes, "If we want a new machine to be up and running in 2035-40, we need to act right now." Plans are underway for "a collider 100 kilometres in length" to be built at CERN. The initial proposal alone involved "over 1,300 physicists and engineers, belonging to some 150 universities". Digging the tunnel will cost 9 billion euros; making the equipment will add another 15. There will be salaries to pay - not to mention a hefty electricity bill. For anyone steering such gargantuan projects - and needing to convince governments to pay up - skilful PR is as important as clever theorising or precision engineering.

Genesis shows this public-oriented approach to advantage. The style, tone and difficulty-level of the book are what one might expect in a lecture aimed at a general audience. All that's missing are the PowerPoint slides. There are interesting digressions into other cultural areas - Greek myth, Renaissance art, and so on - sometimes to illustrate important technical ideas such as symmetry, or merely as pleasant asides. Word origins are another humanistic sweetener, though I didn't feel I needed to be told that the discoverer of cosmic inflation coined the word "from the Latin inflare, to inflate, which was already commonplace in economics to describe a steep rise in prices." Still, if a large part of one's professional life involves explaining the importance of quantum field theory to politicians who can barely count, any connection with the inside of their wallets is likely to appeal as a way of bolstering real-world relevance.

Cramming billions of years into a couple of hundred pages inevitably means that much has to be left out, especially when half of the book covers only a few seconds. Readers wishing to know more can, of course, find very many books that go deeper. Genesis is no better or worse than most of those: it summarises the current standard models of fundamental particles and cosmic evolution pleasantly and elegantly. For science itself, the bar is higher. As Professor Tonelli writes, "it is not enough that [a theory] happens to be elegant and enjoys considerable popularity." There also needs to be evidence - and the most exciting kind shows theoretical models to be wrong rather than right. What researchers hope for is the unexpected. What funders prefer is to know the outcome in advance. It's a paradox that top-level scientists like Professor Tonelli negotiate with aplomb.


Stephen Walker, Beyond (Literary Review)

On the morning of April 12, 1961, an orange dot appeared in the sky over the Russian village of Smelkova, growing as it fell. A government official reported what he witnessed. "A spaceship-sputnik landed with cosmonaut Gagarin Yuri Alekseyevich." The official was lying - in fact he was never there. Villagers arriving at the scene found no one inside the scorched metal sphere. Instead Gagarin floated down by parachute, two kilometres away. He made history, but not quite in the way the Soviet Union wanted the world to know. The reason for the fib is just one among many intriguing details in Stephen Walker's extensive, blow-by-blow account of the race to put a human in space.

American efforts lagged far behind Russia's. The paradoxical reason, Walker explains, was U.S. superiority in nuclear weapon design. Their bombs were smaller and lighter, requiring sleeker missiles, while the Soviet Union's needed massive thrust to reach intended targets. For both sides, space exploration was a by-product of military technology, with warheads being replaced by payloads of comparable proportions and inestimable propaganda value.

While Soviet failures could be concealed from the public, American mishaps were often live television events. The first U.S. attempt at a satellite launch, in December 1957, was dubbed "Flopnik" by the press after blowing up on the launch pad. The technology gap narrowed in the following three years, so that when Russian space-dogs Belka and Strelka became the first creatures to return alive from Earth orbit, America soon answered with a chimpanzee. The differing choice of species reflected a further contrast in the two sides' approaches. Chimps were anatomically close to humans, intelligent but potentially unruly. Dogs were passive and could be trained to stay still. U.S. astronauts were intended to be pilots in control of their vehicles; the first Soviet cosmonauts would be passengers whose main duty was to survive.

American hopes were pinned on the Mercury Seven; an elite group of test pilots selected from over five hundred initial candidates. Presented to the public at a press conference, they instantly became celebrities, and were given sports cars to endorse by a shrewd Chevrolet dealer. They also received the attentions of "air hostesses... cocktail waitresses or female contenders for the yearly Miss Orbit award." John Glenn, most moralistic of the seven, warned others to "keep their ‘pants zipped' and not cede moral leadership to the ‘godless communists'."

Their Russian counterparts, later dubbed the Vanguard Six, enjoyed no such glamour, not even being allowed to tell their families what they were training for. Like the Americans, they were chosen for proven achievements as aviators, able to endure extreme physical conditions and mental stress. An additional consideration was size - all had to be small enough to fit inside the capsule. At five foot five, Yuri Gagarin was the shortest of the group, though it was his personal qualities that made him front-runner. As a child he had endured Nazi occupation of his rural village, forced to shiver in a makeshift dugout while soldiers took over the family's humble wooden house. Gagarin saw his younger brother tortured, his older siblings taken for slave labour, and he emerged from the war a stoical patriot, toughened yet apparently unscarred. From the start of his air force career he impressed everyone with his "infectious vitality, his gift for making a pleasing impression, his leadership skills and intelligence, his immaculateness and martial punctuality." It also helped that he was good-looking and an ardent communist from Russian peasant stock - an ideal poster-boy for the Soviet Union.

The first part of Beyond flips back and forth between the American and Russian programmes, digressing into other events preoccupying the minds of leaders on both sides. On the same day that Belka and Strelka took off, CIA pilot Gary Powers was jailed by a Moscow court, his U-2 spy plane having been shot down after photographing the launch site. In the following spring, President Kennedy was scheduling an intervention in Cuba at the same time as final preparations were being made in Russia for the first manned space flight. The Bay of Pigs fiasco began five days after Gagarin's triumph.

While the bigger geopolitical picture is clearly relevant, it does pull us away from the main theme, and into well-trodden territory. The Mercury Seven story is also rather familiar, for instance from Tom Wolfe's The Right Stuff. These however fall from view as Gagarin's launch date approaches, and the narrative becomes a day-by-day, hour-by-hour, then minute-by-minute account of the historic flight. At times the painstaking detail can feel laborious, but the details shed much light on Soviet attitudes.

Russian capsules typically contained a self-destruct device, to be detonated if the craft went off course and landed on enemy territory. Gagarin's lacked one, but only because the capsule was found to be too heavy otherwise. Navigation was supposed to be automatic, however there were emergency manual controls, locked by a three-digit code that could be radioed from Earth - a precaution against any cosmonaut trying to defect. In the event, Gagarin was supplied with the code by technicians aware how impractical the idea was. Also automatic was the ejector seat that shot Gagarin from the plummeting capsule so he could parachute safely to the ground. Why the lie about him being inside the whole time? The official who reported it was the Soviet sports commissar, wishing to claim a new world altitude record. International rules said the "pilot" had to land in the same "vehicle" he went up in. There was no rule about military rank, though, and when Gagarin met his rescuers he learned he had become a major, as well as world hero. There was to be no sports car as reward, instead other luxuries almost as rare: a larger apartment with a refrigerator and washing machine. He didn't have long to enjoy them, though - killed in a training-flight accident in 1968. Beyond tells the full story of the finest two hours of his tragically short life.


Carl Zimmer, Life's Edge (Literary Review)

What is life? The question sounds simple enough, but for centuries, scientists and philosophers have struggled to find a precise answer. Carl Zimmer's engaging and informative book surveys a wide range of suggestions across the ages.

For ancient Romans, a baby's life began with its first breath - so herbally-induced abortion wasn't considered infanticide. Christian theologians maintained that foetuses acquired a soul while still in the womb, but couldn't agree when. Sixteenth-century Italian legislators decided it was forty days after conception; an eighteenth-century British judge said it was when the unborn child first moved.

Some would say that life begins at the moment of conception, but Zimmer argues that conception is a process, not a moment. He notes that a single fertilized egg can develop into a pair of identical twins. "If we must believe that a fertilized egg immediately becomes a person, then we're left to wonder where that person went when it became two people." More rarely, a pair of non-identical embryos can merge to produce a chimera - a person with two distinct genomes. "If every fertilized egg is a single person with all the rights that a single person is entitled to, does a chimera get to have two votes?"

The end of life is blurry too. Cessation of heartbeat was the standard sign until the 1960s, when brain death began to be accepted - but how can you tell exactly when a brain has died? Zimmer relates the heart-rending story of Jahi McMath, a 13-year-old Californian girl who collapsed after botched surgery, was put on a ventilator, and was subsequently declared brain dead. Her family refused to allow the ventilator to be switched off, and following a court battle, flew Jahi to a hospital in New Jersey where she was tube-fed, gained weight, and was eventually discharged to an apartment with round-the-clock care. There she continued to grow, even entering puberty. It seemed that part of her brain was still active, though neurological tests remained negative. Jahi persisted in this state for three more years until her liver failed and she was declared dead for a second and final time.

We might define a living organism as something with the ability to reproduce - but what about computer viruses, or even real ones? They need a host in which to replicate, so we might judge them not really alive as independent entities. Yet something similar can be said about the Amazon mollie, a type of fish that is exclusively female and exists as a "sexual parasite", enlisting males of a different species in order to reproduce.

Like us, and unlike computer viruses, Amazon mollies have DNA - so is that a workable definition? Again there's a problem: DNA life had to start somehow, most likely from microbes with the more primitive RNA molecule. And where did those come from? Zimmer meets veteran biologist David Deamer, who has long advocated volcanic pools as the chemical soup were self-replicating cells first formed. Deamer's rival, Michael Russell, condemns the theory as "fundamentally flawed", favouring hydrothermal vents on the ocean floor. "Both could not be right," says Zimmer - though who's to say that life didn't start off in more than one way?

Any search for life on other planets requires sufficient understanding of what one might be looking for, so a group of NASA scientists came up with a definition. "Life is a self-sustained chemical system capable of undergoing Darwinian evolution." Zimmer raises an obvious objection: is Darwinian evolution the only kind? He lists other definitions, such as, "Life is a metabolic network within a boundary," or one that geneticist Edward Trifonov distilled from over a hundred proposals. "Life is self-reproduction with variations." None of these has gained general acceptance among scientists. "There are as many definitions of life as there are people trying to define it."

Rather than seek a general rule, we could concentrate on particular cases, and a large part of Zimmer's book does exactly that, describing visits to scientists studying snakes, bats and other creatures. Those chapters feel like independently written articles, and something of a digression, but they're interesting in their own right. Arguably the greatest natural wonder Zimmer presents is the slime mould; a creeping, web-like organism that is uncannily good at finding the shortest path to tasty food such as oatmeal. "In one experiment, scientists created a map of the United States, with oatmeal standing in for the biggest cities. The slime moulds built what looked remarkably like the American interstate highway system." Researcher Simon Garnier sees it as a kind of intelligence. "We need an overstuffed brain to do better than random, but cellular waves rolling across a network of tentacles may also suffice. The slime mould is the thing that has pushed this principle as far as possible." The real masters of the universe, perhaps?


Lucy Jane Santos, Half Lives; Liz Heinecke, Radiant; Jeffrey Orens, The Soul of Genius (Wall Street Journal)

Marie Curie holds a special place in Nobel Prize history - not only the first woman to win the prize, but also one of very few people to have been awarded a second. Both were connected with the element radium that she discovered. Three new books tell the story from different angles, and in very different ways. Half Lives is a cultural history of radium, explaining how the radioactive metal captured the imagination - and damaged the health - of unwitting consumers and workers in the early twentieth century. Radiant tells of Curie's friendship with another remarkable woman, the dancer Loïe Fuller, who drew artistic inspiration from radium's eerie glow. The Soul of Genius focuses on the scientific conference where Curie - the only female participant - met with Einstein, while the press ran lurid stories about her love life.

She was born Maria Sklodowska in Russian-controlled Poland in 1867. Unable to get a higher education in her native country, she studied in Paris and met fellow scientist Pierre Curie, whom she married in 1895. Pierre and his brother had discovered piezoelectricity - the phenomenon that would one day make quartz watches possible - and found it could be used to make extremely fine electrical measurements. This proved useful following an accidental discovery by physicist Henri Becquerel, who noticed that uranium salts fogged a photographic plate, as if emitting invisible rays. Marie and Pierre investigated the phenomenon and gave it a name - radioactivity.

The "rays" were mostly particles, spat out from unstable atomic nuclei, but the terminology stuck, and continues to cause anxiety when people confuse microwave radiation with the stuff in nuclear reactors. There were no such fears in the 1890s, only excitement. Through painstaking analysis, the Curies found that uranium-rich minerals harboured tiny quantities of two new, strongly radioactive elements they named polonium (after Marie's homeland) and radium. By 1903, when Becquerel and the Curies were jointly awarded the Nobel Prize for physics, a couple more radioactive elements had been found, but it was radium that caught greatest attention.

According to Lucy Jane Santos' Half Lives, "It is not hard to pinpoint exactly why it was this substance (which Marie Sklodowska Curie referred to as ‘my beautiful radium') that eventually became the focus of public fascination and of entrepreneurial zeal." Radium's half-life of 1,600 years meant it could produce energy at a steady rate, effectively forever. "It was rare and hard to produce but that only seemed to add to its desirability."

The Curies hadn't isolated pure radium, only compounds containing it. Eight tons of waste from a uranium mine at Saint Joachimsthal in Bohemia, and four years of work, yielded one tenth of a gram of radium chloride. What was especially fascinating, Ms Santos writes, was that "while radium chloride looked like common salt in daytime, it actually glowed in the dark." Its radiation ionized the surrounding air, producing a blue halo. Marie described bottles in her darkened laboratory as being "like faint fairy lights." According to Half Lives, "It was light, more than any other aspect, that would become associated with the Curies." A drawing in Vanity Fair portrayed the famous couple inspecting a shining test tube.

Henri Becquerel put a small sample of radium salt in his pocket to take to a scientific meeting. It burned a hole through his waistcoat, and ten days later, a mark appeared on his skin. Pierre Curie had a similar experience, which the two men announced in a joint paper in 1901. Paradoxically, it launched a new form of therapy based, Ms Santos writes, on "the premise that if radium could burn or kill skin it could destroy tumours." Like X-rays, radium was considered a boon for medical science, with the advantage that it required no special machinery. Spring water at Saint Joachimsthal was found to be mildly radioactive, and locals took advantage, producing "radium soaps, radium pastries and radium cigars", which contained no radium at all. The town became a health spa where visitors could enjoy luxurious treatment at the Radium Palace Hotel.

Half Lives digresses a little too far into the history of spa culture, but brings us back to the main theme with the various quack cures and opportunistic tie-ins that radium lent its name to, from hair tonic to toothpaste, and even a radioactive golf ball you could find with a Geiger counter. Most notorious was the addition of radium salts to phosphorescent paint to make it glow longer. In the 1920s it was applied to watch dials by women instructed to lick their brushes to a fine point. Many died or suffered lasting health problems, leading to a lengthy legal battle by some who became known as "Radium girls", recently the subject of an eponymous movie.

The danger had been clear enough to the Curies in 1902, when the American-born dancer Loïe Fuller wrote, wondering if radium could be used to make a glowing costume. The resulting friendship - a minor detail in Ms Santos' entertaining survey of radium's impact - is the focal point of Liz Heinecke's Radiant, a "creative nonfiction" that sits somewhere between dual biography and historical novel. Fuller, all but forgotten nowadays, was as famous in her time as Marie Curie. W.B. Yeats mentioned her in the poem "Nineteen Hundred and Nineteen", confident that readers could visualise her silk wings, lit by coloured lamps from multiple directions. The mesmerising "serpentine dance" Fuller invented can be seen on YouTube, captured on hand-tinted celluloid.

As the Curies were quick to point out, radium was too scarce, expensive and dangerous to use; but Marie and Loïe stayed in touch, pioneers in their respective fields. Fuller easily deserves a book to herself - a star of the Folies Bergère in Paris, she was sketched by Toulouse Lautrec, and was a friend and promoter of the sculptor Rodin. Her experiments in theatrical lighting led her to create her own laboratory; she consulted Edison in America and the astronomer Flammarion in France. She lived in an openly lesbian relationship with a younger woman who dressed exclusively in male clothing, and surrounded herself with a coterie of female dancers she trained. She also helped launch the career of Isadora Duncan, who has remained in public consciousness while Fuller, sadly, has faded.

Given the promising material, it's a shame that Radiant doesn't manage to do it justice. The creative approach is bold, using imaginary dialogue and inner thoughts to flesh out the women's stories, but the narrative tone flips uneasily between novelistic dramatization and biographical info-dumping. Too often, major life events are told as back-story, rather than shown as lived experiences. In a dramatized documentary film, we know when we're seeing an actor and when we're hearing a commentator. In Radiant it's hard to tell, and that makes the book hard to engage with.

A chapter called "The Accident" relates the crisis that occurred in 1906. "Pierre is dead? Dead? Absolutely dead? As Marie said the words out loud, something inside her, closer to her belly than her heart, turned to ice. It couldn't be true. They'd argued a bit that morning, but the day had seemed so perfectly normal." Pierre's death - stumbling under the wheel of a horse-drawn cart - left Marie to raise their two daughters and continue the scientific work. In 1908 she became a professor at the Sorbonne, and in collaboration with André Debierne managed to extract pure radium metal from its salt. Her first Nobel Prize had been for the physics of radioactivity; her second would be for the chemistry of radium. It coincided with a further crisis that forms a major part of Jeffrey Orens' The Soul Of Genius.

A vacant seat had arisen on the illustrious French Academy of Sciences, and Marie Curie offered herself as candidate. Right-wing observers were outraged - not only would she be the first woman elected, she was also a Slav, hence in their eyes racially inferior. In a vitriolic press campaign they supported her opponent, a French-born professor at the Catholic University of Paris, who won narrowly in January 1911.

Meanwhile preparations were underway for the historic conference that is the central theme of The Soul of Genius. The idea came from Walther Nernst, a physicist who, Mr Orens suggests, was largely motivated by hopes of a Nobel Prize. What Nernst wanted was, in modern terms, a workshop, where a small number of leading researchers could thrash out problems raised by the new quantum and relativity theories. Nernst approached an elderly, hugely wealthy industrialist - Ernest Solvay - who was spending his twilight years developing a grand if somewhat vague unified theory of physics and human life. Solvay readily agreed to sponsor the conference - his reward would be a captive audience of the world's finest minds, while they would get a handsome fee and a week at a luxury hotel in Brussels.

Among the eighteen scientists who arrived in October 1911, Marie Curie was arguably the most famous. The youngest was 32-year-old Albert Einstein, who wrote afterwards, "Nothing positive has come out of it… I heard nothing that I had not known before." Nevertheless, the Solvay Conference has gone down in history as a landmark event, the start of a series that continues to the present day.

The scientific discussions were technical, and are not Mr Orens' main concern. What caught public attention was the revelation that Madame Curie was involved in a love affair with a married colleague at the conference, Paul Langevin, whose enraged wife had gone to the press. The story broke just as the conference was ending, then only a few days later came the announcement that Marie was to be awarded the Nobel Prize for chemistry. She was at the eye of a media storm. The newspaper L'Oeuvre printed extracts of stolen love letters, and Langevin challenged the editor to a duel. It took place in a Paris velodrome in November 1911, both men wisely firing into the air.

Curie was due to receive her award the following month, and the Nobel committee were nervous. A lead member wrote saying he'd asked his colleagues what should be done following the "ridiculous" duel, and all had told him it would be best if she stayed away. Curie replied, "I consider that there is no relation between my scientific work and the facts of private life." She went to Stockholm with her sister Bronya and eldest daughter, fourteen-year-old Irène, and received her award on December 10. By the end of the month she was seriously ill in hospital. "She had paid a tremendous cost for being a woman of principle," Mr Orens writes. "It was a price that would never be fully recovered."

The Soul of Genius tells three stories - those of Curie, Einstein and the Solvay Conference - and like the dual-narrative Radiant, feels a little forced in the connections it makes. Curie and Einstein became friends, but were not particularly close - Einstein described her as a cold fish (Häringseele, literally "herring soul"). Both Mr Orens and Ms Heinecke describe how, following the collapse of the affair with Langevin, Curie endured months of depression and ill health, then went to England in the summer of 1912 to stay with the physicist Hertha Ayrton - another outstanding woman with a story of her own. During the First World War, Marie led a program to develop mobile X-ray units -petites Curies - and once again became a hero in the public's eyes. She resumed research on radioactivity after the war, joined by daughter Irène who went on to win a Nobel Prize herself. Both died prematurely - Marie at 66, Irène at 58. The other daughter, Ève, had a safer career as a writer, and lived to 102. Yet according to Lucy Jane Santos' Half Lives, it was not radium that killed Marie Curie. In 1995, disinterred for reburial at the Panthéon, her lead-lined casket was probed and found to have relatively low contamination. "Instead, the facts suggested that her radiation ailments were likely the result of X-ray exposure from her war work." She was a martyr in so many ways.


Alan Lightman, Probable Impossibilities (Wall Street Journal)

Alan Lightman came to prominence in the 1990s with Einstein's Dreams, a novel that drew on his expertise as a theoretical physicist, and showed his gift for elegant prose. Since then, he has produced a steady stream of fiction and non-fiction bridging science, philosophy and the humanities. This latest collection of essays maintains that syncretic spirit, tackling big questions like the origin of the universe and the nature of consciousness, always in an entertaining and easily digestible way.

Professor Lightman certainly has a head for figures. One piece describes a man seeing a woman's smile: "Light reflected from her body instantly enters the pupils of his eyes, at the rate of ten trillion particles of light per second." The number is no throwaway guess - an endnote explains the author's calculation, with assumptions about the brightness of the light, the distance between the people, and so on.

He is not always so precise. "Of course, one would expect most quantum cosmologists to be atheists, like the majority of scientists," he claims, offering no data to back it up. He does however report, "A 2013 Harris poll found that 74 per cent of Americans surveyed believe in God, and 72 percent believe in miracles." The implication seems to be that most Americans aren't scientifically minded, though Professor Lightman did find a prominent scientist, Owen Gingerich, who belongs to the 72 percent. It prompts the author to make another arbitrary guess. "I would estimate that something like 3 to 5 percent of all scientists share Gingerich's view." Quantifying belief is harder than counting light particles.

Professor Lightman can see the dilemma. He describes himself as "a scientist and a materialist," then later adds, "All of that said, I still consider myself a spiritual person." We may be no more than a collection of atoms destined eventually to disperse, but our awareness of existing is something that still defies explanation. The author recalls meeting Robert Desimone, a brain scientist investigating how we are able to pay attention to certain things we see or hear while letting others sink unnoticed. The answer seems to lie in "the synchronized firing of a group of neurons, whose rhythmic electrical activity rises above the background chatter of the vast neuronal crowd." But what makes one group win rather than another? "We don't understand the answer to that yet," Professor Desimone says. The author then raises an even bigger question. "How does a gooey mass of blood, bones, and gelatinous tissue become a sentient being?... How does it develop a self, an ego, an ‘I'?" Professor Desimone responds dismissively, saying further research on "the detailed mechanisms in the brain" will make the problem of consciousness "fade away into irrelevancy and abstraction."

A similar evasiveness crops up when Professor Lightman asks several cosmologists what caused the Big Bang. He quotes Sean Carroll, a notable science populariser in his own right, who says, "In everyday life we talk about cause and effect. But there is no reason to apply that thinking to the universe as a whole. I do not feel in any way unsatisfied by just saying ‘that's the way it is.' "

Most colourful among the cosmologists is Russian-born Andrei Linde of Stanford University, famous for developing the theory of "eternal chaotic inflation", where multiple universes swell like bubbles in an endless foam. He is now in his early seventies, the same age as Professor Lightman, who observes, "Linde does not have a small opinion of himself." His latest passion is photography. "I can produce things that are better than what I see in museums. You see, I am now talking like an arrogant American." Or should that be, arrogant quantum cosmologist? Professor Lightman asks him if he believes the universe is truly infinite. "Do you think dinosaurs truly existed?" Professor Linde replies. Well, fossils are one thing; infinity is harder to grasp, unless you see the universe as a mathematical theory rather than actual stuff.

Professor Lightman shows greater humility than some of his interlocutors, being prepared to admit that certain problems may forever be beyond human understanding. He cites the philosopher Colin McGinn's argument that "it is impossible to understand the phenomenon of consciousness because we cannot get outside of our minds to discuss it." Nor can we get outside the universe; and with only one example to study, it's hard to see if the existence could ever be proved of other, intrinsically unreachable versions within a wider multiverse.

On the other hand, humanity has only had a few thousand years to mull over these questions, often simply coming up with new ways of saying the same old things. Professor Lightman writes approvingly of the Roman poet Lucretius, who believed the universe consisted of atoms perpetually rearranging themselves within an infinite void, following natural laws without the intervention of gods. The same combinations would inevitably recur, along with subtly altered ones, making Lucretius' contemporary, the statesman Cicero, ask if there were countless worlds with "persons in exactly similar spots with our names, our honours, our achievements, our minds, our shapes, our ages, discussing the very same subject?"

One of the best pieces in Professor Lightman's consistently thought-provoking book describes a journey to Memphis following the death of his last surviving parent, "to see one final time the house where we all lived." After a gap of two years, there are still plenty of familiar sights on the way to the house that has already been sold. But when he gets there, something is wrong. "There's a hole in space where the house used to be." It's been demolished, and workers ignore him as they landscape the bare ground. "I look at the men and imagine that I can see right through them as I see through the slab of air where the house used to be." It's a moving meditation on mortality, the passage of time and the inevitability of change. "These guys have no conception of what was once here... They have their own cabinets of memories."

This is the humanist side to balance the scientific one. The materialist Professor Lightman sees no contradiction in also being spiritual, though the term needs clarification. "By spirituality, I mean belief in things that are larger than myself, appreciation of beauty, commitment to certain rules of moral behavior, such as the Golden Rule. Spirituality does not require belief in miracles." What he does believe in is "what one might the Central Doctrine of Science: All properties and events in the physical universe are governed by laws, and those laws hold true at every time and every place in the physical universe... Philosophers debate about whether the ‘laws of nature' are mere descriptions of nature or necessities of nature, the latter being rules that nature must obey without exception. The Central Doctrine of Science, and the view of most scientists, is that the laws are necessities." I don't know if Professor Lightman did a poll on that one, though it's probably true. And it does leave us wondering why the universe is so admirably law-abiding.


Johnjoe McFadden, Life is Simple (Wall Street Journal)

If a friend tells you they've seen a UFO, what would you think? It might have been an alien spacecraft - or perhaps the friend was mistaken. The first possibility requires numerous unproven assumptions about extra-terrestrial technology; the second is consistent with what we know about human fallibility. The fourteenth-century Franciscan friar William of Occam was never troubled by flying saucers, but he did see the importance of eliminating unnecessary assumptions - the principle known as Occam's Razor. It forms the central theme of Johnjoe McFadden's tour through two millennia of scientific discovery.

Mr McFadden is professor of molecular genetics at the University of Surrey in England, and his interest in Occam was sparked by a daily commute which took him past the village of Ockham. This, he learned, was where William was born, probably around the year 1287. Little is known about William's early life, but by his thirties his writings on philosophy and theology were widely read and highly controversial.

Occam challenged the prevailing belief that objects were manifestations of archetypal forms that were themselves real. As Mr McFadden explains, "Cherries were cherries because they shared in the universal of ‘cherryness'." Occam instead proposed what became known as nominalism. "[He] argued that universals are merely the terms that we use to refer to groups of objects." Occam shaved universals out of existence, saying, "It is vain to do with more what can be done with less."

The conventional view was that an object's qualities - such as colour, texture or weight - had a separate existence from the object itself. This was how theologians explained the Eucharist, when bread and wine became the body and blood of Christ. God could transform substance while leaving appearance unchanged. Occam's nominalism cast doubt on that, and in 1324 he was summoned to a papal court in Avignon. The proceedings dragged on for four years until Occam managed to escape to Germany, where he died nineteen years later. The Munich church where he was interred no longer exists, but Mr McFadden reports that near the site there is "a Hotel Occam and a very convivial deli and wine bar called Occam Deli."

Mr McFadden's book is itself a multi-course meal with a variety of ingredients. After the story of Occam's life has been told, the friar's two big ideas - nominalism and the razor - remain as heroes, portrayed as the driving spirit that enabled science to blossom. The author takes us from the epicycles of Ptolemaic astronomy to the multiverse of modern-day superstring theory, via Isaac Newton, Charles Darwin, Albert Einstein and a great deal more, including the influence of Occam's thought on painters, poets, musicians and political philosophers across the centuries. The encyclopaedic range will certainly appeal to readers who want as much information as possible crammed between two covers, though at times during this all-you-can-eat buffet of human knowledge I was reminded of what was said by Mies van der Rohe, Robert Browning, and long before either of them, in different terms, William of Occam: "Less is more."

Occam's nominalism may seem like plain common sense, in contrast to the metaphysical universals he opposed. But rather than cherries, what about science? Is it merely a name we give to a set of activities by people collectively labelled scientists, or a universal principle with absolute standards? Mr McFadden takes the latter view. As he puts it, "Science is simplicity." The Sun-centred cosmology of Copernicus was simpler than Ptolemy's geocentric one, Kepler's three laws of planetary motion were simpler than either, and Newton showed how all three could be explained by a single law of gravitation.

Yet isn't simplicity a matter of opinion? Newtonian physics gave way to the theories of relativity and quantum mechanics. A physicist would say those are simpler because they explain a wider range of phenomena, based on fewer assumptions, and with greater accuracy. A student struggling to understand them might say otherwise. Mr McFadden offers his own way of quantifying simplicity, which he calls Occam's pocket razor. "It counts the number of significant words (excluding articles, conjunctions etc.) required for rival explanations or models and punishes the lengthier by halving its probability for every significant word… According to my pocket calculator, our pocket razor suggests that the heliocentric system is two to the power of seventy or about a million billion fold more likely than the geocentric model."

It is when dealing with biology - his own specialist field - that Mr McFadden is at his most interesting and illuminating. In part it's because the subject is so full of special cases, such as the torpedo fish, named centuries ago for its mysterious ability to induce torpor in its prey. The fish's trick was electric shock, unlike the modern weapon named after it.

By contrast, the coverage of physics and chemistry in Life is Simple feels much like the standard account found in numerous other books. There's only room for so much: Galileo's astronomical discoveries are well covered, but his later downfall is summarised as "a story that has been told many times." That's an unfortunate omission, since Galileo's complicated dealings with the Catholic Church illustrate both the influence of Occam's philosophy and the limitations of his razor.

In 1616, six years after first announcing the astronomical discoveries, Galileo swore an oath before a Vatican official, agreeing not to advocate the Copernican model. Seven years (and two Popes) later he published The Assayer, a book notionally about comets but also containing many other ideas. One was his famous remark about the universe being written in the language of mathematics, another was a theory of atoms. Like his explanation of comets - which he thought were an optical effect - Galileo's idea of "fire-corpuscles" was wrong, but he drew an important conclusion. "To excite in us tastes, odors, and sounds I believe that nothing is required in external bodies except shapes, numbers, and slow or rapid movements. I think that if ears, tongues, and noses were removed, shapes and numbers and motions would remain, but not odors or tastes or sounds. The latter, I believe, are nothing more than names." This was nominalism, as one of Galileo's opponents was quick to inform the Inquisition, who nevertheless took no further action. They had, after all, approved the book for publication, just as they later approved Galileo's Dialogue on the Two Chief World Systems, published in 1632. Galileo wriggled round his earlier oath by putting proof of Earth's motion into the mouth of a fictional character. Never mind that the "proof" - Earth's tides - was wrong; Galileo also put the views of the Pope in the mouth of an Aristotelian stooge named Simplicio. Factions that had long been gunning for Galileo finally had their chance to bring him down. He spent his remaining years under comfortable house arrest where he refined his theory of atoms, which he thought were of no size, held together by suction. A simple theory - and simply wrong.

We would expect a biography of Napoleon to mention defeats as well as victories. Take the same approach with great figures of science, and Occam's Razor no longer looks so sharp, lacking any real predictive power. Life may be simple, but history is complex.

© Andrew Crumey

Thanks to HTML Codex