Marina Benjamin, ROCKET DREAMS. Scotland on Sunday, January 5, 2003. Review by Andrew Crumey.
LIKE me, Marina Benjamin was a child of the 1960s. Like me, she watched the Apollo moon missions and imagined a tantalisingly near future of graceful space stations, Martian colonies and holidays in orbit. And like so many of us would-be space cadets, she wonders where it all went wrong.
It is now 30 years since the last human walked on the lunar surface - Gene Cernan became the 12th member of the world's most exclusive club, of whom only eight are still alive. The most famous, Neil Armstrong, is a virtual recluse. Others got religion or turned to booze as a way of salving their incurable ache for space. One returning astronaut spent weeks and weeks just looking at the sky.
We all know why the space race stopped. Having won the goal set by John F Kennedy, America did not need to pump any more dollars into an effort whose public appeal had largely evaporated even before the first lunar module could blast itself homewards. What we are left with is a sense of nostalgia. Instead of the space age, we now have the cyber age. This cultural shift is what Benjamin investigates here, and the result is both thought-provoking and highly entertaining.
Benjamin is a former New Statesman arts editor, and appears admirably unfazed by the science she presents here, whether it is the logistics of interplanetary travel, or the subtleties of SETI@home - the cottage-industry approach to the search for extra-terrestrial life in which she takes part.
Her travels take her to Roswell - a forlorn piece of small-town America that has been deserted by the army base that was its original reason for existing, and now functions as a tourist attraction for ufologists eager to see the site of the famous "incident", when either a weather balloon or a flying saucer piloted by aliens crashed to Earth in 1947.
Best of all is Benjamin's visit to a convention of celebrity memorabilia hunters in North Hollywood where, for a small fee, forgotten actors offer autographs to anyone who actually knows who they are.
Among this where-are-they-now cast of C-list has-beens in a plasticky Holiday Inn, a sign points the way to 'Walt Cunningham, Dick Gordon, Edgar Mitchell'. Never heard of them? Benjamin has - and she finds them sitting behind a desk while a long line of fans walk reverentially past, paying as much as a thousand dollars for their autographs. These three are moon men, and they still command respect among the initiated.
That mystical awe is what Benjamin investigates so well in this book. She connects it with deep cultural myths about adventure, spiritual growth, escape. Benjamin's previous book, Living At The End Of The World, was an excellent account of millennarian cults, and the same interest in mass psychology informs this latest one.
We have never made it to Mars because, quite simply, there is little reason for going there in person. Life in space, says one commentator, would mean cramped quarters, living in your own sweat, and worst of all, "seeing the same goddamn faces every day." You don't need a billion-dollar rocket for that - the Big Brother house offers it far more cheaply.
Perhaps, then, our future in the cosmos will be a strictly virtual one. For a surprisingly small fee, you can send an e-mail to space, or even your DNA. As Benjamin observes in this astute, at times beautifully written little book, the quest for outer space has mostly been an attempt to find something that in reality lies within ourselves.
Ben Marsden, WATT'S PERFECT ENGINE. Scotland on Sunday, January 26, 2003. Review by Andrew Crumey.
WE ALL know the story. A young James Watt gazed at a simmering kettle, then grew up and replaced the rattling lid with a piston, heralding the age of steam. But how much of this familiar legend is true? Ben Marsden's entertaining and informative little book provides the answer, which as you might guess is: not a lot.
Greenock's most famous son did not invent steam power; he perfected it. Long before Watt's birth in 1736, Thomas Newcomen's 'fire engines' were being used in mines all over Britain, sucking floodwater out of tunnels to keep them operational.
As a young inventor and instrument maker (musical as well as scientific), Watt found himself employed at Glasgow University where his task was to provide the bangs and effects that would draw students - and their fees - to the lectures of luminaries such as Joseph Black, whose theory of latent heat was one of the scientific breakthroughs of the age. As part of this job, Watt was called upon to repair a model Newcomen engine that refused to work. It was this gadget - preserved in the Hunterian Museum - that was to play a bigger role in the coming revolution than any childhood thoughts of kettles.
Another factor was a suggestion made to Watt in 1759 by a talented student, John Robison. Could steam be used to turn the wheels of a carriage? Between them, Watt, Black and Robison would eventually turn such dreams into a reality. Watt did all the most important work and gave due credit to the other two; but Marsden makes it clear Watt's achievement was by no means a flash of solitary inspiration.
Trying to get the model engine to work immersed Watt in truly pioneering research and his careful experiments vindicated Black's theories. The wastefulness of the Newcomen engine was its need to repeatedly cool and re -boil the water. Far better, Watt realised, to use two vessels: one hot, the other cool.
A memorial on Glasgow Green marks the spot where Watt's eureka moment allegedly happened in April 1765. But practical steam power was still a long way off, and the necessary industrial backing came from a Birmingham mill owner with a taste for innovation, who saw how Watt's invention offered an alternative to the unreliable waterwheel driving his factory.
Watt's greatest stroke of genius, perhaps, was the patent he took out in 1769. As Marsden explains, it not only described the prototype steam engine, but also scooped up intellectual property rights on applications Watt was yet to achieve, such as driving a horseless carriage. Watt never solved that problem - hills always beat him, which is why the eventual solution was to bypass existing roads and lay rails instead. But Watt's crafty patent ensured that for decades he alone could develop the new technology. Rival engineers were pursued through the courts, and the punitive royalties they paid made Watt fabulously wealthy. After his death in 1819, Watt became a Victorian icon of power and success. In one sense, he was the Bill Gates of his age: shrewdly cornering a market that turned out to be vast beyond anyone's dreams. Even today, says Marsden, with steam power a distant memory, Watt's name is still on every light bulb.
Joao Magueijo, FASTER THAN THE SPEED OF LIGHT. Scotland on Sunday, March 2, 2003. Review by Andrew Crumey.
BEING a controversial scientist used to mean coming up with a theory that went against conventional wisdom. Joao Magueijo certainly fits that bill, since he reckons Einstein's theory of relativity is wrong. But Magueijo's taste for controversy goes beyond challenging scientific orthodoxy. His fascinating, amusing and at times scurrilous book vilifies colleagues as cheats and "old farts", and describes university administrators in terms that have had them consulting libel lawyers. This 37-year-old Portuguese theoretical physicist is clearly an angry young man. He also emerges as a highly likeable one. And his theory might even be right.
The cornerstone of Einstein's relativity is that the speed of light is constant. On Earth or in another galaxy, at rest or flying through space, light always travels at 186,000 miles per second. That's why you cannot overtake a lightbeam: it never seems to slow down, no matter how hard you try and catch up. Magueijo explains this by means of a dream the young Einstein allegedly had about cows and an electric fence which I have never previously come across: I suspect it belongs only to hearsay. The historical aspects of this book are not its greatest strength. But once we get past such introductory material, the plot thickens intriguingly.
Magueijo accepts Einstein's basic concept, but with a twist. Over billions of years, he suggests, the speed of light has been gradually slowing down. When the universe was born, light's speed was virtually infinite. In the far future, it might dwindle to a snail's pace.
What led Magueijo to this idea is cosmic microwave radiation - the afterglow of the big bang famously photographed by the COBE satellite a decade ago, and now by a successor called MAP whose detailed image was unveiled last month. Both pictures show a universe remarkably uniform in temperature: 2.7 degrees above absolute zero, no matter where in space you look. Twenty years ago, this was a major problem of astrophysics. How could space have cooled so evenly?
An answer was provided by inflation: the idea that the universe was born in a state of tension, forcing it to expand enormously fast. A tiny patch of that primordial universe became everything we see today: hence its uniformity. In America this is the accepted version of events - COBE and MAP are seen as vindication of the hypothesis. But Magueijo - whose book interestingly highlights the social aspects of science - points out that on this side of the Atlantic, inflation has always had its doubters. Not only does it require a new kind of particle - the inflaton - that has yet to be found; but perhaps more importantly, the theory was invented by our "younger cousins across the pond".
Magueijo came upon his own alternative while doing research at Cambridge. He expresses loathing for its stuffy, class-ridden collegiate atmosphere, and incomprehension for the very British phenomenon of inverted snobbery. Escaping by means of solitary walks, Magueijo realised that if light was much faster in the past it could explain the universe's present uniformity. Further work revealed - quite unexpectedly - that it would also account for the universe being "flat", meaning that light beams do not circle the cosmos and return to their starting point. This important bonus gave added credence to a theory Magueijo christened VSL, for "varying speed of light".
Faster Than the Speed of Light is a departure from the style familiar in other science books like Stephen Hawking's A Brief History of Time, Alan Guth's Inflation, or Brian Greene's The Elegant Universe. Magueijo is light on technical details, and occasionally dodgy on astronomical ones (contrary to what he states, Edwin Hubble was not the first person to photograph galaxies). Like Janna Levin's How the Universe Got its Spots, Magueijo's book is about a highly speculative theory, written by a young researcher who evidently knows what it feels like to be out on a limb, cold-shouldered by the scientific establishment. Magueijo discusses his personal life (his partner is a fellow physicist, subjected to blatant sexism over one of her research proposals), and he describes flashes of inspiration to the accompaniment of trance music at beach parties in Goa. He is as cool, hip and media-friendly as Hawking isn't.
All this could make his book insufferable - by his own admission he is a "smart ass" and a "big mouth." But there is enough humour to keep the reader on his side.
Magueijo has little time for the talk of elegance and beauty favoured by devotees of superstrings, which he calls "cosmic pubes." He also derides M -theory - the idea that we live in multidimensional space. Nobody agrees what the M stands for - possibly membrane or even mother. Magueijo suggests "masturbation" as more appropriate.
His greatest barbs, though, are aimed at the administrators of London's Imperial College, where he currently works (though for how much longer is anybody's guess). "Scientific pimps" is the mildest accolade he offers. He describes how he was once required to itemise his work activities every minute for a week as part of the kind of statistical exercise beloved of bureaucrats everywhere. Magueijo produced a Nicholson Baker-style account, including lengthy descriptions of his trips to the toilet, and took the lack of subsequent comment as confirmation that none of it was read.
Having done my own PhD in the theoretical physics department he describes, it is amusing to see how little has changed in nearly 20 years. The well-loved departmental patriarch is still, it seems, just as prone to falling asleep half-way through conversations.
As for VSL, it is gradually acquiring a following; and the reasons shed light on how theories progress from fringe to mainstream. When starlight is split into a spectrum, dark lines indicate chemicals in the star's atmosphere. Examined closely, these lines have a fine structure governed by a quantity related to the speed of light and the strength of electrical forces. Observations of distant galaxies suggest this "fine structure constant" might be changing over time. The evidence is still controversial, but has made VSL respectable. John Barrow - well known for his popular science writing - is one among many physicists now taking the idea seriously.
In the end, VSL might go the same way as countless other speculative ideas in physics. There is no shortage of theories which are revolutionary "if true": but it's that last caveat that really counts, and only improved astronomical data will provide the answer. That will require more money and research grants, dished out by those sedentary old farts Magueijo despises. Let's hope he gets his answer before he turns into one too.
George Johnson, A SHORTCUT THROUGH TIME. Scotland on Sunday, April 13, 2003. Review by Andrew Crumey.
NAPOLEON said the French Revolution began not with storming of the Bastille in 1789, but with the premiere of Beaumarchais' Figaro five years earlier. The greatest revolutions start quietly, with an idea.
George Johnson's fascinating and highly accessible book describes the earliest ramifications of what some think may prove to be the most significant idea of the 21st century: the quantum computer. The experimental successes are still modest, but startling. Single atoms have been floated on laser beams, made to flip like tiny switches. Particles of light have been frozen in their tracks, or even teleported. Most mind-boggling of all, an atom of beryllium has been made to stand in for Schroedinger's legendary cat: "The atom and its doppelganger… momentarily seemed to be in two places at once."
Star Trek is still a long way off; but in the shorter term, quantum computers threaten to destroy internet security. Anyone smart enough to build one could instantly break into every financial institution or government agency in the world. This, more than the promise of matter transporters or holodecks, is what makes it a hot issue with researchers worldwide.
Unpicking the complexities of the subject is not easy, but Johnson has done a fine job of it. A science writer for the New York Times, renowned for previous books including Strange Beauty and Fire in the Mind, Johnson explains from the outset that A Shortcut Through Time will avoid many of the usual conventions of the popular science genre. No vignettes here of colourful personalities and their eureka moments; no potted biographies of scientists in history. The book's human interest comes largely from Johnson's memories of his childhood experiments in electronics, and from his frank admission that he is only a step or two ahead of the reader.
His first task is to explain how conventional computers work, and he does this on a strictly "need to know" basis. The main point is that every computer in the world - whether it is a PC or Pentagon supercomputer - works according to the same principles, set out by Alan Turing more than half a century ago. Turing's imaginary design involved a printer with a long paper tape. Given the right settings and enough tape, his computer could perform any calculation.
Every computer is a "Turing machine", and in most cases the tape and printer are replaced by a silicon chip. The essence of any "classical" computer is its ability to store information: the ones and zeroes that are the binary digits - "bits" for short - of our digital age. Roughly eight million of these make a megabyte: the average MP3 file contains an awful lot of ones and zeroes.
On a CD, the digits are dots that can be read by a laser. Each denotes either a one or a zero; anything else is a mistake. With quantum computers, there is room for a third option: both zero and one at the same time. Instead of bits we have qubits. The idea stems from Schroedinger's cat paradox: the feline in the box, the story goes, is simultaneously dead and alive until the box is opened. It is one, zero, and both: a qubit.
Using cats to store information is impossible, however, because every sane person in the world (except perhaps a few poststructuralist philosophers) believe that even before the box is opened, Schroedinger's cat is definitely either dead or alive, not a mixture of both. It is not the mind of an observer that decides the issue, but all the billions of particles that constantly touch the cat's fur and connect it with the rest of the universe.
For individual atoms, though, the situation is different. If they can truly be isolated from external influences, they can be two different things at once. This is the experiment that was successfully done seven years ago with a beryllium atom.
Since then scientists have been trying to persuade larger collections of atoms to serve as strings of qubits. So far they have made it to seven. If the numbers get into the hundreds or thousands, quantum computers may be able to take on what Johnson calls some of the hardest problems in the universe.
One of these is a problem we all learn in primary school: factorisation. The number 21 can be factorised as 3 times 7; 55 is 5 times 11. Factorising very large numbers is time consuming, because the only way to do it is to guess possible answers and check to see if they work. Even a computer cannot do much better than this, which is why the world's safest codes are based on factorising numbers a hundred or more digits in length. If you do not know one of the factors to begin with, all the computing power in the world will never be able to come up with the answer.
With quantum computers the situation is different. A hundred atomic qubits could effectively run through all the possible factors simultaneously, delivering an answer in no time. The safest codes would be easily breakable. Financial security would still be possible, it turns out, thanks to "quantum money"; though the catch is that you could only use the stuff for a single transaction. Observing it would invalidate it.
In a subject like this, science fact and fiction rapidly blur. Johnson is always admirably clear about separating one from the other. The promise of quantum computing is enormous: but Johnson acknowledges that predicting the future is impossible. He reminds us of a prophecy made in 1949. "Computers in the future may have only 1,000 vacuum tubes and perhaps weigh only half a ton." That was overly cautious; but at the other extreme, claims made at the same time about fusion power, space exploration and artificial intelligence proved to be wildly optimistic.
If it all works, then parents of the future will have an even harder time explaining everyday technology to their kids. "Well you see, honey, we live in an infinite multiverse of alternative realities, and every time you click the mouse you select one of them."
No, I don't understand it either: but Johnson's excellent book is highly recommended. And when the revolution comes, don't say no one warned you.
Leonard Mlodinow, SOME TIME WITH FEYNMAN. Scotland on Sunday, June 29, 2003. Review by Andrew Crumey.
PHYSICIST Richard Feynman became a household name in 1986 after the Challenger disaster. Dipping a rubber O-ring into some iced water during a televised press conference, Feynman dramatically showed how cold weather destroyed the space shuttle.
It illustrated both his down-to-earth manner and his taste for theatre. Feynman's scientific work - on the Manhattan Project and quantum field theory - and his many interests (bongo-playing, lock-picking, Tuvan music) have been described in several biographies. What Leonard Mlodinow offers is a personal and very touching account of his working relationship with Feynman during 1981, when the Nobel Prize-winning scientist was already suffering from the cancer that would kill him eight years later.
Mlodinow arrived as a promising young researcher at the California Institute of Technology, and found himself working alongside not only Feynman, but also another star of physics, Murray Gell-Mann, who invented the theory of quarks. The contrast between the two giants is one of the most entertaining aspects of Mlodinow's highly enjoyable book.
Whereas Feynman was very much a 'regular guy', Gell-Mann, according to Mlodinow, always needed to prove he was the smartest person in the room. An avid linguist, Gell-Mann was once addressed in Mayan by a student curious to find out if even this obscure language was one that he could understand. Gell -Mann said no: he only knew Upper Mayan, and the student had spoken to him in Lower Mayan.
Mlodinow's contrasting characterisation of Feynman and Gell-Mann is all the more effective because of its sketchiness. A few light touches are enough to make the gulf between them very clear. A visiting lecturer tells Feynman and Mlodinow he has been working on the same theory for 12 years. It took a long time for Einstein to be accepted, he says, and one day he, too, will be recognised. Feynman tells him he is wasting his time. "He wanted recognition - I gave it to him," he says to Mlodinow after the lecturer has gone off in a huff. "I recognised him as a pompous ass."
Pricking pomposity was evidently a favourite pastime of Feynman's, but for the arrogant Gell-Mann he had real respect and even affection - which Mlodinow shares - since Gell-Mann at least had something to be arrogant about.
Physics is sketched as lightly here as the personalities, and again this is a plus. The book is more concerned with the process of creation and discovery than with the intricacies of particle theory, making it ideal reading for anyone wanting to savour the scientific lifestyle and mind-set without getting bogged down in technicalities.
Mlodinow's housemate - a Hispanic garbage collector with an interest in drugs and philosophy - adds a welcome outside view of proceedings (and quickly hits it off with Feynman).
Mlodinow also highlights his own dilemmas at the time. His search for a new research project reads very much like a description of writer's block. Feynman serves as counsellor and mentor. And when Mlodinow is told that he also has cancer, the two men have more than physics in common.
This is a warm, delightful glimpse into a fascinating world. However, in the end, Mlodinow said goodbye to it. He went to Hollywood and became a scriptwriter for Star Trek.
John Cornwell, HITLER'S SCIENTISTS. Scotland on Sunday, September 14, 2003. Review by Andrew Crumey.
OF ALL the atrocities committed by the Nazis, perhaps the most chilling are the grotesque 'experiments' carried out on concentration camp victims. The crimes of people such as Josef Mengele - infecting prisoners with malaria, sterilising them with X-rays, injecting dye into their eyeballs - perverted everything that medical science was meant to stand for. Together with the efforts of people such as rocket pioneer Wernher von Braun and nuclear physicist Werner Heisenberg, they gave rise to a post-war image of Nazi Germany as a hotbed of demented scientific genius.
John Cornwell's masterly survey of science under Hitler paints a very different picture: one of inefficiency and ineptitude fostered by supreme arrogance. Germany before 1933 was a world leader in all scientific fields: the first 20 years of the Nobel awards saw Germans take half the prizes. Hitler put an end to that.
Within weeks of taking power, Hitler ordered all Jewish civil servants to be dismissed from their posts. Since academics were classed as civil servants, this meant that universities, at a stroke, lost a large proportion of their finest minds. The physicist Max Planck made an appeal to the Fuehrer, pleading that science without Jews was impossible. Hitler told Planck that if German science could not do without Jews, then Germany would just have to do without science.
Hitler, who fancied himself as an artist, had no interest in science. In fact he was suspicious of any technological advance that might weaken the "master race." He disliked machine guns, since they discouraged hand-to-hand combat. His ideal mode of warfare, one senses, would have been swordplay between knights on horseback.
A scientific illiterate who favoured "intuition" over hard facts, Hitler was an ardent devotee of pseudo-scientific fads. A vegetarian and teetotaller on "medical" rather than moral grounds, he was also a firm believer in astrology, and got his scientific advice not from experts, but from anyone in his entourage who happened to have read a newspaper. From his photographer, Hitler learned of fears that if atoms were ever split, they might ignite the entire Earth. This was enough to put him off atom bombs, though research still went ahead.
Jet technology was equally undervalued. Hitler ordered the new engines to be fitted only to bombers, because his doctor reckoned high-speed flight would render pilots unconscious. In the latter stages of the war, von Braun's V -weapons were lavishly funded, but largely out of desperation. In fact they were highly inefficient people-killers: roughly two Britons died for each hugely expensive V2 launched on the country. Far from being weapons that could have won the war, they were ones that further weakened Germany's over -stretched resources.
We can be grateful that the Nazis had as leader someone so singularly inept in understanding the importance of technology. Churchill, by contrast, was a man who knew his scientific limitations, and delegated matters accordingly.
The Nazi state has been likened to a wheel with Hitler at its hub. Its spokes were the various branches of the military, industry, civil service and Nazi party, all of which had competing interests and had to curry favour with the central power in order to get anything done. The result was needless duplication of effort, and a total lack of large-scale planning. The Manhattan Project - America's atom bomb plan - could never have happened in Germany.
Instead, Werner Heisenberg led a small team in a quest to construct a nuclear reactor that never worked. The Allies greatly over-estimated Heisenberg's progress: a main motivation for the Manhattan Project was determination to beat the Nazis in a race that never really happened. Heisenberg's role has been the subject of endless debate: after the war, he portrayed himself as a hero who could have built a bomb had he wanted to, but who instead made sure the Nazis never got what they wanted. Michael Frayn's play Copenhagen makes much play of Heisenberg's famous uncertainty principle, and the corresponding uncertainty about his wartime motivations.
Cornwell reproduces extracts from secretly recorded conversations between Heisenberg and other German bomb scientists after their capture, which show that while Frayn's even-handedness has obvious dramatic attractions, the evidence points much more towards sheer opportunism on Heisenberg's part. Heisenberg's greatest regret, one senses, was that unlike him, scientists in Britain and America got their sums right.
Arrogance is a recurring theme in Cornwell's meticulously researched account of Nazi science. For example, part of the reason that Bletchley Park code -breakers were able to decipher messages encrypted with the famous Enigma machine was simply that the Germans assumed their code to be unbreakable. In the same way, a Luftwaffe official turned down one proposal for aircraft improvements on the grounds that German aircraft could not be improved.
Hitler largely ignored science, convinced that sheer might and racial superiority were enough to guarantee victory. In the social sciences, notions of "racial hygiene" were used to bolster the myth of Aryan superiority, providing the ideological framework for genocide. Schoolchildren were set the arithmetical problem of working out the cost of feeding and supporting the mentally handicapped or others with "lives not worth living." The solution was carried out with the aid of the best German technology: the manufacturers of death-camp crematoria laboured to produce machines capable of handling 1,000 bodies per day, haggling with the SS over unpaid bills while people were slaughtered.
The lesson would appear to be that dictatorships are not good places for science. Cornwell is drawn to this view, but rightly points out the counter -example of the Soviet Union. The concluding part of his book, in which he tries to find an overall definition of "Nazi science", is in fact the least convincing. So wide is the survey he offers, that no overall theme can emerge other than the time and place where it all happened.
Were the Nazis unique in their disregard for medical ethics? Not at all, says Cornwell: one could cite, among many other examples, the use of American black convicts in post-war drug trials without consent. Cornwell considers racial ideology to be a uniquely Nazi feature, but again this is surely wrong. Colonial genocides of the 19th century were excused by an appeal to Darwinism; Tasmanians killed by British colonists became museum exhibits illustrating white supremacy.
A great many books have been written about all aspects of Nazi science, but Cornwell's great achievement here is to give such a rich, concise overview of an enormous subject. As well as the famous stories, there are many obscure ones, such as the attempts to create "Nazi mathematics", or an Aryan "quantum theory of society".
If there is a single lesson to be drawn from this eminently readable work, it is that great scientists are not necessarily great human beings. When the Jews were dismissed, others quickly took their jobs. Those who made a stand did so indirectly, like the physicist who always carried parcels under his arm so as to avoid having to give the Hitler salute. It was a nice touch, but nowhere near enough.
Basil Mahon, THE MAN WHO CHANGED EVERYTHING. Scotland on Sunday, September 21, 2003. Review by Andrew Crumey.
IN THE early 1900s, a firm of Aberdeen lawyers had the task of paying out dividends to shareholders of the city's Music Hall. Unable to trace one, they put an advert in the newspaper asking the whereabouts of a Mr James Clerk Maxwell. A school inspector contacted them, asking incredulously if they had honestly never heard of the most famous man ever to have walked the streets of Aberdeen. No, they said, they had not. Nor have most people today. Maxwell, the greatest Scottish scientist ever, is hardly known except to specialists. Basil Mahon's short and far from perfect biography at least goes some way towards restoring Maxwell's reputation.
In a lifetime of only 48 years, Maxwell achieved an astonishing amount. He invented colour photography and explained the rings of Saturn. He proved that the air we breathe is made of rapidly moving molecules. He figured out how light works, and why it travels at 300,000 kilometres per second. Most importantly, he predicted the existence of electromagnetic waves. From television and radio to X-ray machines and microwave ovens, the modern world depends on Maxwell's discoveries.
Why is he so neglected? Mahon cites Maxwell's modesty as one factor; he was described by all who knew him as calm, gentle and good-natured. Not the sort of self-promoting academic who nowadays ends up fronting TV series, and in the 19th century became - like Humphrey Davy - a star of the public lecture circuit.
The main reason for Maxwell's relative obscurity, Mahon suggests, is that he was simply too far ahead of his time. When he died in 1879, few understood the importance of his work. Someone who would was Albert Einstein, who used Maxwell's theory as the main pillar of relativity. Yet whatever the reason, Maxwell's story provides a depressing illustration of Scotland's ability to let home-grown talent go unrecognised.
Maxwell was born in Edinburgh in 1831, but grew up at his father's estate in Galloway. An only child, he lost his mother at an early age and was sent to Edinburgh Academy, where his rural accent and eccentric clothing (designed by his father) made him the butt of jokes. Yet Maxwell quickly showed his talent. At the age of 14 he became interested in curves he could draw using a pen held by thread looped round pins. His teacher contacted a mathematician at Edinburgh University who realised Maxwell's work was worth publishing. It became his first scientific paper.
Maxwell entered the university the following year with the plan to study law and follow in his father's professional footsteps; but it was clear that maths and physics were his true vocation. He went on to Cambridge, coming top in the final exams, and from then on his academic career was assured.
The question, though, was where he should pursue it. A professorship became vacant at Aberdeen's Marischal College, and Maxwell was appointed. Four years later, the college was merged with King's to form the new University of Aberdeen. Then, as now, mergers meant job losses, and Maxwell's position was in jeopardy.
He was already a proven scientist of international renown. Yet he was made redundant in favour of his opposite number from King's, who was, says Mahon, "an astute negotiator who had earned the nickname 'Crafty'." When it came to academic politics, Maxwell was no expert. Nor did his ground-breaking research count for much with the university administrators. "Only a few people had any idea of its importance and none of them lived in Aberdeen."
There were not many in Edinburgh either. Maxwell applied for a professorship there and was turned down. In later years he would apply to St Andrews with the same result. So Maxwell did his greatest work in England. At King's College, London, and then at Cambridge, he made discoveries hailed by physicists as on a par with those of Newton and Einstein.
It is in discussing this work that Mahon - himself a scientist - is at his best, providing an accessible account of Maxwell's experimental work on colour vision, and his crowning achievement, the unification of electricity and magnetism in the so-called "Maxwell's equations", which laid the ground for modern attempts to find a unifying "theory of everything." When it comes to Maxwell's life, though, Mahon is inevitably hampered by a frustrating lack of material. What are we to make, for example, of Maxwell's childless marriage to an older woman who became a semi-permanent invalid due to an undiagnosable nervous disorder, and who was disliked by most of Maxwell's friends and family?
Mahon candidly expresses the frustration. Information on Katherine Maxwell is slight; and the waspish opinions expressed by others may have been mere spite. Maxwell doted on her, at one stage sleeping in a chair at her bedside for many nights on end so as not to leave her unattended.
Our portrait of the couple, however, comes largely from a biography written posthumously by a friend, who was keen to portray the Maxwells as good Victorians. Following his sudden death from cancer in 1879, the highest tribute paid to Maxwell by his doctor was that he had been "a most perfect example of a Christian Gentleman".
The same sanitised aura hangs over parts of Mahon's account, particularly in dealing with Maxwell's childhood, where the cosy tone is cloyingly reminiscent of a "lad o' pairts" tale from Samuel Smiles.
Yet despite these reservations, any biography of Maxwell is worthwhile as a way of keeping alive the memory of someone whose story deserves to be retold to successive generations. The Scots are terribly good at making a tremendous fuss over local heroes whose place in the greater scheme of things is actually very small. Maxwell was the reverse. While his own country ignored him, he received honours from all over Europe and America. After his death, the waves he predicted were experimentally demonstrated by Heinrich Hertz in Germany. They were put to practical use by Marconi, who - unlike Maxwell - knew all about the benefits of PR.
Maxwell is buried with his wife and parents at Parton Church in Galloway, a quiet and beautiful spot. Only a simple plaque at the graveyard entrance hints at what a remarkable man he was. To see his true memorial, look at any TV set. Without his discoveries, it could not have been invented.
Adrian Fort, PROF: THE LIFE OF FREDERICK LINDEMANN. Scotland on Sunday, November 2, 2003. Review by Andrew Crumey.
PRIME ministerial reliance on unelected advisers is nothing new. Churchill had Frederick Lindemann, a scientific genius whose brusque manner, German origins and sheer power made him an object of suspicion and envy. A Machiavellian figure in the background of countless books about the war effort, he takes centre stage in Adrian Fort's intriguing biography.
His German father worked in England for Siemens and became a millionaire through a lucky investment. His mother was an heiress, so Frederick was born in 1886 to a life of affluence. Educated mostly in Germany, he attended Max Planck's physics lectures in Berlin, and by his early twenties was at the forefront of research, numbering Einstein among his friends.
The First World War necessitated a hasty retreat from Germany - much to Lindemann's annoyance, as he had just reached the finals of a tennis tournament. He devoted himself to aircraft research, learned to fly and undertook potentially suicidal experiments aimed at understanding why aircraft sometimes spun out of control.
His heroism caught the attention of Churchill, who was aviation minister. Churchill was spellbound by Lindemann's "beautiful mind" and was to draw on it constantly. As soon as Churchill entered Chamberlain's war Cabinet, Lindemann found himself at the centre of power. As Lord Cherwell, he became Paymaster General in Churchill's own Cabinet.
Churchill admired Lindemann's genius; others saw him as being impossible to work with. Sarcastic and withering in his handling of anyone he considered his intellectual inferior, he also harboured personal grudges and animosities for years. Such traits are often evidence of an inferiority complex; but what could Lindemann possibly feel inferior about?
Fort offers two answers: his German origins, and the fact that he was a scientist in a world of snooty classicists. Lindemann felt the need to be more English than the English, and more cultured than the historians and linguists who decried science at the high table of his Oxford college.
Lindemann's deeper personality, though, remains unfathomable. He never married, having as his companion a loyal, Jeeves-like valet who had to run the bath to a depth of exactly six inches. The people who called Lindemann "Prof" were friends only in the sense that they were not his enemies.
Lindemann has been blamed for the strategy that razed Hamburg and Dresden. Fort exonerates him, but concedes that Lindemann's undervaluing of the V -weapon threat was a mistake. He amply illustrates the huge influence Lindemann exerted - accompanying Churchill to Potsdam and Yalta, even in the absence of the Foreign Secretary; almost usurping Rab Butler's authority as post-war Chancellor; and finally setting up the Atomic Energy Authority before his death in 1957. Yet the man remains an enigma; a first-rate mind and a thoroughly disagreeable personality.
David Foster Wallace, EVERYTHING AND MORE. Scotland on Sunday, December 14, 2003. Review by Andrew Crumey.
DAVID Foster Wallace is an American novelist and short story writer who has channelled his creative energy into producing the worst book on mathematics I have ever read.
His theme is infinity - a fascinating subject covered in countless popular books, a recent example being Brian Clegg's Brief History Of Infinity. The story as usually told begins with Zeno's paradoxes and moves on to Georg Cantor's discovery that some infinities are bigger than others.
David Foster Wallace tells a different story. His begins more or less in the middle of one of the countless maths books he has spent the past year or two trying to understand - a task in which he appears to have succeeded, though only to the extent that it all made sense to him at the time. In retelling it, what comes out is a mess. I have a PhD in mathematical physics, and I was confused by this book.
Wallace is best known for his novel Infinite Jest, whose title appears to be his main qualification for having written this latest tome. In his novel, and in short story collections such as Girl with Curious Hair, Wallace has displayed great originality, with an off-beat style pervaded by knowingness and intelligence. He is smart, cool, hip.
It is these very qualities that are so disastrous in Everything And More. The book is novelistic to the extent that Wallace adopts, at least superficially, the narrative voice of a professional mathematician, casually spattering his pages with technical symbolism that is left largely unexplained. This in itself would be enough to make the text incomprehensible to anyone not already armed with university-level calculus.
But to make matters worse, Wallace structures his book as an endless series of digressions and footnotes. Fine if you want to write a latter-day Tristram Shandy, but not recommended for a reworking of Euclid's Elements. Mathematicians have spent thousands of years trying to make their subject accessible. Wallace prefers obscurity.
The overall tone is a head-clutching zany-neurotic bewilderment that could almost be a nightmare penned by the figure in Edvard Munch's Scream - which is how I felt, too. The nearest parallel I can think of is Edgar Allen Poe's Eureka, in which the great writer delved into cosmology with brain-pounding results. But at least Poe was trying to state an original theory. Wallace's story has been written by others already: why the need for this useless addition?
No prizes for guessing the answer. Wallace is a 'name', and his book will sell on the strength of it - which just goes to show that finding a good book about mathematics these days is like trying to find a decent plumber: there are a lot of cowboys out there. Wallace is the maths teacher from hell, making everybody around him feel really dumb as he effortlessly displays all those fancy equations he loves to quote. If he does a promotional book tour, I demand he be made to rewrite some of them from memory, and explain what they are supposed to mean.
Thanks to HTML Codex