Friday, October 17, 2008

Was Darwin Wrong?

By David Quammen

Evolution by natural selection, the central concept of the life's work of Charles Darwin, is a theory. It's a theory about the origin of adaptation, complexity, and diversity among Earth's living creatures. If you are skeptical by nature, unfamiliar with the terminology of science, and unaware of the overwhelming evidence, you might even be tempted to say that it's "just" a theory. In the same sense, relativity as described by Albert Einstein is "just" a theory. The notion that Earth orbits around the sun rather than vice versa, offered by Copernicus in 1543, is a theory. Continental drift is a theory. The existence, structure, and dynamics of atoms? Atomic theory. Even electricity is a theoretical construct, involving electrons, which are tiny units of charged mass that no one has ever seen. Each of these theories is an explanation that has been confirmed to such a degree, by observation and experiment, that knowledgeable experts accept it as fact. That's what scientists mean when they talk about a theory: not a dreamy and unreliable speculation, but an explanatory statement that fits the evidence. They embrace such an explanation confidently but provisionally—taking it as their best available view of reality, at least until some severely conflicting data or some better explanation might come along.
 
The rest of us generally agree. We plug our televisions into little wall sockets, measure a year by the length of Earth's orbit, and in many other ways live our lives based on the trusted reality of those theories.
 
Evolutionary theory, though, is a bit different. It's such a dangerously wonderful and far-reaching view of life that some people find it unacceptable, despite the vast body of supporting evidence. As applied to our own species, Homo sapiens, it can seem more threatening still. Many fundamentalist Christians and ultraorthodox Jews take alarm at the thought that human descent from earlier primates contradicts a strict reading of the Book of Genesis. Their discomfort is paralleled by Islamic creationists such as Harun Yahya, author of a recent volume titled The Evolution Deceit, who points to the six-day creation story in the Koran as literal truth and calls the theory of evolution "nothing but a deception imposed on us by the dominators of the world system." The late Srila Prabhupada, of the Hare Krishna movement, explained that God created "the 8,400,000 species of life from the very beginning," in order to establish multiple tiers of reincarnation for rising souls. Although souls ascend, the species themselves don't change, he insisted, dismissing "Darwin's nonsensical theory."
 
Other people too, not just scriptural literalists, remain unpersuaded about evolution. According to a Gallup poll drawn from more than a thousand telephone interviews conducted in February 2001, no less than 45 percent of responding U.S. adults agreed that "God created human beings pretty much in their present form at one time within the last 10,000 years or so." Evolution, by their lights, played no role in shaping us.
 
Only 37 percent of the polled Americans were satisfied with allowing room for both God and Darwin—that is, divine initiative to get things started, evolution as the creative means. (This view, according to more than one papal pronouncement, is compatible with Roman Catholic dogma.) Still fewer Americans, only 12 percent, believed that humans evolved from other life-forms without any involvement of a god.
 
The most startling thing about these poll numbers is not that so many Americans reject evolution, but that the statistical breakdown hasn't changed much in two decades. Gallup interviewers posed exactly the same choices in 1982, 1993, 1997, and 1999. The creationist conviction—that God alone, and not evolution, produced humans—has never drawn less than 44 percent. In other words, nearly half the American populace prefers to believe that Charles Darwin was wrong where it mattered most.
 
Why are there so many antievolutionists? Scriptural literalism can only be part of the answer. The American public certainly includes a large segment of scriptural literalists—but not that large, not 44 percent. Creationist proselytizers and political activists, working hard to interfere with the teaching of evolutionary biology in public schools, are another part. Honest confusion and ignorance, among millions of adult Americans, must be still another. Many people have never taken a biology course that dealt with evolution nor read a book in which the theory was lucidly explained. Sure, we've all heard of Charles Darwin, and of a vague, somber notion about struggle and survival that sometimes goes by the catchall label "Darwinism." But the main sources of information from which most Americans have drawn their awareness of this subject, it seems, are haphazard ones at best: cultural osmosis, newspaper and magazine references, half-baked nature documentaries on the tube, and hearsay.
 
Evolution is both a beautiful concept and an important one, more crucial nowadays to human welfare, to medical science, and to our understanding of the world than ever before. It's also deeply persuasive—a theory you can take to the bank. The essential points are slightly more complicated than most people assume, but not so complicated that they can't be comprehended by any attentive person. Furthermore, the supporting evidence is abundant, various, ever increasing, solidly interconnected, and easily available in museums, popular books, textbooks, and a mountainous accumulation of peer-reviewed scientific studies. No one needs to, and no one should, accept evolution merely as a matter of faith.
 
Two big ideas, not just one, are at issue: the evolution of all species, as a historical phenomenon, and natural selection, as the main mechanism causing that phenomenon. The first is a question of what happened. The second is a question of how. The idea that all species are descended from common ancestors had been suggested by other thinkers, including Jean-Baptiste Lamarck, long before Darwin published The Origin of Species in 1859. What made Darwin's book so remarkable when it appeared, and so influential in the long run, was that it offered a rational explanation of how evolution must occur. The same insight came independently to Alfred Russel Wallace, a young naturalist doing fieldwork in the Malay Archipelago during the late 1850s. In historical annals, if not in the popular awareness, Wallace and Darwin share the kudos for having discovered natural selection.

The gist of the concept is that small, random, heritable differences among individuals result in different chances of survival and reproduction—success for some, death without offspring for others—and that this natural culling leads to significant changes in shape, size, strength, armament, color, biochemistry, and behavior among the descendants. Excess population growth drives the competitive struggle. Because less successful competitors produce fewer surviving offspring, the useless or negative variations tend to disappear, whereas the useful variations tend to be perpetuated and gradually magnified throughout a population.
 
So much for one part of the evolutionary process, known as anagenesis, during which a single species is transformed. But there's also a second part, known as speciation. Genetic changes sometimes accumulate within an isolated segment of a species, but not throughout the whole, as that isolated population adapts to its local conditions. Gradually it goes its own way, seizing a new ecological niche. At a certain point it becomes irreversibly distinct—that is, so different that its members can't interbreed with the rest. Two species now exist where formerly there was one. Darwin called that splitting-and-specializing phenomenon the "principle of divergence." It was an important part of his theory, explaining the overall diversity of life as well as the adaptation of individual species.
 
This thrilling and radical assemblage of concepts came from an unlikely source. Charles Darwin was shy and meticulous, a wealthy landowner with close friends among the Anglican clergy. He had a gentle, unassuming manner, a strong need for privacy, and an extraordinary commitment to intellectual honesty. As an undergraduate at Cambridge, he had studied halfheartedly toward becoming a clergyman himself, before he discovered his real vocation as a scientist. Later, having established a good but conventional reputation in natural history, he spent 22 years secretly gathering evidence and pondering arguments—both for and against his theory—because he didn't want to flame out in a burst of unpersuasive notoriety. He may have delayed, too, because of his anxiety about announcing a theory that seemed to challenge conventional religious beliefs—in particular, the Christian beliefs of his wife, Emma. Darwin himself quietly renounced Christianity during his middle age, and later described himself as an agnostic. He continued to believe in a distant, impersonal deity of some sort, a greater entity that had set the universe and its laws into motion, but not in a personal God who had chosen humanity as a specially favored species. Darwin avoided flaunting his lack of religious faith, at least partly in deference to Emma. And she prayed for his soul.

In 1859 he finally delivered his revolutionary book. Although it was hefty and substantive at 490 pages, he considered The Origin of Species just a quick-and-dirty "abstract" of the huge volume he had been working on until interrupted by an alarming event. (In fact, he'd wanted to title it An Abstract of an Essay on the Origin of Species
and Varieties Through Natural Selection, but his publisher found that insufficiently catchy.) The alarming event was his receiving a letter and an enclosed manuscript from Alfred Wallace, whom he knew only as a distant pen pal. Wallace's manuscript sketched out the same great idea—evolution by natural selection—that Darwin considered his own. Wallace had scribbled this paper and (unaware of Darwin's own evolutionary thinking, which so far had been kept private) mailed it to him from the Malay Archipelago, along with a request for reaction and help. Darwin was horrified. After two decades of painstaking effort, now he'd be scooped. Or maybe not quite. He forwarded Wallace's paper toward publication, though managing also to assert his own prior claim by releasing two excerpts from his unpublished work. Then he dashed off The Origin, his "abstract" on the subject. Unlike Wallace, who was younger and less meticulous, Darwin recognized the importance of providing an edifice of supporting evidence and logic.

The evidence, as he presented it, mostly fell within four categories: biogeography, paleontology, embryology, and morphology. Biogeography is the study of the geographical distribution of living creatures—that is, which species inhabit which parts of the planet and why. Paleontology investigates extinct life-forms, as revealed in the fossil record. Embryology examines the revealing stages of development (echoing earlier stages of evolutionary history) that embryos pass through before birth or hatching; at a stretch, embryology also concerns the immature forms of animals that metamorphose, such as the larvae of insects. Morphology is the science of anatomical shape and design. Darwin devoted sizable sections of The Origin of Species to these categories.
 
Biogeography, for instance, offered a great pageant of peculiar facts and patterns. Anyone who considers the biogeographical data, Darwin wrote, must be struck by the mysterious clustering pattern among what he called "closely allied" species—that is, similar creatures sharing roughly the same body plan. Such closely allied species tend to be found on the same continent (several species of zebras in Africa) or within the same group of oceanic islands (dozens of species of honeycreepers in Hawaii, 13 species of Galápagos finch), despite their species-by-species preferences for different habitats, food sources, or conditions of climate. Adjacent areas of South America, Darwin noted, are occupied by two similar species of large, flightless birds (the rheas, Rhea americana and Pterocnemia pennata), not by ostriches as in Africa or emus as in Australia. South America also has agoutis and viscachas (small rodents) in terrestrial habitats, plus coypus and capybaras in the wetlands, not—as Darwin wrote—hares and rabbits in terrestrial habitats or beavers and muskrats in the wetlands. During his own youthful visit to the Galápagos, aboard the survey ship Beagle, Darwin himself had discovered three very similar forms of mockingbird, each on a different island.
 
Why should "closely allied" species inhabit neighboring patches of habitat? And why should similar habitat on different continents be occupied by species that aren't so closely allied? "We see in these facts some deep organic bond, prevailing throughout space and time," Darwin wrote. "This bond, on my theory, is simply inheritance." Similar species occur nearby in space because they have descended from common ancestors.
 
Paleontology reveals a similar clustering pattern in the dimension of time. The vertical column of geologic strata, laid down by sedimentary processes over the eons, lightly peppered with fossils, represents a tangible record showing which species lived when. Less ancient layers of rock lie atop more ancient ones (except where geologic forces have tipped or shuffled them), and likewise with the animal and plant fossils that the strata contain. What Darwin noticed about this record is that closely allied species tend to be found adjacent to one another in successive strata. One species endures for millions of years and then makes its last appearance in, say, the middle Eocene epoch; just above, a similar but not identical species replaces it. In North America, for example, a vaguely horselike creature known as Hyracotherium was succeeded by Orohippus, then Epihippus, then Mesohippus, which in turn were succeeded by a variety of horsey American critters. Some of them even galloped across the Bering land bridge into Asia, then onward to Europe and Africa. By five million years ago they had nearly all disappeared, leaving behind Dinohippus, which was succeeded by Equus, the modern genus of horse. Not all these fossil links had been unearthed in Darwin's day, but he captured the essence of the matter anyway. Again, were such sequences just coincidental? No, Darwin argued. Closely allied species succeed one another in time, as well as living nearby in space, because they're related through evolutionary descent.
 
Embryology too involved patterns that couldn't be explained by coincidence. Why does the embryo of a mammal pass through stages resembling stages of the embryo of a reptile? Why is one of the larval forms of a barnacle, before metamorphosis, so similar to the larval form of a shrimp? Why do the larvae of moths, flies, and beetles resemble one another more than any of them resemble their respective adults? Because, Darwin wrote, "the embryo is the animal in its less modified state" and that state "reveals the structure of its progenitor."
 
Morphology, his fourth category of evidence, was the "very soul" of natural history, according to Darwin. Even today it's on display in the layout and organization of any zoo. Here are the monkeys, there are the big cats, and in that building are the alligators and crocodiles. Birds in the aviary, fish in the aquarium. Living creatures can be easily sorted into a hierarchy of categories—not just species but genera, families, orders, whole kingdoms—based on which anatomical characters they share and which they don't.
 
All vertebrate animals have backbones. Among vertebrates, birds have feathers, whereas reptiles have scales. Mammals have fur and mammary glands, not feathers or scales. Among mammals, some have pouches in which they nurse their tiny young. Among these species, the marsupials, some have huge rear legs and strong tails by which they go hopping across miles of arid outback; we call them kangaroos. Bring in modern microscopic and molecular evidence, and you can trace the similarities still further back. All plants and fungi, as well as animals, have nuclei within their cells. All living organisms contain DNA and RNA (except some viruses with RNA only), two related forms of information-coding molecules.

Such a pattern of tiered resemblances—groups of similar species nested within broader groupings, and all descending from a single source—isn't naturally present among other collections of items. You won't find anything equivalent if you try to categorize rocks, or musical instruments, or jewelry. Why not? Because rock types and styles of jewelry don't reflect unbroken descent from common ancestors. Biological diversity does. The number of shared characteristics between any one species and another indicates how recently those two species have diverged from a shared lineage.

That insight gave new meaning to the task of taxonomic classification, which had been founded in its modern form back in 1735 by the Swedish naturalist Carolus Linnaeus. Linnaeus showed how species could be systematically classified, according to their shared similarities, but he worked from creationist assumptions that offered no material explanation for the nested pattern he found. In the early and middle 19th century, morphologists such as Georges Cuvier and Étienne Geoffroy Saint-Hilaire in France and Richard Owen in England improved classification with their meticulous studies of internal as well as external anatomies, and tried to make sense of what the ultimate source of these patterned similarities could be. Not even Owen, a contemporary and onetime friend of Darwin's (later in life they had a bitter falling out), took the full step to an evolutionary vision before The Origin of Species was published. Owen made a major contribution, though, by advancing the concept of homologues—that is, superficially different but fundamentally similar versions of a single organ or trait, shared by dissimilar species.
 
For instance, the five-digit skeletal structure of the vertebrate hand appears not just in humans and apes and raccoons and bears but also, variously modified, in cats and bats and porpoises and lizards and turtles. The paired bones of our lower leg, the tibia and the fibula, are also represented by homologous bones in other mammals and in reptiles, and even in the long-extinct bird-reptile Archaeopteryx. What's the reason behind such varied recurrence of a few basic designs? Darwin, with a nod to Owen's "most interesting work," supplied the answer: common descent, as shaped by natural selection, modifying the inherited basics for different circumstances.

Vestigial characteristics are still another form of morphological evidence, illuminating to contemplate because they show that the living world is full of small, tolerable imperfections. Why do male mammals (including human males) have nipples? Why do some snakes (notably boa constrictors) carry the rudiments of a pelvis and tiny legs buried inside their sleek profiles? Why do certain species of flightless beetle have wings, sealed beneath wing covers that never open? Darwin raised all these questions, and answered them, in The Origin of Species. Vestigial structures stand as remnants of the evolutionary history of a lineage.
 
Today the same four branches of biological science from which Darwin drew—biogeography, paleontology, embryology, morphology—embrace an ever growing body of supporting data. In addition to those categories we now have others: population genetics, biochemistry, molecular biology, and, most recently, the whiz-bang field of machine-driven genetic sequencing known as genomics. These new forms of knowledge overlap one another seamlessly and intersect with the older forms, strengthening the whole edifice, contributing further to the certainty that Darwin was right.
 
He was right about evolution, that is. He wasn't right about everything. Being a restless explainer, Darwin floated a number of theoretical notions during his long working life, some of which were mistaken and illusory. He was wrong about what causes variation within a species. He was wrong about a famous geologic mystery, the parallel shelves along a Scottish valley called Glen Roy. Most notably, his theory of inheritance—which he labeled pangenesis and cherished despite its poor reception among his biologist colleagues—turned out to be dead wrong. Fortunately for Darwin, the correctness of his most famous good idea stood independent of that particular bad idea. Evolution by natural selection represented Darwin at his best—which is to say, scientific observation and careful thinking at its best.
 
Douglas Futuyma is a highly respected evolutionary biologist, author of textbooks as well as influential research papers. His office, at the University of Michigan, is a long narrow room in the natural sciences building, well stocked with journals and books, including volumes about the conflict between creationism and evolution. I arrived carrying a well-thumbed copy of his own book on that subject, Science on Trial: The Case for Evolution. Killing time in the corridor before our appointment, I noticed a blue flyer on a departmental bulletin board, seeming oddly placed there amid the announcements of career opportunities for graduate students. "Creation vs. evolution," it said. "A series of messages challenging popular thought with Biblical truth and scientific evidences." A traveling lecturer from something called the Origins Research Association would deliver these messages at a local Baptist church. Beside the lecturer's photo was a drawing of a dinosaur. "Free pizza following the evening service," said a small line at the bottom. Dinosaurs, biblical truth, and pizza: something for everybody.
 
In response to my questions about evidence, Dr. Futuyma moved quickly through the traditional categories—paleontology, biogeography—and talked mostly about modern genetics. He pulled out his heavily marked copy of the journal Nature for February 15, 2001, a historic issue, fat with articles reporting and analyzing the results of the Human Genome Project. Beside it he slapped down a more recent issue of Nature, this one devoted to the sequenced genome of the house mouse, Mus musculus. The headline of the lead editorial announced: "HUMAN BIOLOGY BY PROXY." The mouse genome effort, according to Nature's editors, had revealed "about 30,000 genes, with 99% having direct counterparts in humans."

The resemblance between our 30,000 human genes and those 30,000 mousy counterparts, Futuyma explained, represents another form of homology, like the resemblance between a five-fingered hand and a five-toed paw. Such genetic homology is what gives meaning to biomedical research using mice and other animals, including chimpanzees, which (to their sad misfortune) are our closest living relatives.
 
No aspect of biomedical research seems more urgent today than the study of microbial diseases. And the dynamics of those microbes within human bodies, within human populations, can only be understood in terms of evolution.

Nightmarish illnesses caused by microbes include both the infectious sort (AIDS, Ebola, SARS) that spread directly from person to person and the sort (malaria, West Nile fever) delivered to us by biting insects or other intermediaries. The capacity for quick change among disease-causing microbes is what makes them so dangerous to large numbers of people and so difficult and expensive to treat. They leap from wildlife or domestic animals into humans, adapting to new circumstances as they go. Their inherent variability allows them to find new ways of evading and defeating human immune systems. By natural selection they acquire resistance to drugs that should kill them. They evolve. There's no better or more immediate evidence supporting the Darwinian theory than this process of forced transformation among our inimical germs.
 
Take the common bacterium Staphylococcus aureus, which lurks in hospitals and causes serious infections, especially among surgery patients. Penicillin, becoming available in 1943, proved almost miraculously effective in fighting staphylococcus infections. Its deployment marked a new phase in the old war between humans and disease microbes, a phase in which humans invent new killer drugs and microbes find new ways to be unkillable. The supreme potency of penicillin didn't last long. The first resistant strains of Staphylococcus aureus were reported in 1947. A newer staph-killing drug, methicillin, came into use during the 1960s, but methicillin-resistant strains appeared soon, and by the 1980s those strains were widespread. Vancomycin became the next great weapon against staph, and the first vancomycin-resistant strain emerged in 2002. These antibiotic-resistant strains represent an evolutionary series, not much different in principle from the fossil series tracing horse evolution from Hyracotherium to Equus. They make evolution a very practical problem by adding expense, as well as misery and danger, to the challenge of coping with staph.

The biologist Stephen Palumbi has calculated the cost of treating penicillin-resistant and methicillin-resistant staph infections, just in the United States, at 30 billion dollars a year. "Antibiotics exert a powerful evolutionary force," he wrote last year, "driving infectious bacteria to evolve powerful defenses against all but the most recently invented drugs." As reflected in their DNA, which uses the same genetic code found in humans and horses and hagfish and honeysuckle, bacteria are part of the continuum of life, all shaped and diversified by evolutionary forces.
 
Even viruses belong to that continuum. Some viruses evolve quickly, some slowly. Among the fastest is HIV, because its method of replicating itself involves a high rate of mutation, and those mutations allow the virus to assume new forms. After just a few years of infection and drug treatment, each HIV patient carries a unique version of the virus. Isolation within one infected person, plus differing conditions and the struggle to survive, forces each version of HIV to evolve independently. It's nothing but a speeded up and microscopic case of what Darwin saw in the Galápagos—except that each human body is an island, and the newly evolved forms aren't so charming as finches or mockingbirds.

Understanding how quickly HIV acquires resistance to antiviral drugs, such as AZT, has been crucial to improving treatment by way of multiple drug cocktails. "This approach has reduced deaths due to HIV by severalfold since 1996," according to Palumbi, "and it has greatly slowed the evolution of this disease within patients."
 
Insects and weeds acquire resistance to our insecticides and herbicides through the same process. As we humans try to poison them, evolution by natural selection transforms the population of a mosquito or thistle into a new sort of creature, less vulnerable to that particular poison. So we invent another poison, then another. It's a futile effort. Even DDT, with its ferocious and long-lasting effects throughout ecosystems, produced resistant house flies within a decade of its discovery in 1939. By 1990 more than 500 species (including 114 kinds of mosquitoes) had acquired resistance to at least one pesticide. Based on these undesired results, Stephen Palumbi has commented glumly, "humans may be the world's dominant evolutionary force."

Among most forms of living creatures, evolution proceeds slowly—too slowly to be observed by a single scientist within a research lifetime. But science functions by inference, not just by direct observation, and the inferential sorts of evidence such as paleontology and biogeography are no less cogent simply because they're indirect. Still, skeptics of evolutionary theory ask: Can we see evolution in action? Can it be observed in the wild? Can it be measured in the laboratory?
 
The answer is yes. Peter and Rosemary Grant, two British-born researchers who have spent decades where Charles Darwin spent weeks, have captured a glimpse of evolution with their long-term studies of beak size among Galápagos finches. William R. Rice and George W. Salt achieved something similar in their lab, through an experiment involving 35 generations of the fruit fly Drosophila melanogaster. Richard E. Lenski and his colleagues at Michigan State University have done it too, tracking 20,000 generations of evolution in the bacterium Escherichia coli. Such field studies and lab experiments document anagenesis—that is, slow evolutionary change within a single, unsplit lineage. With patience it can be seen, like the movement of a minute hand on a clock.

Speciation, when a lineage splits into two species, is the other major phase of evolutionary change, making possible the divergence between lineages about which Darwin wrote. It's rarer and more elusive even than anagenesis. Many individual mutations must accumulate (in most cases, anyway, with certain exceptions among plants) before two populations become irrevocably separated. The process is spread across thousands of generations, yet it may finish abruptly—like a door going slam!—when the last critical changes occur. Therefore it's much harder to witness. Despite the difficulties, Rice and Salt seem to have recorded a speciation event, or very nearly so, in their extended experiment on fruit flies. From a small stock of mated females they eventually produced two distinct fly populations adapted to different habitat conditions, which the researchers judged "incipient species."
 
After my visit with Douglas Futuyma in Ann Arbor, I spent two hours at the university museum there with Philip D. Gingerich, a paleontologist well-known for his work on the ancestry of whales. As we talked, Gingerich guided me through an exhibit of ancient cetaceans on the museum's second floor. Amid weird skeletal shapes that seemed almost chimerical (some hanging overhead, some in glass cases) he pointed out significant features and described the progress of thinking about whale evolution. A burly man with a broad open face and the gentle manner of a scoutmaster, Gingerich combines intellectual passion and solid expertise with one other trait that's valuable in a scientist: a willingness to admit when he's wrong.
 
Since the late 1970s Gingerich has collected fossil specimens of early whales from remote digs in Egypt and Pakistan. Working with Pakistani colleagues, he discovered Pakicetus, a terrestrial mammal dating from 50 million years ago, whose ear bones reflect its membership in the whale lineage but whose skull looks almost doglike. A former student of Gingerich's, Hans Thewissen, found a slightly more recent form with webbed feet, legs suitable for either walking or swimming, and a long toothy snout. Thewissen called it Ambulocetus natans, or the "walking-and-swimming whale." Gingerich and his team turned up several more, including Rodhocetus balochistanensis, which was fully a sea creature, its legs more like flippers, its nostrils shifted backward on the snout, halfway to the blowhole position on a modern whale. The sequence of known forms was becoming more and more complete. And all along, Gingerich told me, he leaned toward believing that whales had descended from a group of carnivorous Eocene mammals known as mesonychids, with cheek teeth useful for chewing meat and bone. Just a bit more evidence, he thought, would confirm that relationship. By the end of the 1990s most paleontologists agreed.

Meanwhile, molecular biologists had explored the same question and arrived at a different answer. No, the match to those Eocene carnivores might be close, but not close enough. DNA hybridization and other tests suggested that whales had descended from artiodactyls (that is, even-toed herbivores, such as antelopes and hippos), not from meat-eating mesonychids.

In the year 2000 Gingerich chose a new field site in Pakistan, where one of his students found a single piece of fossil that changed the prevailing view in paleontology. It was half of a pulley-shaped anklebone, known as an astragalus, belonging to another new species of whale.
 
A Pakistani colleague found the fragment's other half. When Gingerich fitted the two pieces together, he had a moment of humbling recognition: The molecular biologists were right. Here was an anklebone, from a four-legged whale dating back 47 million years, that closely resembled the homologus anklebone in an artiodactyls. Suddenly he realized how closely whales are related to antelopes.

This is how science is supposed to work. Ideas come and go, but the fittest survive. Downstairs in his office Phil Gingerich opened a specimen drawer, showing me some of the actual fossils from which the display skeletons upstairs were modeled. He put a small lump of petrified bone, no longer than a lug nut, into my hand. It was the famous astragalus, from the species he had eventually named Artiocetus clavis. It felt solid and heavy as truth.

Seeing me to the door, Gingerich volunteered something personal: "I grew up in a conservative church in the Midwest and was not taught anything about evolution. The subject was clearly skirted. That helps me understand the people who are skeptical about it. Because I come from that tradition myself." He shares the same skeptical instinct. Tell him that there's an ancestral connection between land animals and whales, and his reaction is: Fine, maybe. But show me the intermediate stages. Like Charles Darwin, the onetime divinity student, who joined that round-the –world voyage aboard the Beagle instead of becoming a country parson, and whose grand view of life on Earth was shaped by attention to small facts, Phil Gingerich is a reverant empiricist. He's not satisfied until he sees solid data. That's what excites his so much about pulling shale fossils out of the ground. In 30 years he has seen enough to be satisfied. For him, Gingerich said, it's "a spiritual experience."

"The evidence is there," he added. "It's buried in the rocks of ages." 

Republished from National Geographic Magazine

Thursday, October 16, 2008

Fish With First Neck Evolved Into Land Animal -- Slowly

James Owen
for National Geographic News
October 15, 2008


The skull of a 375-million-year-old walking fish reveals new clues to how our fish ancestors evolved into land dwellers.

The fossil fish—called Tiktaalik roseae—was discovered in the Canadian Arctic in 2004 and provides the 'missing link' between fish and land vertebrates, according to scientists. It's also the proud owner of the world's first known neck.

read more at National Geographic News

Tuesday, October 14, 2008

Amnesty condemns Saudi executions

By Christian Fraser

BBC Middle East correspondent

Executions in Saudi Arabia are being carried at an average rate of more than two a week, according to a new report by Amnesty International.

The human rights group says the rate of executions in the Kingdom has increased markedly in recent years.

In their report, they say foreign nationals bear the brunt of executions.

Saudi Arabia is also one of the few remaining countries to execute people for crimes they committed while under the age of 18.

On Friday in downtown Riyadh the crowds gather at Justice Square, outside the grand mosque. It is a place Westerners have dubbed "Chop Chop Square".

On the stage, awaiting the blade of the scimitar, stands the condemned. The death penalty in Saudi Arabia is beheading under the law of the sharia.

'Secret trials'

Although the Kingdom refuses to provide official statistics on how many people it kills each year, Amnesty International has recorded at least 1,695 executions between 1985 and May 2008.

Of these, 830 were foreign nationals - a highly disproportionate figure since foreigners only make up about a quarter of the country's population.

The rate of execution has increased, says the charity in its report "Affront to justice: Death penalty in Saudi Arabia", following an extension of the death penalty to crimes for drugs offences and corruption.

According to the report, the trials are often held secretly, foreigners would be unable to understand the proceedings because routinely they are denied access to a lawyer and, in some cases, they have no idea they have even been convicted.

Six Somalis beheaded this year were only told they were to be killed on the morning of their execution.

Confessions are usually extracted through torture, ranging from cigarette burns, to electric shocks, nail-pulling, beatings and threats to family members, Amnesty says.

It adds that, while pardons are sometimes granted, Saudi nationals are eight times more likely to escape execution through the payment of a diya or "blood money".

Republished from BBC News

Sunday, October 12, 2008

Does science make belief in God obsolete? – Conversation 3

A series of conversations among leading scientists and scholars.

Conversation 3 - Pervez Amirali Hoodbhoy


 

Not necessarily.

But you must find a science-friendly, science-compatible God. First, try the pantheon of available Creators. Inspect thoroughly. If none fits the bill, invent one.

The God of your choice must be a stickler for divine principles. Science does not take kindly to a deity who, if piqued or euphoric, sets aside seismological or cosmological principles and causes the moon to shiver, the earth to split asunder, or the universe to suddenly reverse its expansion. This God must, among other things, be stoically indifferent to supplications for changing local meteorological conditions, the task having already been assigned to the discipline of fluid dynamics. Therefore, indigenous peoples, even if they dance with great energy around totem poles, shall not cause even a drop of rain to fall on parched soil. Your rule-abiding and science-respecting God equally well dispenses with tearful Christians singing the Book of Job, pious Hindus feverishly reciting the havan yajna, or earnest Muslims performing the salat-i-istisqa as they face the Holy Ka'aba. The equations of fluid flow, not the number of earnest supplicants or quality of their prayers, determine weather outcomes. This is slightly unfortunate because one could imagine joining the faithful of all religions in a huge simultaneous global prayer that wipes away the pernicious effects of anthropogenic global climate change.

Your chosen God cannot entertain private petitions for good health and longevity, prevent an air crash, or send woe upon demand to the enemy. Mindful of microbiology and physiology, She cannot cure leprosy by dipping the afflicted in rivers or have humans remain in unscathed condition after being devoured by a huge fish. Faster-than-light travel is also out of the question, even for prophets and special messengers. Instead, She must run the world lawfully and unto the letter, closely following the Book of Nature.

A scientific Creator should certainly know an awful lot of science. To differentiate between the countless universes offered by superstring theory is a headache. Fine-tuning chemistry to generate complex proteins, and then initiating a cascade of mutations that turn microbe to man, is also no trivial matter. But bear in mind that there are definite limits to divine knowledge: God can know only the knowable. Omniscience and science do not go well with each other.

The difficulty with omniscience—even with regard to a particle as humble as the electron—has been recognized as an issue since the 1920s. Subatomic particles show a vexing, subtle elusiveness that defeats even the most sophisticated effort to measure certain of their properties. Unpredictability is intrinsic to quantum mechanics, the branch of physics which all particles are empirically seen to obey. This discovery so disturbed Albert Einstein that he rejected quantum mechanics, pronouncing that God could not "play dice with the universe." But it turned out that Einstein's objections were flawed—uncertainty is deeply fundamental. Thus, any science-abiding deity we choose may be incompletely informed on at least some aspects of nature.

Is one being excessively audacious, perhaps impertinent, in setting down terms of reference for a Divine entity? Not really. Humans have always chosen their objects of worship. Smarter humans go for smarter Gods. Anthropomorphic representations—such as a God with octopus arms—are a bit out of fashion today but were enormously popular just a few centuries ago. As well, some people might object to binding God and human to the same rules of logic, or perhaps even sharing the same space-time manifold. But if we drop this essential demand then little shall remain. Reason and evidence would lose meaning and be replaced by tradition, authority, and revelation. It would then be wrong for us to have 2 + 2 = 5, but okay for God. Centuries of human progress would come to naught.

Let's face it: the day of the Sky God is long gone. In the Age of Science, religion has been downsized, and the medieval God of classical religions has lost repute and territory. Today people pay lip service to trusting that God but they still swallow antibiotics when sick. Muslim-run airlines start a plane journey with prayers but ask passengers to buckle-up anyway, and most suspect that people who appear to rise miraculously from the dead were probably not quite dead to begin with. These days if you hear a voice telling you to sacrifice your only son, you would probably report it to the authorities instead of taking the poor lad up a mountain. The old trust is disappearing.

Nevertheless, there remains the tantalizing prospect of a divine power somewhere "out there" who runs a mysterious, but scrupulously miracle-free, universe. In this universe, God may choose to act in ingenious ways that seem miraculous. Yet these "miracles" need not violate physical laws. Extraordinary, but legitimate, interventions in the physical world permit quantum tunneling through cosmic worm holes or certain symmetries to snap spontaneously. It would be perfectly fair for a science-savvy God to use nonlinear dynamics so that tiny fluctuations quickly build up to earthshaking results—the famous "butterfly effect" of deterministic chaos theory.

Nietzsche and the theothanatologists were plain wrong—God is neither dead nor about to die. Even as the divine habitat shrinks before the aggressive encroachment of science, the quantum foam of space-time creates spare universes aplenty, offering space both for a science-friendly God as well as for self-described "deeply religious non-believers" like Einstein. Many eminent practitioners of science have successfully persuaded themselves that there is no logical contradiction between faith and belief by finding a suitable God, or by clothing a traditional God appropriately. Unsure of why they happen to exist, humans are likely to scour the heavens forever in search of meaning.

Pervez Amirali Hoodbhoy is chairman of the department of physics at Quaid-e-Azam University in Islamabad, Pakistan, and is the author of Islam and Science: Religious Orthodoxy and the Battle for Rationality.

Saturday, October 11, 2008

In death's shadow

Jul 24th 2008
From The Economist print edition


“CAN a person who is Muslim choose a religion other than Islam?” When Egypt’s grand mufti, Ali Gomaa, pondered that dilemma in an article published last year, many of his co-religionists were shocked that the question could even be asked.

And they were even more scandalised by his conclusion. The answer, he wrote, was yes, they can, in the light of three verses in the Koran: first, “Unto you your religion, and unto me my religion”; second, “Whosoever will, let him believe, and whosoever will, let him disbelieve”; and, most famously, “There is no compulsion in religion.”

The sheikh’s pronouncement was certainly not that of a wet liberal; he agrees that anyone who deserts Islam is committing a sin and will pay a price in the hereafter, and also that in some historical circumstances (presumably war between Muslims and non-Muslims) an individual’s sin may also amount to “sedition against one’s society”. But his opinion caused a sensation because it went against the political and judicial trends in many parts of the Muslim world, and also against the mood in places where Muslims feel defensive.

In the West, many prominent Muslims would agree with the mufti’s scripturally-based view that leaving Islam is a matter between the believer and God, not for the state. But awkwardly, the main traditions of scholarship and jurisprudence in Islam—both the Shia school and the four main Sunni ones—draw on Hadiths (words and deeds ascribed with varying credibility to Muhammad) to argue in support of death for apostates. And in recent years sentiment in the Muslim world has been hardening. In every big “apostasy” case, the authorities have faced pressure from sections of public opinion, and from Islamist factions, to take the toughest possible stance.

In Malaysia, people who try to desert Islam can face compulsory “re-education”. Under the far harsher regime of Afghanistan, death for apostasy is still on the statute book, despite the country’s American-backed “liberation” from the tyranny of the Taliban. The Western world realised this when Abdul Rahman, an Afghan who had lived in Germany, was sentenced to die after police found him with a Bible. After pressure from Western governments, he was allowed to go to Italy. What especially startled Westerners was the fact that Afghanistan’s parliament, a product of the democracy for which NATO soldiers are dying, tried to bar Mr Rahman’s exit, and that street protests called for his execution.

The fact that he fled to Italy is one of the factors that have made the issue of Muslim-Christian conversion a hot topic in that country. There are several others. During this year’s Easter celebrations, Magdi Allam, an Egyptian-born journalist who is now a columnist in Italy, was publicly baptised as a Catholic by Pope Benedict; the convert hailed his “liberation” from Islam, and has used his column to celebrate other cases of Muslims becoming Christian. To the delight of some Catholics and the dismay of others, he has defended the right of Christians to proselytise among Muslims, and denounced liberal churchmen who are “soft” on Islam.

Muslims in Italy and elsewhere have called Mr Allam a provocateur and chided Pope Benedict for abetting him. But given that many of Italy’s Muslims are converts (and beneficiaries of Europe’s tolerance), Mr Allam says his critics are hypocrites, denying him a liberty which they themselves have enjoyed.

If there is any issue on which Islam’s diaspora—experiencing the relative calmness of inter-faith relations in the West—might be able to give a clearer moral lead, it is surely this one. But even in the West, speaking out for the legal and civil right to “apostasise” can carry a cost. Usama Hasan, an influential young British imam, recently made the case for the right to change religions—only to find himself furiously denounced and threatened on Islamist websites, many of them produced in the West.

Friday, October 10, 2008

Moroccan theologian: Muslim girls can wed at nine

Sheikh Maghraoui reiterates his claims are based on Prophet Mohammad’s sayings.


RABAT - A Moroccan theologian repeated his claims that Muslim girls could marry as early as nine years old, arguing it was sanctioned by the Prophet Mohammed.

"The marriage of nine-year-old girls is not forbidden because according to the Hadith (the Prophet Mohammed's sayings), Mohammed married Aisha when she was only seven-years-old and he consummated his union when she was nine," wrote Sheikh Mohamed Ben Abderrahman Al-Maghraoui on his website (Maghrawi.net).

"I am a confirmed theologian and I have not made this up. It is the prophet who said it before me," said the Marrakesh-based founder of a religious association.

"Those who criticise me, like the press or Moroccan television as well as the lawyer who filed the complaint (against me), are part of a secular attack against the Islamic nation and its theologians," he added.

Earlier this month, Rabat-based lawyer Mourad Bekkouri filed a complaint against Sheikh Maghraoui and his fatwa, which he said damages children's human rights, and the family and criminal code by increasing the risk of rape.

He said the theologian is undermining Islam and its followers and that he had requested the state prosecutor to speed up the case.

His views were backed by the left-wing newspaper Al Ittihad Al Ichtiraki, which claimed that "vicious theologians are today capable of putting religion in the service of paedophilia."

The views of Maghraoui, which contradict the teachings of mainstream Islam and the interpretation of the life Prophet Mohammed, has sparked anger and criticism among pious Muslims, who accused him of deliberately attempting to distort Islam.

Republished from: Middle East Online

Saturday, October 4, 2008

Saudi cleric favours one-eye veil

A Muslim cleric in Saudi Arabia has called on women to wear a full veil, or niqab, that reveals only one eye.

Sheikh Muhammad al-Habadan said showing both eyes encouraged women to use eye make-up to look seductive.

The question of how much of her face a woman should cover is a controversial topic in many Muslim societies.

The niqab is more common in Saudi Arabia and the Gulf, but women in much of the Muslim Middle East wear a headscarf which covers only their hair.

Sheikh Habadan, an ultra-conservative cleric who is said to have wide influence among religious Saudis, was answering questions on the Muslim satellite channel al-Majd.

Republished from: http://news.bbc.co.uk/2/hi/middle_east/7651231.stm

Rushdie unrepentant about Satanic Verses

1st October 2008

Twenty years after the publication of the book that almost cost him his life, Sir Salman Rushdie is still glad that he wrote The Satanic Verses.

In the second of a series of interviews with leading cultural figures filmed exclusively for The Times, he tells Clive James that he “wouldn’t not have wanted” to be the writer asking the big questions about religion and civilisation posed by the book.

His remarks are uncomfortably pertinent, coming at a time when Muslim extremists have again driven a literary figure into hiding. This time the victim is Martin Rynja, a Dutch-born London publisher who had agreed to release The Jewel of Medina, a controversial novel by Sherry Jones about the Prophet Muhammad’s relationship with his nine-year-old bride, Aisha.

Mr Rynja’s home in Islington was firebombed in the early hours on Saturday. Undercover police tipped him off hours earlier and arrested three men from East London.

Rushdie criticised Random House, his own publisher, in August for refusing to publish the book in the United States , calling it “censorship by fear.”

The interview stretches beyond the fatwa against Rushdie. It ranges from the partition of India to how he played air-guitar Elvis on a squash racket when a child in Bombay.

Rushdie says he is an atheist who finds dead religions “much more attractive” but says he has nothing against true believers until their faith spills over into the public sphere and becomes “my business”.

The Times first reviewed The Satanic Verses 20 years ago today. The review carried no hint of the controversy to come but praised the book as “better than Midnight’s Children”, Rushdie’s Booker Prize-winning second novel.

The first sign of serious trouble came four days later when India banned the book after complaints that it was offensive to Muslims and protests began in Muslim communities around the world.

In February 1989 Ayatollah Khomeini, then supreme leader of Iran, issued a fatwa calling on all Muslims to murder Rushdie, and the writer went into hiding for the best part of ten years.

Rushdie says: “The question I’m always asking myself is: are we masters or victims? Do we make history or does history make us? Do we shape the world or are we just shaped by it? The question of do we have agency in our lives or whether we are just passive victims of events is, I think, a great question and one that I have always tried to ask. In that sense I wouldn’t not have wanted to be the writer that asked it.”

During his time in hiding there were explosions at bookshops in London, York and High Wycombe, the book’s Japanese translator was stabbed to death, its Italian translator survived a stabbing, its Norwegian publisher narrowly escaped an attempt on his life and 37 people died after a gang set fire to a Turkish hotel where the Turkish translator was staying (he survived).

The writer is more relaxed about his security today but the fatwa cannot be annulled, and when he was knighted last year protests in Pakistan and Malaysia called for his death. No wonder Rushdie prefers “dead religions”.

Republished from: http://entertainment.timesonline.co.uk/tol/arts_and_entertainment/books/clive_james/article4856150.ece

Thursday, October 2, 2008

Saudi Arabia: Shia Minority Treated as Second-Class Citizens

Wahhabi Authorities Discriminate Against Ismaili Citizens

(London, September 22, 2008) – The Saudi government should end its systematic discrimination against its Ismaili religious minority, Human Rights Watch said in a report released today. Human Rights Watch called upon the government to set up a national institution empowered to recommend remedies for discriminatory policies and responding to individual claims.

The 90-page report, “The Ismailis of Najran: Second-Class Saudi Citizens,” based on more than 150 interviews and reviews of official documents, documents a pattern of discrimination against the Ismailis in the areas of government employment, education, religious freedom, and the justice system.

“The Saudi government preaches religious tolerance abroad, but it has consistently penalized its Ismaili citizens for their religious beliefs,” said Joe Stork, deputy Middle East director at Human Rights Watch. “The government should stop treating Ismailis as second-class in employment, the justice system, and education.”

At least several hundred thousand, and perhaps as many as 1 million, Ismailis live in Saudi Arabia, part of the Shia minority in the Sunni-dominated country of 28 million. Most Ismailis live in Najran province, on Saudi Arabia’s southwestern border with Yemen, where tensions have been growing in recent years.

Saudi Arabia conquered Najran following a brief war with Yemen in 1934, incorporating into the kingdom the local Sulaimani Ismailis, one strand of Ismaili belief. Najran has been home to the highest Sulaimani Ismaili cleric, the Absolute Guide, since the 17th century.

Despite more than 70 years of shared history, Saudi authorities at the highest levels continue to propagate hate speech against this religious minority. In April 2007, the Council of Senior Religious Scholars, the body tasked with officially interpreting Islamic faith, ritual, and law, termed Ismailis “corrupt infidels, debauched atheists.” In August 2006, Saudi Arabia’s highest judge, Shaikh Salih al-Luhaidan, declared to an audience of hundreds that Ismailis “outwardly appear Islamic, but inwardly, they are infidels.” Other Saudi officials did not rebut or disown those statements.

Growing tension since the mid-1990s between Ismailis and Najran’s governor, Prince Mish’al bin Sa’ud bin Abd al-‘Aziz, led to clashes in April 2000, after the authorities arrested an Ismaili cleric they accused of “sorcery.” Security forces arrested hundreds of Ismailis, and tortured and secretly tried dozens of others. The authorities then purged some 400 Ismailis from the local bureaucracy.

Since then, local officials who have been sent to Najran from other parts of the country and reflecting the country’s dominant conservative Wahhabi Muslim ideology, have continued to discriminate against Ismailis in employment, education and the justice system, and interfered with their ability to practice their religion.

Only one of the 35 department heads of the Najran provincial government is an Ismaili. Almost no Ismailis work as senior security personnel or as religion teachers. Saudi textbooks teach that the Ismaili faith is a sin of “major polytheism,” tantamount to excommunication. Wahhabi teachers in Najran insult Ismaili pupils’ faith and try to convert them to Sunni Islam, even using threats of class failure and flogging.

Ismailis are not free to pass their religious teachings on to new generations. The authorities have at times exiled the Absolute Guide from Najran or placed him under house arrest. Saudi authorities also ban the import or production of Ismaili religious literature. Ismailis face obstacles in obtaining permits to build new mosques or expand existing ones, whereas the state funds and builds Sunni mosques in Najran, even in areas without a Sunni population.

The country’s Sharia judges, following Wahhabi beliefs, routinely discriminate against Ismailis on the basis of their faith. In March 2006, a judge annulled the marriage of an Ismaili man to a Sunni woman, saying that the man lacked religious qualification. In May 2006, another judge barred an Ismaili lawyer from representing his Sunni client.

“State-sponsored and officially tolerated discrimination against the Ismailis of Najran seriously threatens their identity and denies them basic rights,” Stork said. “The authorities are shutting them out from education, government employment, and professions.”

In July 2008, King Abdullah opened a well-publicized interfaith conference in Spain initiated by Saudi Arabia and attended by Muslim, Jewish, Christian, Hindu, and Buddhist religious leaders.

“The measure of Saudi religious tolerance will be its practice at home, not only what it preaches abroad,” Stork said.

Republished from:
http://www.hrw.org/english/docs/2008/09/22/saudia19804.htm

Wednesday, October 1, 2008

Pakistan's rising 'Taliban' hits women's health

Christina Lamb and Mohammed Shehzad, Peshawar

MALE doctors and technicians have been banned from carrying out ultrasound examinations and using electrocardiographs (ECG) on female patients by the Islamist government of Pakistan’s North West Frontier province in its latest step towards “Talibanisation”.

The ban effectively excludes all women from undergoing such crucial medical examinations as the province has only one female ECG technician and none trained in ultrasound.

“We think that men could derive sexual pleasure from women’s bodies while conducting ECG or ultrasound,” explained Maulana Gul Naseeb Khan, the provincial general secretary of the Muttahida Majlis-e-Amal (MMA), the six-party religious alliance which now governs the North West Frontier.

“Similarly some women could lure men under the pretext of ECG or ultrasound. Therefore to uphold the supreme values of Islam, the MMA has decided to impose the ban in line with the May 8 resolution of the province’s assembly that nothing repugnant to Islam will be allowed.”

The ban is the latest in a series of Taliban-style measures imposed by the MMA since its surprising victory last October after General Pervez Musharraf, Pakistan’s president, came under western pressure to allow elections.

The clerics have already banned public dancing and music, kite flying and satellite television. They have closed cinemas, photographic shops and beauty parlours, and have torn down billboards displaying female images.

The bans are disturbingly similar to those imposed by the Taliban regime in Afghanistan, which was intent on turning the clock back to the time of the prophet Muhammad in the 7th century.

Several MMA leaders were strong backers of the Taliban, running madrassas, or religious schools, that provided fighters to help Mullah Muhammad Omar, the Taliban leader.

In June the frontier government imposed sharia, or Islamic law, with penalties such as stoning to death, in defiance of Musharraf who denounced the so-called Talibanisation of his country and attacked MMA attempts to make women wear head-to-toe veils.

“This reflects shallow-mindedness,” he said in an interview. “We do not want a backward and intolerant Islam. If we follow the intolerant version of Islam we cannot progress.”

However, it was the Pakistan military that for years encouraged the growth of Islamic militant groups, using them to fight in Kashmir, or to provoke sectarian clashes, which military rulers used as an excuse to prolong their stay in power.

It is widely believed that the MMA was given a helping hand in last year’s elections, which backfired when the strong anti-Americanism in the province resulted in the religious parties doing far better than expected.

The province borders Afghanistan and there has been widespread outrage at Pakistan allowing American forces onto their land to search for Osama Bin Laden and members of the Al-Qaeda network.

The bazaars of the old city of Peshawar, the provincial capital, sell posters of Bin Laden and sweets and dates in boxes bearing his picture.

The ban on men carrying out ultrasound and ECGs means that women are now forced to travel outside the province to Rawalpindi, a long and costly journey, requiring their husbands to miss at least a day off work to accompany them.

“The so-called apostles of Islam are unaware of the facts and figures related to the dismal state of health in our country,” complained one surgeon in Peshawar.

He pointed out that not only are there no female ultrasound technicians in the North West Frontier province but also no female anaesthetists, dispensers or theatre technicians.

“Women are operated on in the presence of male staff. They are anaesthetised and undressed for surgery by men. Even if female surgeons operate on them, male staff are indispensable during the operation. If we started following the MMA’s orders then all the women will die in the operating theatre,” said the surgeon.

Human rights activists are concerned that the ban will result in more stillborn babies and deaths in pregnancy. Pakistan already has one of the world’s highest rates of deaths in pregnancy, with an estimated 30,000 women dying in childbirth each year.

Republished from: http://www.timesonline.co.uk/tol/news/world/article1060071.ece