Book Read Free

The Doomsday Handbook

Page 4

by Alok Jha


  “Dust from a radiological weapon would remain trapped for extended periods in cracks and crevices on the surfaces of buildings, sidewalks and streets, and some would have been swept into the interiors of buildings,” said Levi and Kelly. “Certain materials that could be used in a radiological attack, such as cesium-137, chemically bind to glass, concrete and asphalt. More than 15 years after the 1986 Chernobyl disaster, in which a Soviet nuclear power plant underwent a meltdown, cesium is still affixed to the sidewalks of many Scandinavian cities that were downwind of the disaster.”

  Getting hold of radiological material would not be difficult today. In hospitals, radium and caesium are used as sources of radiation in cancer treatments. And weapons-grade plutonium or spent nuclear fuel has become increasingly available on the black market since the end of the Cold War led to the break-up of the Soviet Union.

  A biological dirty bomb would aim to spread viruses or bacteria into a population. Explosives are unlikely to be involved here, as any blast would kill the bug the terrorist wanted to spread. Instead, it could be released in an invisible cloud of particles on to a train in rush-hour traffic, or added as contamination at some point in the food chain. The technology here is not new: in 1923, scientists affiliated to the French Naval Chemical Research Laboratory detonated pathogen bombs over animals in a field at Sevran-Livry, just outside Paris, killing many of their test subjects.

  The trick for would-be biological terrorists is to weaponize their bug efficiently, making it into a form that is easily spreadable. Anthrax spores are suitable here, since they can be dried and milled into precise sizes, for example, that will lodge in people’s lungs long enough to be deadly.

  * * *

  1995 SUBWAY SARIN INCIDENT

  12 dead

  5,000 injured

  * * *

  Chemical terrorism involves releasing into the air toxic materials that can kill fast. In 1995, the Japanese cult Aum Shinrikyo released a nerve gas called sarin on to the Tokyo subway—12 people died and 5,000 had to be treated in hospital.

  Another potent neurotoxin is ricin, which forms as a by-product when castor beans are turned into castor oil. A dose the size of a grain of salt is enough to kill an adult. Infection results in fever, nausea and abdominal pain, and victims can die of multiple organ failure within a few days of exposure.

  The severity of an attack depends on the sophistication of the people undertaking it—it only takes a competent chemist to create vats of a toxic nerve agent, and there are plenty of competent chemists in the world. In the case of the Aum Shinrikyo attack on Tokyo, the terrorists seemed to have planned in a hurry—their batch of sarin was impure and their method of dispersal was to puncture a bag with the tip of an umbrella. Had they been better organized, the attack would probably have killed several thousand more people.

  Health protection officials in biohazard suits attend to the Tokyo Metro in the wake of a release of the nerve gas, sarin, by the Aum Shinrikyo terrorist group.

  How hard is it to make a dirty bomb?

  It depends on what the terrorist wants to use. Many natural diseases could be employed as biological weapons, for example, including plague, botulism and tularaemia (a plague-like disease).

  The one people worry about is anthrax, a disease caused by Bacillus anthracis, a bacterium that mainly affects cattle and sheep but can also infect people. The external form of the disease causes sores, but the pneumonic form can kill 90 percent of those it infects if left untreated. Victims develop the pneumonic form by breathing in fewer than 10,000 spores of anthrax between one and five micrometers in size. These spores pass through the lining of the lungs and travel to the lymph nodes, releasing poisonous chemicals as they go. Symptoms include vomiting and fever, and if antibiotics are not available, untreated patients will die of hemorrhage, respiratory failure or toxic shock within a few days.

  * * *

  All they would need to do to infect millions of people would be to drop a few hundred kilograms of the stuff on to a city from a low-flying aircraft.

  * * *

  If terrorists could get hold of anthrax spores and grind them to the right size, all they would need to do to infect millions of people would be to drop a few hundred kilograms of the stuff on to a city from a low-flying aircraft.

  Putting contaminants into the food supply chain is even easier. Lawrence Wein and Yifan Liu from Stanford University, California, calculated the effects of someone pouring botulinum toxin into a milk tanker on its way to a holding tank. “Among bioterror attacks not involving genetic engineering, the three scenarios that arguably pose the greatest threats to humans are a smallpox attack, an airborne anthrax attack, and a release of botulinum toxin in cold drinks,” they wrote in the Proceedings of the National Academy of Sciences in 2005.

  Botulinum is a strong poison that affects nerve function (and is used in small doses in plastic surgery to smooth wrinkles on the skin). Despite pasteurization of the infected milk, the scientists found that the would-be terrorist in their scenario would infect around 568,000 people with a fatal dose of the poison within a few days.

  Is it likely?

  According to the Council for Foreign Relations (CFR), a non-partisan American think tank, the Bush administration was convinced back in 2002 that al-Qaeda had radioactive materials, such as strontium-90 and caesium-137, with which they could build bombs. “In January 2003, British officials found documents in the Afghan city of Herat indicating that al-Qaeda had successfully built a small dirty bomb. In late December 2003, homeland security officials worried that al-Qaeda would detonate a dirty bomb during New Year’s Eve celebrations or college football bowl games, according to The Washington Post,” the CFR said on its website.

  The think tank added that Iraq had already tested a one-ton radiological bomb in 1987, but had given up on the idea because the radiation levels it generated were insufficient. “In 1995 Chechen rebels planted, but failed to detonate a dirty bomb consisting of dynamite and cesium 137 in Moscow’s Ismailovsky Park. In 2002 the United States arrested an alleged al-Qaeda operative, Jose Padilla, for plotting to build and detonate a dirty bomb in an American city. In 2003 British intelligence agents and weapons researchers found detailed diagrams and documents in Afghanistan suggesting that al-Qaeda may have succeeded in building a dirty bomb. Al-Qaeda detainees in American custody claim such a dirty bomb exists, but none have been discovered.”

  Levi and Kelly also pointed to the International Atomic Energy Agency, which in 2001 stated that almost every nation in the world had the radioactive materials needed to build a dirty bomb, and that more than 100 countries lacked the controls to prevent the theft of these materials.

  If the question is whether a dirty bomb will one day go off somewhere, spreading radiological, chemical or biological material, then the answer has to be yes. It only takes a patient, skilled scientist to prepare the equipment and the active ingredient. But would such an event bring civilization close to the edge? That is less likely. A dirty bomb would no doubt have disproportionate effects, emptying cities and causing devastating psychological and economic damage. But a would-be terrorist would have to detonate hundreds, at the same time, all over the world, for his actions to have a cumulative effect. End of the world from dirty bombs? Unlikely.

  Death by Euphoria

  * * *

  In Aldous Huxley’s Brave New World, people are born into specific roles and spend their lives fulfilling preordained tasks for the World State. Social mobility is just not an option in this extreme portrayal of society. Yet, the extreme loss of freedom does not lead to revolution. All thanks to a governmentcontrolled drug called soma.

  * * *

  There is no war in the World State, there are no divided allegiances, and everyone knows that society comes before individuals. If things become bad in the carefully controlled world, soma is there to provide a holiday from the facts, to get things back to normal. Mustapha Mond, the Resident World Controller for Western Europe, claims that soma
is there to “calm your anger, to reconcile you to your enemies, to make you patient and long-suffering. In the past you could only accomplish these things by making a great effort and after years of hard moral training. Now, you swallow two or three half-gramme tablets, and there you are. Anybody can be virtuous now. You can carry at least half your morality about in a bottle. Christianity without tears—that’s what soma is.”

  Back in the real world, soma does not exist. At least, not yet. We have an ever-increasing repertoire of drugs to treat the unwell, but soon there will also be drugs for the healthy, to improve mental skills and alertness, to de-stress after hellish days at work and to induce euphoria at the weekends, all without side effects. How long before we use so many drugs so willingly that we are no longer in control? Perhaps the end of society as we know it will not come with a bang, but will fade away in a self-administered medical haze.

  * * *

  How long before we use so many drugs so willingly that we are no longer in control?

  * * *

  Medicines now, medicines to come

  Psychoactive drugs have been used by human societies since prehistoric times. Archaeological evidence beneath houses in northwest Peru, dating back more than 10,000 years, shows that residents used to chew coca leaves. The alkaloids in the leaves are known to be stimulating, and can mitigate the effects of highaltitude, low-oxygen living.

  In the past century, and in particular the last few decades, our improved understanding of human physiology has brought an explosion of drugs (medicinal and illicit) that can help lift mood, improve alertness or just keep you awake for days on end.

  Most of us take drugs every day—how many people do you know who cannot start the day without the mind-sharpening effects of caffeine or nicotine?

  But healthy people are also taking plenty of prescription drugs. Methylphenidate (trade name Ritalin) is given to children with attention deficit hyperactivity disorder, but it is also used by healthy people to enhance their mental performance. Modafinil, a drug developed to treat narcolepsy, has been shown to reduce impulsiveness and help people focus on problems, because it can improve working memory as well as planning. It has already been used by the US military to keep soldiers awake and alert, and some scientists are considering its usefulness in helping shift workers deal with erratic working hours. Propranolol, a beta-blocker, is used to treat high blood pressure, angina and abnormal heart rhythms—it is also used sometimes by snooker players to calm their nerves before a game.

  Almost seven percent of students in US universities have used prescription stimulants for non-medical purposes, and on some campuses, up to a quarter of students have used them. “These students are early adopters of a trend that is likely to grow, and indications suggest that they’re not alone,” says Henry Greely, a professor at Stanford Law School.

  In a 2008 commentary for Nature, co-authored with a slew of experts in ethics, neuroscience, psychology and medicine, Greely explains that a modest degree of memory enhancement has been found with ADHD drugs such as methylphenidate, as well as the drug donepezil, developed for the treatment of Alzheimer’s disease. “It is too early to know whether any of these new drugs will be proven safe and effective, but if one is, it will surely be sought by healthy middle-aged and elderly people contending with normal age-related memory decline, as well as by people of all ages preparing for academic or licensure examinations.”

  A 2005 study by the UK government’s science think tank Foresight looked at the future for mind-enhancing drugs. In the resulting report, leading scientists in the fields of psychology and neuroscience argued that there really would be a pill for every ill very soon. And that all of it would be possible without addiction. In a world that is increasingly non-stop and competitive, an individual’s use of drugs may move from the fringe to the norm.

  Trevor Robbins, an experimental psychologist at Cambridge University and one of the authors of the Foresight report, pointed to a future where drugs would be used to vaccinate against substances such as nicotine, alcohol and cocaine. They could cause the immune system to produce antibodies against the drug being abused—these would render the drug impotent when taken and prevent it from having any effect on the brain.

  Other drugs might delete painful memories, which could be useful for people suffering from post-traumatic stress disorder. “We are now looking 20–25 years ahead,” said Robbins at the launch of the report. “Very basic science is showing that it is possible to call up a memory, knock it on the head and produce selective amnesia.”

  And all of this could be done so that you get the benefits of a drug with none of the pain. The harmful side effects of today’s drugs could in future simply be engineered out.

  Unexpected consequences

  It is hard not to get excited at the possibilities of pharmaceuticals that could solve all our problems, make life better, get rid of painful memories or just put a happy (but very safe) glow on our everyday existence.

  The inevitable difficulty arises from how little we really understand of what any drug actually does to our bodies and minds. Modafinil, for example, seems to be involved in modulating several chemical messengers in the brain, but no one is really certain how it works. With an expanded use of such stimulants comes the concern that in the long run, the drugs might take a toll on the brain.

  Several studies in animals have shown that stimulants could alter the structure and function of the brain in ways that could depress mood, boost anxiety and, contrary to their short-term effects, lead to cognitive deficits. In 2006, scientists at the US Food and Drug Administration collated information from studies looking at children and teenagers who took antidepressants for depression, anxiety disorders and ADHD—they found that those children had twice the risk (four percent versus two percent) of contemplating or attempting suicide compared with those on placebos.

  The unknowns are not limited to carefully designed pharmaceuticals. Research by Robin Murray, a professor of psychiatry at the Maudsley hospital in south London and one of Britain’s leading experts on mental health, shows that cannabis almost always exacerbates the symptoms of psychosis in people who are already suffering from (or have a family history of) mental health problems. A study published in the British Medical Journal by his colleague, Louise Arsenault, showed that people taking cannabis at the age of 18 were around 60 percent more likely to develop psychosis in later life. “But if you started by the time you were 15, then the risk was much greater, around 450%,” says Murray.

  This does not mean that everyone who smokes cannabis will develop psychosis, but it does seem to exacerbate the problem in those who have a heavy predisposition.

  Would people do it?

  The takeover of our entire society by drugs would require us all to willingly take them in large quantities, on a regular basis. This is not as outlandish an idea as you might think.

  For one thing, we are all living longer, thanks to the benefits of decades of medical advances. But no one knows what living to 150 or 200 might do to our minds. Perhaps it will just bring us even more years in which to feel lonely or to develop depression, something already hinted at in figures from the World Health Organization and others, which show big rises in psychological disorders in aging populations. If we can live to 200 or more, who is to say that so many years of accumulated memories and feelings might not overwhelm our primate brains, which only evolved to survive for several decades at most? One solution might be to develop even more drugs to counteract and soothe the effects of age-related depression or decline—people would have to take these to keep themselves sane.

  A less extreme scenario involves peer pressure. You may object to using mind-altering drugs to help you work harder or longer. But in a world where all your colleagues (and competitors) are doing it, can you afford to be left behind? If you are a 70-year-old office worker halfway through your life, and you have to compete for work with a well-educated 23-year-old in a developing country, would you not take every advantage you c
ould lay your hands on?

  Perhaps you want to keep your life as natural and additive-free as possible. Too late, argues Greely. “The lives of almost all living humans are deeply unnatural; our homes, our clothes and our food—to say nothing of the medical care we enjoy—bear little relation to our species’ ‘natural’ state. Given the many cognitive-enhancing tools we accept already, from writing to laptop computers, why draw the line here and say, thus far but no further?”

  * * *

  You may object to using mind-altering drugs to help you work harder or longer. But in a world where all your colleagues (and competitors) are doing it, can you afford to be left behind?

  * * *

  Human ingenuity, he goes on to say, has given us means of enhancing our brains through inventions such as written language, printing and the Internet. Drugs should be viewed in the same general category as education, good health habits, and information technology: simply as ways in which our uniquely innovative species tries to improve itself.

  “Safe and effective cognitive enhancers will benefit both the individual and society,” concludes Greely. “But it would also be foolish to ignore problems that such use of drugs could create or exacerbate. With this, as with other technologies, we need to think and work hard to maximize its benefits and minimize its harms.”

 

‹ Prev