Book Read Free

Quirky

Page 21

by Melissa A Schilling


  Work was Marie Curie’s primary shelter and solace, and she now turned to seeking ways that radioactivity could be harnessed for medical benefits. Noting that radiation killed tumor-forming cells faster than healthy cells, and having unique knowledge and skill in isolating radioactive isotopes, she began to conduct the world’s first studies using radioactive isotopes to treat tumors. Curie had become a profound role model for women, and by 1910 twenty women scientists were working as unpaid volunteers in her laboratory. As noted in Chapter 4, 1911 would turn out to be a tumultuous year; the public became aware that Marie was having an affair with her friend and colleague Paul Langevin at exactly the same time that she was awarded her second Nobel Prize, this time in chemistry, “in recognition of her services to the advancement of chemistry by the discovery of the elements radium and polonium, by the isolation of radium and the study of the nature and compounds of this remarkable element.”19 Rather than being bathed in adulation for accomplishing the seemingly impossible as the first person to ever receive two Nobel prizes and for many years the only person to win Nobel prizes in two different fields, she faced vicious public attacks because of the scandal. Curie once again retreated from public life, losing herself in her work.

  When World War I broke out, Curie heard that wounded soldiers were having their limbs amputated because field hospitals did not have X-ray equipment to find the location of bullets and shrapnel. She thus developed mobile X-ray units that she drove to field hospitals on the battlefront, and she set the units up with the help of her daughter Irène, who was now seventeen. She is attributed with saving the lives of over one million soldiers.20 By the age of nineteen, Irène would be training women X-ray technicians at the Edith Cavell Hospital. Irène would develop into a profoundly talented scientist like her mother, discovering (among other things) a way to create artificial radioactivity and winning a Nobel Prize of her own.

  The constant exposure to X-rays and radium took their toll, and in 1934 Marie Curie died at the age of sixty-seven from “aplastic pernicious anemia” caused by radiation exposure. As Eve Curie notes, “When her mission was accomplished she died exhausted, having refused wealth and endured her honors with indifference.”21

  Marie Curie exhibits many of the themes discussed in previous chapters. She was unconventional, was uninterested in social life, and lived in self-imposed isolation. She dedicated her life to a grand purpose and doggedly pursued her objectives. When obstacles and challenges arose, she did not falter—she just dug in her heels and worked even harder. The crucial role of Pierre and his electrometer in Marie’s discoveries also aptly demonstrates how important access to resources can be, as will be discussed in the next chapter. However, perhaps more than anything else, Curie’s story illustrates how large a role can be played by both the opportunities and challenges of an era.

  On the one hand, Curie’s story poignantly conveys the challenges of being a woman in science prior to the late twentieth and early twenty-first centuries. Women were not supposed to be in scientific and business domains, and many universities in Europe did not admit women. It was only through Curie’s extraordinary effort and resourcefulness that she obtained higher education. Furthermore, even after her brilliance was acknowledged and her accomplishments were irrefutable, she was denied access to the Academy of Sciences and nearly denied a Nobel Prize—all because of her gender. Her success also required making a choice that would have been very difficult for many women: she relinquished nearly all of her caregiving duties for her children to others. This one story of a female breakthrough innovator does much to reveal to us why there is a paucity of women on lists of famous innovators.

  On the other hand, Curie’s story also exquisitely demonstrates the opportunities and positive effects that an era can offer. First, her birth into a time in which Poland was defending its cultural heritage by secretly educating its people meant that not only did Marie have access to education; she also thought of it as her patriotic duty. Her involvement in the Flying University meant that she interacted with some of the most intelligent and fearless women of Warsaw, giving her access to an intellectual community and to an ethos in which women should pursue intellectual and scientific advance with courage and fervor. The rise of Polish positivism during this time reaffirmed both of these factors: Polish positivism overthrew the idea that women should have lesser access to education and emphasized that diligent and pragmatic work was the key to Poland’s future.

  The events unfolding in Poland were not the only significant opportunities of the era affecting Curie. In Europe at the beginning of the 1800s, women had virtually no access to university education. By the mid-1800s, however, changes were under way. Several colleges in the United States had begun admitting women in the early 1800s, and by the middle of the century European women were pressing their case for access to university education. In 1865 the University of Zurich became the first European university to admit women, followed quickly by the University of Paris (the Sorbonne) in 1867. By the late 1860s and early 1870s, many other European universities were following suit. In 1878 the University of London became the first university in the United Kingdom to permit women to earn degrees (Oxford and Cambridge allowed women to take classes at this time but did not permit women to earn degrees until 1920 and 1947, respectively). Marie Curie started at the Sorbonne in 1891; if she had been born thirty years earlier, it is likely that no amount of resourcefulness or tenacity would have enabled her to pursue a career in science.

  The rise of Polish positivism and the rise of women’s access to education in Europe coincided to inspire Marie and Bronia to craft their plan to take turns going to the Sorbonne. It was a bold, unlikely, and resourceful scheme for two Polish girls without financial resources. The fact that it worked and both Marie and Bronia obtained university educations is quite remarkable. There were moments when, as a governess, Marie did not think they would have enough money to see each other through and succumbed to discouragement. Her description of an alternative future provides a sharp contrast to the one that actually transpired: “My plans for the future are modest indeed: my dream, for the moment, is to have a corner of my own where I can live with my father.… I shall install myself in Warsaw, take a post as a teacher in a girls’ school and make up the rest of the money I need by giving lessons. It is all I want. Life does not deserved to be worried over.”22

  Finally, it must also be noted that Curie’s discovery of radium and radioactivity occurred because of the unique confluence of her work studying the magnetic properties of steel, Pierre Curie’s development of the electrometer, and the timing of the discovery of mysterious invisible energy rays by Röntgen and Becquerel. Given Curie’s intellect and drive, we can speculate that she might have achieved amazing things even without this convergence of time and place, but we do not know what those achievements might have been or whether she would be remembered as one of the most important innovators of all time.

  WHEN THE WORLD IS jolted by technological or economic shocks, there are often corresponding “blooms” of innovation and innovators as individuals respond to the changing needs or resources created by the shock. The rise of the personal computer and the development of the Internet are excellent examples. These shocks in information technology led to feverish innovation activity as people and organizations raced to exploit the new opportunities these technologies offered.23 These shocks didn’t just influence innovation in the computer and software industries; they also spurred innovation across almost every industry by enabling new types of products and production methods. Drug compounds could be automatically screened by computers, automated processes could be incorporated into industrial machinery, textbooks could be digitized and made modular and customizable, and products could be marketed and sold over the Internet. It is difficult to identify an industry that was not affected by the rise of information technology. Many famous innovators emerged during the period from the late 1970s (when the personal computer first emerged) to the mid-1990s (when the
Internet became available to the general public), including Steve Jobs; Bill Gates, who cofounded Microsoft with Paul Allen; Linus Torvalds, who initiated the development of Linux; Tim Berners-Lee, credited with inventing the World Wide Web by writing the hypertext markup language and hypertext transfer protocol; Jeff Bezos, who founded Amazon; Marc Andreesen, who founded Netscape; and Larry Page and Sergey Brin, who cofounded Google.

  Shocks that are not directly related to technology can also spur innovation by changing regulatory constraints, our access to resources, and our economic priorities. For example, in 1973 the members of the Organization of Petroleum Exporting Countries (OPEC) declared an oil embargo, causing oil prices to quadruple and creating an oil crisis. Politicians called for gas rationing, and President Nixon asked gas stations to voluntarily refrain from selling gasoline on Saturdays and Sundays, resulting in long lines at the pumps. Many states even asked citizens to not put up Christmas lights, with Oregon actually banning them! Although the embargo was lifted a few months later, the crisis had jolted governments, industries, and consumers, leading to a long-term effect on their behavior. For example, US auto manufacturers began to focus on developing cars that were more fuel efficient, and consumers began taking fuel efficiency into earnest account when choosing a new car. Both companies and individuals began to dramatically increase their efforts in developing renewable energy alternatives such as solar and hydroelectric power. Rapid innovation resulted in the fall of the cost of solar power from $100 per watt to about $20 per watt, and suddenly solar cells began to be used in a wide range of applications, from digital watches to residential power in remote locations. In a telling bit of irony, oil companies even started using solar cells to power the warning lights on offshore rigs.

  Social movements also play a role in stimulating innovation by changing social priorities or norms of behavior. Consider, for example, Steve Jobs, whose life and career were deeply affected both by the technological shocks occurring in computing technology and by his strong identification with the sixties counterculture movement. In the 1960s, opposition to the Vietnam War and growing tension over sexual and racial discrimination led to a series of antiwar protests and intensification of social movements such as the civil rights movement, the free speech movement, and the women’s liberation movement. These mostly peaceful movements were fueled by unprecedented levels of student activism and rejected the authority of “the establishment” in favor of a world characterized by peace, equality, and freedom of personal expression. Because the movements sought to overthrow social norms of the past, they collectively became known as the “countercultural movement.”

  The ethos of the counterculture movement played a big role in Jobs’s beliefs about resisting authority and the constraints of social norms, and helped to nurture his vision of the personal computer as a tool of personal expression and liberation. The computer, in Jobs’s mind, was not a mere tool for productivity; it was also a means of social revolution. The musician Bono, a friend of Jobs, explained why the hippies of the countercultural movement played such a big role in the creation of the personal computer industry: “The people who invented the twenty-first century were pot-smoking, sandal-wearing hippies from the West Coast like Steve, because they saw differently… the hierarchical systems of the East Coast, England, Germany, and Japan do not encourage this different thinking. The sixties produced an anarchic mind-set that is great for imagining a world not yet in existence.”24

  WAR ALSO HAS A substantial impact on innovation, although the effect is double-edged. On the one hand, it removes people from their work in science and industry and puts them in uniform and on the battlefield, disrupting their pursuit of goals. It also leads to the widespread destruction of resources, including both physical assets as well as the intellectual and creative talent that is crucial for innovation. On the other hand, war can also spur innovation by inspiring a sense of urgency and idealism that leads individuals to pursue bigger projects. People break out of roles and routines that may have been preventing them from realizing their full innovative potential. War also tends to stir the social pot, bringing together people who might normally not come into contact by virtue of gender, race, or walk of life. Consider, for example, the role of women in industry during World War II. From 1940 to 1945, the percentage of women in the workforce rose dramatically—reaching 37 percent in 1945—as women filled the holes in the workforce left by men who went off to war. A US government campaign aimed at recruiting women to the munitions industry featured the muscle-flexing “Rosie the Riveter,” who became a symbol of female strength, independence, and patriotism. Women were accepted into positions formerly closed to them, as in the aviation industry, for example. In 1943 women made up 65 percent of the workforce in aviation, up from a mere 1 percent prior to the war.25 Hundreds of women were also enlisted as “computers” who used desk calculators to calculate long lists of equations used to target artillery on the battlefield. Women mathematicians were also involved in the development and programming of the ENIAC (electronic numerical integrator and computer), widely considered to be the first general-purpose computer. As a result, women gained unprecedented access to science and technology, and science and technology gained unprecedented access to women, vastly widening the pool of intellectual and creative talent employed in technological innovation.

  Perhaps no one is a better example of radical new opportunity than Grace Hopper, born in 1906 in New York City. Her mother was an accomplished mathematician, and her father was an executive at a life insurance company.26 Grace shared her mother’s love of math and in 1934 became the first woman to receive a doctorate in mathematics from Yale. As Hopper notes, “I wanted to be an engineer.… My dad always made things, and I’ve always been fascinated with how things work. But there was no place at all for women in engineering when I graduated in 1928.”27 She thus accepted a position as a professor at Vassar and by 1940 was both a popular teacher and a highly respected member of the Vassar faculty.

  Being a professor was one of the few professional roles considered appropriate for women at that time. She had a comfortable life. All of that changed in December 1941, when the Japanese bombed Pearl Harbor. Within the next six months Grace Hopper’s husband, brother, cousins, and many of her friends had enlisted in the military. As soon as President Roosevelt signed the Navy Women’s Reserve Act authorizing women to enter noncombat positions in 1942, Hopper decided she would enlist as well. At 36 years old and 105 pounds, she was considered too old and small for naval service; however, she was able to obtain a waiver as mathematics professors were highly sought after for the war effort.28 Because of her math background, she was assigned to a project to develop a machine that could rapidly make difficult calculations for tasks such as laying minefields. After the war, many of the major technology firms (including IBM, Honeywell, and Eckert-Mauchly Computer Corporation) wanted to interview her. Although women were not generally accepted in either science or business at that time, her military rank (she would eventually attain the rank of admiral) and protocol neutralized the gender discrimination she would have normally faced, enabling her to be more influential. She went on to develop the first computer programming language and was, ironically, voted “Man of the Year” by the Data Processing Management Association. In her honor, in 1997 the US Navy named its newest guided-missile destroyer the USS Grace Hopper.

  During wartime, governments and industries often vastly increase their efforts at developing communication, transportation, and munitions technologies, and these investments create pools of assets and expertise that continue to give rise to innovations even after the wartime needs recede. Although the fundamental science underlying the development of radar goes back at least to work on electromagnetics in the late 1800s, the big advances that made radar useful occurred during World War II. The governments of Germany, Japan, the United States, Britain, the Soviet Union, the Netherlands, France, and Italy all invested vigorously in trying to develop radar systems that would enable
the detection and tracking of aircraft. Britain made major scientific advances in radar but lacked the money and other industrial resources needed to develop radar to its potential during the war. Thus in June 1940, when France had already fallen to the Nazis and fearing that Britain would be next, Winston Churchill decided to seek the help of the United States. Churchill brokered a deal to share Britain’s radar technology with the United States in exchange for help with production and financing. A British magnetron that was developed by a team led by Henry Tizard and that was a thousand times more powerful than the best US transmitter of the time was sent by ocean liner to the United States in a secret operation known as the “Tizard mission.”29 Bell Telephone Laboratories immediately went to work putting the new transmitter into production.

 

‹ Prev