A quick accounting of the story of Samuel Slater can illuminate the ripple effect of gender biases in patenting. Born and apprenticed to a cotton miller in England, he departed for New York in 1789, after committing locally patented cotton-spinning secrets to memory with the intention of selling them in the States. For such plundering, he was called the “Father of the Industrial Revolution” by Andrew Jackson, although locals in his hometown referred to him as “Slater the Traitor.” It’s unclear what he was called by the laborers who worked in the factories he is said to have revolutionized, because as female members of the underclass and/or recent immigrants, their gender, race, and class deemed their specific contributions unworthy of publicly note. (This, of course, equally applied to their innovations to his designs, and explains their absence from his patent applications, although they are likely to have offered improvements fairly consistently.) Indeed, Slater may have been known as the Patriarch of the American Factory System, but what he really innovated—through theft—is textile manufacturing. Today, around one in seven women who work outside the home labors in some aspect of the textile industry, a workforce that on average brings in far less than half the amount of money it takes to survive in the regions of the world in which they work. Slater can’t be held exclusively responsible, of course, but his legacy includes a significant contribution to the global gender wage gap.
Due to the manner in which they foster and restrict global trade—indeed, how they limit those who might profit—patents, even more than other forms of IP (including copyrights, which have been around exactly as long), form the backbone of American-style capitalism. Thus our political economy rests on gendered constructs—and it always has. Yet since the first patent was awarded in 1790 for a potash-making process, a salt used in both soaps and artillery, patent protection has been described as an unbiased stimulator of creativity, a protective measure for the makers of products that offers exclusive rights to profit from patented goods.
Patents are uniquely aimed to incentivize cultural products, elements of processes, or technological solutions in “science and the useful arts,” as the US Constitution explains. They are guaranteed only on successful application and cover a significantly shorter period than copyrights (twenty years from date of application, in most cases). There are six main types of patents: utility patents, which apply to processes, machines, products, material, or a significant improvement to previous versions of any of these; design patents, which cover the ornamentation or decoration of a manufactured good; and three varieties of patents that, more or less, correct, update, or stand in for previously awarded or pending patents.
Theoretically, of course, exclusive rights to profit may spur makers toward innovations—or theft—but there’s very little evidence to show that economic gain truly stimulates the creative process. And there’s none to show that women and other marginalized folks have been offered similar incentives to create, however much they persist in doing so. In fact, the notion of patents as creativity stimulators is fairly quickly revealed to rely on an unproven logic that ignores inherent masculine privilege and confuses legal rights, financial profit, and free expression to create the myth of US ingenuity and entrepreneurship. More than anything else, patenting seems to uphold the myth that we Americans pull ourselves up by our bootstraps (#US 216544, patented by Henry M. Weaver of Mansfield, Ohio, in 1879).
Yet who made that boot? What patenting deliberately fails to acknowledge are the communities and contexts that often lead to what we call invention and discovery. “Histories of colonialism and cultural and racial stereotypes have often led us to overlook the knowledge contributions of the poor,” Sunder summarizes. One of the noted areas in which the poor—farmers, in this case, often in developing nations—have seen contributions overlooked is in agriculture and medicine. Recently, herbs long known to promote certain healing effects in India, for example, have been patented to US companies. It’s a situation that started in 1931, with the first plant patent, which covered distinct varieties of asexually produced vegetation. It was the first time that life forms became subject to intellectual property claims, but it was not the last.
In 1988, a sea change: the first patent was awarded for the ownership of and rights to profit from a living mammal. #US 4736866 was granted Philip Leder and Timothy A. Stewart of Harvard University for the creation of the OncoMouseTM, a mouse they’d bred for cancer research. No longer were patents covering the vast domain of “everything under the sun that is made by man.” For the rodent was at least partially “made” by an impregnated female mouse, unnamed in Leder and Stewart’s application.
Whether to mice or men, offspring-producing females ceded then their formerly exclusive domain, the creation of new mammalian life forms. The changes to patenting were significant, but also not, for the new patent revived earlier, ambivalent definitions of certain key terms: an “invention” could be found; a “novelty” could be elsewhere common. The new domain of patent ownership also revived an even older notion of patents, from the sixth century, when they were intended to aid exploration and colonization. Is it not colonizing to invent new forms of mice? So the patenting of mammalian life in 1988 hardly happened suddenly as much as it suddenly laid bare the gendered intentions of IP policy. But even that had a precursor, a quarter century earlier.
Up until the mid-1960s, most things under the sun that were “made by” women weren’t truly owned by anyone at all. The forms of cultural production to which women were often consigned were not considered “authored” in the contemporary sense and were therefore not eligible for IP protection. Dinner just appeared on the table and clothes emerged from closets. Of course women were actively dissuaded from participating in patent-heavy STEM fields, partially because babies. And babies didn’t require IP registration, they were simply birthed. Some of them grew into men, who could be said to be “made by” pregnancy, specifically, or maternity, more generally.
But in 1964, a patent (#US 1970000062143) was awarded William Wright and Ralph Meyerdirk for the first articulated arm scanner, also known as the sonogram machine. Heralded as a means by which pregnant women could bond with unborn fetuses, ultrasound technology had been commercially available since 1963 and had risen to popularity quite quickly. Sonograms made the previously hidden “mysteries” of “life” visible, although not exclusively to the women it was presumed weren’t bonding with the fetuses growing inside them. Men—husbands, older brothers, male doctors, interns, medical technicians—were granted access to this previously unseen provenance, too. (I would be surprised if the female partners of pregnant women were regularly granted similar viewing rights.) This shift had a cultural parallel: in the 1950s, it was still common to announce, “We’re having a baby!” while the sentiment two decades later would be expressed with the biologically less likely, “We’re pregnant!” IP rights over an unborn creation may not have changed yet—they would, however, soon enough—but the cultural sense of ownership over pregnancy certainly did. Sex education adopted a spiritual tone, situating intercourse as the scientific foundation of life itself; the interiors of women’s bodies were photographed, stripped of actual women, who became mere background to fetuses, the nonconsenting models on the covers of news magazines. The ownership of pregnancy, in other words, became cultural through the wonders of ultrasound technology.
If we accept that sonograms provide pregnant women the opportunity to increase their emotional attachment to growing fetuses—and I’m not convinced we should, although my bias against growing a fetus may be coming into play here—we can believe that it would elicit a similar response in male partners. Or, perhaps, in men in general. After all, recent studies indicate that social media—which facilitates interaction between physically disconnected social actors—does provide many of the same effects as in-person interaction.
The exact psychology at work, however, may matter less than the political impact of a technology that allowed men unhindered visual access to the interior of women’s
bodies. In 1988, twenty-five years after the patenting of the ultrasound machine allowed a broad swath of people greater insight into the mechanics of pregnancy, the first patent was granted guaranteeing ownership of mammalian life. Note that twenty-five is the exact minimum age required for election to Congress, which in 1988 was 98 percent male. So the men who voted on whether other men should have the legal right to own and profit from the creation of mammalian life forms were the first generation who from conception grew up—literally, and not metaphorically—under the watchful eyes of their fathers, all thanks to a device from which other men profited.
Masculine ownership of the means of reproduction, I suggest, was gradually normalized throughout the second half of the twentieth century. First in the early 1960s with the sonogram machine, and then as images of what the machine allows us to see were more and more circulated throughout popular culture. This was furthered in the late 1980s, with the patenting (indeed, even trademarking) of the OncoMouseTM. And while before 1988 there may have existed such a thing as a biological imperative, a drive described as natural that some suggest lie at the base of their desire to have children, it’s possible that maternity may have lost some of the purely biological veneer since. It could be (and has frequently been) argued that the planet no longer requires or even supports an expansion of humankind, and that the biological imperative, if it did ever exist, is no longer necessary, sustainable, or ethical. It may only be leftover, a relic of life before technology.
These days at parties, I’m more likely to be asked how many kids I have than when I am going to have them, but my answer hasn’t much changed. Sometimes I offer a spluttering recitation of my full life and active career, and sometimes I state dully, “I’ve written over ten books.” I admit I have occasionally mentioned my cats. Whatever my response, I’m still expected to defend my selfishness, or confronted about my apparent disinterest in honoring my parents, or asked what is wrong with me. (While I do have several invisible chronic illnesses, not a single one of them played into my decision to remain childless, so they do not enter this conversation.)
Neither does the measure of my cultural production mitigate the implicit accusations, the sense (we all feel it) that I have erred. Still, weekly if not daily, I am called to explain my aberrational nature, of being a woman apparently capable of having children who won’t. It’s not only white cismen, of course, demanding justification. It’s my South Asian immigrant neighbors, the attendees of lecture tours, my students. They may express considerable investment in my work, but remain reasonably convicted that it is not enough. In such conversations, I feel it most clearly: being a woman who is a cultural producer is clearly not sufficient to fulfill my obligation to society. I will not be valued, it seems, until I am a reproducer. I suspect this may be true as well for women incapable of having children, may contribute to the sense that women gain as they age that they are no longer of value to the world.
I submit this may be because, in a very basic way, IP has outlined a course of gender roles for masculine players and for feminine players and then policed them, at least partially, through the assignment of copyrights and patents. “We must understand intellectual property as social and cultural policy,” Sunder rightly contends of the laws that govern who we value as productive and who as reproductive. “Increasingly in the Knowledge Age, intellectual property laws come to bear on giant-sized values, from democracy and development to freedom and equality,” she argues. And indeed: these giant-sized values seem to show up in individual relationships—even between random partygoers, talking about the future.
My concern, and my frustration, is that this contributes to a sense that pregnancy for women is obligatory, but cast as biological. If the entirety of intellectual property rights were established as a counterweight to women’s ability to birth babies, women must keep reproducing as long as men are producing. The logic of one proves the logic of the other. It’s starting to seem to me to be a sad race to make more stuff and more people to make use of it, not out of desire or joy, but because we are told we must, in a logic so deeply instilled we write it all off as natural.
I came to Chicago at the tail end of 1993, a moment that has been described as one of the most exciting in American music history. I went, at the time, to two or three shows a night, right up until I moved west in 1999. I wanted to go off-grid then, to escape not the electrical grid, but the urban one: the physical division of space along right angles, featuring sequentially ordered lot numbers, a logic that allows you to look up from any outdoor location in the city and know exactly how many houses you are away from your own. You can easily calculate how long it will take you to get to your house, or anywhere else, and the best means of crossing that terrain. And suddenly your life is efficient, predictable, and you feel accomplished for having figured out something so complex, simply by looking at an address. Even if you have just stumbled out of a show at the Abbey Pub at 12:45 drunk as shit and are trying to get to the Empty Bottle to catch the headliners before they climb offstage. A later analysis will reveal that each of the performers you traipsed around town for that night had different careers by the following decade’s end: one became an accountant, another joined the police force, two married and moved to a farm to raise twins.
You can calculate quite a bit about the future from any street corner in Chicago, but what you cannot predict is whether or not you will be happy there.
I found that first time away from the urban grid chaotic and distressing. Streets wandered aimlessly and adopted new names, there was no way of telling how long travel might take within the city, and people were often late for appointments. I returned to the grid four years later. I had to get on with my life. I wasn’t getting any younger.
Chicago’s grid system is more than a satisfying urban plan, however; it reflects the very origins of modern time, and the many efficiencies it birthed. Indeed, right at the intersection of Jackson and LaSalle downtown, there’s a plaque celebrating the adoption of standard time in 1883. Here’s a travel blog with details:
In the aftermath of the destruction of downtown Chicago by the Great Chicago Fire of October 1871, all of the central city was rebuilt, and pretty quickly, too. One of the architects in the city who got nearly more work than he could handle was W. W. Boyington, whose Chicago Water Tower and Pumping Station at Chicago Avenue and Michigan Avenue (then Pine Street) were among the very few structures that survived the fire. Among Boyington’s commissions after the fire were two on Jackson Boulevard at LaSalle Street—a prestigious address as the financial district was being rebuilt in that area. . . . On the northeast corner of the intersection was the Grand Pacific Hotel, a very swanky hostelry that stood across the street from the newly rebuilt Chicago Board of Trade Building, also designed by Boyington. . . . Then as now, the CBOT building had a large town clock by which many in the financial district set their pocket watches. But how was the rest of the city—or, for that matter, the rest of the state or the region—to agree upon the time?
Though it may seem strange now, there was no single national time system being used. Many towns and cities, especially along railroad lines, set their time according to the railroad’s clock. The railroads themselves depended on local noon as the determining factor, but that resulted in more than a hundred different local time zones. You can see how this made setting accurate train schedules difficult. . . . So when the railroads decided they had to do something to standardize the time system, you can bet everyone was interested.1
You’ve probably heard this story before: the railways developed, and then implemented, standard time in order to ease travel across great distances. And people, because they love being on time for things, responded enthusiastically. Any closer inspection reveals this is a bold-faced lie, which we’ll get back to in a moment, but I can’t let you overlook the significant presumption embedded in this tale: that agreeing on a single notion of time is, in any way, desirable.
There was a time when I would have concurred tha
t it was. And admittedly, it is hard to imagine a world that functions without being able to say, “Let’s all meet up here at 7:00 for this reading, and then I will read, and it will take between seven and ten minutes and then you can go home.” But just because that is how the world functions now does not mean that other functioning worlds are not possible.
Back to our travel blog:
[Boyington’s] Grand Pacific Hotel II . . . was one of the first . . . big, important hotels built in Chicago immediately after the fire. It occupied a square half-block bounded by Jackson, LaSalle, Quincy, . . . and Clark Street[s]. . . . The Grand Pacific was designed . . . in the then popular grand palazzo style and served both travelers and wealthy permanent residents, some of whom did business across the street [at the Board of Trade]. It was here that in 1883, delegates from all the U.S. and Canadian railroads held the General Time Convention to find a better, uniform way of setting the time.
Note, please, who it was that we are told wanted time standardized: the robber barons that ran the railroads, the wealthy, and high-stakes financial investors. Let me take another moment to underscore that: the standardization of time did not emerge from a popular uprising.
A little over a century later, in 1989, an astronomer, employee of the US Army Laboratory Command, and time historian named Ian R. Bartky published an article called “The Adoption of Standard Time.” In it he revealed that standard time was not initiated by the railways at all, in fact it was initiated by astronomers, who preferred to let the private interests of the railroads both do the dirty work of and take the blame for the fundamental shift to the way US residents arranged their days and interacted with each other that standardized time would require. Of course this was calculated. People already hated the rich, who ran the railroads, but harbored few to no opinions about the scientists who looked at stars. Those scientists, however, needed a better way of communicating across great distances of land what was happening at the same exact time. Why not see if the railways would get on board this standard time thing, the astronomers figured, and do the dirty work? Pure science should not be sullied by such quibbles.
Body Horror Page 14