by Aaron Swartz
There’s just one problem: I enjoy deep discussions of punctuation and other trivialities. I could try to justify this taste—some argument that we should think about everything we do so that we don’t do everything we think about—but why bother? Do I have to justify enjoying certain television shows as well? At some point, isn’t pure enjoyment just enough? After all, time isn’t fungible.
But of course, the same drive that leads me to question punctuation leads me to question the drive itself, and thus this essay.
What is “this drive”? It’s the tendency to not simply accept things as they are but to want to think about them, to understand them. To not be content to simply feel sad but to ask what sadness means. To not just get a bus pass but to think about the economic reasons getting a bus pass makes sense. I call this tendency the intellectual.
The word “intellectual” has a bit of a bad rap. When I think of the word I hear a man with a southern accent sneering at it. But this stain seems appropriate—the idea has a bad rap.
And why is that? One reason is that many people simply don’t like to think about things. Perhaps it reminds them of school, which they didn’t enjoy, and they don’t want to go back there. Another is that they’re busy people—men of action—and they don’t have time to sit and think about every little detail. But mostly it’s just because they think it’s a waste of time. What’s the point? What difference does it make what you think about punctuation? It’s not going to affect anything.
This is the argument that’s often used when demonizing intellectuals. As Thomas Frank summarizes the argument:
The same bunch of sneaking intellectuals are responsible for the content of Hollywood movies and for the income tax, by which they steal from the rest of us. They do no useful work, producing nothing but movies and newspaper columns while they freeload on the labor of others.
When I think of intellectuals, though, I don’t really think of Hollywood producers or politicians or even newspaper columnists. But the people I do think of seem to have something else in common. They don’t just love thinking, they love language. They love its tricks and intricacies, its games, the way it gets written down, the books it gets written into, the libraries those books are in, and the typography those books use.
Upon reflection this makes perfect sense. Language is the medium of thought, and so it’s no surprise that someone who spends a lot of time thinking spends a lot of time thinking about how to communicate their thoughts as well. And indeed, all the intellectuals that come to mind write, not because they have to or get paid to, but simply for its own sake. What good is thinking if you can’t share?
This contrasts with how intellectuals are commonly thought of—namely as pretentious elitist snobs. But real intellectuals, at least in the sense I’m using the term, are anything but. They love nothing more than explaining their ideas so that anyone who’s interested can understand them. They only seem pretentious because discussing such things is so bizarre.
This stereotype actually seems more like the caricature of the academic than the intellectual. (It’s perhaps worth noting that most of the intellectuals I can think of aren’t academics or at least have left the academy.) Far from being intellectuals, academics are encouraged to be almost the opposite. Instead of trying to explain things simply, they’re rewarded for making them seem more complicated. Instead of trying to learn about everything, they’re forced to focus in on their little subdiscipline. Instead of loving books, they have to love gabbing—up in front of class or at office hour with students or at professional conferences or faculty meetings.
Not that there’s anything wrong with that. At the beginning I declined to justify my being an intellectual on any grounds other than pure personal enjoyment. And here, at the end, I can’t think of any better justification. Certainly people should think deeply about their actions and the world’s problems and other important topics. But the other ones? That’s little more than personal preference.
Getting It Wrong
http://www.aaronsw.com/weblog/gettingitwrong
October 12, 2006
Age 19
Anyone who’s spent any time around little kids in school, or even read books about people who have, knows that they’re terrified of getting the answer wrong. Geez, you don’t even need to hang around little kids. When you’re out chatting with a bunch of people and you say something that shows you didn’t know something, you look embarrassed. When you’re playing a video game and not doing well, you try to come up with an excuse. People hate failing, so much so that they’re afraid to try.
Which is a problem, because failing is most of what we do, most of the time. The only way to stretch your abilities is to try to do things a little bit beyond them, which means you’re going to fail some of the time. Even weirder are the competitive situations. If I’m playing a game that relies solely on practice against someone who’s practiced more than me, I’m probably going to lose, no matter how good a person I am. Yet I still feel degraded when I do.
Anyone who wants to build a decent educational environment is going to need to solve this problem. And there seem to be two ways of doing it: try and fix the people so that they don’t feel embarrassed at failing, or try to fix the environment so that people don’t fail. Which option to pick sometimes gets people into philopolitical debates (trying to improve kids’ self-esteem means they won’t be able to handle the real world! Preventing kids from experiencing failure is just childish coddling!), but for now let’s just be concerned with what works.
Getting people to be OK with being wrong seems tough, if only because everybody I know has this problem to a greater or lesser degree. There are occasional exceptions—mavericks like Richard Feynman (why do you care what other people think?) often seem fearless, although it’s hard to gauge how much of that was staged—but these just seem random, with no patterns suggesting why.
It seems quite likely that a lot of the fear is induced by a goal-oriented educational system, obsessed with grades for work (A, B, C) and grades for students (1st, 2nd, 3rd). And perhaps the fear of being wrong you see in older people stems from having been through such experiences in childhood. If this is the case, then simply building a decent non-coercive environment for children will solve the problem, but that seems like too much to hope for.
Perhaps the solution is in, as some suggest, building self-esteem, so that when kids are wrong on one thing, they have other things to fall back on. I certainly see this process operating in my own mind: “Pff, sure they can beat me in Guitar Hero, but at least I can go back to writing blog entries.” But self-esteem is like a cushion: it prevents the fall from being too damaging, but it doesn’t prevent the fall.
The real piece, it would seem, is finding some way to detach a student’s actions from their worth. The reason failing hurts is because we think it reflects badly on us. I failed, therefore I’m a failure. But if that’s not the case, then there’s nothing to feel hurt about.
Detaching a self from your actions might seem like a silly thing, but lots of different pieces of psychology point to it. Richard Layard, in his survey Happiness: Lessons from a New Science, notes that studies consistently find that people who are detached from their surroundings—whether through Buddhist meditation, Christian belief in God, or cognitive therapy—are happier people. “All feelings of joy and even physical pain are observed to fluctuate, and we see ourselves as like a wave of the sea—where the sea is eternal and the wave is just its present form” (p. 191).
Similarly, Alfie Kohn, who looks more specifically at the studies about children, finds that it’s essential for a child’s mental health that parents communicate that they love their child for who they are, no matter what it is they do. This concept can lead to some nasty philosophical debates—what are people, if not collections of things done?—but the practical implications are clear. Children, indeed all people, need unconditional love and support to be able to survive in this world. Attachment parenting studies find that even
infants are afraid to explore a room unless their mother is close by to support them, and the same findings have been found in monkeys.
The flip side is: how do we build educational institutions that discourage these ways of thinking? Obviously we’ll want to get rid of competition as well as grades, but even so, as we saw with Mission Hill, kids are scared of failure.
While I’m loath to introduce more individualism into American schools, it seems clear that one solution is to have people do work on their own. Kids are embarrassed in front of the class, shy people get bullied in small groups, so all that really leaves is to do it on your own.
And this does seem effective. People seem more likely to ask “stupid” questions if they get to write them down on anonymous cards. When people fail in a video game, it only makes them want to try again right away so they can finally beat it. Apparently when nobody knows you’re getting it wrong, it’s a lot easier to handle it. Maybe because you know it can’t affect the way people see you.
Schools can also work to discourage this kind of conditional seeing by making it completely unimportant. Even Mission Hill, which ensured every classroom was mixed-age, still had a notion of age and clear requirements for graduating. What if school, instead of a bunch of activities you had to march through, was a bunch of activities students could pick and choose from? When people are no longer marching, it’s hard to be worried about your place in line.
But can we take the next step? Can schools not just see their students unconditionally, but actually encourage them to see themselves that way? Clearly we could teach everybody Buddhist meditation or something (which, studies apparently show, is effective), but even better would be if there was something in the structure of the school that encouraged this way of thinking.
Removing deadlines and requirements should help students live more fully in the moment. Providing basic care to every student should help them feel valued as people. Creating a safe and trusting environment should free them from having to keep track of how much they can trust everyone else. And, of course, all the same things would be positive in the larger society.
Too often, people think of schools as systems for building good people. Perhaps it’s time to think of them as places to let people be good.
EPILOGUE
Legacy
http://www.aaronsw.com/weblog/legacy
June 1, 2006
Age 19
Ambitious people want to leave legacies, but what sort of legacies do they want to leave? The traditional criterion is that your importance is measured by the effect of what you do. Thus the most important lawyers are the Supreme Court justices, since their decisions affect the entire nation. And the greatest mathematicians are those that make important discoveries, since their discoveries end up being used by many who follow.
This seems quite reasonable. One’s legacy depends on one’s impact, and what better way to measure impact than by the effect of what you’ve done? But this is measuring against the wrong baseline. The real question is not what effect your work had, but what things would be like had you never done it.
The two are not at all the same. It is rather commonly accepted that there are “ideas whose time has come,” and history tends to bear this out. When Newton invented the calculus, so did Leibniz. When Darwin discovered evolution through natural selection, so did Alfred Russel Wallace. When Alexander Graham Bell invented the telephone, so did Elisha Gray (before him, arguably).
In these cases the facts are plain: had Newton, Darwin, and Bell never done their work, the result would have been largely the same—we’d still have calculus, evolution, and the telephone. And yet such people are hailed as major heroes, their legacies immortalized.
Perhaps, if one only cares about such things, this is enough. (Although this seems a rather dangerous game, since the future could wake up at any moment and realize its adulation is misplaced.) But if one genuinely cares about their impact, instead of simply how their impact is perceived, more careful thought is in order.
I once spent time with a well-known academic, who had published several works widely recognized as classics even outside his field, and he offered some career advice in the sciences. (Actually, come to think of it, there are two people of whom this is true, suggesting the phenomenon has broader significance.) Such-and-such a field is very hot right now, he said, you could really make a name for yourself by getting into it. The idea being that major discoveries were sure to follow soon and that if I picked that field I could be the one to make them.
By my test, such a thing would leave a poor legacy. (For what it’s worth, I don’t think either person’s works fall into this category; that is to say, their reputation is still deserved even by these standards.) Even worse, you’d know it. Presumably Darwin and Newton didn’t begin their investigations because they thought the field was “hot.” They thought through doing it they would have a significant impact, even though that turned out to be wrong. But someone who joined a field simply because they thought a major discovery would come from it soon could never enjoy such a delusion. Instead, they would know that their work would make little difference, and would have to labor under such impressions.
The same is true of other professions we misconceive of as being important. Take being a Supreme Court justice, for example. Traditionally, this is thought of as a majestic job in which one gets to make decisions of great import. In fact, it seems to me that one has little impact at all. Most of your impact was made by the politics of the president who appointed you. Had you not been around for the job, he would have found someone else who would take similar positions. The only way one could have a real impact as Supreme Court justice would be to change your politics once appointed to the bench, and the only way you could prepare for such a thing would be to spend the majority of your career doing things you thought were wrong in the hopes that one day you might get picked for the Supreme Court. That seems a rather hard lot to swallow.
So what jobs do leave a real legacy? It’s hard to think of most of them, since by their very nature they require doing things that other people aren’t trying to do, and thus include the things that people haven’t thought of. But one good source of them is trying to do things that change the system instead of following it. For example, the university system encourages people to become professors who do research in certain areas (and thus many people do this); it discourages people from trying to change the nature of the university itself.
Naturally, doing things like changing the university are much harder than simply becoming yet another professor. But for those who genuinely care about their legacies, it doesn’t seem like there’s much choice.
CONTRIBUTOR BIOS
Aaron Swartz (1986–2013) was an American computer programmer, a writer, a political organizer, and an Internet hacktivist. He was involved in the development of RSS, Creative Commons, web.py, and Reddit. He helped launch the Progressive Change Campaign Committee in 2009 and founded the online group Demand Progress. He is survived by his parents and two brothers, who live in Chicago.
Lawrence Lessig is the director of the Edmon J. Safra Center for Ethics at Harvard University and a professor of law at the Harvard Law School. He was a founding board member of Creative Commons. He lives in Cambridge, Massachusetts.
Benjamin Mako Hill is an assistant professor in the Department of Communication at the University of Washington and a faculty affiliate at the Berkman Center for Internet and Society at Harvard. He is a participant and leader in free software and free culture communities.
Seth Schoen is senior staff technologist at the Electronic Frontier Foundation in San Francisco, where he worries about technology users’ freedom and autonomy. He and Aaron were friends for over a decade; they first met at the U.S. Supreme Court in 2002.
David Auerbach is a writer and software engineer who lives in New York. He writes the “Bitwise” column for Slate.
David Segal is the executive director and co-founder of the activism organization Dema
nd Progress. He previously served as a member of the Providence City Council and as a Rhode Island state representative. He ran for Congress in 2010, backed by much of the “netroots,” organized labor, and the Rhode Island progressive movement. During his tenure at Demand Progress he has helped lead various grassroots efforts to protect Internet freedom, including the successful defeat of the Stop Online Piracy Act (SOPA). He co-edited and wrote much of a book about that effort, called Hacking Politics. His writing on public policy matters has appeared in a variety of publications. He holds a degree in mathematics from Columbia University.
Henry Farrell is associate professor of political science and international affairs at George Washington University. He works on a variety of topics, including trust, the politics of the Internet, and international and comparative political economy. He has written articles and book chapters as well as a book, The Political Economy of Trust: Interests, Institutions and Inter-Firm Cooperation, published by Cambridge University Press.
Cory Doctorow is a Canadian-British blogger, journalist, and science fiction author who serves as co-editor of the blog Boing Boing. He is an activist in favor of liberalizing copyright laws and a proponent of the Creative Commons organization, using some of their licenses for his books. Some common themes of his work include digital rights management, file sharing, and post-scarcity economics. His novels include Down and Out in the Magic, Kingdom, and Little Brother.
James Grimmelmann is a professor of law at the University of Maryland. He studies how laws regulating software affect freedom, wealth, and power.
Astra Taylor is a writer and documentary filmmaker. Her films include Zizek!, a feature documentary about the world’s most outrageous philosopher, which was broadcast on the Sundance Channel, and Examined Life, a series of excursions with contemporary thinkers. Her writing has appeared in The Nation, Salon, Monthly Review, The Baffler, and other publications. Her most recent book is The People’s Platform. She lives in New York City.