Book Read Free

The Best American Magazine Writing 2016

Page 5

by Sid Holt


  Code culture is very, very broad, but the geographic and cultural core is the Silicon Valley engine of progress. The Valley mythologizes young geniuses with vast sums. To its credit, this culture works; to its shame, it doesn’t work for everyone.

  At any moment some new thing could catch fire and disrupt the tribal ebb and flow. Instagram was written in Python and sold for $700 million, so Python had a moment of glory. The next mind-blowing app could show up, written in some new language—and start everyone taking that more seriously. Within eighteen months your skills could be, if not quite valueless, suspect.

  I was in a meeting once where someone said, “How long will it take to fix that?” One person, who’d been at the company for years, said, “Three months.” A new person, who’d just come from a world of rapidly provisioned cloud microservices, said, “Three minutes.” They were both correct. That’s how change enters into this world. Slowly at first, then on the front page of Hacker News.

  Programmers carve out a sliver of cognitive territory for themselves and go to conferences, and yet they know their position is vulnerable. They get defensive when they hear someone suggest that Python is better than Ruby, because [insert 500-comment message thread here]. Is the next great wave swelling somewhere, and will it wash away Java when it comes? Will Go conquer Python? Do I need to learn JavaScript to remain profitable? Programmers are often angry because they’re often scared. We are, most of us, stumbling around with only a few candles to guide the way. We can’t always see the whole system, so we need to puzzle it out, bit by bit, in the dark.

  4.2. The Thing About Real Artists Is That They—

  As a class, programmers are easily bored, love novelty, and are obsessed with various forms of productivity enhancement. God help you if you’re ever caught in the middle of a conversation about nutrition; standing desks; the best keyboards; the optimal screen position and distance; whether to use a plain text editor or a large, complex development environment; chair placement; the best music to code to; the best headphones; whether headphone amplifiers actually enhance listening; whether open-plan offices are better than individual or shared offices; the best bug-tracking software; the best programming methodology; the right way to indent code and the proper placement of semicolons; or, of course, which language is better. And whatever you do, never, ever ask a developer about productivity software.

  Meanwhile, the executives who run large programming teams have to actually ship software. “Ship” is a cult word. If they don’t ship on time, managers could get a lower rating on their performance reviews and end up making only inordinate, as opposed to obscene, amounts of money. Wine cellars are at risk, not to mention alimony payments. As managers, their job—along with all the trust falls and consensus-building and active listening—is to reduce ship risk, which comes in many forms: bad bugs; features that were promised to bosses or clients that distract from boring, utterly necessary features; or test servers that crash at night.

  One of the greatest ship risks is anything shiny. This is where languages are particularly risky. An experienced and talented programmer can learn a language in a week, but a middling one is going to take much longer. Meanwhile, exciting, interesting programming languages always come with a list of benisons, promises of speed or productivity or just happiness. No, really. Happiness is a serious selling point for languages, and people have written blog posts where they analyze how people discuss code. According to an analysis by GitHub user Tobias Hermann, PHP coders are far more likely to use the word “hate” in their Reddit comments than Clojure programmers; Clojure programmers are far more likely to use the word “cool” than PHP programmers.

  There are many blog posts on how to persuade your manager to switch to a new language. Experienced managers, who bear scars and were often coders themselves, become practiced at squinting and coughing and saying things like, “No, the switching cost is just too high right now,” or, “Maybe we could do a two-week trial project when we build the analytics reporting engine.”

  Then the programmers shuffle back to their standing desks and complain until the product is shipped. Or else they just quit, because Lord knows there are jobs out there. For programmers, particularly the young ones, there are jobs everywhere.

  Managers and old coders have fewer options. It’s often better to just keep working and shipping, even if the code starts to look ugly, even if there are nominally better solutions, even as the technical debt accrues around you because in a few years everything will change. Maybe you’ll get promoted and the new manager will have the will and motive to tear up everything you did, cursing, and start again (perhaps using a new language) with the goal of making something much simpler. Or the entire industry will spasm and everything you’ve done will need to be thrown away and rebuilt along new lines anyway. (From desktop to web, from web to mobile, from mobile to…quantum? Who knows. But there’s always something.)

  Some code is beautiful and you want to read it, reuse it, and take it into your program and your heart. Some code is annoying and pretentious, and some code looks good at first and turns out to be kind of mean. Estimating code quality is a big part of programming. Go on—judge.

  Somehow it keeps working out. The industry is always promising to eat itself, to come up with a paradigm so perfect that we can all stop wasting our time and enter a world of pure digital thought. It never happens.

  4.3 We Still Need to Choose…

  Nine weeks into the re-architecture, you have asked TMitTB to come by the office and talk next steps.

  You’ve noticed that his team has started to dress like him. One of the women is in tall boots and has done something complex with her hair. She’s wearing a black leather jacket. Nothing ostentatious, just cooler. She was previously all Patagonia. Is this how programmers dress? How did they get their own executive style?

  “PHP,” he says, “well—it is what it is. The team had a good time at PHP[world]. But I think the thing we might have learned…”

  He doesn’t pronounce the brackets, of course, but you approved the expense, and that’s how they write it, bracketed. It’s good they had a good time, because it cost you $25,000 to send them to that conference and put them in hotels and feed them, and you have no idea whether that was money well spent or not.

  “…is that we really need to move off of PHP.”

  Oh. Well. There’s your answer.

  “We’re all agreed that PHP isn’t the language for our next five years.”

  “Which one would you say is?”

  “Ay, there’s the rub,” he says, and you have to remind yourself to not show him your real face right now. If he quotes Hamlet again, though…

  “Well,” you ask, “which language do you want to use?”

  He looks confused. “I mean, it doesn’t matter,” he says. “I don’t write the code.”

  Then who does? And you realize, right now, the answer is no one.

  5.1 What Is the Relationship Between Code and Data?

  Data comes from everywhere. Sometimes it comes from third parties—Spotify imports big piles of music files from record labels. Sometimes data is user-created, like e-mails and tweets and Facebook posts and Word documents. Sometimes the machines themselves create data, as with a Fitbit exercise tracker or a Nest thermostat. When you work as a coder, you talk about data all the time. When you create websites, you need to get data out of a database and put them into a webpage. If you’re Twitter, tweets are data. If you’re the IRS, tax returns are data, broken into fields.

  Data management is the problem that programming is supposed to solve. But of course now that we have computers everywhere, we keep generating more data, which requires more programming, and so forth. It’s a hell of a problem with no end in sight. This is why people in technology make so much money. Not only do they sell infinitely reproducible nothings, but they sell so many of them that they actually have to come up with new categories of infinitely reproducible nothings just to handle what happened with the last
batch. That’s how we ended up with “big data.” I’ve been to big-data conferences and they are packed.

  5.2 Where Does Data Live?

  It’s rare that a large task is ever very far from a database. Amazon, Google, Yahoo!, Netflix, Spotify—all have huge, powerful databases.

  The most prevalent is the relational database, using a language called SQL, for Structured Query Language. Relational databases represent the world using tables, which have rows and columns. SQL looks like this:

  SELECT * FROM BOOKS WHERE ID = 294;

  Implying that there’s a table called BOOKS and a row in that table, where a book resides with an ID of 294. IDs are important in databases. Imagine a bookstore database. It has a customer table that lists customers. It has a books table that lists books. And it has a clever in-between table of purchases with a row for every time a customer bought a book.

  Congratulations! You just built Amazon! Of course, while we were trying to build a bookstore, we actually built the death of bookstores—that seems to happen a lot in the business. You set out to do something cool and end up destroying lots of things that came before.

  Relational databases showed up in the 1970s and never left. There’s Oracle, of course. Microsoft has SQL Server; IBM has DB2. They all speak SQL and work in a similar manner, with just enough differences to make it costly to switch.

  Oracle makes you pay thousands of dollars to use its commercial enterprise database, but more and more of the world runs on free software databases such as PostgreSQL and MySQL. There’s even a tiny little database called SQLite that’s so small, so well-behaved, and so permissively licensed that it’s now in basically every smartphone, available to apps to help them save and load data. You probably have a powerful SQL-driven database in your pocket right now.

  5.3. The Language of White Collars

  If you walk up to some programmers and say, “Big corporate programming,” they’ll think of Java.

  Java is a programming language that was born at Sun Micro-systems (RIP), the product of a team led by a well-regarded programmer named James Gosling. It’s object-oriented, but it also looks a lot like C and C++, so for people who understood those languages, it was fairly easy to pick up. It was conceived in 1991, eventually floating onto the Internet on a massive cloud of marketing in 1995, proclaimed as the answer to every woe that had ever beset programmers. Java ran on every computer! Java would run right inside your web browser, in “applets” (soon called “crapplets”), and would probably take over the Web in time. Java! It ran very slowly compared with more traditional languages such as C. What was it for? Java! They also had network-connected computer terminals called JavaStations. Java! Kleiner Perkins Caufield & Byers even announced a $100 million Java fund in 1996. But after all that excitement, Java sort of…hung out for a while. The future didn’t look like Sun said it would.

  Java running “inside” a web browser, as a plug-in, never worked well. It was slow and clunky, and when it loaded it felt like you were teetering on the edge of disaster, a paranoia that was frequently validated when your browser froze up and crashed. Java-enabled jewelry, meant to serve as a kind of digital key/credit card/ ID card, also had a low success rate. But Java was free to download and designed to be useful for small and large teams alike.

  Here are some facts about Java to help you understand how it slowly took over the world by the sheer power of being pretty good.

  It was a big language. It came with a ton of code already there, the “class library,” which had all the classes and methods you’d need to talk to a database, deal with complex documents, do mathematics, and talk to various network services. There were a ton of classes in that library waiting to be turned into objects and brought to life.

  It automatically generated documentation. This was huge. Everyone says code deserves excellent documentation and documentation truly matters, but this is a principle mostly proven in the breach. Now you could run a tool called javadoc, and it would make you webpages that listed all the classes and methods. It was lousy documentation, but better than nothing and pretty easy to enhance if you took the time to clean up your code.

  There were a lot of Java manuals, workshops and training seminars, and certifications. Programmers can take classes and tests to be officially certified in many technologies. Java programmers had an especially wide range to choose from.

  It ran on a “virtual” machine, which meant that Java “ran everywhere,” which meant that you could run it on Windows, Mac, or Unix machines and it would behave the same. It was an exceptionally well-engineered compromise. Which made it perfect for big companies. As the 2000s kept going, Java became more popular for application servers. Creating a content management system for a nongovernmental organization with 2,000 employees? Java’s fine. Connecting tens of thousands of people in a company to one another? Java. Need to help one bank talk to another bank every day at 5:01 p.m.? Java. Charts and diagrams, big stacks of paper, five-year projects? Java. Not exciting, hardly wearable, but very predictable. A language for building great big things for great big places with great big teams.

  People complain, but it works.

  5.5. Liquid Infrastructure

  “Enterprise” is a feared word among programmers because enterprise programming is a lot of work without much to show for it. Remember healthcare.gov, the first version that was a total disaster? Perfect example of enterprise coding. At the same time, programmers respect big systems—when they work. We respect the ambition of huge heavy machines running big blobs of code. We grew up reading about supercomputers. Big iron is cool, even if the future seems to be huge cloud platforms hosting with tons of cheap computers.

  But Java is also in wide use at Google. It’s a language for places such as General Electric and Accenture. These aren’t startups, but if their product schedules slip, so does their revenue, and they are beholden to the public markets. Gigantic data-driven organizations are structured around code, around getting software made. But that doesn’t mean their teams are huge—Amazon, for example, is famous for its two-pizza rule: “Never have a meeting where two pizzas couldn’t feed the entire group.”

  These companies have cultures that know how to make software. They have whole departments dedicated to testing. The process is important because there are so many moving pieces, many of them invisible.

  Academic researchers often produce things that basically work but don’t have interfaces. They need to prove their theses, publish, and move on to the next thing. People in the free-software community often code to scratch an itch and release that code into the digital commons so that other people can modify and manipulate it. While more often than not this process goes nowhere, over time some projects capture the imagination of others and become part of the infrastructure of the world.

  Java, interestingly, profits from all this. It’s designed for big corporate projects and has the infrastructure to support them. It’s also a useful language for midsize tasks. So the libraries that you need to do things—image processing, logging on to files, full-text search—keep appearing at a steady clip, improving on the standard libraries or supplanting them entirely.

  Eventually, people realized that if they didn’t like the Java language, they could write other languages that compile to Java “bytecode” and run on the Java virtual machine (JVM). So there are now many languages that run on top of Java. Some are versions of well-known languages, such as Jython and JRuby. Others are totally new, like Scala, which is one of the languages that Twitter began to use when it outgrew Ruby.

  The point is that things are fluid in the world of programming, fluid in a way that other industries don’t seem to be. Languages are liquid infrastructure. You download a few programs and, whoa, suddenly you have a working Clojure environment. Which is actually the Java Runtime Environment. You grab an old PC that’s outlived its usefulness, put Linux on it, and suddenly you have a powerful web server. Now you can participate in whole new cultures. There are meetups, gatherings, conf
erences, blogs, and people chatting on Twitter. And you are welcomed. They are glad for the new blood.

  Java was supposed to supplant C and run on smart jewelry. Now it runs application servers, hosts Lisplike languages, and is the core language of the Android operating system. It runs on billions of things. It won. C and C++, which it was designed to supplant, also won. A lot of things keep winning because computers keep getting more plentiful. It’s weird.

  5.7. What About JavaScript?

  Remember Netscape, the first huge commercial web browser? In 1995, as Java was blooming, Netscape was resolving a problem. It displayed webpages that were not very lively. You could have a nice cartoon of a monkey on the webpage, but there was no way to make the monkey dance when you moved over it with your mouse. Figuring out how to make that happen was the job of a language developer named Brendan Eich. He sat down and in a few weeks created a language called JavaScript.

  JavaScript’s relationship with Java is tenuous; the strongest bond between the languages is the marketing linkage of their names. And the early history of JavaScript was uninspiring. So the monkey could now dance. You could do things to the cursor, make things blink when a mouse touched them.

  But as browsers proliferated and the Web grew from a document-delivery platform into a software-delivery platform, JavaScript became, arguably, the most widely deployed language runtime in the world. If you wrote some JavaScript code, you could run it wherever the Web was—everywhere.

  JavaScript puttered around for years in the wilderness, as Java did, too. But without the resolute support of a corporate entity like Sun.

  Then, about a decade ago people began to talk about Ajax—the idea that you could build real software into a webpage, not just a document, but a program that could do real work.

 

‹ Prev