The Silo Effect

Home > Other > The Silo Effect > Page 18
The Silo Effect Page 18

by Gillian Tett


  It was a message he hoped to keep spreading. By 2014, he had moved on once again in his career, and left City Hall to return to the private sector. But in his spare time, he worked as a fellow in urban science at the University of Chicago, where he taught classes on how to help governments use data more effectively.21 He hoped it would help to convince some of the young computer scientists to jump across boundaries and silos. Most of the young tech geeks he saw dreamed of being the next Mark Zuckerberg, working in glamorous freewheeling start-ups. To them, the idea of working for government of any sort is anathema. But Goldstein hoped he might broaden their minds. “We need to get more technology people into government,” he observed. “So I try to tell the students to think of doing something different.”

  To mark his fortieth birthday milestone, his wife contacted colleagues and friends and asked them to send a birthday email greeting. To her surprise, fifty-nine of his former colleagues sent affectionate messages. Some came from City Hall and the Chicago Police Headquarters. Others were sent by his tech friends at OpenTable. But there was also a message from Rod Gardner, his former training officer in the 11th District, the man who had once cornered him in the police gym.

  “It said: ‘I always thought that you were an [FBI] plant. LOL!’ ” Goldstein recalled. He liked to think it was a compliment. But it was also a small testament to how surprising life can sometimes be. Particularly when you are willing to jump out of your normal box.

  6

  (RE)WRITING SOCIAL CODE

  How to Keep Silos Fluid

  “We want to be the anti-Sony, the anti-Microsoft—we look at companies like that and see what we don’t want to become.”

  —Senior Facebook executive

  JOCELYN GOLDFEIN SAT AT HER desk in a scruffy, open-plan office at the Palo Alto headquarters of Facebook, and felt a flush of shame wash over her. She stared with horror at her computer. A voice in her head asked: How could I be so stupid?

  Five weeks earlier in the summer of 2010, Goldfein, thirty-nine, had joined the fast-growing social media giant, hoping to build a new chapter in her career. The move had seemed wildly exciting. A no-nonsense woman with sleek brown hair and cheerful, dimpled face, Goldfein was a rare creature by Silicon Valley standards: a female computer scientist from the hallowed ranks of Stanford University who held a top management role. Before joining Facebook, she had worked for seven years at VMware, a group that made cloud computing technology. She started as a computer engineer who loved to write code; she was particularly passionate about “triaging bugs,” as she liked to say (computer jargon for fixing glitches in code). “At VMware, I built a name for myself by being like this monstrous triager of bugs! I, like, triaged one thousand bugs in my first month at VMware!” she later recounted. But by the time she left VWware in 2010, she had been promoted to general manager of engineering, a role that put her in charge of hundreds of engineers.1 That made her a prime catch for a fast-growing company such as Facebook. And though Goldfein did not initially have much interest in the social media group, since she hated working for big, bureaucratic companies, she was converted to the Facebook dream after she met its founder, Mark Zuckerberg.2 “There was no question after I met him that Mark was by far the most impressive founder of all the founders I met. It’s sort of a trite thing to say, in hindsight, that’s like, ‘duh.’ But he was amazing.”

  So in July 2010, she turned up at Facebook’s trendy “warehouse” style offices in Palo Alto. But then events took an unusual twist: instead of being thrown into a management job, she was told to join an “onboarding” induction course known as “Boot-camp,” a six-week training program for new recruits.3 This request was unusual, particularly given that she had already been deputy head of engineering of a vast company. But everybody joining Facebook got hazed together, no matter their age and rank; it was a company rule that all new entrants should experience a joint introduction process, like recruits joining an army—or Goldstein joining the Chicago police.

  So the new recruits were all pulled together into a room and asked to start working on some beginner projects, sitting side by side at a desk. Goldfein was treated just like the rookies, and given a humdrum task: triaging five bugs in the system. She was thrilled; fixing bugs was her speciality. Like most computer engineers, she had loved solving problems ever since she was a child, a trait she first learned from her grandmother in Northern California. “This woman did logic puzzles for fun; she would solve Rubik’s Cubes as a hobby,” Goldfein said. “She taught me to do those when I was a kid and it was sort of an epiphany when I got to programming. It’s sort of the same thing.”4 But as she started battling with her bugs, Goldfein noticed something odd: three out of the five bugs she was supposed to be fixing did not seem to pose any problem after all.5 Is it a trick? She suspected so. But then she realized that there was a simpler explanation: Facebook was such a fast-growing company that its computer scientists were rewriting codes at a furious pace. Any bugs attached to the old systems kept disappearing from view since those pieces of code were just not being used anymore.

  Did that matter? Most computer engineers might say no. In Silicon Valley people prefer to build new products for the future, not clean up boring problems from the past. But Goldfein had a tidy, precise mind that liked to keep her world neat.6 “Bugs don’t sound like a very sexy thing, but if you are determined to ship high-quality software, one of the things you need is information about what is the state of the software,” she explained. “A bug database can be a faithful representation of the state of the world . . . [but only] if you are really scrupulous about your bug hygiene.”

  So as she sat at her desk, she decided to improve Facebook’s hygiene by creating a program that would track old bugs, send emails to anybody who had ever worked on the code, and then kill the obsolete bugs. “At Facebook things move so fast that if something has been untouched for three months, it is a pretty good sign that it is irrelevant. So [this system] would trigger an email to everybody copied on the bug and then if another three months went by and nobody responded . . . it would just auto-close it.”

  She christened her program the “Task Reaper” and she set about testing her creation on a small scale. But then disaster struck. As she typed code on the computer, she accidentally pressed the copy and paste keys on her keyboard and implanted her embryonic code into the entire live Facebook system. Within seconds, the pilot Task Reaper had identified 14,000 dead bugs in the system and dispatched hundreds of thousands of emails to all the associated Facebook staff. The volume was overwhelming. The company’s email system crashed, freezing the Facebook network, and locking all the staff out of their messages.7 Furious howls erupted across the office. Goldfein was horrified. Crashing the office computers was an appalling mistake for a rookie employee to make. She assumed there would be unpleasant repercussions. “I was brand-new, nobody knew me from Adam—and there was a pretty strong reaction,” she explained. “Our sales team couldn’t interact with customers, engineers couldn’t do code reviews. The email [was] kind of the lifeblood of the company.”

  But then she encountered another surprise: when other Facebook staff rushed over to find out what had happened, they seemed more interested in asking why she had created her Task Reaper program than in punishing her for her mistake or complaining that she had trespassed into the territory of two existing departments, the “exchange” team and the “bug tools” team. “There was no one saying: ‘How dare you?’ ” she recounted. “The people I expected to be most mad at me actually just sort of rolled up their sleeves, waded in, and started fixing the problem.”

  This reaction was different from anything Goldfein had experienced. At other big companies, different teams tended to guard their turf from outsiders—and each other—and there was so much competition between different teams that they did not welcome incursions. Indeed, it was because she had seen so much tribal infighting at other big technology companies that she had initially been so wary of joining Facebook. She ha
ted large bureaucracies—and silos.

  But as she looked around at Facebook, she realized that the social media giant seemed different from what she had seen before. It was not just that the company’s work practices seemed somewhat unstructured, like the graffiti used to decorate its walls.8 It also seemed less plagued by the internal rivalries and rigidities seen at its rivals. The silos and bureaucratic structures that had undermined Sony did not seem to exist inside Facebook, or not as far as Goldfein could see.

  Was that just a happy accident? At the time, Goldfein did not know. But the answer is crucial to the bigger themes of this book. In the first half, I described how silos can sometimes have a pernicious impact on institutions and social groups, making people blind to risks and opportunities. The tale of Sony that I described in Chapter Two was a classic example of how silos can sometimes crush innovation. There its once creative engineers ended up becoming embroiled in endless turf wars—and unwilling or unable to cooperate. However, this tale was not unique to Sony, or Japanese culture; destructive silos exist in many large institutions, even—or especially—those which have been successful in the past. Microsoft, General Motors, and UBS are just some examples.

  But what is perhaps even more interesting than the question of why some individuals and institutions are damaged by silos is why some groups do not suffer from this curse to the same degree. Why do some companies, people, or entities avoid the type of turf warfare and tunnel vision that beset Sony and UBS? What can we do to avoid those problems? In the last chapter, I suggested one micro-level reason: people who are willing to take risks and jump out of their narrow specialist world are often able to remake boundaries in interesting ways. Traveling in a mental sense, if not in a physical sense, can set people free from silos; if nothing else because it enables them to imagine a different way of living, thinking, and classifying the world.

  But while the story of how individual people can combat silos is interesting, it is only part of the issue. The other big question is whether institutions can find ways to silo-bust on a bigger scale. Can the type of journey that Brett Goldstein embarked on in Chicago be replicated on an institutional level? In this respect, the story of Facebook offers ideas that could be applied in many institutions. The company has sparked a revolution in how we all communicate and interact with each other across the world. Facebook has helped people to remodel their ties and identities in all types of communities and friendship groups. But what is less well known is that the group has also tried an experiment in social engineering inside the company too, by influencing how its employees interact. In particular, Facebook officials have spent many hours worrying and thinking about their employees’ cognitive maps, social structures, and group dynamics. That has prompted them to implement deliberate experiments to stop silos developing inside the company and prevent Facebook from suffering from the fate of a company such as Sony.

  These experiments are still at an early stage. The company is barely a decade old. But though the initiative is still developing, it throws up some fascinating lessons. The engineers who have developed these silo-busting experiments in Facebook have done so by borrowing ideas from the world of social science about the ways that humans interact—and then tried to apply them in a practical sense to their company. Most importantly, they have deliberately tried to do what anthropologists constantly do, but most companies do not: think about how their employees define the world, classify their surroundings, and navigate boundaries. Pierre Bourdieu would have felt at home.

  IT IS NOT SURPRISING that Facebook has launched so many internal social experiments. After all, since its inception the secret of its success has been to marry quantitative computing skills with soft analysis about humans’ social ties and then turn this into a hard-nosed business plan. The company’s leaders are fascinated by both computing and social code. They know that blending these two can produce a corporate gold mine.

  The origins of the company are legendary. Back in late 2003, Mark Zuckerberg, then a Harvard sophomore studying psychology, first dreamed up the idea of creating a “Facemash”—later called “The Facebook”—website to connect students to each other.9 Zuckerberg hit on this idea even though he was not a wildly social creature himself. He spent most of his time immersed in computers and computing code. Yet despite his outsider, geek status, or perhaps because of it, he also had an innate instinct for what made humans tick, and how to play on their insecurities and need for interaction. His venture started small. In the winter of 2003 Zuckerberg began talking to some of his fellow students about building a website that would list all of the Harvard students next to their photographs. Then, in February 2004, Zuckerberg and Eduardo Saverin, a junior, launched The Facebook.10 They built the site at a breathless pace, expanding it to other colleges. Then Zuckerberg dropped out of college and moved west to Palo Alto, where he rented a small, scruffy house with fellow computing enthusiasts. The site exploded. By September that year, Facebook had developed one of its first defining features: the “Wall,” which enabled members to write titbits of information and comments on profile pages. It expanded to cover not just colleges, but high schools and corporate entities.11 Then Sean Parker, the legendary entrepreneur, raised finance from investors such as Peter Thiel and the venture capital group Accel Partners.12 As Facebook grew, it introduced a series of iconic features: “News Feed” (which collects in one place feeds about friends); “Platform” (a system to let outside programmers develop tools for sharing photos, taking quizzes, and playing games); “Chat” (a tool to enable users to talk to each other), and “Like” (a feature that registers approval about posts). Its popularity swelled and more milestones were passed. In the autumn of 2007 Facebook sold a 1.6 percent stake to Microsoft for $240 million and created an advertising partnership. The following year, it hired Sheryl Sandberg, the glamorous and savvy Washington insider and former Google executive, as chief operating officer. In June 2009, the company notched up another big achievement: it became the most popular social media site in the world, eclipsing MySpace.

  But what was striking about Facebook’s rapid growth was not simply its vast number of users; the other startling issue was how the site reshaped patterns of social interaction. As Facebook took off, groups of people who had never been connected could link up online, swapping stories, news, and ideas. Long-lost friends were reunited, memorials staged, birth announcements posted, and jobs advertised. Facebook enabled people to “cluster” together in groups online, mimicking the way they interacted in the real world. People could collide with new people and ideas online. But they could also huddle with familiar friends. Social media created both the potential for people to open up their social world and to restrict it into self-defined groups, or cyber tribes.

  Most Facebook users never gave any thought to the underlying structural patterns that were created by this “clustering” and “collision” dynamic. They just wanted to connect with “friends.” But Zuckerberg and his fellow computer scientists at Facebook took a more analytical view. When they looked at the complex ties that created friends in the real world and cyberspace they did not just see a warm, undefined lump of emotion. Instead, they saw patterns behind these emotional links. And while disciplines like anthropology, psychology, or sociology have analyzed social interactions with soft, nonquantitative techniques, the Facebook engineers were part of a fast expanding breed of data scientists who assumed they could study social interactions with mathematics, not just ideas. To them, human connections were akin to the electronic components of a computer screen or mathematical model: something that could be mapped. “We are not anthropologists. Most of us are trained as computer scientists. But we are really interested in how people interact, how systems work, how people communicate,” Jocelyn Goldfein observed. “Because of our computing training, we tend to think of human organization problems as graph problems—we look at systems, nodes, and connections. And when you look at the world like that, it can get some really interesting results.”

>   IN THE SUMMER OF 2008, Facebook quietly passed a little milestone. Its leaders realized that the company was expanding so fast that it employed more than 150 computer engineers.13 Outside Facebook nobody knew—or cared—that the company had passed this 150 number. After all, whenever Silicon Valley start-ups are successful, they explode in size. When Goldfein was working at VMware before joining Facebook, the staff there swelled from a few hundred to 10,000 in just seven years. Rapid growth is considered a badge of honor.

  But when the top Facebook managers realized they had broken the 150 threshold they became uneasy. The reason lay with the concept known as “Dunbar’s number”—the theory developed by British evolutionary psychologist–cum–anthropologist Robin Dunbar. Back in the 1990s, Dunbar conducted research on primates and concluded that the size of a functioning social group was closely related to the size of a human, monkey, or ape brain.14 If a brain was small, the size of a monkey’s or ape’s, say, the creature could only cope with a limited number of meaningful social relations (a few dozen). But if a brain was bigger, as for a human’s, a wider circle of relationships could be formed. Humans did this, Dunbar argued, via “social grooming,” conventions that enabled people to be closely bonded. Just as primates created ties by physically grooming each other’s skin by picking out nits, humans bonded with laughter, music, gossip, dance, and all other ritualistic day-to-day interactions that develop when people work or live together.

  The optimal size for a social group among humans was about 150, Dunbar suggested, since the human brain had the capacity to maintain that many close ties via social grooming, but not more. When groups became larger than that, they could not be held together just by face-to-face bonds and social grooming, but only with coercion or bureaucracy. Thus, bands of hunter-gatherers, Roman army units, neolithic villages, or Hutterite settlements all tended to be smaller than 150. When they grew larger than that, they typically split. In the modern world, groups that are less than 150 in size tend to be more effective than bigger units, he argued, and humans seemed to instinctively know this. Modern college fraternities tend to be smaller than this. Most companies’ departments are below this threshold. And when Dunbar examined how British people exchanged Christmas cards in the early 1990s (which he considered to be a good definition of a friendship circle in British culture back then, before the advent of platforms such as Facebook), he discovered that the average number of people reached by the cards that somebody dispatched to different households was 153.15 “This [150] limit is a direct function of relative neocortex size, and this in turn limits group size,” as Dunbar wrote. “The limit imposed by neocortical processing capacity is simply on the number of individuals with whom a stable inter-personal relationship can be maintained.”16

 

‹ Prev