Sid Meier's Memoir!

Home > Other > Sid Meier's Memoir! > Page 6
Sid Meier's Memoir! Page 6

by Sid Meier's Memoir! (retail) (epub)


  But when your opponent’s pieces are right in front of you on the table, it’s easy to guess where they are. Board game designers often tried to solve the problem with a complex system of fake pieces known as dummy counters, but these were awkward at best. By comparison, it was actually less work for a computer programmer to leave items unrendered on the screen. We didn’t have to hide anything; we just chose not to draw it in the first place. Cardboard had served its purpose admirably when it was all we had, I thought, but transistors were obviously superior.

  The problem was that Nato Commander was boring. For one thing, the limited scope of the map stole a surprising amount of momentum. There was some indefinable quality about seeing the world spread out before you, ripe for the conquering, but scrolling slowly back and forth across multiple screens sucked the energy right out of that experience. It turned out the game did need to take up an entire table, after all. This may have been the reason I became fixated on a zoom-able map for Silent Service, I can’t remember. Frankly, though, the map wasn’t the biggest issue.

  When I played games like Risk as a child, my friends and I would crowd around the board, sharing our triumphs and good-natured taunts together. Invade our country, and we would take it personally; help stop the invader, and we would remember the favor. When one of us came too close to winning, the rest would team up to bring him down, and no computer would ever threaten to thumb wrestle me over who got to play Australia. Each player brought their own personality to the interaction, and even their own mood on a given day, and my algorithms simply could not replicate the camaraderie of friends around a table, egging each other on and learning from each other’s strategies. My coworkers and I still regularly played board games together in the company break room, in fact—proof that even people who made computer games for a living understood the value of the in-person gaming experience. I’m admittedly biased, but I’d love to see someone crunch the numbers on productivity and job satisfaction in companies that choose gaming over other forms of team building.

  Perhaps Turing had been right after all, with his belief that good AI had to involve social skills. Up to this point, I hadn’t realized that community was such an important part of the fun when it came to wargames—nor, unfortunately, did I realize it now. Instead, after wrapping up Silent Service, I went back to the wargame genre, and persisted in banging my head against it for the next three projects in a row.

  Crusade in Europe, Decision in the Desert, and Conflict in Vietnam were, like the original Ace series, closer to what would one day be called a game plus two expansion packs. All three were based on an engine I developed for Crusade in Europe, which was itself a reworking of the original NATO Commander code. With each successive release, we tried to add more historical depth, which is where I mistakenly thought we were going wrong. It didn’t fix the gameplay issues, but it did lead to some moving narratives, at least.

  To help us, we hired a historian and former Princeton professor named Ed Bever, who happened to write strategy game reviews for Antic magazine in his spare time. In addition to his deep understanding of military scenarios both past and present, he had once written that NATO Commander was “exciting and exacting,” so obviously we thought he had good taste in games as well.

  Among other talents, Ed was masterful at navigating the dichotomy between fun and solemnity. Real battles could be sensitive subjects, and it wouldn’t be appropriate to put in quite the same level of destructive joy that we could get away with in other titles. This was especially true for Conflict in Vietnam, and the problem reared its head in a surprising number of places.

  “One issue which aroused strong feelings was what to call losses on the status display screen,” Ed wrote in the designers’ notes. “We reverted to total casualties for two reasons. One was to avoid offending those who lost relatives in Vietnam and therefore might find it offensive to count bodies, even in simulation. Second, the body count creates a misleading impression of the casualty ratios, because many Americans survived wounds that would have killed Vietnamese.”

  This was the first time we’d ever put disclaimers alongside the historical information in our manual, and it never quite sat right with me. Not because I thought we should have been less delicate, but because I realized that I would rather create things that didn’t require disclaimers in the first place. All three of the Command Series, as the wargame trilogy came to be known, provided a solid simulation experience and profound historical lessons—but I don’t think they necessarily counted as games.

  The most elementary, defining feature of gaming is its interactivity. Players may not be rewarded for every choice, but the control over the outcome must be primarily in their hands, otherwise they’re just watching a movie that demands occasional button jabs. In this case, there were not only too many historically predetermined parameters, but I had also introduced too much AI into the units. My hope had been to eliminate pesky micromanaging, but instead players had ended up with very little to do. They could even choose to do nothing at all, instead watching the game play out entire simulations against itself. Many reviewers were impressed with this—or at least they thought they were. But in reality, it wasn’t fun to watch more than a couple of times. Like a computer endlessly calculating pi, it was conceptually neat, but not really all that interesting long-term.

  These problems were only compounded by the fact that the conflicts were too recent to make any ending feel particularly happy. Even a swift, decisive victory still left the player asking, “But at what cost?” I’ve always felt that our role as game designers is to suspend reality, not examine the pain of real moral dilemmas. There’s a place for that in art, certainly—and videogames do count as art—but it’s generally not a place where people want to spend their time after a long day at the office. Even setting aside the added intensity of first-person engagement vs. passive observation, games are expected to sustain their audience far longer than any other art form. A trip to a museum or a tragic film might demand up to three hours of uncomfortable soul-searching, but game designers are asking you to commit somewhere between twenty and a billion hours with us. Not many people are willing to wallow in life’s toughest moments for that long, and at the very least, I didn’t want to wallow in them myself for an entire year of development. It had taken longer than usual for me to learn this lesson, but finally, I broke free of the wargame obsession, and returned to the skies.

  * Achievement Unlocked: We Didn’t Start the Fire—Collect Billy Joel, the Ayatollah, and Kennedy.

  5

  COLLECTIVE EFFORT

  Gunship (1986)

  AS I WAS SEARCHING FOR WHAT to work on next, several years’ worth of drama was coming to a head in the computer hardware world, where Atari and Commodore had entered a war that was at least as much personal as it was business. The dispute was complicated, involving hostile takeovers, ex-employees of both companies defecting into new ventures, and financial contracts being physically lost and then discovered again. The end result was that each company was claiming ownership of a technology that neither one had developed, and both had filed multiple suits against the other in court. At the center of it all was the latest holy grail of processor technology: the 68K chipset, code-named Lorraine.

  For the record, I always felt like code-naming projects was self-aggrandizing, and we never did it at MicroProse. My games were always just “the submarine game” or “the D-Day game” until we came up with a real title just before launch. But to be fair, our products already came with a sense of anticipation built in—the word “game” alone implied something exciting was in store. If you’re designing hardware, I suppose “the faster gray box” doesn’t work that well as a code name. These days, our publisher does make us use code names, as project teams have expanded and corporate espionage has become a real issue for the industry. Emails are all too easy to leak, and I understand the need for secrecy. But it does sometimes lead to a “Who’s on first?” kind of conversation when someone isn’t sure whi
ch off-the-wall code name goes with which project. Personally, if I don’t want to tell people what I’m working on, I just don’t tell people what I’m working on.

  In any case, the ownership of Lorraine would take years for Atari and Commodore to finally settle, but in the short term, neither one could stop the other from using it in their next generation of hardware. I didn’t care one way or the other about the corporate politics. A technology arms race was a great thing from our perspective, and having the 68K processor in both the Atari ST and the Commodore Amiga just meant we could deliver superior games to twice as many people.

  Without a specific topic in mind, I went to work on a new 3D engine for the Amiga, which would end up being my only project on that machine before it bit the dust. The Amiga wasn’t a bad computer by any means, but it failed to live up to its promise in sales numbers, and to a small business like ours, that mattered. We’d spend up to a year fine-tuning a game on whatever computer we started with—tweaking the visual layout for a particular screen resolution, optimizing sound effects with a certain audio chip in mind, and so on. Then when it was done, we’d spend only a few months shoehorning the code onto the other major systems. Our initial release was always going to be the best version of a game, so it made sense to maximize the experience for the greatest number of customers by developing on whatever was most popular at the time. The Amiga had a dedicated cult following, but it never rose to the top, either in homes or in our offices.

  This trend wouldn’t become apparent for at least a year, though, because developers were given prototypes of new hardware to work with long before they were sold to the public. So I tinkered away on my new 3D engine, imagining what kind of games we might someday make with it, while down the hall, the rest of the company continued to work on the established platforms.

  One of these titles in progress was a helicopter game for the Commodore 64 called Gunship, which was created by Andy Hollis and a new designer named Arnold Hendrick. It had a heavy influence from the pen-and-paper role-playing games that Arnold had started his career making, including the somewhat radical concept of permanent death. You could save your progress and continue accruing victories at a later date, but Gunship gave no option to reload from a saved game after a failed mission. If you died, you died—although some players pulled off a deus ex machina rescue by quickly ejecting the floppy disk before their data could be overwritten. Other atypical features of the game included naming your character, and choosing your helicopter’s weaponry while staying under maximum weight requirements, similar to allocating skill points in a traditional RPG. A level-20 wizard in a Dungeons & Dragons campaign could run from a battle or spend a night sleeping at the inn to replenish his stats, while the Gunship helicopter pilot could sit out a mission under the guise of sick leave, or take some needed R&R off-base. These character mechanics had been tested for more than a decade by board gaming veterans, and Gunship would be one of the first to successfully bring them into the digital realm.

  Notably, players could also choose whether to automate helicopter landings or manage them personally, a process we made sure to explain in the manual this time.

  But while the design mechanics were breaking boundaries, the flight mechanics were just breaking. We knew there would be an issue with unfamiliarity, since this was one of the first helicopter simulations ever to go on the market, and we planned to flatten the learning curve with a colorful frame of reminders called a keyboard overlay—a lost relic in today’s world of ergonomic peripherals that are barely thicker than cardboard themselves. But our playtesters assured us that the collective, as a helicopter’s control stick was known, operated intuitively enough. The main problem, they reported, was speed.

  Despite flying slower than their winged cousins, helicopters are also more responsive side-to-side, and for a game programmer, rotating the world is harder than zooming by it horizontally. It would take several seconds for a plane to bank around from one position to the next, but a helicopter could turn sharply and even spin in place, which meant we had to render 360 degrees of landscape in a three-dimensional arc faster than we ever had before.

  I offered up my new 3D engine, and the team eagerly took it, even though it would require a complete overhaul of Gunship’s underlying program. The Commodore 64 was less powerful than the Amiga that I’d been creating it for, but the new engine was still more efficient than anything else we had. Together, Andy Hollis and I spent months retrofitting the code and attempting to make the old computer perform like a new one.

  Everything came down to frame rate, or the number of times per second that the computer could redraw the screen. Change one tiny thing in the foreground, like the pointer on an altitude gauge, and the computer could do it quickly. But change the entire background, and things got a lot choppier.

  We set our sights on four frames per second, which wasn’t so lofty. Even my original Star Trek game on the servers at General Instrument had run that fast, though of course moving text wasn’t a fair comparison to a swerving hillside. Other games we’d made at MicroProse had run faster, but four was the bare minimum. Anything less would leave the game unplayable.

  So far, we had three.

  “I need one more optimization run,” Andy would lament late into the evening, begging me to find a calculation that didn’t need to be performed, or a piece of information that didn’t need to be stored at that exact moment. “I know you can come up with one more idea.”

  The schedule had already been delayed significantly by swapping out the engine, and if we couldn’t get the speed up soon, it would be time to start throwing out parts of the game like loose ballast, until whatever was left could stay in the air.

  Fortunately, we managed, and the game went on to sell over 250,000 copies and win “Action Game of the Year” from Computer Gaming World. I wish I could sum up how we fixed it, but the math is long, complicated, and (I’ve been assured) boring. The important thing to note is that it wasn’t one lightning-bolt solution, but dozens of incremental changes, many of which we couldn’t take credit for. We had to find ways to do our job better, but we also had to take advantage of other people who were doing their jobs better: new technology, new compression algorithms, new ways to implement standard subroutines. Gaming is a collaborative effort, and it’s silly to think that any one person can claim all the glory. As my first experience in the CES vendors’ hall had proven, our industry was not made from one peerless, monolithic booth, but tens of thousands of small ones—some with mismatched tables, perhaps, but all with something to contribute.

  The only place that gave me that warm and fuzzy collaborative feeling more than CES was the Computer Game Developers Conference. I didn’t attend the very first CGDC, which was founded by designer Chris Crawford—best known at that point for a game called Balance of Power and a book titled The Art of Computer Game Design—and consisted of twenty-seven people sitting on the floor in his house. But I did make it to the second gathering six months later, at a Holiday Inn outside San Jose in September 1988. By that time, attendance had quintupled and lunch was catered, though we still ate standing up, doubling our paper plates so they wouldn’t spill. Entrance fees were nominal, and organizers had to race to the bank with at-the-door proceeds in order to prevent the check they’d given the hotel from bouncing. I’m pretty sure that was also the year that Chris began delivering keynote speeches in costume. One year he cracked a whip at us to illustrate the power of subconscious creative urges; another year he delivered an impassioned theatrical performance comparing game design to Don Quixote, which he ended by grabbing a heavy metal sword and galloping through the audience.

  “For truth!” he roared at us. “For beauty! For art! Charge!”

  Toward the end of my first conference, the organizers surprised Chris with an award for being “Zee Greatest Game Designer in Zee Universe,” illustrated by a large plastic light bulb trophy. Other awards were given out, but in general the organizers made a point to give them only to publishers
, not individual designers, because they felt competition would fracture the community and create bad blood. MicroProse won an award for our playtesting department. I guess at the time we must have been shipping with fewer bugs than everyone else. Mostly I think we were just ahead of the curve in having a quality assurance team at all—one discussion at the conference centered around whether professional testers were even capable of providing unbiased feedback, with their paychecks coming in the form of dollars instead of fun. Fortunately, the topic quickly evaporated, perhaps after everyone realized that this line of thinking could logically extend to our own compensation, as well.

  By the second or third year, I was giving presentations myself, and by the tenth I was on a “Legends of Game Design” panel with industry mainstays like Ron Gilbert, who had been programming for HesWare just before they went under and went on to design the revolutionary new SCUMM engine for LucasArts, which delighted programmers with its improved efficiency but perhaps raised some eyebrows with its thematic acronyms (the accompanying program tools were named SPUTM, SPIT, FLEM, MMUCAS, BYLE, and CYST). But even sitting on a dais in front of hundreds of people, I never felt removed from the other attendees. CGDC was the one place where we were all friends and equals, and everyone had something to talk about even if they weren’t given a podium. Design in the 1980s was a largely independent activity, so no one was passing business cards or networking in the modern, rung-climbing sense. No one was protective of their status. We were just excited to have a community, and to be around others who understood our love for gaming in a way that our friends, and sometimes even our families, didn’t. It wasn’t that gaming was looked down on by the rest of the* world, necessarily, but it was sometimes glanced at sideways in confusion. Later decades would give rise to new flavors of mainstream fear about gamers and their obsessions, but back then the worst accusation an outsider would have leveled at us was that gaming was a frivolous pastime with minimal benefits—not as long as a book, not as pretty as a TV show, not as healthy as a sport. But in that respect, I don’t think it was much different than other niche interests. Surely jazz musicians would have an equally hard time explaining just what’s so special to them about riffing on a piano for hours on end, while architects would thrill at the chance to finally geek out with someone over the geometric peculiarities of Frank Gehry. There are lots of rare breeds in the world, and CGDC just happened to be the place where my rare breed gathered. I don’t think any of us could have imagined back then the kind of cultural domination that gaming would someday achieve. We simply shared ideas, and turned each other on to games we might not have heard of yet, and ate a whole lot of cookies.

 

‹ Prev