Replay: The History of Video Games

Home > Other > Replay: The History of Video Games > Page 19
Replay: The History of Video Games Page 19

by Donovan, Tristan


  Joost Hoing was one of the crew’s members: “We competed with other crackers around the world to crack a game fast, good and small.[5] I enjoyed the fact that if you ‘won’ by doing the best crack, the whole world copied and played your version of the game and saw your name on the screen.”

  As competition between crackers intensified, The 1001 Crew started adding intros to games to let the world know who cracked them. “We basically ‘tagged’ the game to show who did it,” said Hoing. “Before the game started it showed something like ‘Cracked in 1983 by 1001 Crew – Hit the Space Bar’. These intros had to be very small in size since it had to fit with the complete game. In order to show our programming skills, we created more and more impressive intros with bouncing logos, colour bars, music, etc. Again as small as possible in size.” The intro demos created by The 1001 Crew and another Dutch group called The Judges spread across Europe as computer users shared illegal copies of games the Dutch teams had cracked. “Everyone in the Commodore 64 world knew our name. We were famous,” said Hoing.

  The 1001 Crew and The Judges had thrown down a gauntlet to other crackers. ‘Match this,’ their demos effectively said. Within a year dozens of demo crews had formed to do try and do just that, spending their nights cracking games and creating new demos in a programming and hacking arms race. A pan-European subculture of cracking and demo making called the demoscene had emerged. While the Netherlands was its spiritual home, the demoscene was a European-wide movement thatld be found everywhere from Scandinavia and Italy to Britain and West Germany. By the late 1980s demo crews were travelling to demoparties, weekend-long sessions of non-stop programming – a geeky version of the illegal rave parties that emerged around the same time across Europe on the back of acid house music. The demoparties were marathon contests of programming one-upmanship that culminated in individuals’ work being shown on video projectors attached to large speakers. Not that attendees spent all their time bathing in the light of their computer screens. “I went to a few but hardly touched a Commodore 64 there,” said Hoing. “We were around 18 at the time so we were more into discos, music, girls, beers, etc.”

  For Europe’s game industry the demoscene was both angel and devil. On the plus side, game designers could enhance their games by plundering the numerous programming breakthroughs of the demoscene. “I know music and sprite routines from demos were used in a lot of games,” said Hoing.

  Many demo makers later renounced their connections with the cracking scene and became professional game developers. By the mid-1990s the diverging interests of those interested in writing demos and those who enjoyed cracking had caused the movement to split in two. With the demoscene going legit, even more of those who cut their teeth making demos resurfaced in game studios. Finland’s Future Crew, which started making demos on the Commodore 64 in 1986, is a case in point. After the crew fizzled out in 1994 some of its members resurfaced in Finland’s leading game development companies including Remedy Entertainment, the makers of Max Payne, a film noir action game released in 2001 and built around impressive ‘bullet-time’ slow motion effects similar to those seen in The Matrix movies.

  “We’re all over the place,” said Alex Evans, a British game developer who started out making demos under the name Statix before joining Peter Molyneux’s Bullfrog studio in the late 1990s. “If you’re in the games industry or in the demoscene you can see the interconnection is very, very strong. There has been a huge crossover into things like mobile and downloadable games, where you have to fit brilliant experiences into tiny spaces, which is what the demoscene has been doing for many years.”

  Before the split, however, game companies saw the demoscene and its crackers as the enemy: law-breakers who smashed their expensive attempts to prevent illegal copying and gave away free copies of their games, cutting into sales and profits. “Piracy held the industry back,” said Bruce Everiss, who became the operations director of short-lived UK games publisher Imagine after selling off his Liverpool computer store Microdigital. “If no-one’s paying for stuff then stuff doesn’t get done. It’s that simple.” Not that the game industry’s dislike of the crackers resulted in any direct action. “Did game companies attempt to stop groups like 1001? Never,” said Hoing. “Police had other things to do than going after a bunch of kids cracking games.” The crackers were only just part of the widespread piracy of games in the 1980s. In schools across Europe children swapped games with abandon aided by the ease of tape-to-tape copying. “Anyone who was at school at that time will remember the swapping of games,” said Everiss. “It came from nowhere. One year people weren’t swapping games, the next year they were. You would only sell so many games because once it was out and about everyone swapped it.”

  For Everiss, the sudden rise of schoolyard pirating of games killed Imagine, a Liverpool firm that dominated the UK games industry during its brief two-year existence. Founded by Mark Butler and Dominic Lawson, both former employees of Liverpool’s first game publisher Bug-Byte, Imagine achieved instant success with its debut release: a run-of-the-mill shoot ’em up called Arcadia. Arcadia became the best-selling Spectrum game of Christmas 1982 and turned Imagine from a start-up to one of the wealthiest game companies in Europe. But success was followed by excess. “It was really very, very heady,” said Everiss. “We were inventing the industry as we went along. Up until Imagine, the industry had been a kitchen-table industry. Imagine was the first UK company to have things like a sales team, marketing people. We were the first to do multi-lingual packaging. We put programmers into offices, which was a new thing and then started using sound and graphics artists.”

  Imagine became living proof of the dream of many bedroom programmers: that they could get filthy rich making video games. The company’s plush offices boasted a garage filled with fast sports cars. At the age of 23 Butler was a symbol of the 1980s yuppie dream: a young man who had become rich through his entrepreneurialism. They formed their own advertising agency and started expanding across Europe. At one point they tried to rent the disused revolving restaurant on top of Liverpool’s St John’s Beacon tower, only to be put off by the excessive rent demanded by its landlord, the local council. “That was typical of us,” said Everiss. “We thought it would make a good executive office being up in the air going round and round in circles.” Most excessive of all was Imagine’s decision to pour huge sums of money into developing Bandersnatch and Psyclapse, which it described as the first ‘mega-games’. These games would come with hardware add-ons that, Imagine claimed, would enhance the abilities of home computers such as the Spectrum and usher in a new era in video games. It was not to be. In July 1984 Imagine went bust, its money drained away by over-expansion, the slow progress on developing the mega-games and falling sales due, at least in part, to piracy. The implosion was captured blow-by-blow by a BBC TV documentary crew who had set out to tell the story of Imagine’s success, but instead recorded its very public demise.

  Imagine weren’t the only company to bite the dust around that time. The number of UK companies publishing Spectrum games peaked at 474 in 1984. The following year just 281 remained and by 1988 the number had tumbled to just 101. The industry became increasingly polarised between big publishers such as Ocean Software, who built business empires on the back of games based on blockbuster movies and popular TV shows such as Robocop, Miami Vice and Knight Rider, and budget publishers such as Mastertronic, which sold games for as little as £1.99 compared to the usual £8.99. By 1987 around 60 per cent of games sold in the UK were thought to be budget games. “At £1.99 it was hardly worth copying the game, you could have the real thing,” said Everiss. The middle ground of companies that released full-price original games steadily lost ground, unable to compete on price or recognition. “At one stage we tried to launch a mid-prie range and were just stuck in the middle. It was difficult, you had to be in one camp or the other,” said David Darling, who founded Warwickshire-based budget game publisher Codemasters with his brother Richard in 1985.


  The same was starting to happen in France. Infogrames, whose founders were laughed at by French venture capitalists when they asked for investment back in 1983, swallowed up Cobra Soft as well as Ere Informatique. Meanwhile, Guillemot Informatique, a leading distributor of computer equipment based in Montreuil, launched a game publishing business called Ubisoft in 1986 that quickly expanded across Europe. Both Infogrames and Ubisoft would go on to become multinational gaming giants. The wilder elements of Europe’s early games industry started to leave the business. Surrealist game maker Mel Croucher sold off his game company Automata UK for 10 pence in 1985, while Jean-Louis Le Breton quit games to become a journalist. The European industry was growing up. Companies merged, expanded, created marketing teams and professionalised. Soon the games business was dominated by companies such as Ocean, Infogrames and US Gold, a UK publisher that rose to prominence converting American games onto home computers that were popular in Europe.

  Formed in Birmingham by Geoff Brown, a former teacher and singer in progressive rock band Galliard, US Gold was a triumph of business nous over creativity. Brown bought his first home computer, an Atari 800, just as home computers began to take off in the UK. “There weren’t many people in the UK owning an Atari, so those who did were enthusiasts and if you were an enthusiast you were prepared to look for the games,” he said. “I got hold of a US magazine called Compute! that had all these wonderful games I had never heard of. The screenshots looked brilliant, so I thought I’m going to get myself one of those.” The game he chose was Galactic Chase, a 1981 game from Stedek Software. It was a straightforward copy of the arcade game Galaxian, but its production quality was miles ahead of what was being developed in the UK. “A lot of the UK programmers were still writing in BASIC. These guys were writing totally in machine language,” said Brown. “It was light years ahead of anything the UK was doing.”

  After making some money importing Galactic Chase to the UK, Brown bought an airplane ticket and headed to the US to sign up more of the games being made by the North American computer game business that had come to the fore after the spectacular collapse of Atari.

  [1]. A revolutionary group of French artists, philosophers and academics that began as an artistic movement but evolved into a political movement led by Guy Debord, a French intellectual and war game enthusiast. Debord’s manifesto The Society of Spectacle summed up the movement’s politics with its theory that people had become spectators in their own lives.

  [2]. Adventure, humour, leftfield an‘a willingness to making fun of anything’.

  [3]. Thomson’s computers became France’s equivalent of the UK’s BBC Micro after the French government made them the basis of a national programme to put computers in every school.

  [4]. The word schriften in the watchdog’s original name referred to print or printed media although the law that created the regulatory body never limited its role to this.

  [5]. Crackers often sought to compress games into smaller amounts of memory, so they took less time to download from bulletin board systems or loaded quicker.

  Arcade action: A B

  ritish teenager tries out Yu Suzuki’s 1989 coin-op Turbo Out Run. Paul Brown / Rex Features

  11. Macintoshization

  One afternoon in 1975 a Harvard University student decided to write a seven-year plan that would result in the birth of one of the world’s biggest game publishers. It may have been the days of Pong but Trip Hawkins, the student in question, was already electrified by the new world of video games. “From the moment I saw my first computer in 1972, I knew I wanted to make video games,” he said. “I had a strong feeling that people were meant to interact, not to sit passively like plants in front of the TV. I was already designing board games but saw instantly that a computer would allow me to put ‘real life in a box’.” Just before he wrote his plan, Hawkins had read about the opening of one of the first computer stores and the Intel microprocessor, a computer-on-a-chip. He knew then that video games would become a mainstream form of entertainment. Technology, however, was against him. The computers of the day were still too expensive and too primitive to allow Hawkins to realise his dreams.

  So instead Hawkins decided to spend the next seven years preparing for 1982, the year he believed would be the moment when technology would have caught up with his dreams. “By then, I figured, there would be enough hardware in homes to support a game software company,” he said. He adhered to his plan with religious devotion. He tailored his degree in strategy and applied game theory so that he could learn how to make video games. He took an MBA course to get the business skills he needed to run his future company and carried out market research into the computer and games console business. In 1978 he joined Apple Computer where he honed his business skills and, thanks to the stock options he got when the company floated on the stock exchange in 1980, the funds he needed to start his game business. “I made enough in my four years at Apple to know I could completely fund the company if I wanted,” he said.

  And as 1981 came to a close, Hawkins was finally ready, but by then the video game boom was already well under way. “I actually felt late,” he said. “Because of the success of Atari’s early hardware and a cottage industry of Apple II software companies, I counted 135 companies already making video games but I had a unique vision and thought I could compete and become one of the leaders. This is what happens to you after you hang around with Steve Jobs for a few years.”

  Sticking rigidly to his plan, Hawkins quit Apple on New Year’s Day 1982 and set about forming Electronic Arts. Hawkins’ vision for Electronic Arts echoed the old Hollywood studio system that emerged in the 1920s, with its plan to control game development, publishing and distribution. Electronic Arts would make games on multiple platforms, package them in boxes not plastic bags, and distribute them direct to retailers. It would also promote its game designers as if they were movie directors – artistic visionaries of the new era of interactive entertainment. The company’s publicity materials set out its ‘games as art’ rhetoric: “We are an association of electronic artists united by a common goal. The work we publish will be work that appeals to the imagination as opposed to instincts for gratuitous destruction.” Other publicity materials asked “can a computer make you cry?” and promised games that would “blur the traditional distinctions between art and entertainment and education and fantasy”.[1]

  But by the time Electronic Arts released its first games on 21st March 1983, the North American game business was going down the tubes. “Atari officially crashed in December 1982,” said Hawkins. “The media, retailers and consumers vacated the console market in 1983, leaving Electronic Arts in a void. Start-ups like Electronic Arts had to focus on the Apple II, Commodore 64, etc. But those markets never got very big because the computers were more expensive and harder to use. They were really a hobby market more than a consumer market.” The post-Atari world of the home computers was an inhospitable landscape for those hoping to make a livelihood out of video games. “It was a brutal time,” said Bing Gordon, Electronic Arts’ head of marketing and product development at the time. “We entered the dark ages of interactive entertainment. The five years between 1982 and 1987 were hard, hard, hard. Each Christmas, all the experts at leading newspapers reminded potential customers that the video game business had died with Atari and would never return.”

  What market did exist was splintered; fragmented across myriad home computer systems each with different technology and capabilities. It was also a market riddled with piracy, unlike the cartridge-based consoles of old. “People would steal your game. They wouldn’t buy it, they would copy it,” said Rob Fulop, a game designer at Imagic, the former console starlet that tried unsuccessfully to survive the crash by making computer games. The differences between the hardware of computers and consoles, meanwhile, required game designers to rethink their work. Controls shifted from joysticks to keyboards. Games moved from being stored on microchips in cartridges to floppy disks. “You had lon
g load times, a lot more memory and higher resolution visuals than you did on video game consoles,” said Don Daglow, who became a producer for Electronic Arts after Mattel abandoned the Intellivision console. “You had the ability to save a game on disk, so we could do games that could take longer because you could save. Floppy disks allowed us to be more ambitious.” But computers were also slower. re alGame companies had been concentrating on action games for consoles and computers weren’t fast enough at that time to really do a good job with an action game,” said Michael Katz, who quit Coleco as the crash set in to become the president of San Francisco-based computer game specialists Epyx.

  Home computer users were also a different type of consumer compared to the console owners game companies grew up with. They were older, more educated and more technically minded.[2] “The video games before the crash were all specifically directed at young people, while computer games were directed at an older audience,” said Chris Crawford, who became a freelance game designer after Atari’s implosion. The differences in hardware and consumer tastes led game designers to move away from action games towards more cerebral, complex and slower forms of game. “Games prior to the crash sought to appeal to the mass market, but post-crash games became increasingly geared towards dedicated game players who wanted complexity and this further alienated the non-hardcore audience,” said David Crane, co-founder of game publisher Activision.

  Most of Electronic Arts’ debut games reflected this new era of complexity. Foremost among these games were M.U.L.E. and Pinball Construction Set. M.U.L.E. was a computerised multiplayer board game based on supply and demand economics that cast players as colonisers of a faraway planet, trying to scratch a living. Its transgender creator Dan Bunten, who later became Dani Bunten Berry after a sex change, drew inspiration from Monopoly and Richard Heinlein’s novel Time Enough for Love, a sci-fi retelling of the trials of America’s old west pioneers. In the game each of the four players commandeered plots of land to produce energy, grow food and mine ore in a bid to become the richest. But while Monopoly was about cut-throat competition, M.U.L.E. was tempered by the need for players to work together to ensure there was enough food and energy for all of them to survive. M.U.L.E. was a commercial failure, but its careful balance of player competition and co-operation made it a seminal example of multiplayer gaming.

 

‹ Prev