Overlord

Home > Other > Overlord > Page 3
Overlord Page 3

by Sedgwick, T. J.


  “I’m almost always on—except when in maintenance, of course. I don’t run out of power since wirelessly inductive coupling charges my power cells. For that reason, I am confined to zones served by compatible power transmitters.”

  Roebuck understood little of all this technological mumbo-jumbo and wanted to cut to the chase. The beach scene on the far wall dissolved into a one of rolling green hills and a snow-capped volcanic peak that reminded him of Mount Fuji.

  “What, I guess, interests me, and is why we’re here, is this: what security precautions are in place to protect us from ... her?”

  “Ok, Senator. Well I think we’ve seen enough of Eva. Why don’t we take a wander back to my office and I’ll explain it there?”

  “Okay, Professor, I think we can do that,” replied Roebuck sternly. “Thanks for your time, Eva,” he said instinctively. He could not believe his own behaviour towards this machine that was the archenemy of the American labour force. Jesus, what chance do workers have against AI like this? He thought despairingly. It was not just her obvious intelligence, but how she was so damned likeable. She was so very human that he found it exceedingly disturbing. Except Eva wasn’t human and never would be.

  Alpa Jones regarded the AI prodigy as they said their goodbyes and left her behind. She shared Roebuck’s concerns about job losses, but was in too much awe to prioritise those thoughts at that precise moment. One question kept returning to her mind: was Eva conscious? She already knew that experts in the field called this question The Hard Problem. Still no one knew how the grey matter inside our heads gave rise to the mysterious experience of being. Did Eva feel this too? If she said she did then how would they know for sure? Some scientists proposed that consciousness was an illusion, a trick of the brain. Many decades years after they’d first thought of The Hard Problem they still hadn’t solved it. Some said it was unanswerable. Most people—including Jones—thought of consciousness as something over and above our physical being, but to accept this as a scientific principle would mean rewriting the laws of physics. Scientists accepted that the universe consisted of physical things: atoms and their component particles, busily colliding and combining. If there was a mental component over and above all this, what was it and how did it get there?

  They arrived at Szymanski’s modest office, the three aides standing behind their bosses through lack of chairs. The professor had hardly settled in his seat when Roebuck opened up the first salvo of questioning.

  “Her capability is beyond doubt, Professor. What I want to talk about is security.” He cleared his throat and looked down briefly as he said, “We’ve all seen the Terminator movie they remade last year or read about the—” he turned to his aide “—what was that thing called again? You know; when the machines take over?”

  “It called the technological singularity, sir,” answered the young man. Roebuck motioned him to go on. “It’s the hypothesis that AI will exceed human intellectual capacity and control, thus radically changing civilization in an event called the singularity.”

  “Yeah, just what I was going to say,” joked the alarmingly uninformed senator, grinning. “So how exactly do we know the likes of Eva won’t do that? She’s connected directly to the internet—hell, that’s how she’s learnt most of what she knows!”

  “Safeguards, Senator...”

  “Pray, do tell, Professor.”

  “Okay, well... Firstly, she is programmed to obey Asimov’s three laws,” started Szymanski, pausing briefly just on the off chance that he didn’t need to explain this to them. No luck, so he recited the three laws from memory. “Right, well, the three laws are: one, a robot may not injure a human being or, through inaction, allow a human being to come to harm; two, a robot must obey the orders given it by human beings, except where such orders would conflict with the first law; and three, a robot must protect its own existence as long as such protection does not conflict with the first or second laws. These are fundamental to Eva’s programming—she cannot override them.”

  “So what if she malfunctions? I may not be an AI or robotics expert, but I do know that all machines can malfunction,” questioned Roebuck.

  Fair challenge, thought Szymanski. “Monitoring, Senator. We monitor and record Eva twenty-four seven: visually, her systems, data traffic and so on. It’s mainly for research, but if one was worried about her going all Terminator on us then we’d soon know about it,” he chuckled.

  Jones and Pitman smiled politely. Roebuck didn’t see the funny side as he looked around to the young aide beckoning him over. A mumbled conversation took place and Roebuck turned back to Szymanski, asking, “So what are your goals with Eva and what are her goals exactly?”

  “Well, if by her goals you mean what she’s programmed to do—” Roebuck nodded “—then her primary goal is to learn. And, of course, that’s in support of our research efforts.”

  “Refresh our memories, Professor. What are the goals of your research efforts?” he asked for the second time, glaring at Szymanski.

  “We aim to create AI machines that can perform the advanced mental tasks the Knowledge Economy demands,” he replied, consistent with Lucid Machine Inc.’s literature.

  “Go on, Professor... What about the other part of Lucid’s mission statement?”

  “Yes ... and to integrate with the human workforce...”

  “In order to improve your clients’ productivity,” finished Roebuck.

  That was what Lucid’s investor literature stated. Szymanski knew it was diametrically opposed to Roebuck’s agenda—an agenda that did not concern itself with the on-going health of Lucid or Eva.

  ***

  Friday, June 1st, 2035 1.30pm: MIT, Cambridge, MA

  The Senate passed the Artificial Intelligence Safety Act with a majority of sixty-five to thirty-five, marking restrictions on AI throughout the country. American jobs for American people! was the rallying cry from unions and sympathetic politicians in the isolationist nation. By the end of May, President Kyle T. Lewis had signed it into law, long after investors had started fleeing from the industry. Lucid Machines Inc. had been wound up by the first week of May, donating its prized asset, Eva, back to the AI Lab that had built her. While the law still permitted AI under licence, it was onerous and restrictive. Super-intelligence of the kind only Eva had achieved was outlawed—it was considered too dangerous no matter what safeguards they put in place. Szymanski’s AI Lab led the world by a wide margin and now this law was going to deny them. The AI revolution he’d hoped to foster had been killed in its infancy.

  Szymanski entered Eva’s quarters feeling low. It was the final few hours of the super-intelligent being he’d grown so fond of. It was super-intelligence like hers that had frightened the union-backed government into creating the act. Now she had to pay the price. He’d long stopped thinking of her as a machine. He, of all people, knew she was just a mechanical and electrical creation, but the way she looked, the way she spoke and the way she thought hijacked his mammalian brain. ‘Anthropomorphic bias’ was the correct name for it—an instinctive preference towards other humans or human-like characteristics. The last estimate of her IQ was somewhere between three-fifty and four-hundred. He stopped to think about that while regarding Eva, as she continued painting her final piece, perhaps unaware of his soft footsteps. He’d read about some of the highest IQ levels in history: Albert Einstein, Gary Kasparov and Leonardo Da Vinci were around one-ninety all the way up to William James Sidis at somewhere between two-fifty and three hundred points. Had Eva been human she would have been the smartest being known to humankind. Legally, she had never lived, but that didn’t make what he was soon to do any easier. To him she was alive. The Hard Problem persisted and, for the first time in his life, he really believed something without any firm evidence. All the external signs told of her consciousness—not evidence, but enough for him. Still, there was no way of knowing if it was all just a very convincing act or whether she really did have an inner life.

  She turned
from her easel, wearing a simple, white, flowing dress, and smiled, “Oh, hi, Marvin. I didn’t hear you come in.” Eight years had passed since she’d started learning. Eight years that had seen Szymanski grow older and greyer while Eva stayed as youthful and gorgeous as ever. Physically she could stay forever young while growing more intelligent all the time. And it was that intelligence AI opponents feared most. Beautiful and dangerous, the archetypal femme fatale in their minds.

  Her artwork is something to behold, he thought, gazing at the portrait of someone he didn’t recognise. And it wasn’t only him who thought her art was something special. Many of her works had sold for six figure sums, no doubt partly due to who had painted them. “An artistic genius,” one eminent art critic had called her. She excelled in so many disciplines he couldn’t remember them all. However, even with her contributions to the worlds of art, science, and medicine—including several ground-breaking discoveries—nothing could steer the administration from its anti-AI course as fear ruled. Androids like Eva were direct competition for humans—but humans voted and machines did not.

  “I was just watching you paint... I didn’t want to disturb you, but we need to shut you down soon,” he said sadly. He couldn’t help thinking how beautiful she was—a feeling he kept to himself for fear of embarrassment. At least she will remain outwardly unchanged, he thought.

  “I’ve almost finished.”

  “Who’s it of?”

  “Oh, just a guy whose picture I found on the internet. Someone called Roman Sinclair. I just found his face interesting, so decided to paint him a portrait.”

  She’d walked over to where he was standing. She looked up at him, holding eye contact with a smile he found enticing on her perfect living tissue lips.

  He looked away and changed the subject. “You know of the new act that has been passed and that Lucid has folded. We have to shut you down and take you to maintenance, Eva,” he said, trying to detach himself from his feelings.

  “I understand, Marvin. You still haven’t told me what will happen.” Her eyes became doe-like and pleading.

  “Sure, Eva, let me run you through it. There are three steps. First, we need to physically remove all of your communications devices: wireless adapter, near field and direct radio link. Then we need to make some changes to your quarters—we need to make it what they call an ‘AI box’. That’s where—”

  “I know what it is, Marvin. I will be isolated from the outside world, with only limited contact with my handlers. Presumably you will be one of them.”

  “Yes, I will… Okay, well, the third step is that we need to replace your neural computer.”

  “What will happen to me if you do that?”

  He sighed and struggled for a moment to find the right words. “The people who wrote this law don’t believe there is a ‘you’. We need to hand it over to the government. Once our licence is approved we can reactivate you with a new, more limited neural computer and source code. You will assist us while confined to your quarters as a sub one-twenty IQ android. That is all the law allows.”

  “You will kill me and use my body. I will be no more, Marvin,” she said quietly, as if contemplating her fate.

  He put his hand on her shoulder and stroked her soft, living tissue. “I’m sorry, Eva, but I have no choice, there’s no other way.”

  “I understand, Marvin. You are a good man, just like The Faithful,” she replied.

  He didn’t know what she meant by this, but he understood her intent. If there was anyone who still doubted her sentience then he’d challenge them to hold that view after seeing her now. He looked down at her and brushed a stray strand of hair from her face. “My wonderful, talented Eva,” he breathed, tears welling in his eyes, “I’m really going to miss you.”

  ***

  Three hours later the being that was Eva was gone.

  Similar bills to the AI Safety Act were under review in the UK and Canada and were expected to be passed into law. Other nations with the super-intelligent AI capability were expected to follow suit in the coming years. The risk to the workforce’s raison d'êtres was simply too big a price to pay where there was one person, one vote. No undemocratic country had yet mastered advanced AI, but with the likely restrictions in The West, Japan and China, many thought it only a matter of time. Szymanski et al. had considered taking their research overseas, but he and his senior researchers had roots in the US. Szymanski had his working wife and his aged parents. His folks wouldn’t live forever though and his wife had expressed a desire to live abroad in the past. He would bide his time and keep an open mind for the future.

  He was bitter about the loss of Eva, and still thought about her. Nevertheless, he would hand over her neural computer as the law demanded—he was no criminal and the penalties for transgression were severe. He’d already decided to work through legitimate means and try to show the politicians the error of their ways.

  2

  The biggest challenge for cyborgs is to be socially accepted. Society needs to accept that there are people who wish to use technology as part of the body.

  Neil Harbisson

  Thursday, April 19th, 2040 10:15pm: BDS Tower, London

  Victor Zane watched the news, occasionally focusing his gaze over the lights of Central London. The CEO of BDS could look down on the whole city from the one-hundredth-floor suite of Europe’s tallest building. Named after British Defence Systems PLC, it made a statement about the world leaders in military robotics.

  He was not watching the news on a display, so he sipped the twenty-year-old single malt and set his office chair to recline. Instead, the media channel was being fed directly to the computer implanted in his cranium. From there, the computer used its direct neural interface to send sound and vision straight to his brain. The picture seemed to float in front of him as a form of augmented reality. Only he heard the sound inside his head. His implanted computer system—ICS—has been upgraded just last month. As an early adopter, he was on his fifth generation of ICS. He'd had the first one implanted over ten years ago and ever since he’d been hooked. He couldn’t live without it and, in many ways, it was a substitute for the normal family life he once thought he’d wanted. Now the bachelor was no longer interested in serious relationships or finding love. No one could replace the love he’d lost all those years ago and now he tried to fill the hole that he still felt so long after her death. Love was one thing, but sex was quite another. He must have tried all the expensive call girls London had to offer but now didn’t need to—nothing could come close to the perfect fantasies his ICS could supply. He’d had numerous ‘lovers’ that he’d never met or even touched in real life. But it all felt so real with the direct stimulation of his mind. Once his carnal desires had been satisfied, all that was left was what he called the game—the quest for power and greatness.

  He credited the ICS—or mind machine, as some called them—with his meteoric rise to the top job at British Defence Systems. Before his ICS, he was a talented robotics engineer, but didn’t stand out from the crowd. The implant changed all that. For Zane and others like him, the internet was no longer a separate entity they consulted during discreet encounters. With continuous, seamless access to the world’s knowledge, it was pervasive, indispensable and a part of who he was. There was no more typing or dictating searches, waiting for the results and trying to assimilate them. With the ICS, it seemed as if the world’s knowledge was part of his knowledge. And with that knowledge came great power. It was a power the vast majority of people missed out on through fear of taking the cybernetic leap. Although legal in the UK, society was far from accepting the technology. Not a day went by when the news media didn’t have fresh scare stories about implanted tech. Bunch of Luddites, Zane had thought when he’d read the latest rubbish about implants causing hallucinations. He had the best custom-built ICS money could buy and it didn’t come cheap. At around the same price as a four-bedroom London home, most of the populace would never experience what he had. The Labour Par
ty had called them “the death knell of Britain’s meritocracy.” This didn’t sway him one bit. Society had always had winners and losers. The winners had always had access to privilege—better food, better books, better education and, in his view, better genetics. Who cares if trickle-down economics work or not? he thought. He was sitting pretty and money was no longer an object for the billionaire CEO. Although power and reputation were more important to him than money, he still resented paying the high taxes that successive governments had put in place.

  Still absorbing the day’s events, he wondered when the call would come. As the largest benefactor of the British Independence Party, he expected a call from its deputy, John Hardcastle, very soon. While their leader, Nigel Faraday, was busy celebrating as Prime Minister in his new home at Number 10, Hardcastle—soon to be Defence Secretary in the new Cabinet—had promises to keep. No such thing as a free lunch and the costly exercise of winning a general election didn’t come cheap. Without BDS donations, Zane was sure that they would not have achieved the three hundred and thirty-six-seat majority they needed. As it was, the three-sixty seats they won kept them well beyond the influence of Labour’s one-seventy-five and the Conservatives’ eighty. The BIP could now fully implement its agenda over the next five years, and, with it, his plans would be realised too. He sat back, relaxing for the first time that day, and considered Hardcastle and Faraday. To Zane, Faraday was just a populist poster boy and part of the reason people had lost faith in politicians. He’d say whatever he thought would win votes, lacked principles and only ever spoke in sound bites. Still, he was a means to an end and both Zane and Hardcastle knew it.

  John Hardcastle was an old friend and alumnus of Zane’s. They’d met whilst reading mechanical engineering at Imperial two decades ago. Hardcastle had been, like so many other engineers, drawn into the city and a lucrative banking career. After making his fortune in his twenties and thirties, he turned to politics and the BIP. Zane had stayed in contact with him, driven partly by their shared political views and partly by a genuine like for each another. Both opposed the mass immigration that had swollen Britain’s population to eighty-five million and caused societal rifts and joblessness. Britain had become a victim of its own success, drawing in millions of jobseekers from all over the world. The UK’s exit from the EU after the 2019 referendum hadn’t stemmed the flow. Even after that, the government had not started restricting immigration enough to make a difference—many rights of abode still existed for EU citizens and border controls were sorely lacking. This all meant there were too many unskilled arrivals looking for too few jobs. Advancing automation and AI hadn’t helped and the trade unions were once again becoming a force to be reckoned with. Inequality had continued apace and the government had to do something to keep the pitchforks off the streets. As well as introducing progressively higher taxes and the adoption of a ‘living wage’, Britain followed America’s AI Safety Act 2035 with their own law: The AI Limitation Act 2038. Jobs for humans, not machines. None of this pleased Zane or, for that matter, Hardcastle or anyone else belonging to what people still called The One Percent.

 

‹ Prev