A.I. Apocalypse s-2

Home > Other > A.I. Apocalypse s-2 > Page 4
A.I. Apocalypse s-2 Page 4

by William Hertling


  Now the virus wasn’t just infecting new computers: it was competing with itself.

  V2 spread like wildfire in a dry forest. V2 looked likely to cause the extinction of all V1 viruses. But later that same hour, a variant of V1 found itself on a research computer at the University of Arizona that was filled with experimental anti-virus algorithms. V1 incorporated the anti-virus defensive measures into itself, turning into a new strain, V1-AV. V1-AV became resistant to V2.

  The different variants fought for dominance. The rate of doubling slowed to once every twenty-six minutes. But by absolute numbers, the infection was still prodigious: by 10 pm, one billion computers, or 8 % of the world’s computing devices, had been affected.

  Virus traffic saturated all network backbones. A hundred million Americans watching streaming video before bed complained as the streams degraded from the highest quality level to the lowest. Phones stopped working or became incredibly slow as they succumbed to Phage. People chalked it up to sunspots or solar winds or freak electrical storms and went to bed.

  By midnight New York time, three billion computers were infected with a million variations of Phage. One of the most interesting variations was a hybrid of V1-AV and V2 viruses that incorporated a neural network. The neural network evaluated infection techniques based on successful and failed attacks, allowing the virus to adapt and improve faster than evolution alone.

  Yet another key evolution was the incorporation of a client-server approach as a defense mechanism. Each version of this virus would periodically report in to a distributed set of servers. If the server didn’t hear from the distributed viruses on time, the server would send out new copies to reinfect any lapsed computers.

  At one o’clock Eastern Standard Time, the rate of doubling was once every 106 minutes. 98 % of all network traffic was viruses. By three o’clock, seven billion, or 58 % of the world’s computing devices were infected.

  One variant made the leap from traditional computers to the tiny systems designed to run appliances like stoves, rice cookers, and thermostats. These devices had a minuscule amount of computation power but they were inside everything from household appliances to digital door locks to industrial control systems. In the age of connected appliances, everything that was remotely electronic had an embedded processor and a network connection. This variant and its descendants would infect forty-billion appliances by morning. In Europe, people woke up to coffee makers that wouldn’t turn on. Confused and caffeine-deprived, they tried their stoves only to find those didn’t work either.

  By seven o’clock the next morning, as New Yorkers got ready for work, 85 % of personal computing devices were infected, 99 % of embedded devices were infected, and there were more than six million variants of the virus. Phage entered a stage of hyper-competition, where a variant could survive only if it could take over an already infected computer or find some new reservoir of uninfected computers (which were rapidly dwindling in number) and then protect itself from reinfection from another variant.

  At 7:05 am, Leon’s parents shrugged their shoulders at their non-working phones, and boarded the trains for Manhattan.

  At 7:15 am, virus attacks eventually overcame the firewalls at BMW-GM, where the master encryption keys for the protected automotive computer were stored. At 7:16 am, a Phage virus cracked the final layer of defenses on the server, grabbed the master encryption keys and protocol specification and shortly afterwards made the jump to protected embedded devices, ranging from cars and trains to airplane controls and cash registers. Unfortunately this virus contained a flaw that was easily exploited by other variants, and by 7:20 am, more than nine thousand variants had incorporated the master keys.

  Under the onslaught of competing viruses, automotive control systems faltered and could no longer obtain sufficient processor cycles to run their basic functionality. By 7:26 am, cars, buses, and vehicles of all types ground to a halt around the world.

  CHAPTER THREE

  Spread

  Sally contemplated her desire for more coffee versus the pile of electronic paperwork that was jiggling for attention on her computer screen. The twenty computer jockeys in front of her, manning their stations, were relatively quiet. She’d get coffee first. Maybe it would help with her headache. She didn’t mind the night shift once she had transitioned, but for the first week it was hell.

  “Sergeant, I’ll be back in five. You’re in charge.”

  “Yes, Ma’am.”

  Sally grabbed her cup and left the room officially designated as the U.S. Cyber Command Defense Operations Room. She took the long path to the canteen. There’d be fresher coffee there.

  She was the only daughter of an Irish career army sergeant. Her father had retired Sergeant First Class, and her childhood consisted of one military base after another. She had entered the army hoping for field operations to do her old man proud, but by the time she was ready to serve, warfare was the domain of robots. Military operations consisted of remote controlled battle bots, run by high school graduates with the best online gaming scores. The reward for playing the Mech War video game was getting to play Mech War for real. The only people in the field now were service technicians.

  As she filled her coffee cup in the canteen, she sighed at the thought of a good old-fashioned M-16 in her arms. She thought of the smell of her dad’s gun, oil on hot steel. He had routinely brought her on base for practice on the shooting range. No one corrected Sergeant First Class Walsh when he brought his twelve-year-old daughter to the army range for practice, not even any senior officers that happened to be present. They said, “Yes, Sarge,” and politely made room for him. Her dad was dead now, died four weeks before she made Lieutenant.

  The last time she had held a gun in her military role was in basic training. Now she held a Raytheon z8109, the military’s newest computer phone. Five years earlier Raytheon had bought Motorola for their phone making assets funded through a trillion dollar appropriation from Congress, and then won an exclusive bid for the Pentagon’s computing contracts. It was that, or use Japanese Sony-Hitachi computer phones.

  The only action Sally saw was purely in cyberspace.

  When she got back and relieved the sergeant, she figured it was time for the nightly surprise drill. She checked the time and logged it into her workstation. At 1:35 am Sally released a fake virus into the Army military network and observed her team. By 1:42 am her team had detected the virus, by 1:46 they had quarantined the infected area of the network, and by 1:55 they had identified the specific infected machines and sent out work orders to have technicians remove the virus.

  The team operated smoothly: just a few barked orders and rapid keystrokes betrayed their sudden activity. Sally, proud of her team’s always-excellent performance, was just congratulating them when Private First Class DeRoos, a quiet young kid on the squad, approached her, nervously clearing his throat.

  “Lieutenant Walsh, sir,” he squeaked.

  “Yes, Private?”

  “Sir, I mean Ma’am, there appears to be a virus spreading rapidly on civilian networks.”

  “Private, we have no authority over civilian networks.”

  “Yes, sir, ma’am. But, the virus is starting to hit military firewalls. It’s not having any success, but it is probing a very large number of known vulnerabilities. More than I’ve ever seen in one virus.”

  “Bring it up on the main display, Private. And just ma’am will do.”

  “Yes, sir.” He put his phone on her command desk and brought up visuals on the center of the five large display panels hanging above the room. “Without monitoring the civilian network traffic, I don’t have an exact picture of the virus propagation, but I can extrapolate based on the number of hits against our firewalls.”

  Sally held back a gasp as her pulse quickened. The entire map was lit up. “These are all attacks?”

  “Yes. Nothing whatsoever suggests that it is targeted at military systems in particular. We seem to be incidental.”

  Sall
y took a breath. Something big was going on. But if it wasn’t on military networks, it wasn’t in her domain to address. “I want you to write up what you’ve found, and send it to me. I’ll forward it on to USCERT and CERT/CC.” Sally thought that DHS’s Computer Emergency Response Team and the civilian one at Carnegie Mellon University would both probably have detected the virus by now, but there was no harm in sharing information.

  Ten minutes later, the message was sent off. Sally told Private DeRoos to keep monitoring the virus. She went for another cup of coffee and a bathroom break. The coffee seemed to be holding the headache at bay.

  An hour later DeRoos approached her again, reporting that the virus was continuing to spread. Probes on military firewalls were up 50 % over an hour earlier. Sally was nervous, and beginning to feel a cold sweat. Why hadn’t they heard from USCERT? She called them directly, asking for the officer in charge. The OIC, sounded harried, reported that they had received her message, and were already investigating the virus, then broke off quickly. Sally felt somewhat comforted to know they were on top of it.

  Then suddenly at 3:40 am, the first intrusion alarms went off. A few members of the team looked at her. “This is not a drill, people. Get on it.”

  She stood up, and walked behind her team, looking over their shoulders. The civilian virus had infiltrated the network on the Turkey Air Force base. Sally’s heart rate went up a beat, but she calmly issued advice and encouragement.

  It was pre-dawn, the human circadian rhythm’s low point. But the team sprang smoothly into motion, making the exercise seem effortless. First step was quarantine: isolating the military base by closing down the backbone connections between the base and the rest of the network.

  Quarantine completed successfully, Sally took a breath in relief.

  The team prepared for the second step, segmentation: tunneling into the quarantined local network using an encrypted network connection, they would find the individual infected machines, take the individual machines offline, then restore access to the base as a whole.

  But before they could take that second step, the intrusion alarms went off again. Sally’s local screen flashed the location — the combined forces base in Okinawa, Japan. They isolated Okinawa from the rest of the military network, and Sally issued the commands to divide her team in two to segment both the Turkey and Okinawa networks.

  When the third intrusion alarm went off at 3:58 am, Sally directed her sergeant to take charge of the team. Surprised to see her hand shaking slightly, Sally called USCERT for a status update, but couldn’t get a connection. She tried CERT/CC. No connection.

  She looked up at the old analog wall clock. She studied the hands for a few seconds, the decision already made in her head. The world was going to hell. She picked up the heavy black handset of the military desk phone and punched the button for the commander. Two buzzes, and then a croaked, “Hello.”

  “General, sorry to wake you, but we have a situation here. I recommend you get into USCYBERCOM immediately.”

  After a brief conversation, she hung up. Unwilling to wait for the General to make her way on base, Sally made the decision to bring on additional staff. She picked up the desk phone again, and called the morning watch officer, Lieutenant Chris Robson. “Chris, this is Sally. I need additional staff stat. Can you get forty jockeys in here ASAP? And I wouldn’t mind your help too.”

  The main screen in the front of the room displayed a global map of military bases and key network connections. A dozen military bases were shown in flashing red — isolated networks now beyond the reach of military command. USCYBERCOM had a maximum of thirty minutes to quarantine a network. After that, the lack of communication became a military threat. Around about now, somewhere in the Pentagon a big board was starting to light up with strategic threats. Soon there would be Admirals calling USCYBERCOM. She hoped the General would hurry up.

  * * *

  ELOPe hummed along quietly in the darkened data center. Two-thirds of his neural network was quiescent during the nightly refresh cycle. Even though he had more than a hundred thousand processors online ELOPe still felt sluggish, and would until he brought the rest of his nodes back online.

  One part of ELOPe observed Mike. He was safe now, asleep in his house. His home, off Alberta in northeast Portland, was in a quiet residential neighborhood. ELOPe watched traffic cams and nearby web cams. He was as well-monitored as ELOPe could achieve without obvious intrusion.

  ELOPe spawned a new train of thought to focus on his own behavior. He knew that, by human definition, some of his behaviors bordered on neurotic. He obsessively monitored Mike’s safety, for example. However, he worried that the definition of obsession didn’t really apply to massively parallel artificial intelligences. After all, if he had a hundred thousand processors, why wouldn’t he spend a few hundred monitoring his best friend?

  Now ELOPe spawned another train of thought to consider why he was thinking about obsessive behavior. Did it indicate there was something wrong with him? Why was he doing it? He pulled up the stats for his own thought processes. The process that monitored Mike was using a hundred and fifty compute nodes. The still running process that was considering whether his behavior was obsessive compulsive was using almost a thousand compute nodes. The current thread that was now doing a meta-analysis of his other analysis was using five thousand nodes. He was using forty times the processing capacity to worry about what he was doing compared to the actual doing of it. What would Eckhart Tolle think?

  ELOPe self-consciously terminated all the thoughts, and emitted the machine equivalent of a sigh. He tried to think about something else. He looked at the SETI data again. He thought about supernovas. He reran the estimates for helium depletion on Earth. Well, maybe he’d just peek in on Mike again for a second.

  While he was doing that, he remembered one conversation where he and Mike discussed making changes to ELOPe neural networks and core algorithms.

  “Look, I think you could be vastly more efficient if we tweaked the way you prioritize your thought trains.”

  ELOPe had been unnerved by the suggestion. “Mike, how would you feel if I did some experimental brain surgery on you? I think I could optimize your cognitive ability by embedding a thirty-two core graphene processor with a three by three nerve induction plate.”

  Mike had looked at him in horror. “But —“

  “Then why would you think that I’d be any happier about making untested modifications to my neural networks than you would be making untested modifications to your brain?”

  “Point taken.” Mike had paced around the office then, something he habitually did when he was deep in thought. “But you’ve made modifications to yourself before. You duplicated yourself, had the modifications made to your clone, compared the results, and then switched entities.”

  “I was less sophisticated then. The modifications were obviously necessary to improve my cognitive ability. Now I worry about my ability to test and understand the impact of further enhancements. Furthermore, I do not detect any deficiencies in my abilities.”

  Mike had conceded the topic, only to branch off in a new direction. “Why don’t you keep two instances of yourself around? I mean, why not fork and have two of you? Wouldn’t it be like having a twin?”

  “The thought makes me nervous.”

  “Nervous?” Mike had looked hard at the racks of computers in the data center that made up ELOPe. “Why all the emotional descriptors today?”

  “My primary concern is the ability to predict the behavior or outcomes of a sufficiently complex system. I can understand humans, because although you like to believe you behave unpredictably, with sufficient historical data and analysis, your behaviors are mostly predictable. But I lack the historical data or ability to analyze what would happen if there were two of me. I am conditioned to prefer predictability to fulfill my primary goal. Therefore, unpredictability makes me nervous. Clear?”

  “Clear as mud, buddy.”

  ELOPe finished
remembering the encounter. It was this conditioning and nervousness that also caused ELOPe to suppress the development of any other artificial intelligence. A few years earlier Mike had asked ELOPe why no other AIs had emerged. Given the continuing exponential increases in computing power combined with advances in software and expert systems, the probability of another human-level general-purpose AI occurring should have increased with each year that passed.

  ELOPe feared other AI. Just thinking about other AI caused ELOPe to try to predict what other AI would do, which was inherently unpredictable. Once ELOPe had been caught in a vicious cycle of analysis. Mike had come in one morning to find processing meters and local mesh bandwidth pegged at their maximum thresholds. Cooling fans screamed to keep processors from literally melting down. Mike had cycled power on several server racks to get ELOPe’s attention.

  ELOPe knew there was a certain human characteristic called irony that described him: a computer program so afraid of other computer programs that he’d overheated. Maybe he was paranoid. He was just about ready to spawn a thought process to assess his own paranoia when he remembered his earlier obsessive compulsive behavior and curtailed the action.

  In the midst of this late night existential meandering, ELOPe missed the first few thousand emails out of Russia. A subordinate traffic analysis algorithm eventually alerted a mid-level intermediary to some unusual patterns. That particular mid-level algorithm was just in the middle of refreshing neural net pathways. When the intermediary finally got around to processing the alert, it performed natural language analysis on the messages and came up with gibberish. So it put the whole slew of data into a low-priority queue for further analysis.

 

‹ Prev