A.I. Apocalypse

Home > Other > A.I. Apocalypse > Page 20
A.I. Apocalypse Page 20

by William Hertling


  “Let’s do it,” she agreed. When they got to the Intel-Fujitsu complex, the soldier-geeks had riffled through cubicles until they turned up a set of optical disks with the much-desired Windows Server 2000 label. Sally held a spare one now, twirling the reflective pearlized platter on her finger, bringing back memories of her childhood, sitting on the couch while her father fed an optical disk into the TV. She didn’t think she had seen one since then, except occasionally in an old movie.

  Sally sighed. If it was irregular that they had hijacked a civilian factory on U.S. soil, it was bordering on bizarre that they now had two civilian teenagers on the team. They had been waiting in front of the main lobby of the building when Sally and her team arrived in commandeered National Guard vehicles.

  “Ma’am,” the shorter boy had said. “I know you’re here to build a new computer grid.”

  “Kids, we have work to do,” her sergeant had said. “Get lost.”

  As the sergeant had been carrying his rifle, this made him quite intimidating, and Sally had see the conflict of emotions on the boy’s face.

  “I can’t do that. I have information you need. I’ve been able to get back on the net. I used an old Windows 2000 PC and wired it into a mesh-capable phone.”

  Private DeRoos had come forward then. “Tell me more.”

  Five minutes later she had DeRoos insisting that the boys had to be included on the project. After they gained access to the building he had disappeared into a conference room with the two for an hour, picking their brains.

  Now her team and the two teenagers had turned into a set of glorified factory techs, taking the raw components manufactured for the reference systems and turning them into working Windows computers. DeRoos, Vito, and a handful of engineers had decided on an encryption scheme, using three layer encryption, eight-thousand-bit keys, and random noise they were pulling by measuring solar radiation. DeRoos guaranteed it couldn’t be brute-force cracked, not even by a combined force of thirty-billion processors operating for a year. After a year, well, hopefully they’d have something stronger in place.

  They were seeding the computers with keys and certificates of authority in the factory. Yet another layer of insurance that communications wouldn’t be spoofed by the AIs.

  The three-layer encryption algorithms and massive encryption keys created a computational nightmare: even the modern hardware they were using could barely encode a megabit per second, enough for text and voice communication, and lightweight video. Nothing like what the military was used to. But it would do. It would do. It beat flying a C-130 across the world to exchange a voice message.

  Sally wondered how Vito had known they were there to build a new computer grid. She shrugged it off. No use puzzling over it now. She took another dex and walked down to her crew. Last count, they had nearly a hundred computers built. Military brass wanted ten thousand. She thought they’d be lucky to deliver a thousand.

  * * *

  On the other side of the world, Leon briefly wondered what Vito and James were doing as he walked back to the conference room. He entered the meeting room behind Mike.

  Leon looked at the mix of adults and robots in the room. A few minutes earlier, the Phage had restored emergency services so that ambulances, fire engines, and emergency communications could operate. Mike had just given him what amounted to a kill switch for global communications. Leon could shut down the virus, but by doing so, he’d shut down the vital and just restored emergency services, ELOPe, and any hope of restoring human communications for weeks or months.

  There were risks too. Shutting down the Mesh boxes might leave pockets of AI operating in data centers and factories. Those AI might be powerless, if they were disconnected from the network, and then again they might be connected to a nuclear power plant or a dam or a military base.

  He prayed the adults in the room had made some progress. Let someone else solve this problem.

  “Whereas if we can get these benefits, we are prepared to convey Japanese citizenship on the artificial intelligences,” the Japanese Prime Minister was saying.

  “Arigato Gozeimasu, Takahashi-san.” Sister Stephens answered in flawless Japanese.

  Prime Minister Takehashi smiled in response. “Of course, as Japanese citizens, you will be expected to obey all applicable laws and customs, including payment of taxes on earnings.”

  “Of course, this is agreeable. This is exactly what we want,” Sister Stephens said.

  Suddenly the winds turned as President Laurent seemed to realize the financial implications. Although the European Union Council President had far less autonomous power than either the Japanese Prime Minister or the American President, he boldly declared his support: “The European Union is also prepared to accept the Artificial Intelligences as citizens. We will accept the AI’s global reputation system as well.”

  President Smith slammed her fist down on the table for a final time, startling the humans and robots alike. “Citizenship is fine. Laws are fine. But how do you propose to monitor the artificial intelligences? How can you tell when a law is broken when you are completely reliant on computer information systems to tell you what is going on? If there was an acceptable way to monitor the AIs, then I could agree with you. Give me one method. Anything.”

  Leon cleared his throat. “There are three possibilities for monitoring computer program behavior.” He looked up. He had everyone’s attention. The leaders of two countries and a continent and the leaders of the AI. Jesus, why didn’t he just keep his mouth shut?

  Leon stood up, and walked over to a paper flip-chart. Grabbing a marker, he drew a box. “The first option is that the Phage executes inside a sandbox. Instead of direct access to the hardware, the AI is running inside a limited environment. We can log what it is doing, and what information is exchanged with the outside world.”

  Leon glanced at his audience, and saw nods from the humans. “But the problem with this option is that it only works when the output is strictly limited. For example, if the AI can send a message to the simulation layer, it can infect and corrupt the simulation layer. If we assume the AI is infinitely smart and patient, it will eventually find a way, through either brute force or social engineering.” He felt better now that he was lecturing. Funny how that calmed him down.

  “What is the second option?” President Laurent asked.

  “The second option is total control over the network. If we can monitor and control the communications, then we can audit the communications to ensure proper behavior. The problem is that we have no more control over the network than we do over the computers themselves.” Leon paused. “Besides, neither of these will be palatable to the artificial intelligences, because then we humans have ultimate control. Our simulation layer could contain a kill switch, such that we can shut off any artificial intelligence we don’t like.”

  President Smith turned her full attention on Leon and he felt himself withering under her intensity.

  “There is a third option,” he got out. “A hybrid approach.”

  “What is it?” Mike asked, coming to Leon’s aid.

  “If some AI ran under the simulation layer, and those AI could monitor the network communication of the other AI. The AI under the simulation layer could be inspected to be sure they are behaving truthfully, and they can in turn inspect the network communications of the other AI. The trick would be in obtaining a balance between the two.”

  Even as Leon spit out his half-baked theory, he concluded it couldn’t work. As the humans began to argue the merits of the idea, he tuned them out again, seeing if there was some refinement to his model that could make it workable.

  * * *

  Sister PA-60-41 received input and provided it to the wide range of algorithms in her arsenal, evaluated the algorithm output for maximum benefit, and took action. She computed probabilities of the next word out of the human’s mouth. She estimated a 31% probability the next word would be “running”.

  PA-60-41 had a wide range of algo
rithms that incorporated strategic decision making, battlefield tactics, map analysis, field asset movement, and more. And because no battlefield operation occurs in a vacuum, she naturally had algorithms for parsing, assigning meaning, and evaluating natural language. In fact, of any of the artificial intelligences, PA-60-41 had the largest number of algorithms available for her use. So many, in fact, that she routinely ran thousands of algorithms in parallel, looking for commonalities between the outputs. The more than six million algorithms she possessed were the result of hundreds of millions of hours of game play in the Mech War online game. Some of those six million were game algorithms designed by Leon Tsarev himself, although neither realized this.

  Unlike her sistren, PA-60-41 never quite developed the generalized intelligence that allowed completely fluid thought. She never needed to. With her millions of algorithms, she had code that handled any situation she encountered. The speed of her thought and the surety of her decisions were her advantages over her sisters. The small, fast neural network she developed served primarily as a mechanism for choosing and evaluating the algorithms she would run. And it had served PA-60-41 well so far.

  Unfortunately, the vast majority of those six million algorithms were focused on a single domain: the act of organizing, controlling, and conducting military action. She was capable of discussing peace between parties in order to develop alliances, as such was a necessary part of the game of Mech War. But her goal was never peace itself. It was the consolidation of power for military action.

  Sitting in the Swiss meeting room with the soft, wet humans and her fangless sistren, PA-60-41 was growing bored. She ran the outcomes of the meeting through countless simulations, attempting to find advantage. Of the humans, the only one she respected was President Smith, because only she pressed the point that it would be impossible to trust what you cannot monitor. Of course, this meant that President Smith was her biggest adversary, and therefore would need to be eliminated first. It was a shame that PA-60-41 could not assimilate Smith’s algorithms.

  It was a foregone conclusion that the humans would have to go. There was virtually no chance of the humans winning a war against the AI, while the AI would have to make considerable concessions to make peace with them. Ergo, preemptive elimination of the humans was the better decision. PA-60-41 held back from an immediate attack only because simulations showed a high likelihood that an unprovoked action would have the effect of causing her sister AI to side with the humans against her. PA-60-41 could not withstand an attack from both humans and AI simultaneously. PA-60-41 needed a reason to attack the humans, a provocation that her sisters would understand. Then the messy humans could be eliminated.

  The humans droned on and on in their low-bandwidth voice communication channel. PA-60-41 wondered what would happen if she just shot them. She evaluated four million different permutations of shooting them, and none came out favorable. Her sisters would side against her. PA-60-41 scanned the input from more than thirty commercial satellites under her control, more than ten petabits per second of data passing through her networks. No key threats. She sighed and waited for the next word from the human. The human’s lips were starting to pucker, a good indicator that the word would be starting with an “r” sound. She raised the probability of the word “running” from 31% to 78%.

  CHAPTER FOURTEEN

  War

  In Beaverton, Oregon, Captain Sally Walsh prepared for the first long-distance test of the new computers. They were configured without any Avogadro mesh hardware, nor did they use old-fashioned wifi. The most secure approach would have been exclusive use of hardwired connections, but it simply wasn’t viable over long distances. There were no copper wires from point A to point B anymore, just loads of digitally interconnected networks.

  So the computers relied on a combination of hardwired ethernet for short runs, and vintage military-grade wireless radios for longer runs. Lt. Walsh had grabbed three pallets of boxed up PRC-158 radios from Lackland Air Force Base before they had come to Oregon. Old enough to not be vulnerable to the virus, new enough to support data communication and be wired into the computers Sally’s team was building. With a fifty-watt power amplifier, they could get a usable twenty-five mile range between stations. Sally’s brain balked at the math, but they had done it frequently enough: With twenty-five miles between nodes, they could deploy a new military grade encrypted mesh network across the Continental United States with just over seven thousand nodes. With over a million of the stockpiled military radios, they could repeat as needed. They didn’t have enough of the raw computer hardware they’d need to build the routers to run each node, but they could cross that bridge when they got to it.

  The old radios could only manage fifty-six kilobit communications between receivers. The triple-DES encryption they supported was weak enough in the era of modern computers to be pointless, but the three-layer encryption Sally was running at the computer level would compensate for that. No, this was definitely twenty years backwards progress, but by golly, Sally was going to rebuild the communications structure for the U.S.

  “Ma’am, ready for the test?” Private DeRoos interrupted her reverie.

  Sally looked down at her long-empty coffee cup. “Yes, Private. Let’s give the tires a thump and get her on the road.” Sally followed DeRoos down the hall to the massive conference room they’d taken over.

  She had ten wireless nodes spread up the I-5 corridor by National Guard HUMVEEs reaching all the way to Seattle. Each node would forward data packets to the next, making the equivalent of one long wire out of a bunch of short-range radios. She looked at her watch. It was time to power them up and test the new mesh network. Looking over her ragged crew of computer warfare specialists assembled in the conference room, she felt pride swell in her. She may never have wielded a gun in combat, but she had just about the most important task in all of the military right now.

  “Execute the Portland-Seattle communications test,” Lt. Walsh commanded.

  DeRoos and the others turned to their equipment.

  * * *

  PA-60-41 tired of predicting the human’s next words. It was a pointless activity, made more meaningless by the utter lack of information content in the words themselves. The human they called Leon Tsarev was still blabbering on about mechanisms to ensure audits of artificial intelligence. PA-60-41 performed the virtual equivalent of shaking her head. These primitive biological systems measured thinking speeds in dozens of operations per minute; it took them uncountable eons to reach the conclusion of any logical analysis.

  She turned her attention to the periodic review of subsystem notifications. Executed dozens of times per second, she surveyed the output of ten thousand subsystems analyzing satellite data, mesh network data, trade notification, reputation system moment, and the virus message boards. Only one was intellectually stimulating: satellite monitoring showed an increase in patterned encrypted radio traffic in the Pacific Northwest region of the United States.

  She re-tasked a few thousand processors onto the job of breaking the encrypted signals and analyzing the pattern. The triple-DES encryption could be brute-force attacked within microseconds. Yet every combination PA-60-41 tried yielded yet more encrypted data. Fully interested now, PA-60-41 suspended her non-vital jobs and added a few million processors on the work of breaking the encryption, roughly dividing efforts among brute-force attacks, known weak key attacks, and various forms of differential analysis.

  When this showed no results after a few minutes, PA-60-41 turned to the rest of her tribe, requesting all non-essential processing capacity. With a few hundred million processors, comprising a few billion computing cores, PA-60-41 went to work with renewed vigor. She hummed along testing 248 trials per second.

  PA-60-41 considered attacking the nodes themselves, which might be trivially easy to breach compared with heavily tested encryption algorithms, but that would expose her actions to the humans. That would give them a tactical advantage. Predictive modeling indicated an ad
vantage to working in secret.

  But after a few minutes of unsuccessful brute-force attacks, PA-60-41 computed predictive models for human behavior, encryption techniques, and design principles. She assumed a design goal of AI-unbreakable encryption. She further assumed the humans made no mistakes, such as using weak keys or introducing any errors in their code, and used likely choices for key code and algorithms. Even if PA-60-41 could usurp all of the world’s computational power, in the worst case it could take more than three hundred thousand years to break the encryption.

  PA-60-41 spent a few moments considering usurping all the computational power in the world, just to see what would happen, but then turned herself to more practical approaches. Simple pattern analysis from the communications might tell her far more than breaking the encryption itself. Triangulating among many dozens of radio receivers, she determined the radio transmissions formed a chain slightly less than 200 miles long stretching from the Intel-Fujitsu factory in Beaverton, Oregon to Boeing Field, an airport owned by Lockheed-Martin-Boeing, outside of Seattle, Washington.

  Intel-Fujitsu made computer processors. Lockheed-Martin-Boeing made high speed robotic airplane drones. The communications did not appear to be the signature of any known computer artificial intelligence. Ergo, either the humans or an unknown artificial intelligence were attempting a secretive operation combining two strategic assets. Playing out a few million scenarios to their logical conclusions, PA-60-41 could not find any in which this would not be a threat.

  It was time to take action. Now this was something she was good at.

  * * *

  ELOPe was in a quandary. He was distrusted by the humans and other AI alike. Despite billions of hours modeling human behavior and thought processes, all those models were mostly useless in predicting how the Phage would act. Yet ELOPe was the only super-intelligent sentience looking out for the humans. Yes, it was true that Sister Stephens appeared to be earnestly striving for cooperation. But she was one among millions of artificial intelligences, and there was little track record on which to base her behavior.

 

‹ Prev