The Beam: Season Three

Home > Horror > The Beam: Season Three > Page 11
The Beam: Season Three Page 11

by Sean Platt


  Episode 14

  Chapter One

  September 30, 2042 — District Zero

  Noah entered the lab to find Stephen York at his station surrounded by last night’s dead soldiers: empty and somewhat smelly Chinese takeout boxes with their flaps open, chopsticks sticking from the tops like straws, plus nearly a dozen purple cans of Coke Six, for the caffeine boost.

  Noah looked at his watch before approaching York’s back: 6:34 a.m. Time for York to switch over, start drinking coffee instead.

  “Stephen.”

  Stephen turned. His eyes were red. His hair wasn’t a mess, but that was only because it was short enough and uninteresting enough to lie like a pile of straw. He looked like a man who should be tired but hadn’t yet realized it, like the sandman was tapping on his shoulder but he’d been too immersed to notice.

  “Good morning,” Noah said.

  “Good morning,” Stephen echoed.

  “Have a good night?”

  York looked around at the takeout boxes, registering something like surprise — as if he hadn’t realized until now that they were there and it was morning. As if he and Noah hadn’t slain all of those boxes and Cokes themselves. But Noah had long ago learned to keep basic biological functions out of the way while his attention remained unbroken, and after years at Quark, Stephen had learned the same. Eating happened for both of them. These days, it wasn’t really something they did.

  “Apparently,” York said.

  “What are you on to?”

  Noah nodded at York’s screens. He had three open. One was the sector he was programming, and one was an existing, rolled-out Crossbrace partition for duplication, modification, and reference. The third was a bug log, to record the flaws Stephen saw as he was using the reference sector. You couldn’t merely program; you had to fix and recompile as you went then push out the updates when warranted, being sure to cross-check against interfacing software for conflicts. The iterations of trials and verifications necessary at each step were endless. Every task led to twenty more. Once you began A, you saw that B was required to do it. But B needed C, and C needed D.

  “Fi patch for the improved visual upgrades,” Stephen answered.

  “Fi? Why are you working on Fi?”

  “So the upgrades and add-ons can talk to the network.” Stephen looked confused, and for a few seconds Noah wanted to smack him for being so helpless and stupid. They were changing the world. Nobody should appear pathetic when changing the world.

  “I told you. The existing Fi will work fine. We need to get the upgrades out there now. Yesterday.”

  “We talked about this,” said York. “The resolution on these units is fifteen times better than the previous models that use the old Fi standard.” He nodded down at three small orbs on the desk. His elbow nudged one, and it rolled to face Noah, showing its iris and pupil. The things didn’t look any more realistic than any hundred-year-old glass eye, but interestingly, that’s how people wanted them. Those who bought prosthetic limbs wanted realism because the replacements were fixing a biological mistake, but these eyes were upgrades: “enhanced humanity,” the adopters called them. Why bother to implant a robotic eye if nobody could tell it wasn’t just a normal eye?

  “Are you kidding me, Stephen? Do you even remember what this company used to be called before I bought it from Ben Stone? Do you remember our biggest informational asset here?”

  York shook the implication away. “EverCrunch algorithms won’t push data past a bottleneck in the Fi protocol. You know that.”

  “Really? Telling me what I know?” Noah huffed. “I wrote that software. You don’t need to rewrite it. Don’t even try; it’s a waste. Just remove the limiter, and open the whole of the Fi to the enhanced visual stream.”

  “That’s what I’m doing.”

  “Sounded to me like you said you were going to write a patch.”

  “So I can remove the limiter.” York’s voice was almost patronizing.

  “You don’t write a patch to remove a limiter,” Noah said, annoyed. He came forward and nudged Stephen out of the way then scrolled through the code. He positioned the cursor at the limiter section, deleted, then entered a new value. When he looked back at Stephen, the man’s eyes were wide.

  “You can’t do that.”

  “Why not? Without the old value throttling the bandwidth, the same Fi can carry a hundred times what the old implants broadcast.”

  “What if the person using it has other Crossbrace accessories, Noah? Shit, what if they have a pacemaker? If you don’t notify the rest of the peripherals that the eye might use that much bandwidth, a user could enter a room filled with detail — a warehouse filled with labeled boxes, maybe — and the eye will begin to process them all and send them back with pattern-matched dumps. That kind of deluge could choke out the pacemaker!”

  “Pacemakers are just drums, Steve. They don’t need to constantly talk to the network.” Which was obvious. York was being an idiot. Again.

  “For diagnostics and — ”

  “Do you think this is 1980? Are we talking about an artificial heart the size of a Daimler? Jesus Christ, Steve, it’s a ticker and nothing more. It only pings Crossbrace for redundancy, not to keep it beating. Nanotechnology will handle the whole loop in a few years. Hell, even today they’re half-smart. And when we roll out The Beam — ”

  “You can’t just step all over medical safeguards like that! The FTC will — ”

  “Curl up and cry? That’s the worst part of the rollout: all the government interference. FTC, FDA, everyone with an acronym suddenly has their hands in our business. But since when has the government added to science and innovation? People like me succeed despite the interference of incompetents.”

  “It only interferes with our success if — ”

  “Jesus. If I’d have known I was hiring a pussy to help me change the world, I’d have kept looking. It’s your job to help me be bold, Steve. Stop second-guessing my choices. Just do what I fucking say, and stop questioning. But you know, maybe the problem here is that you don’t understand what’s actually going on between us. I’m not asking for your permission to increase the bandwidth parameters of my software. What I’m doing is — ”

  “Our software,” said Stephen.

  Noah pursed his lips, watching Stephen. Then he resumed speaking, slowly.

  “We’ve worked together closely, haven’t we, Stephen?”

  “Of course.”

  “It’s been years. Just you and me. Us and a bunch of techs and code monkeys.”

  “Sure.”

  “So maybe I can’t blame you. Maybe I should try to understand because maybe some of this is my fault.”

  “Blame you for what?”

  Noah put his hand on Stephen’s shoulder.

  “Maybe we should clear something up. I’m your boss. You’re my employee. This is my company, not yours. This is my vision. You’ve helped me articulate that vision, but it’s still mine. And that means that you’re not in charge, are you?”

  “I…”

  “You need to know your place.”

  Stephen’s mouth closed. He watched Noah with his big eyes. Under his stupid, rule-following haircut. Was this man a virgin? He might actually be, Noah realized; he’d come to Quark as a kid and had barely left his station since. He looked like a deer in crosshairs, standing obediently still, waiting to be shot.

  Watching his own thoughts from above, Noah felt a strange, uncomfortable feeling slide across his skin like oil. With the sensation, his anger mostly dissipated, leaving only frustration and irritation and annoyance behind.

  “Just use the fucking code,” he finished. “Make the eyes work first, then go back for the patch later.”

  Stephen looked like he might protest — might ask whether to disclose the changed bandwidth throttle in the market specs when the eye implants went up for approval, for one — but he said nothing. There were a few seconds in which York looked beaten, maybe wounded. Seeing it, the slick s
ensation reasserted itself on the back of Noah’s neck.

  Stephen returned silently to work. Noah walked away, hoping the oily feeling might stay behind.

  Instead of sitting at his console in the lab with Stephen, Noah went back to his office, where he’d been for most of the nighttime hours, planning the network’s next iteration. It might be more efficient to continue the planning with Stephen — and, in fact, that’s why he’d come down to check on York in the first place. He’d reached the point where he could no longer hold his water, and on his way back from the restroom it had occurred to him that he wanted another set of eyes on the web he was plotting using the 3-D modeling software. Stephen wasn’t as good at thinking of networks in 3-D as Noah, but he was second best — and leagues ahead of anyone else in the lab. Really, Stephen might be the only person in the world other than Noah who truly understood Crossbrace from the inside out…and who therefore understood (as much as he tried to pretend he didn’t) why Crossbrace wasn’t good enough.

  The network was two weeks old, and already the country was losing itself down digital tunnels, singing the praises of Noah’s genius. In Crossbrace, people saw everything the Internet was not. But celebrations were for the lazy. It wouldn’t be long before those jubilant masses started seeing the shortcomings that Noah and Stephen already knew were there. The more immersed people became in digital living — the more of themselves they off-worlded to the new cloud, the more they tried on new senses, the more they came to rely on the Internet of Things — the more apparent it would become that the world Quark had created was woefully incomplete.

  Stephen’s current quandary was a perfect example. As long as the network relied on analog, individual minds to puzzle out digital, nonlocal, collaborative problems, there would be a disconnect. No wonder Stephen thought it wasn’t safe to uncouple the ocular implants’ limitations and risk monopolizing bandwidth that a person’s other peripherals might need to survive. Noah was already thinking of the next iteration, already imagining a day when the AI finally evolved far enough to make those decisions without human pollution. But that AI — the kind that could build its own world rather than just live in one created by Quark — wasn’t here yet. And not everyone could see as far forward, or think as many steps ahead, as Noah West.

  There was a small knock at the open door. Noah looked up from his chair and saw Stephen standing in the doorway, looking timid.

  “Noah?”

  Noah nodded without speaking. No greeting would sound conciliatory enough. If he spoke, he’d either insult Stephen further or come off as pandering. Noah had never been good at making peace.

  “I’ll keep working on the eye enhancements if you want, but just so you know, we got a report back from DZPD. They actually answered.”

  Noah sat up. He was good at thinking ahead for sure, but he hadn’t seen this coming.

  “They answered?”

  “Yeah.”

  “With a fuck you?” Noah hoped not. If the response from the police department came back as a no, it would be hard not to shout at Stephen again — as he had the first five times Stephen had suggested contacting them. Noah had been as dead-set against trying to involve DZPD as he’d been against the board’s stubborn refusal to understand Crossbrace’s genius at first…or Crossbrace’s successor’s genius today. And so, stymied, Stephen had finally sent the request to the cops behind Noah’s back, and when Noah had learned of it, he’d blown a gasket.

  York shook his head. “No. They agreed.”

  “They agreed to give us access to their camera feeds?” Noah said, flabbergasted.

  York shook his head again. “No, not that much agreement. But they did agree to let us place visual sensors wherever we wanted, up to sixty per city block, as long as they’re given access to the streams produced by those sensors. If I had to guess, once they see the coverage we’re able to offer, they’ll strip their cameras anyway, and it’ll be the same as if they’d given us permission to peep in on their feeds all along. Except that this way, the quality will be much better, and our AI will be better able to parse the visual data.”

  “And Crossbrace connectivity? Will they allow us to network those points into Crossbrace? Up to sixty visual sensors per block?”

  “As long as they have access to the API.”

  “Of course. Everyone has access to the API. Do you think they’ll have any idea how to use it?”

  York flipped his hands palm-up. “No idea. I suppose there must be geeks everywhere, so maybe. But of course the point-to-point coming into Quark is off limits, and they won’t have our behavioral algorithms. Though…if I could make a suggestion?”

  Noah nodded.

  “We could feed DZPD our behavioral analyses anyway — after they’ve been processed, maybe with a filter or some sort of vetting in place so they’re getting the Quark stream instead of the raw inputs — that could be far more helpful than just letting them fiddle with the API on their own. They’ll be able to spot crimes far earlier using our AI interpretations than they could by sitting there watching camera views since the AI can watch everywhere at once and their operators can’t.”

  “Tie it to robotic enforcers?”

  York paused before answering, probably trying to determine if Noah was really going to let him keep leading this discussion after shutting him down so completely in the last one.

  “Probably not yet,” he finally said. “I suppose they’d let officers look at the AI’s guesses, then check the feeds themselves to verify, then act. Not with robots in a closed loop, but using humans. At least for now.”

  Noah pinched the bridge of his nose. He hadn’t thought Stephen was right to ask DZPD about the citywide surveillance network, and he certainly wouldn’t have thought sending the Quark stream to the cops was a good idea. But now — now that the police had agreed to let them plant all those valuable sensors — he saw York’s idea for the gem it was. Regardless of whether the police used the Quark behavioral assessments or not, Crossbrace would still be assimilating sixty separate visual records per block of data. There was no harm to doing that other than pride: Quark giving away its hard-gathered data without so much as charging for it. But the more the cops liked using Crossbrace intelligence to make the city safer, the more the cops would like Crossbrace itself. Soon, they might consider equipping officers with false eyes, smart glasses, even smart bullets and drones. Police liked to pretend they were fine doing things the same old ways, same as everyone. But Quark wouldn’t be the first pusher to hook a customer by offering free samples.

  A small voice inside Noah said, You were right about this one, Steve. Thank you for pushing me and holding your ground and making Crossbrace’s dataset infinitely better.

  Instead, his lips said, “Use A-6 processors on the visual sensors. Not the A-5s. They’re unstable, and we’d have to tie them into the power grid to keep them online.”

  York waited a beat to see if Noah would say more, but when Noah merely returned to work, he left. Noah watched him go from the corner of his eye.

  A feeling of unease went with York. It was that same oily sensation from earlier — a feeling Noah couldn’t entirely place, but didn’t like at all.

  He wondered if Stephen regretted his time with Quark, or might even resent it. Even though York’s NDA would prevent him from ever taking credit for Crossbrace or what followed, he was still changing the world. He was paid a fortune. So did it really matter that he didn’t have any time to spend it? Did it really matter that he’d spent eighty hours a week, working for a decade, given the scope of the revolution he was helping usher into the world? That’s what people like Stephen wanted, right? The same as it was what people like Noah wanted — and they were the same kinds of people, deep down.

  A disturbing thought hit him: Have I stolen Steve’s life then given him nothing that matters?

  Certainly not.

  Still, Noah pulled up the Quark employee roster then inserted a glitch that would cause the system to fire Stephen from the co
mpany this coming Friday. His badge wouldn’t work, and the AI would kick him out of the building to the accompaniment of blaring security alarms. Given the complexity of the glitch, Noah estimated it would take two or three days for Quark’s less-skilled programmers to find the problem and fix it — a period of unfortunate limbo that, after York was fired, they’d be able to tell him about in advance of solving what was wrong

  During that time, Stephen would literally have no choice but to take the weekend off.

  He could walk through the park.

  He could read.

  He could get laid, possibly for the first time.

  He could blow off some steam. Dampen the pressure. Regain some sanity to protect the mind that Noah had to admit would be needed if The Beam was ever to see the light of day. Maybe Noah could go forever on almost no sleep, but he reluctantly had to admit that wasn’t true of anyone else…and that the need for simple comforts, here and there, was only human.

  His small act of sabotage accomplished, Noah returned to work.

  As hard as he was on Stephen, deep down he knew he’d never let anything bad happen to the man. Not in a thousand years.

  Chapter Two

  May 15, 2063 — District Zero

  “Long.”

  Dominic turned. Detective Lewis was standing behind him in the doorway of the small, below-the-line apartment, holding something up in front of him. The something was black and drooped over a stylus that Lewis had probably kept in his pocket more for poking at evidence than for use on his tablet or Beam surfaces — at home, obviously, and not in the station. Dominic thought the joke was on Lewis, who was nearly sixty and apparently shared Dominic’s Grandy’s views on police work. He acted like poking things with a stylus to keep them clean of fingerprints would prevent soiling the scene. As if Lewis’s blueprint — and everyone else’s, given how much he used the stylus on the job — wasn’t soiled from end to end.

  Dominic’s eyes moved to the thing Lewis was holding out like a gross trophy carried home by a feline hunter. It looked like a wet shoelace.

 

‹ Prev