Version Control

Home > Science > Version Control > Page 27
Version Control Page 27

by Dexter Palmer


  Then:

  She watched as paramedics chiseled a large block of foam from the other side of the car, placed it on a stretcher, and carted it away, its custard color shot through with thin red streaks of blood. A small hand stuck out from it, unmoving.

  Oh no.

  Oh God.

  Sean.

  Later, when the time came to portion out the blame, the insurance agent told Rebecca and Philip that bad luck favored complex systems. “The thing you have to understand is that the mass introduction of self-driving cars onto highways made auto liability cases insanely complicated,” he said to them in his small office with its thin walls through which Rebecca could hear the sounds of a woman weeping, a man cursing. “In the twentieth century,” the agent continued evenly, as if these noises were nothing to be concerned with, or as if they were so common that he’d learned to tune them out, “assigning fault to one actor or another in an accident was a fairly straightforward process. But with autonomous cars it’s more difficult. Because unlike brakes or pistons, the actions of artificial intelligence routines aren’t entirely predictable by design. They make thousands of decisions based on the contexts in which they find themselves, and it’s impossible to know what those contexts will be. And if they make decisions that perhaps lead to tragedy, is the owner of the vehicle to blame? Is the manufacturer of the computer software? Or someone else entirely? Who can say for sure? New Jersey’s a no-fault state, which makes things a little less complicated in some ways, but in many states this still isn’t even completely settled law. Which is why when you purchase an autonomous vehicle, you sign one title for its hardware, and a second, separate license for its software, which, unfortunately, has the effect of limiting the extent of the manufacturer’s liabilities in the case of…” He sighed, and it was a practiced sigh that Rebecca was sure he’d made a hundred times before. “An incident,” he finished.

  He paused to let that settle in, and then continued. “This isn’t legal advice: I’m just speaking from experience. You’re going to want to go after the manufacturer. And what they’re going to do is pull the data your vehicle sent back to home base, the same data I’ve got right here. And…look. You took the wheel, right? When all this happened.”

  “Of course I did!” Rebecca said. Then, with a plaintive squeak: “It asked!”

  “I understand. But there’s this clause in your software license. Basically, when you take control of the vehicle in circumstances like this, you relieve the manufacturer of liability from the consequences of any actions that the software took immediately prior to your assumption of control. In plain English, the manufacturer is going to argue that you can’t blame the software for getting you into a mess if you then tried to get yourself out of it. Because how do we know the driver, by taking the wheel, didn’t make the situation worse?

  “I know, I know. It’s bullshit. But just eyeballing this, there are so many possible complicating factors here that…assigning blame? In a manner that would hold up in a court of law? You could try, but…look at who they are, and look at who you are.

  “As for knowing how and why it happened: honestly, I don’t think you ever will, not with any certainty. My advice—and as a father of two I know how you feel—is: just accept that it happened, and find a way to move on from here.

  “I’m sorry.”

  That was not a satisfactory answer for Philip, though, and so, with the dogged persistence of a physicist used to spending years on projects with little chance of success, he did his research. But he didn’t get very far. The corporate functionaries he could manage to get on the phone had their tongues tied by considerations of liability, and the company documents he got his hands on dissolved into jargon when they threatened to touch on the truths he needed. By lurking in less reputable areas of the Internet he was able to acquire schematics for a few makes of autonomous vehicles, as well as various versions of the AI routines that powered them, and with that he was able to propose a conjecture of what might have happened that day. But it would forever sting him that conjecture would be his only consolation.

  What might have happened, Philip thought, was this:

  Most autonomous vehicles had three methods of performing the crucially important task of confirming their own position in spacetime, as well as the spacetime positions of other nearby cars. They had an average of a dozen cameras mounted on their exteriors; they regularly broadcast their positions to each other via Wi-Fi nodes; and they sent their positions to satellites that also relayed those positions to other cars in turn, as well as to other interested parties (automobile manufacturers; insurance agencies; companies who’d paid to place innocuous personalized advertisements in the corners of windshields; the NSA). The three methods were intended to provide a high degree of redundancy, since knowing exactly where a self-driving car is and how fast it’s moving is of utmost importance.

  However, one particular carmaker, not the manufacturer of Rebecca’s and Philip’s car, had announced their intention on the day of the incident to push a firmware update that, in the patch notes, only vaguely said that it was designed to “improve functionality.” Somehow, autonomous-car aficionados had gotten hold of the update in advance and found out that it in fact decreased functionality: specifically, it cut off access to an extremely popular, remarkably addictive video game that millions of commuters played on their windshields each morning on the way to work. So when the predictable instructions circulated on social media for how to use USB sticks and a homebrewed mod to uninstall the upgrade and revert to the prior version of the firmware, while still telling the manufacturer that the firmware was up to date, at least a few car owners took advantage.

  The problem with this was that the new version of this particular car model’s AI routine did have some specific advantages, among which was a decreased latency in the constant transmission of the vehicle’s location in spacetime. However, those cars that had been hacked to run the prior version of the firmware, with its greater latency, could still communicate with the rest of the system, which in turn assumed that those hacked vehicles were actually running the most up-to-date version with the lesser latency.

  So because of this, the three methods that vehicles had of confirming each other’s positions—camera; satellite; Wi-Fi—could no longer agree in the cases of those cars still running the firmware’s prior version. The difference in the transmission latency was only fifty milliseconds, but that is long enough for a car traveling at sixty miles per hour to move four feet. Vehicles in the close proximity of those that were covertly running the obsolete firmware became confused: the location signals they were receiving from Wi-Fi and satellite agreed with each other, but not with what their electronic eyes told them, and they were unsure which to trust. In most places, such as the rural highways that stretch across the Midwest, this didn’t matter; on crowded roads where the cars traveled fast and stuck close, that four-foot discrepancy was incredibly important.

  Different makes and models of autonomous cars dealt with the perception of this discrepancy in different ways, depending on the rigors of their programming. The smartest ones played it safe, assuming that the cars running the obsolete firmware were both in the places indicated by the cameras and the places indicated by their Wi-Fi and satellite signals, receding from them accordingly. Other cars, among which was Rebecca’s, detected the discrepancy in the information sources but had no real idea how to process the error: in a circumstance like this the best course of action was to pull off to the side of the road, hand control over to the driver, and refuse to reactivate the autonomous systems until a mechanic had performed a diagnostic. Had Rebecca’s car been in the right lane at the time of the incident, things might have turned out relatively fine for her and Sean. Such little things only become important after the fact.

  But Rebecca’s car was in the middle lane and barred from escape, which put it at the mercy of those few cars that had the poorest programming, those that, when detecting the discrepancy between the spaces w
here the erroneous signals said the cars were and the spaces where their own cameras said they were, assumed that the cars only occupied the positions where both those spaces overlapped. In their electric eyes the vehicles shrunk to the size of motorcycles, leaving four feet of empty space to accelerate into, to get their passengers to their destinations just a little more efficiently, just a little faster. Suddenly—and Philip discovered that several accidents with similar causes happened within minutes of each other on the same day on the country’s most crowded roads, in Los Angeles and Tampa and Washington, DC—the badly programmed cars accelerated and rammed those in front of them at a life-threatening speed, the momentum transferring between the closely spaced vehicles and sending them ricocheting crazily against each other like billiard balls just after the break. Most roads saw no trouble at all that day, but those few where circumstances met the necessary unfortunate conditions descended into chaos.

  And finally, when the blame was all shared out and it came time for Rebecca to swallow her spoonful, her mother was there beside her.

  “When it came down to it,” Rebecca said, “the car was a coward.” She and Marianne were back on the couch when she’d laid her head on her mother’s lap the night before and wept. “The car was totally fine with taking you to get groceries or going on a Sunday joyride, but when the situation got tough it chickened out. And the thing is that when I could have let it decide what to do anyway, I took the wheel. My hands just jerked up like I was a puppet on strings and I did it before I even thought. And then the car was like, Well, good luck to you then.”

  “You did that because you’re a mother and a human being,” Marianne said. She had a drink in her hand: sparkling water in the glass tumbler that usually held her vodka and cranberry. Most of the time when she and Rebecca had these mother-daughter chats, she mixed vodka and cran for the both of them without asking, and even though Rebecca hadn’t told her mother about the drinks she’d had before getting into the car that day—the drinks were what one might call an unacknowledged complicating factor—she had read her mother’s offer of Pellegrino in the place of Absolut as an acknowledgment of a certain possibility, and of a silent signal.

  “Because you’re human,” Marianne continued, “you couldn’t do anything other than what you did. When the people you love are in danger, you act on instinct.” Looking at the floor, she slipped into a short reverie: “I remember when I was a new mother. With you. And I was learning how to read all those noises you were making: your giggles and your whimpers and your cries. You cried a lot. And it didn’t take me long to interpret those cries even if I was in another room, you know? One was like, She needs her diaper changed, and another one would make me think, That’s something she’s going to have to get over, because it’ll only get worse from here. But there was one. And I can’t really describe it, and I only heard it out of you two or three times. But when I heard it, it was like I got this haze in my head, I dropped whatever I was doing and I ran to you. I was drawn to you like you were a magnet and I was an iron filing. You know?”

  Slowly, not looking at her mother, Rebecca nodded.

  “Same thing with you and Sean,” Marianne said. “You’re flesh and blood, and you saw your son in danger and did what flesh and blood instinctively does to save its own. And imagine what would have happened if you hadn’t: if you’d just sat on your hands and let a calculator take care of things. You realize that thing doesn’t have magic powers, right? You see that at that point there was no way to stop things from turning out any way other than the way they did, right?”

  “Yes,” Rebecca lied.

  “You and I would still be sitting here on the couch, just the same. Except you’d be saying to yourself that you failed because you didn’t even try. But because you were human, you had no choice but to try, and you did the best you could do.”

  Marianne embraced her daughter. “Now,” she said. “Have you talked to Philip: I mean really talked to him about this?”

  “I…he’s keeping it bottled up. I try to talk to him and he won’t talk back. He tried to figure out why it happened and he couldn’t figure it out.

  “I…I don’t know what’s going to happen, Mom. I really don’t know.”

  —Philip. Philip? I’m home.

  —Rebecca.

  —Philip. It’s dark in here.

  —I want it dark. Rebecca.

  —Philip, why don’t I just turn on—

  —Please don’t. Sit down. Rebecca.

  —Philip.

  —I need to ask you something.

  —…

  —…

  —Well, what?

  —I need to ask you if you were drinking in the car. If you’d been drinking before you got in the car.

  —Ohhhhhaaaaaah Philip.

  —Because—

  —Aaaaaaah. Aaaaaa­aaaaa­ah.

  —Because if—

  —Out of all the questions you could ask. You ask that question. That question.

  —I need to ask you if you had a drink before you got in the car.

  —I’m not sitting here asking who thought it was a good idea to buy a car that was driven by a fucking computer.

  —That’s not the same.

  —It is the same it is. This is about blame and nothing else.

  —…

  —…

  —You’re not answering me. I’m concerned that you’re not answering me.

  —…

  —…

  —…

  —Rebecca.

  —I didn’t, okay? I can’t believe you’re asking me this. And I doubly can’t believe I’m answering. How much must I hate myself. How much.

  —…

  —…

  —…

  —Oh what you don’t believe me? Is that what you’re going to say next? Say it.

  —…

  —…

  —…

  —Philip.

  —I…I choose to believe you.

  —Well, I’m glad you’re choosing to believe what’s true. That’s mighty big of you.

  —But Rebecca.

  —After what happened the only thing you care about is evidence and proof. I cannot believe this.

  —Rebecca.

  —You realize there was nothing I could have done, right? Do you think I wouldn’t give everything I could if I could get things to turn out another way?

  —But Rebecca. You have to stop. I really can’t…I don’t want to see you drinking again. I wish I had said something earlier.

  —…

  —If I tell you I believe you, then you have to stop drinking.

  —Are we making some kind of a bargain? Are we going to decide what the truth is based on a bargain now?

  —No, this isn’t a bargain.

  —…

  —I believe you. But you have to stop drinking. Or…I don’t know.

  —…

  —…

  —…

  —We can get through this. But Rebecca, I don’t want to see you drinking again. I can’t find that you’ve taken another drink, ever again. Or I don’t know.

  —…

  —Rebecca.

  —Okay. I will.

  —If you need help, you need to go somewhere and get some help.

  —I don’t need help. I’ll stop.

  —Okay.

  —…

  —…

  —You really believe me, right?

  —I believe you.

  —Oh God I’m so sorry.

  —Come here.

  18

  NO DIGGITY

  Quitting drinking meant that you had to live with your memories. Not all the time. But every once in a while, you’d wake up to see that your mind’s projectionist had discovered a fresh print of one of the most luridly crimson episodes of your past, and was displaying it against the featureless white of the ceiling above you. And you had to lie there in bed, and look at it while you listened to your husband grunt his way through his morning push-ups (
though perhaps he saw his own memories projected on the carpet beneath him as it bobbed closer and farther away from his head). And without recourse to alcohol’s anesthesia, you had to live with what you saw there.

  Philip stood and looked down at Rebecca. “Fifty-eight,” he said, breathing heavily. “Fifty-eight. A record.”

  Rebecca blinked away the vision before her (the convertible with its hind end tipped into the air; the woman beginning to fall) and turned on her side to face him. “That’s excellent, dear.”

  “I bet I could do a hundred.” He rolled his sore shoulders. “I’m going to do a hundred someday.”

  Philip cocked his head in consideration of the woman before him (and this was one of the things that Rebecca would remember later, when she combed through her memories: that quizzical tilt of the head before he spoke). “What are you up to today?” he said.

  “A shift at Lovability from ten to one. Then shopping for a dress. Did I tell you? We’re invited to a party in New York, you and I. By one of my old girlfriends, from back in the day. Britt. I don’t think you’ve seen her since the wedding, but she’s getting the band back together. Mostly so she can show off her perfect life, I think. So I need a new dress. I have to look good. And so do you: we’ll worry about that later. You’re going, by the way: no’s not an answer.”

  “Shopping for clothing seems so tedious for women,” Philip said, wiping sweat off his face with the hem of his shirt. Rebecca noticed (and would later remember herself noticing) that Philip’s stomach was, if not quite a six-pack, rock hard. Maybe doing all those push-ups hadn’t been such a bad idea after all.

  She playfully reached out and slid a fingertip down the thin line of hair that began at his navel and disappeared beneath the band of his boxer shorts. The Philip Steiner of years past might have caught the hint and climbed back into bed, but this morning his mind seemed elsewhere.

 

‹ Prev