The Cinderella Plan

Home > Fiction > The Cinderella Plan > Page 27
The Cinderella Plan Page 27

by Abi Silver


  Judith hesitated again before continuing.

  ‘I’m sorry, Mr Salisbury, for the silence. I’m trying to deconstruct your reply and make sense of it for the members of the jury. Perhaps I have it? Are you, first of all, saying that, at the direction of the Department of Transport, you have made SEDA cars stronger, more capable of withstanding impact, in order to improve passenger safety?’

  Celia jumped up again. ‘This testimony is completely irrelevant. It is immaterial how strong Mr Salisbury’s car was or might have been. He is responsible for driving it safely and responsibly. Of course, a bigger, heavier car will cause more damage than a smaller, lighter one, if it hits a child.’

  ‘Your honour, if Ms Mansome would allow my client to explain his answer, then she would see why it is of particular relevance. I am not in the habit of wasting valuable court time.’

  ‘Very well, Mr Salisbury, please answer Ms Burton’s question. She asked you about a direction you might have received from the Department of Transport, I believe.’

  ‘Yes your honour. We now use titanium rods in the side bars running vertically up the front of the car. And reinforced steel in the bumper, to endure greater forces in a head-on collision.’

  ‘But one consequence of that is that if your stronger, safer-for-the-passenger car, hits a pedestrian, the pedestrian is worse off than before?’ Judith was back in control again.

  ‘Yes.’

  ‘How much worse off than before?’

  ‘It’s hard to quantify.’

  Celia stood up again. ‘Mr Salisbury is being asked again to give expert evidence when he is not an expert and certainly not independent and I repeat that he cannot absolve himself from responsibility for this car, however “armour plated” it might have been.’

  Judge Wilson frowned at Celia’s words and held up his pen.

  ‘I have a question for Mr Salisbury. Your company must have done some research into this matter before it made modifications to its design. In terms of the force hitting those children, hitting Mrs Layton, how did it compare with a traditional car?’

  ‘You can compare it in many ways, your honour.’

  ‘Let’s use speed, shall we, as that is something we can all easily understand. How does the impact on a pedestrian at 36 mph compare between your old model and your newer reinforced model?’

  ‘It probably adds about five miles per hour on.’

  ‘You mean that the impact at 36 mph in a SEDA is more like forty-one in a conventional car?’

  ‘Yes.’

  A low wave of chatter rose and then fell as the public tried to process what they had just heard. Judge Wilson made more notes in his book and waved at Judith to continue.

  ‘Why did the Department for Transport ask you to make these changes?’ Judith asked.

  Celia rose again. ‘Your honour, the reasons for the proposed changes are of even less relevance than the changes themselves.’

  Judge Wilson sat back in his chair. ‘Ms Mansome. I am aware of the wider repercussions of this trial. And, on that basis, I think we need to have a little more latitude than I would normally give in terms of questioning this witness. Continue please.’

  ‘It goes back to what I was saying earlier,’ James said. ‘The focus, rightly, is on fewer accidents and fatalities in the future, dramatically fewer. Once everyone has autonomous cars and they are all being driven in autonomous mode, there will almost never be accidents involving pedestrians or cyclists or motorcyclists. So the focus shifts to protecting the passengers, especially if we move to car pooling, where all cars will carry four or five passengers at a time.’

  ‘Was yours the only company which has developed these fortified vehicles?’

  ‘I don’t know, but I know other manufacturers were consulted.’

  ‘Ms Mansome, Ms Burton can you wait a moment while I ask another question. Mr Salisbury, if these strengthened cars have more impact on collision, then speed limits must be amended accordingly, mustn’t they?’

  ‘I can’t speak for the Department of Transport, but I believe it is on the agenda.’

  ‘Reviewing speed limits is “on the agenda” in the light of your new brood of super-toughened cars?’

  ‘Yes.’

  ‘And is this common knowledge?’

  ‘Within government and manufacturing circles, yes. I can’t say how much the public knows.’

  ‘Thank you, Mr Salisbury. Ms Burton, let’s move on to another topic now, shall we?’

  ‘Yes your honour. Turning now to your car, a SEDA from 2016. We have already heard from Mr Abrams that your car is a hybrid?’

  ‘Yes. It can operate in autonomous mode, meaning the car does the driving, or manual mode when the human drives.’

  ‘In the same way as a regular car?’

  ‘Yes, it has an automatic gear box, so like a regular automatic car.’

  ‘And which mode do you usually choose?’

  ‘I always drive in autonomous mode. It’s the safest way to travel.’

  ‘And, what, you get on with work during your journey?’

  ‘Yes. I must sit in the driver’s seat but, as long as the car is in autonomous mode, I can work, answer emails, take calls.’

  ‘Does the car alert you to dangers on the road in advance?’

  ‘Yes. Just like your conventional satnav may show traffic jams or speed cameras, you get a constant update of what is on the road.’

  ‘So you can, perhaps, choose to go by another route?’

  ‘You can. I prefer to leave the driving to the car, generally. I let the car decide how best to get somewhere. I’m not one of those people who is constantly searching for a quicker route. It’s very time-consuming and defeats the object, for me, of having the car doing all the work in the first place.’

  ‘So just take us through a typical journey. When you first switch on the ignition, the car is in autonomous mode?’

  ‘Yes. I have VERA, the voice-activated technology, switched on. Some people find it annoying, the voice can drone on a bit, but I like it. I say my destination and off we go.’

  ‘If you want to move from autonomous mode to manual mode, what must you do?’

  ‘There is a button on the dashboard clearly marked “manual”. You press that button. But, like I said, it’s not something I have ever done before. I am being forced to accept, from Mr Abrams’ report, that I may have applied pressure to it on this occasion, but I don’t remember doing so.’

  ‘Have you ever been asked by the car to take control previously?’

  ‘No. I haven’t.’

  ‘So in two years of driving that vehicle, you’ve never been asked to take control, yet this happened on 10th October, just as you approached the temporary traffic light.’

  ‘Apparently so, yes.’

  ‘Did you hear Mr Herrera’s possible explanation for why your car did not slow down and, instead, asked you to resume control?’

  ‘Yes.’

  ‘Do you agree with him?’

  ‘I agree that what he said is plausible. Whether it is what happened, I can’t say.’

  ‘Let’s just examine that for a moment…’

  ‘Your honour.’ Celia was standing up, her left hand, palm upwards, pointing in Judith’s direction. ‘I really don’t see how a masterclass in controlling autonomous vehicles is advancing Mr Salisbury’s defence. Next we’ll be treated to a TED talk on the long-term drawbacks of AI.’

  Judith smiled politely to acknowledge Celia’s intervention.

  ‘Ms Burton. Explain where are you going with all this.’ Judge Wilson’s face was crinkled like a prune.

  ‘Your honour. I need fifteen minutes only of the court’s time to develop this line of defence. It will be fruitful I assure you.’

  ‘All right. But I’ll be watching the clock.’

  ‘I�
�m grateful. Mr Salisbury, if your car, while driving along in autonomous mode, which you have testified is how you always travel, encounters an unexpected obstacle in the road, what would you expect it to do?’

  ‘Your honour. Forgive my interruption so soon,’ Celia was on her feet again and she stole a quick sideways glance at Judith as she spoke. ‘The car was not in autonomous mode. The expert evidence is that Mr Salisbury put it into manual mode. This can’t possibly be remotely relevant.’

  ‘Given Mr Abrams’ testimony that the only way he can assess whether the car was in manual mode is because Mr Salisbury had his hands on the steering wheel and his foot on the brake, I will be challenging that conclusion in my summing up. If I am right, we need to explore the behaviour of the car in autonomous mode.’

  Judge Wilson squinted at the two lawyers.

  ‘Fifteen minutes, that’s all. Continue.’

  ‘I was asking what your car would do when meeting an obstacle.’

  ‘SEDA’s car is designed to avoid obstacles if at all possible.’

  ‘What if it’s not possible?’

  ‘Well, then a collision will occur.’

  ‘I touched on this yesterday with Mr Abrams. The vehicle itself – much as a human would in manual mode – the vehicle itself must make a decision on how to steer the car. How does it do that?’

  ‘Through the interaction of the various pieces of software under which it operates. Mr Herrera talked about that too. He is right that we can’t identify exactly how each component acts. Clearly though, the car has been programmed to travel from its point of origin to its point of destination, in accordance with any speed restrictions, without colliding with anything.’

  ‘Are you familiar with what’s known colloquially as the trolley problem?’

  ‘Yes.’

  ‘Members of the jury, just to explain, in this hypothetical scenario, a trolley (or we can substitute if easier to visualise, a train) is hurtling along some tracks, totally out of control and there are five people on the line, let’s say they are tied up and unable to move, rather like in an old black and white Western film. But the tracks split, shortly before the train will reach those helpless individuals, and you are in control of the points; you can divert the train to the second limb of the track and save those five people.

  ‘However, what you can see, up ahead, is that there is one person tied up on the second limb of the track and, if you do succeed in diverting the train onto that second limb, that person, who would otherwise remain alive, will die, solely because of your intervention. So what do you do?’

  ‘Your honour. Much as I am sure this primary school ethics problem is fascinating for Ms Burton, it’s rather boring for the rest of us who want to hear evidence relevant to this most serious charge,’ Celia interrupted again.

  ‘And it’s not possible to answer the charge, without understanding a little more of how these complex vehicles operate,’ Judith said.

  ‘That’s not correct. If the car was in manual mode, which it was, then Mr Salisbury was driving it and that’s the end of the story. The rest is a diversion.’

  ‘Your honour I have explained why this is highly relevant testimony for the witness and I am still well within my fifteen minutes and will remain within it, if I am not continually interrupted. Mr Salisbury is on trial here on a very serious charge. I must be allowed to conduct his defence!’

  Judith waited for Judge Wilson.

  ‘Ms Mansome. I agree this appears tenuous, but I have also agreed to allow Ms Burton time to complete her argument. I suggest you do the same.’

  ‘Thank you. Mr Salisbury. Back to the speeding trolley; five people on one track, one on the other. What do you do?’

  ‘Are you asking me what I would do?’

  ‘It is correct, is it not, that this trolley problem and similar problems have had to be grappled with by those designing the algorithms which go into the cars you make?’

  ‘Yes.’

  ‘Because you need there to be a sensible and ethical solution provided to your vehicle, in advance, if it happens to come across its own modern version of the trolley problem, when it’s out on the road?’

  ‘Yes.’

  ‘And, I’m sure you can see where I am going with this, the layout of the road at the accident site was one version of the trolley problem. To the right of the northbound carriageway, there was the Layton family, halfway across the road, sheltering in the limited refuge of the central reservation, Mrs Layton pushing her three children in front of her. Immediately ahead, there was a temporary, high-sided concrete barrier and further to the right there was the southbound carriageway where, usually, traffic would be coming along from the other direction.

  ‘If your car had, for example, hit the concrete barrier, the Laytons may have been saved but you would almost certainly have suffered more severe injuries, perhaps even death. And the car itself would probably have been written off?’

  ‘Perhaps.’

  ‘Your honour. This is confusing,’ Celia was on her feet once more. ‘This isn’t an objection but an observation. It’s patently obvious that there was a third option available to Mr Salisbury; to transition onto the southbound carriageway, as directed by the road signs and as every other vehicle passing through the contraflow over the previous week had successfully done,’ Celia sat down. Judith ignored her.

  ‘What was your 2016 SEDA supposed to do, when faced with an imminent collision, like this one?’

  ‘The software we installed has been programmed to take the path which leads to least injury. In this case that would have been to steer away from the family, onto the other carriageway.’

  ‘Are you absolutely certain of that? I will repeat what you said. That this car, left to its own devices, well, in accordance with the manner in which it had been programmed, would have a hundred per cent steered away from the people crossing the road?’

  ‘Yes.’

  ‘Do the algorithms only deal with quantitative measures?’

  ‘I don’t know what you mean?’

  ‘I mean, is “least injury” only measured quantitatively. So, for example, it’s “better” to kill one person rather than three. Or does the car also consider the age of the people, their gender, race, profession, health?’

  ‘I imagine it is possible, theoretically, to input that kind of information into the software, but it isn’t part of the software in our cars.’

  ‘Isn’t it true that, over the last five years or so, in preparation for the time when these cars would be let loose on our roads, large numbers of people have been invited to make these kinds of decisions online, in a huge data collection exercise?’

  ‘Yes.’

  ‘They might be asked, if you had no choice, no way out, would you kill three old women or one child, two doctors or four nurses, one pregnant woman or a Nobel prize-winning scientist, a teacher or a burglar; that kind of thing?’

  James sighed and nodded.

  ‘You need to speak up please.’

  ‘Yes, those tests were carried out.’

  ‘The public were asked their views and their answers were analysed?’

  ‘Yes.’

  ‘How much did your company, alone, spend on this analysis, finding out what people thought they should do in these situations?’

  ‘It was a lot of money; perhaps as much as £2 million, but it wasn’t just us. It was a consortium; we shared the work product.’

  ‘And what happened to the results of those tests?’

  ‘They were used to help develop policy.’

  ‘Policy on what?’

  ‘On how autonomous vehicles should respond to real life versions of this problem.’

  ‘But those tests I have just described are qualitative, aren’t they? Who is worth more; a doctor or a nurse? A teacher or a burglar? You said “least injury” meant somethin
g quantitative only.’

  ‘To my knowledge, no programmer wrote code which included those kinds of criteria. Who can say if a doctor is more deserving of life than a nurse, an old man or a young woman?’

  ‘I see. No one wanted to play King Solomon.’

  ‘Yes.’

  ‘Does the driver count in all this?’

  ‘How do you mean?’

  ‘Well, when people were asked to choose between their own life and that of, say, a pregnant woman, what did they choose?’

  ‘Perhaps surprisingly, it was around fifty-fifty.’

  ‘That does surprise me. When interpreting the data, did you believe those selfless people who declared publicly that they would rather die themselves than kill another person?’

  ‘Personally, no.’

  ‘So just to complete this point, as I can see his honour’s generous fifteen minutes is coming to an end, you are certain that your 2016 SEDA car, driving in autonomous mode, made decisions purely based on how to keep the most people alive, in an unfortunate situation?’

  James chewed his bottom lip. He looked out at Martine, then he sighed deeply.

  ‘No. I can’t be certain.’

  ‘Why not?’ Judith asked.

  ‘You heard Mr Herrera. The computers which power these vehicles are programmed to get from A to B safely and efficiently, but they are not static. They interact with their environment and learn; that’s why we call them intelligent. They might, theoretically, pick up and learn, themselves, that humans prize the lives of, say, doctors, above all else, and that might then form part of their own criteria for decision making.’

  ‘Well. I have to say, Mr Salisbury, that I have been defending cases for more than twenty years now and I have never had to deal with so many complicated issues in a road-traffic matter. I think what you are saying is that we will never know, for certain, why your car hit the family.’

  James drew himself up to his full height and his eyes roamed the public gallery.

  ‘The public has to decide what it wants, and it can’t have it all,’ James spoke earnestly. ‘Does it want far-reduced levels of accidents, injuries and fatalities on the road, back to maybe a handful only? Does it want a system where people who cannot hold a driving licence now, through illness or disability, will be able to travel around more freely and independently? Does it want to build a community where people no longer feel they have to live in heavily populated and expensive areas because they cannot face a long commute to work?

 

‹ Prev