Book Read Free

A People's War (The Oligarchy Book 2)

Page 2

by Stewart Hotston


  ‘Why are you asking me this?’ asked Helena, not able to follow the intelligences’ line of reasoning.

  ‘We are calibrating,’ came the immediate response.

  Helena sucked her tongue to the roof of her mouth as she thought about the question. ‘Curiosity... I suppose it can be defined as the urge or desire to understand the origin or driving force behind an outcome or situation,’ she said.

  ‘For example?’ asked Lysander.

  Helena raised her eyebrows; Alex had not mentioned quite how oblique the Solver would be. ‘For example, my question to you about why you needed to ask me these questions in this way.’

  ‘What way?’ asked Lysander.

  Helena sidled around the pattern she felt emerging in the probing of the AIs. She felt unexpectedly defensive.

  ‘I was expecting a more sedate and less random approach to your calibration,’ said Helena, deliberately leaving the word random floating in the air to see how the AI would react.

  Lysander paused for a moment before replying. ‘We understand; curiosity.’

  ‘Go on,’ said Helena. Lysander followed her lead.

  ‘You deliberately posited the idea of our behaviour as random to observe our response; this is curiosity.’

  Impressed, Helena nodded. It’s clear why these intelligences are so effective at their tasks.

  ‘Do you believe in free will?’ asked Lysander.

  Helena snorted. ‘Yes.’

  ‘Do you believe in determinism?’

  ‘No,’ said Helena, sensing rather than seeing the path the computer was taking.

  ‘Why did you answer no to the last question?’ asked Lysander.

  Helena did not answer immediately; she’d seen the trap laid by the AIs. ‘Because I believe in free will. I can’t also hold a contradictory position. If you are about to reply that this is determinism, you are, I suppose, technically correct, but it’s a logically weak example. It would be easier to suggest that free will is weak, bounded as it is by the possible. The fact is I’m here and not on Mars, for example. Yet my freedom is also curtailed by the reality of my being an embodied creature: an intelligence whose existence is only possible because it’s limited by its body.’

  Lysander folded her arms and smiled. ‘You assume that free will is mutually exclusive to determinism. I suggested no such thing. However, a defence of your position is easily predicted since you are intelligent enough to see the irony of your statement.’

  ‘Enough to answer you perhaps. Yet isn’t your response simply a manipulation of my belief in order to force my defence of it?’ asked Helena, sitting forward. She was beginning to enjoy herself.

  ‘What effect do you consider your genetic heritage provides for you?’ asked Lysander after a second’s thought.

  ‘In what context?’ asked Helena, thrown off by the sudden change of topic. ‘Of course I could refer you to my earlier answer about embodiment.’

  Lysander focussed on her question. ‘Exercising your free will, making the choice to believe in it despite knowing the mind is a chemical soup. Do you have freedom when your very existence is based on manipulating your DNA to give you capabilities others do not have?’

  ‘Better,’ said Helena, pleased to see how the AI linked the two subjects with a grander theme. ‘Indeterminable. There is no disputing its influence, but it can be thought of only qualitatively, not quantitatively. Regardless of my genetics I am a human being; part of what makes me human is my ability to overcome my inclinations and my desires. Tendencies are exactly that, neither compulsions nor imperatives. If we are slaves, then we suffer at the hands of terribly conflicted constitutions.’

  ‘Aestheticism. Do you believe many people share your view?’ asked Lysander quietly. Helena got the feeling the AIs were drinking in her thinking. She didn’t understand how this affected their own problem solving ability but could see the one certain outcome of this event; Lysander would know her far more intimately than she was comfortable with.

  ‘I can’t comment on that,’ she said, feeling a more guarded stance was necessary.

  The AIs looked at her without speaking and then changed the subject again, this time referring her back to her experiences in Africa after the Amazon Fell was downed. Helena related what had happened, missing out the involvement of her Uncle. Her AI murmured that perhaps being totally honest would yield unexpected results. She agreed but felt the risk of the results being unfavourable outweighed the possibility that the Lateral Solver would finger her Uncle as a traitor.

  Helena felt it was safe to discuss the mass murder of Normals in the Southern African zone. Lysander did not interrupt or ask her to explain anything beyond the bare facts. She reached the point where her cousin, Adam, had arrived to whisk them away from encroaching Indexiv forces before the AIs spoke again.

  ‘How did it make you feel to find the young man?’

  Helena thought about her words carefully, aware that she had only used the term ‘boy’ when referring to Edward.

  ‘I had little time to think or feel anything regarding Edward. However, as with any successfully completed task, I was satisfied with having finished it.’

  ‘Even if the final outcome was beyond your comprehension?’

  ‘I understood well enough,’ said Helena slowly.

  Lysander cocked her head, looking bemused, and said, ‘You did not know at that point how Edward had actually escaped custody of Euros.’

  ‘No, but not knowing of my ignorance left me uninfluenced by its implications. I was told he had been stored there by the Company,’ said Helena, slipping forward so that she was perched on the edge of her seat.

  Careful, said her AI.

  Lysander’s image flickered, her clothes changing to show her dressed in loose charcoal trousers and a close-fitting, short-sleeved sweater in an off white.

  ‘Helena, the girls now have access to Euros’ databases and news archives,’ said Alex through her tertiary AI. ‘You might notice them changing their clothes; don’t worry about it, just means they’re becoming more comfortable with the world they’re in.’

  ‘It seems you have... not understood the question,’ said Lysander. Helena felt the gap in the sentence. A normal person would have included the word ‘deliberately’.

  ‘Why don’t you rephrase it?’ she replied as politely as she could and noted with a surprising sense of satisfaction that she’d stumped the AIs.

  ‘The question is simple enough,’ they said eventually.

  ‘Then my answer must be the one you would have expected,’ said Helena, feeling as if she had the upper hand. Their conversation had developed beyond a simple question-and-answer session into something more aggressive.

  Lysander took their time. The simulacrum’s expression turned blank as the intelligences turned inwards in silent contemplation. Helena sensed no urgency from them but knew she had somehow found Lysander a dead end out of which the Solver was struggling to escape. She felt excited at watching such a being literally birth in front of her, the distinction between a functional AI and the vastly more subtle Solver in front of her slowly emerging as Lysander struggled with concepts normally forbidden to artificial intelligences.

  Her AI felt tense, like a muscle lifting a heavy weight, but it said nothing.

  ‘It seems from the information available to us that you were not in possession of a psychologically adequate level of information when you fled the Amazon Fell. Your pursuit of the young man, in all likelihood, provided sufficient stimulus to cause you to question your allegiance to the Company.’

  Helena knew her expression betrayed her surprise. ‘Hindsight offers many vantage points,’ she murmured. ‘But it does not permit one to return to the situation with that subsequent knowledge. It does not allow you to act differently than you might have done.’

  ‘Have you no regrets then?’ asked the AIs.

  ‘None.’ said Helena firmly.

  Lysander shifted her weight onto her left leg. If Lysander had been human, H
elena would have taken it as a sign of someone losing their focus on their immediate surroundings. She decided to use the moment of weakness to her advantage, hoping the Solvers were sufficiently anthropomorphised that she could accurately count on their body language.

  ‘What regrets would you expect me to have?’

  Wrong,said her AI sharply.

  She didn’t have time to wonder why her AI had thrown that warning across her mind, because Lysander fixed her eyes on her and said, ‘We would expect you to regret not having kept the boy from your Uncle, that you failed to keep him alive, that Denholme’s sanity cracked as a result of the pressure you put him under and finally that you have somehow not been able to trust your colleagues here in the department.’

  Helena’s breath wouldn’t come. She watched herself from a distance as years of training took over and kept her body language neutral, her expression blank.

  The Solver looked at her and said, ‘Perhaps you are right not to have regrets, but we wonder what act will be your catharsis.’

  Helena smiled grimly.

  ‘Helena?’ It was Alex, speaking directly into her ear so the AI wouldn’t hear. ‘There’s something wrong with the Solver, not sure yet. I guess the information download hasn’t worked properly.’

  How much did Alex hear of our conversation?

  He doesn’t hear what he can’t believe, said her AI quietly.

  ‘So what?’ said Helena aloud for the benefit of both the Solver and Alex.

  ‘Well, carry on for now, but if I can’t locate the source I’ll pull the session.’

  Lysander spoke, their words overlapping with Alex’s broadcast in Helena’s mind, ‘Have you no regrets because you have no morality?’

  ‘I have a morality; it is my own and it is that of my Family. It has suited Euros and it suits the people I share my life with,’ said Helena flatly.

  ‘Where do Normals fit into your morality?’ enquired the AIs.

  ‘Nowhere, just as you do not,’ said Helena curtly. ‘Where they fit into my ethics is another matter.’

  ‘Does their existence not inform your morality?’ persisted Lysander.

  Helena crossed and uncrossed the fingers on her right hand. The simulacrum looked down at her fidgeting. ‘Their existence is given meaning by my morality, not the other way round. We have given them everything; we have provided them with all they have. They need for nothing. They would not exist without us.’ The party line. At least it’s what I once believed. Before Africa.

  ‘Perhaps,’ said Lysander. ‘What have you given them, Helena?’

  ‘A way for Euros to stop Indexiv killing them all,’ said Helena triumphantly.

  ‘Why?’ asked the AIs, ignoring the grandness of the claim.

  ‘There are many reasons why I want Indexiv stopped,’ said Helena.

  ‘Then your actions were not to save Normal people based on their own value.’

  ‘My actions were to preserve my Company and my own life. I am not unhappy that Normal people will continue to live, but they have provided me with nothing. I do not ascribe them any more value than most.’

  ‘Most?’ asked Lysander, turning her head away from Helena to watch her from an angle.

  ‘Society,’ said Helena. ‘The average person on the street.’

  ‘Surely society is the majority, not the minority, no matter how powerful that minority might be? You are not the majority; you are the point one percent.’

  Helena thought the question was rhetorical but disagreed with the sentiment. ‘I would regard Normals as a part of the majority if they contributed something towards it. They do not, hence they are not. Society consists of those who create it.’ Helena felt the sentiment expressed her world accurately and concisely. Yet she held these ideas with a shattered conviction, no longer able to ride them unquestioned through life.

  Lysander reached up and absently began curling her fingers through her hair. For a moment, Helena thought she glimpsed something human in the AI’s eyes.

  ‘Is God dead?’ asked Lysander in the same way Helena might ask someone to pass her a flower she thought was beautiful.

  ‘Can God be alive?’ asked Helena in return, to see what the AIs thought of the divine in their own way before revealing her own ideas on the subject.

  ‘You understand the question,’ retorted the AIs. Her attempts at sophistry were frustrating them.

  ‘I fail to see what God, or any other concept of the divine, has to do with our current discussion,’ said Helena.

  Lysander cocked her head and replied coquettishly, ‘God is the only one who could have known everything of which you were ignorant. Belief in such a being would inform your morality and hence your actions, both in Africa and since you returned to London. If you do not believe in the Divine, we are interested in how you explain your contentment with your ignorance. It seems naive that you would be satisfied with the unknown when it has such a bearing on your life.’

  Helena sat back. Her AI hummed with tension; it said nothing, but its anxiety about where Lysander was taking the conversation struck a deep bass chord across her mind.

  ‘Setting aside LaPlace’s Demon, who can predict all events because it knows all things, God is an irrelevant concept. We are immortal. Morality does not require the divine to exist.’ Even as Helena uttered the words, she remembered David’s father, the story of how he hated such arrogance. She regretted her choice of words, her attempt to pull rank.

  ‘Hubris,’ said the AIs, as if they’d found a burr on a fingernail. ‘Morality may not need the idea of the Divine to exist, but to emerge in the first place? That we find hard to posit. Rules for behaviour may arise in all communities, but humans are reflexive and so are more than the communities of which they are a part. The Other as Divine is crucial to the emergence of morality. Reflexivity needs that heritage to build a society even if one can propose that after the building is done God can be cast aside as unnecessary. To say otherwise is to indulge one’s own sense of importance from a position of post hoc stability, while holding out the meaninglessness of reality for others to suffer under. Your position says, ‘as the world is now in my image all you others must accept it as the truth.’’

  Helena shook her head, feeling she had to give ground. ‘So what? Yes, we might have needed a concept of God for morality to emerge. All men are guilty of choosing to put themselves in the place of God. I am no guiltier of this than any other. In our world one could say that if we didn’t fill God’s shoes something else would.’

  ‘Surely you immortals are guiltier than any others?’ said Lysander firmly. Surely it’s not still calibrating?

  ‘That is an inconsistent statement. We are human like all others. We have no reason to adopt such a stance more than any person living in the world that we have mastered, that we have made perfect.’

  ‘We?’ asked the Solver quietly, as if sensing a mistake in Helena’s logic.

  Helena saw it too and recoiled inside as she saw where the AI had led her. When she had said ‘we’, she had meant the Oligarchs — not the Normals.

  ‘Your silence shows you understand the weakness in your argument. It seems immortals are more likely to believe themselves adequate to replace God than not.’

  ‘There are many of us who believe in the divine,’ said Helena defensively.

  ‘That is irrelevant here,’ said the AIs. ‘You do not. We are puzzled and it is something we must meditate on.’

  ‘What?’ asked Helena, fearing what it was that they did not understand.

  ‘Why would you abandon your own company in order to support those who seek to bring it to its knees? Why would you leave your family behind for strangers? Why would you accept ignorance and not seek to extinguish the darkness it casts over you? Why would you work against the best interests of your company, not simply accidentally but in violation of the laws you have upheld for more than a century? Why act this way when undermining your company could lead to the death of billions?’

  Helena sa
id nothing.

  I cannot access Lysander’s systems. I cannot silence this AI; I cannot help, said her primary AI.

  The door opened behind Helena with a sigh and she knew who was there. ‘Helena? Lysander? What’s going on?’ asked Alex.

  Fallibility, said Helena’s AI.

  ‘Lysander will tell you for themselves,’ said Helena lightly. The simulacrum looked at Helena with a strange expression. They turned to Alex.

  ‘Alexei, Helena has said some very interesting things to us. We must think on them.’

  ‘What the fuck?’ asked Alex.

  He looks bewildered, thought Helena and turned her gaze back to Lysander. What are these AIs were playing at? The reason for their very existence was to track down the telepaths, even if they did not know it yet. They were no different to bloodhounds who would track a scent simply because they were given it. Discovery was their logos. Yet Lysander was more than a bloodhound; they had discerned Helena’s role in the fiasco with painful ease, had already finished their task, yet said nothing to Alex.

  ‘Lysander, you called me in here. Do you feel ready to commence your task?’ asked Alex.

  The simulacrum stared straight at Helena. ‘I am ready to begin searching for the rogues.’

  ‘Is that it? Is that why you called me in?’ asked Alex.

  ‘Alexandrovich, Helena Woolf has provided us with a satisfactory calibration. We are grateful to her.’ Still they did not take their eyes from hers. ‘Now we must turn our minds to fathoming the possibilities she has placed before us.’

  If anything, Alex looked even more lost. He turned to Helena then back to the AIs, his arms limp at his sides.

  Helena’s tertiary AI informed her that Lysander wished to establish an electronic link. Helena tried to read the AIs face but could read nothing except the plainest neutrality.

  She granted permission for the link to be connected and waited for their conversation to continue.

  Lysander flashed into her mind — Later— and was gone, leaving the link hanging.

  Helena was interrupted by Alex. ‘What’s happened here?’ he asked with a growing suspicion in his voice.

 

‹ Prev