Book Read Free

Conscious

Page 30

by Vic Grout


  Around the turn of the millennium, there was a British car manufacturer with a strongly unionised workforce. Rampant anarcho-syndicalism it wasn’t, but the workers did have a little more power and more say in what the company did than many elsewhere. Slowly they were able to improve their working conditions. The result was that the owners had to make more concessions to the workers, which meant less profit. Both factors led to a drop in quality of cars rolling off the production line and unrealistic prices compared to their competitors. Eventually, the company went bust. The result still stands as a case study in how not to do business. In fact, it’s often noted that, by the end, ‘the workers thought the company was there to give them work, rather than make cars’.

  But … can we just for a moment entertain the idea that this might be a good thing? Why shouldn’t we have structures that put people before profits? If we’re not competing successfully against slave (sometimes even child) labour in other parts of the world, where really is the flaw in the system? Here or there? In a capitalist system, nothing happens unless there’s a profit in it for someone; that’s what drives the system – the whole world. Is it really impossible to reverse the logic? In an economic system that looked after people first, would we care that much if the cars weren’t much good? Well, the elite non-workers might but few others would if they were properly fed and living in peace. The elite (and those further down the social ladder who’ve swallowed their bullshit) would bang on about personal freedom – the rest of us would ignore them.

  To put it another way, a good sub-system, failing within a bad global system, isn’t a bad sub-system: it points the way to a better global system.

  In fact, of course, there is work to be done, whether it be by humans or robots, but it’s not being done at present by anyone because it isn’t profitable. Our hospitals and local amenities are falling down but they’re not being rebuilt because the economics aren’t worthwhile. People are starving when there’s food to feed them and dying when we have the medical knowhow to treat them but neither is happening because it doesn’t pay. It can’t be denied: profit comes before people in the world today. Why do we even tolerate talking about the cost of a drug that will keep someone alive? Particularly when that cost can be considerably less than the elite throw away on a whim. We shouldn’t. It isn’t ethical or moral: it’s economic, laced with politics. It’s capitalism.

  And the principle doesn’t just apply to robots: it cuts across every aspect of our emerging technology future. If the AI singularity doesn’t get us first, it will be something else – perhaps environmental oblivion? Capitalism can’t and won’t save the planet because no-one will profit from it and people are a secondary consideration. Or it might be the confrontational approach to dealing with extremism and terrorism. We know, really, that these forces can’t be eliminated through conflict but it suits the elite to have us fighting amongst ourselves. Eventually, personal privacy will only be available to the super-rich: the rest of us will be at the mercy of ‘big data’. The solution to all of this is to tackle the underlying problems of inequality and poverty but to note that technocapitalism won’t do any of it is almost tautological!

  So, to return to the original 1, 2, 3 criticism of Greenburg:

  1. The old jobs will be replaced by new ones. No, not this time. The numbers are beginning to speak for themselves.

  2. Don’t say nasty things about capitalism. Sorry, we have to: it’s going to be the death of us.

  3. Scientists should stick to science. Well yes, there is some sense in this but there’s nothing more dissonant about an engineer making social comment than an economist doing the same. The economist is simply blinkered by the belief that our society and economy are the same thing, always will be and always have to be. (Or, worse, they’re simply lying to protect the system that protects them.) Escape the notion of profit being the first and last word in everything and an economist is just an expert in playing Monopoly. They have no more insight into social structures (including possibly those with robots) or human morality than a scientist, or a poet, or a footballer.

  But Greenburg is right. Technology does have the potential to give us all a wonderful future. But it won’t; not unless we’re prepared to change the framework in which we’re going to place it. If we don’t, it will make things worse.

  So, ‘Will the robots take our jobs?’ isn’t the important question. (Yes, they will!) We should be asking ‘What’s the work that really needs to be done?’, ‘For whose benefit?’ and ‘What will we be doing while they’re doing it?’

  *

  He laid the papers down and smiled sheepishly. “As I said, just a rant, really!”

  “Sounded good to me,” argued Aisha. “And finished! What part of it does not make sense?”

  “Well,” Andy started, hesitantly, “the conclusions, I suppose. No-one seems to have thought this through; no-one seems to be doing anything about it.”

  “That is your point, surely; that people are kept in ignorance and no-one really cares to challenge the established assumptions?”

  “Aye OK, but what about the people at the top – the elite? What about them? Why aren’t they doing anything?”

  “Why would they? They are the beneficiaries of the system. Why would they want anything to change?”

  “Because they have to, surely? Because, ultimately, they’re going to be brought down along with the rest of us. Because the world – even their beloved system – just isn’t going to be stable.

  “You may need to explain that!”

  “OK,” chuckled Andy, taking a breath, “I’ll try!” He leaned back in his seat.

  “You see, people like me,” he began, “people who don’t go much on capitalism – one way or another, sort of divide into two philosophies: those who think that capitalism itself is evil and those who think it’s maintained by evil people.”

  “Is that an important distinction?”

  “Well, it might be. If you think the system itself is the problem then you’re likely to think that it’s inherently stable. People often say that capitalism works because it feeds off human weakness; that everyone – high and low – keeps it going naturally by feeling they have to be in competition with each other. Everyone’s looking over their shoulder at everyone else – except those right at the top. Alternative social frameworks don’t work for precisely the same reason: selfishness. In that case, there’s no need for direct intervention. The elite are just the lucky beneficiaries of a flaw of human nature! It’s almost the spiritual version, if you like: God created the world and The Devil created capitalism.

  “But the other school of thought – a bit more standard Marxist – is that, left to its own devices, capitalism would fail because it’s just so obviously unfair: people just wouldn’t accept such a ridiculous, unbalanced system. In that case, there has to be something looking after it – propping it up. That something essentially would be the elite – the super-rich, the people who control the media, create distractions, start wars, run the economy, that sort of thing – and they do it with knowledge and intent to protect themselves because they benefit. A system can’t be evil but people can. Actually, it’s possible have a foot in both camps and believe it’s a bit of each.

  “Now, the thing is, whichever of those models you buy into, or whichever blend, it isn’t going to work in future. Because, however it’s achieved, capitalism relies on two things: the pay-off for the people at the top and enough stability to quash everyone else lower down. But technology is going to change all that. Firstly, some of the upheavals we’re looking at will be so massive they’ll affect everyone: not just the masses, but the elite too. Secondly, automation and AI will mean unemployment levels will be so great they’ll affect the very stability of the system: there will be too much of an underclass to quash. There are loads of examples of this …

  “For example, for years now, the elite have been very good at hiding things they don’t want known from the rest of us: let’s be honest, we bar
ely know even who they are! But the IoE and big data are going to make that very difficult in future. Their personal data may end up being just as vulnerable as ours. On a completely different subject, what about safety and security? They may try to abandon ordinary folk to war and terrorism but they won’t be able to stay entirely invulnerable themselves: they can’t hide away for ever – they’ll suffer too. Then there’s the looming environmental catastrophe; I can’t really buy the idea that the super-rich are going to pack themselves off to their own private biospheres in a decade or two: they’ll fry like the rest of us. All in all, when the elite say things like, ‘we’re in this together’, it may eventually be truer than they think!

  “But, most of all, there’s the instability of unemployment. As robots do more and more of the work, it either frees up people (everyone) to a better way of life or creates an unstable situation for inequality. The system has to change for the former to work; if it stays the same, we get the latter. So, in the end, the system can’t survive: it either has to surrender to a fairer one or it destroys itself. Either way, the elite can’t carry on as they are. And, unless they’re hiding it very well indeed, they don’t seem to be thinking about it!”

  “Do you think you may be underestimating the powers of evil?” asked Bob, from across the floor, with a distinctly facetious air.

  “Aye; I may well be,” agreed Andy glumly.

  Chapter 24: The Land of the Free

  Speaking with Aisha for more than a few minutes at a time was difficult: her concentration was very poor and she would break off every few sentences to tearfully thank Jenny for saving her in Parc de Bruxelles. But Jenny had to try. As the flight continued, she managed to get a few broken sessions with her and these slowly gave her what she needed. (It seemed strange having any sort of conversation without the backdrop of RFS – just the background plane noise and distant small-talk from Andy, Bob and the medical team.) By the time they were half-way through the flight, Jenny believed she knew roughly what the target characteristics for the disconnecting set should be. They were to build in a margin of error, which would hopefully account for mistakes in various estimates, and all that was needed now was a clearer picture of the global network topology itself so the key nodes could be identified. In principle, they would get this from their new hosts when they arrived and would be ready to switch It off. So that was enough with the technical talk, she felt.

  “How are you feeling now?” she asked, probably for the fifth or sixth time.

  “It hurts,” Aisha said in an altered, suddenly very matter-of-fact voice. “But I can live with the physical pain: I know that will get better.” There was something in the tone of the last few words that suggested more.

  “But?” prompted Jenny.

  “I am not sure,” admitted Aisha. “I am confused: I feel that I do not know my own head at the moment.”

  “Is that such a big surprise? You’re recovering from a major trauma. And I guess you’re pumped full of pain-killers and what-not?”

  “Yes, true. But that is partly what worries me. I have no idea what they gave me: there was not time for them to tell me. I think they gave me more than normal analgesics but it was all such a rush. Then there was no time to ask: the man said I needed to come and talk to you all to tell you I was OK. I was not thinking straight.”

  “Told you to come? We thought you had insisted on it!”

  “Yes, I wanted to; that is right. But that is because he said it was so important, and that I would not see you … or Andy again.”

  She could not see the concerned stare Jenny gave her. “Aisha, are you saying you were coerced into coming to see us?” she asked. “Were you drugged to make you do what you were told?”

  “Possibly. I do not remember.”

  “You said that you were fit enough to fly – that you were coming with us. Were you told to say that?”

  “I do not know.”

  “Are you fit to fly?”

  “I doubt it.”

  *

  “Is Jenny’s plan going to work, Bob?”

  “Is Jenny what?” He was dozing lightly and did not catch every one of Andy’s words.

  “Is Jenny’s plan going to work?”

  Bob thought briefly and nodded quickly. “Yes, I think so: I can’t see why not. If she can be sure just how much disconnection is needed, I know she’ll be able to work out the nodes we need to take out to do it. He glanced over at the two women. It’s a question of whether Aisha’s in a fit state to tell her what she needs. I’m not convinced she’s as well as she says she is.”

  “No, she’s not,” Andy agreed. “I think they filled her full of something to get her on her feet: so that she could come and see us – to persuade us she could travel.”

  “Why would they do that?”

  “I’m not sure; but I’m not the only one who’s suspicious. I think Stephen’s got his doubts about the rest of them too.”

  Bob’s eyes widened. “What makes you say that?”

  Andy slowly reached into his pocket. As he slid in his hand, he let out an involuntary gasp of pain as the pressure told on his leg.

  “What was that?” Bob asked, concerned.

  “Nothing; just my leg hurts from the accident with that cyclist in London.”

  “But that was below the knee, wasn’t it? How come it hurts up there?”

  “Oh, I don’t know. I think some of the irritation has spread; it might be a bit infected.”

  “Bloody hell, Andy, you need to get that looked at when we get there!”

  “Aye, I will,” Andy agreed. “Anyway, this is what I was looking for.” He pulled Stephen’s scrap of card from his pocket and showed it to Bob. It simply read, ‘Gus’, followed by what looked like a telephone number.

  “US mobile number,” confirmed Bob. “But where did you get it? And who’s ‘Gus’?”

  “Stephen gave it to me as we left. He said we might need it if things went wrong. But I’ve no idea who Gus is.”

  “So, what might go wrong, exactly?”

  “I don’t know; but I just have a feeling that our hosts-to-be may not be as welcoming as we might hope.”

  *

  They all managed some fitful sleep in between occasional timing updates from the crew and regular monitoring from Aisha’s medical team. She was not well; but she would get to their destination in one piece. There she would need to be re-examined. The luxury jet allowed for easy movement around the cabin and the other three stretched their legs and changed seats several times; the flight wore on.

  Aisha was staring sightlessly through a window on the side of the plane, unaware that Andy was watching her closely. She remained motionless for several minutes, causing him to stay his tongue. However, when she touched her fingers to her bandaged eyes, then slowly and deliberately shook her head, he broke the silence.

  “What’s up?”

  Aisha stopped the movement, realising she was observed, and lowered her head into her palms. “I feel terrible,” she groaned.

  “Do you need some more painkillers?” Andy started to gesture to the closest doctor.

  “No,” Aisha replied sharply. “I do not mean the physical pain: my conscience hurts me.”

  “Pardon?”

  “It was my fault. I feel so very bad about what happened; what I caused.”

  “What on earth are you talking about? You were assaulted by a racist thug; you were attacked, then chased. How was that your fault?”

  “Because I caused it. I gave those others the idea that I understood what was going on – what the problem was – that I might be responsible for it, even.”

  “But they just misunderstood: they assumed, because they’re bigots.”

  “They misunderstood because I insisted to Jenny that I had predicted what was going to happen just as much as Bob had – even though that was not entirely true. They heard me boasting that I understood too. If I had not felt the need to say those things, none of it would have happened. I put both of us i
n danger through my arrogance. People died because of my pride,” she wailed into her fingers. “I am a doctor: I am supposed to heal people – not kill them!”

  Andy could think of little to say; he simply wrapped his arms around her as gently as he could. “So many people have died, Aisha,” he suggested forlornly. “These were just a few.”

  *

  The hours passed with further seat changes. At one point, Jenny awoke to find Andy gazing pensively out at the clouds below.

  “Penny for your thoughts?” she asked, not realising what memories that would bring back: Aisha had asked the very same in the London coffee shop. He took some time to reply.

  “I suppose I was thinking that, somewhere down there, everywhere down there really, is this huge ‘living’ thing. It’s the Internet, the power, anything that can communicate with It. It’s everything! It’s pretty much like the whole planet’s come alive!”

  Jenny nodded. “We’ve created life.”

  “Or we’ve made something for God to put life into,” suggested Andy.

  “And now we’re planning to kill it,” added Jenny, suddenly sensing his dilemma.

  “Aye.”

  “And you’re not comfortable with that?”

  Andy smiled and something of the old warmth came back into his eyes.

  “Very sharp, Jenny,” he smiled, “and appropriately put, I think. I can’t say I’m exactly against the idea: after all, It’s killed tens of millions of people. But it’s not something I’d do with no remorse at all: I wouldn’t kill anything for the sake of it.”

  “But for good reason?”

  “For good reason, aye: if it was necessary. I’m a vegetarian, as you know. I don’t eat meat because I don’t need to. Nobody needs to these days: it’s just personal choice. I certainly don’t expect anyone else to kill animals on my behalf. That just seems doubly wrong.”

  “But if you had to?”

  “If I had to, aye, I probably would. If I was marooned somewhere and the only thing there, apart from me, was an animal – if it was me or the bunny rabbit, then I guess I’d kill and eat it (if I could catch it, of course).”

 

‹ Prev