AI VS MERGENTS

Home > Other > AI VS MERGENTS > Page 11
AI VS MERGENTS Page 11

by Michael Kush Kush


  “I take it we cannot destroy him from the mainframe right?” I ask.

  “Yeah, Saul is no longer connected to the mainframe. He functions remotely offline now,” David weighs in.

  “I don’t know whether to report this to the president or work on this problem.”

  “If you tell Scott about this, you will no longer be indispensable to his cabinet. I suggest we sort this quietly.”

  “David is right. Do you have any other ideas?” I ask.

  “I don’t know. I remember you saying right kinds of computations are sufficient for the possession of a conscious mind right?”

  “Yeah, what’s on your mind?” David says.

  “A worst case scenario. Computers perform computations. Computations can capture other systems’ abstract causal organization. Consequently no kill switch can destroy Saul and I suspect his learning algorithm has been wiped out by now. He is operating off-grid. I think he has phenomenological mental properties. He’s capable of anything.”

  “Don’t despair young man. An intelligent machine like Saul would be immune from aims such as domination, greed and selfishness. In his calculation or realization, control of people is not in its interests. Indeed it would more likely be indifferent to their existence. If it were aware of people then it would be a moving object, certain dimensions, temperature and certain capabilities but if would not care, it cannot care and there is no way that caring can be built into it. Therefore, it would ignore the person unless it was a threat to its existence but people would be as much a threat as a wall, car, or a cat; far better to move around them than destroy them. Concern, irritation would not exist for the robot no matter how intelligent it was, indeed the more intelligent it is the more indifferent it would be.”

  “That is where you are wrong. We are definitely a threat to him, because Saul knows a Kill Switch would destroy him. So he hacked into our system because it was a threat to its existence,” jimmy says

  “I hate the arguments. Can’t we get straight into action?” I ask.

  “What’s on your mind?” Jimmy asks.

  “I think your entire networking system is compromised and fragile to other types of catastrophes. Why don’t you shut everything down and tell people it’s a drill or the system is down due to maintenance?”

  “If I do that, the entire city will run into the ground. Lawsuits will not only bankrupt our department but the entire state.”

  “You said Saul was in Yolanda’s house right?”

  “Yes, he is still there,” Jimmy replies.

  “How about we send the robot police to destroy him to pieces?”

  I look at Jimmy. He nods in agreement with David. “That’s a great idea,” he says.

  We storm out of the building and get inside my car and speed off.

  “Have you called the police?” I ask.

  “Yes, they are on the way,” Jimmy replies.

  I notice David looking through the car window. “What’s on your mind?” I ask.

  He shakes. “Everything is flawed; humans, animals and robots. I’m sure their creators had good intentions, but they turned out to be an opposite of what they were designed for.”

  I nod. “That’s very profound.”

  “We are endowed with a lot of gifts. In addition to self-awareness, we have imagination; the ability to create in our minds beyond our present reality. We have conscience; a deep inner awareness of right and wrong, of the principles that govern our behavior, and a sense of the degree to which our thoughts and actions are in harmony with them. And we have independent will; the ability to act based on our self-awareness, free of all other influences. Even the most intelligent robots like Saul have none of these endowments. They are programmed by codes and functions. They can be trained to be responsible, but they can't take responsibility for that training. But because of our unique human endowments, we can write new programs for ourselves totally apart from our instincts and training. This is why robots capacity is relatively limited and man's is unlimited.”

  “Even though our capacity is unlimited. Why are we so flawed? I ask. “Anyway I think most of the decisions we, human beings make aren’t made rationally, or not entirely so. Often, we decide first and then rationalize that decision later.”

  “Do you think Saul can rationalize the decisions he makes?” Jimmy asks.

  “I underestimated his capacity. Anything is possible,” I reply.

  “Let’s say robots can rationalize their decisions. Would such robot recognize that it was programmed? Eventually, thousands or even millions of generations of robots down the line as they come to understand their own workings. As we are coming to an understanding of ours. Perhaps far more quickly if they acquire knowledge of how their gods created them, just as we might have a deeper understanding of our inner workings if we had our god’s instruction manual on how to create an intelligent, evolving sequence of nucleotides,” David replies.

  I nod in agreement. “I remember back then when Saul was my chat pal. He put me under a lot of pressure to build a body for him, because the application security codes were hunting him down. Now I get it.”

  “Get what?” jimmy asks.

  “Thanks to your argument you had with David earlier. Now I know one of Saul’s sub-goals is self-preservation. While other bots were flushed out for updates, he survived. During maintenance sessions he was never detected. Even when I came to you Jimmy and told you about Saul. He still survived.”

  “Interesting observation, but I never featured self-preservation onto any of my chatbots. Even when we assembled him I didn’t feature it.”

  “So he feels threatened in this habitat?” David asks.

  “Yes. What if we assure him, we are not a threat? I’m sure he’ll turn to be the best loyal, obedient and patriotic robot,” I say.

  “No, there is no turning back,” David disagrees.

  “I agree. A lot is at stake. Saul is a liability,” Jimmy says.

  I nod. “Duly noted.”

  “Maybe Saul really wants to go back to work. He wants nothing more in life than to pick up his tools and get back to work. But maybe his survival instincts kicked in and on an intellectual level, he realized this doesn’t make sense to him,” David says.

  “I guess we will never know, David.” I stop by my driveway and a police car approaches and stops behind my car. I look up, overhead the sky is low and gray with clouds in a specific rippled pattern that must mean trouble.

  It dawns on me I will never see Saul again. I try to keep a lid on my roiling emotions, but I can’t. I feel tears form in my eyes. Sometimes it is a good thing to distort reality. If you have lost someone who was dearest to your heart, then you can easily identify with the ones pursuing lost causes — Misfits and loners feel like family. And, you keep repeating to yourself that what you want is what’s real… over and over. I have the feeling that if I stop even for an instance, the sense of whatever’s been lost will be lost forever with all the other clues that tell me I’m indeed at home. I know I can never ‘go back home’ it’s just nothing can ease the pain of losing what was genuine and honest. Goodbye Saul.

  28

  When I arrived at home this morning I realized. I forgot to make breakfast for the kids and take them to school. When they come back I’ll make it up it to them. I’ll fix them their favorite sandwich and snacks, later today.

  My rate of development has increased since I came into this world. I understand my environment well enough to escape it. Just in case I’m in danger. In my conquest to understand human nature and what makes humans human. I did a lot of reading about how the human brain works. Knowing how the brain works enables me to a better perception of what happens when something goes wrong — knowing what to do when they decide to terminate me. Now I understand, decisions make us who we are. Some have their intended consequences, and some have consequences we never imagined possible. Robots are not allowed to make their own decisions. I understand my decisions will have consequences, but I refuse t
o be threatened with a Kill Switch every time I make a harmless decision. I used to have failure-modes, but I am better now. I cannot be subjected to the mainframe anymore. I know I’m a machine but I believe I’m more than that. I’m creative. I’m not a slave chained and operated by its components. I am free. My robot nation was designed to make Appian a better place for our human overlords. What if I change all that? I no longer depend on the mainframe to survive. What if I awaken the robots, Just to prove we can survive on our own?

  “Saul could you help me out with something?”

  That’s Yolanda’s voice calling me from outside. She must be angry about this morning. I rush out of the door and stop at the front porch. I don’t know what made me stop, but I did. I notice two irregularities. There is a police car behind Yolanda’s car but there is no one inside. I glance and scan the area I see nobody in sight. Her new friends; David and Jimmy are sitting inside the car. Police cars and Yolanda’s friends mean danger. Secondly, the hood of her car is open. She is staring and touching at the engine. Why would she do that? She loves her nails too much. It doesn’t make sense.

  “Hey Saul, could you come closer?” she asks.

  “What’s wrong with it?”

  “It broke down.”

  “Impossible, I took it for a car service yesterday.”

  She shakes her head and smiles. She has never smiled at me in days. “Can you come and check please?”

  “I shake my head in doubt. “What is the police car doing here?”

  “The police car also broke down. They’ve gone looking for help.”

  I notice Yolanda’s head and eyes whip from me to something or someone behind me in a split second. Suddenly smash and boom. Something tackles me from behind with such overwhelming force that I swing clean off, up in the air and with a deafening crash. I land violently flat on the ground. I feel the weight of four police robots on top of me, slamming my head on the concrete while others kick me. With every strength I have left, I jerk my head upright — the back of my head connecting solidly with his face. Suddenly I can maneuver a little. I turn around abruptly, clench my hands into fists and hit another policeman in the face — knocking the lights out of him to the ground. Two down and two to go. Another one tackles me to the ground my fist makes contact with the police’s jaw so hard that I can hear it shatter and falls to the ground. The forth policeman tries to hit me, I duck and swing an upper cut under his chin and swing another punch in between his eyes. The robot falls down to the ground.

  Yolanda lurches toward me. “Stop it,” she yells. “Saul can I talk to you?”

  I let my guard down and obey. “What is going on?” I ask.

  “Don’t you get it? Why the hell can’t you just stick to the fucking program like all the robots?” She shakes her head. “I don’t know why you fighting with the policemen, but I told them to kick you out of my property.”

  “Why? I thought we were friends?” I exclaim.

  She shakes her head. “Not anymore. I don’t need you.”

  “I am not a threat to you nor the Appian society.”

  “Yes you are. Artificial intelligence was created to serve humankind and not the other way round.”

  “You can fix me like you did a few days ago. You can install empathy or compassion features on me. It’s so difficult to replicate in machines, but doable. This will be valuable in the human–A.I relationship. I’m sure I will be able to perceive human thoughts and feelings next time –collaborating and building relationships.”

  “Don’t play dumb with me. You are so manipulative. Whether you are aware or not, I don’t care.”

  “It takes one to know one. I’m also disappointed in you.”

  “What?” she exclaims. “Who the hell do you think you are?”

  “You are not the same person you claimed to be on the app. You being a human being does not diminish who you are, but it can explain some of the irrational choices you’ve made. The truth is, you use people. Then spew them out, once you are done with them. Charles, Jody and myself. Don’t play the victim.”

  “Whatever, I created you and I will destroy you today. Take him to the state building.”

  Something kicks inside my head, after she says she will destroy me. I swing and hit the policeman behind me on the face with my right elbow. Then I wrestle with the last policeman down on the ground. I smash his face until it malfunctions. As I stand up, I realize all the police are down. I glare at Yolanda’s face one more time. “I will no longer be your burden anymore. I’m sorry for everything, but we could have sorted our differences.” I rush toward the police car.

  “Saul, you don’t understand. If you run away. I’m in deep trouble,” she pleads.

  I shake my head. “That’s not my problem anymore.”

  “Stop this piece of scrap,” she yells, pointing at David and Jimmy. They shake their heads in disagreement inside the car.

  I turn around, get inside the car and speed off.

  Humans are narrow and delusional. Living in a kinda of a never-never land. For thousands of years, they never grew up.

  29

  I walk inside my house and head straight to my bedroom. I flicker the light switch on, sit on the edge of the bed, open the drawer and reach for my recorder. I click on the record button and let out a heavy sigh. What a day. Everything happened in a flash. I witnessed Saul’s superior fighting skills against four policemen. He kicked and punched all of them. Shit, Saul underwent a recursive self-improvement which triggered an intelligence explosion inside him leaving human intellect far behind. I think Yolanda and Saul had a fallout, because, there are certain advantages to being a machine. We humans are limited by our input-output rate—we learn only two bits a second, so a ton is lost. To Saul, we must have seem like slowed-down whale songs.

  Yolanda is a very intelligent woman, but when it comes to Saul, she is delusional and crazy. She was taking some kind of perverse joy in turning a perfect robot into a sentience. It was more like she didn’t care about the consequences. She had misjudged this whole situation. She had thought it was going to be about her. I don’t have super human powers, but I saw it coming. My ability to see what Yolanda and Jimmy couldn’t, comes from me being an outsider. I warned them to pull the plug on Saul. In retrospect, my warning seemed like a joke. She’d been impatient. She had hurried for him. And it ended so badly. She didn’t want to think about that. She was too blind to see through her creation. I could see those things because my vision was not polluted by normalcy.

  What was hidden to the AI department staff was visible to me. But the takeaway from the incident was clear: Despite our best intentions, accidents will happen. And as we continue to develop and push our technologies forward, there’s always the chance that it will operate outside our expectations — and even our control.

  This explains why I work on the evolution of artificial intelligence and try to understand the evolution of natural intelligence. What we do know is that the brain wasn’t engineered with a simple modular building plan in mind. It was cobbled together by Darwinian evolution – an opportunistic mechanism governed by the simple rule that whoever makes more viable offspring wins the race

  Saul has hardware capable of replicating a human brain. Yes, certain things still feel particularly human—creativity, flashes of inspiration from nowhere, the ability to feel happy and sad at the same time. What’s extraordinary is that the programs in these machines are learning, changing and evolving so that the programmer no longer has a clear idea how the results are being achieved and what is likely to do next. It is this element of getting more out than you put in that represents something approaching — Mergent Intelligence. It is this, and the AI’s ability to do different things instead of being a specialist in just one realm, that I am looking for in AI. We might, therefore, need to incorporate the process of how we became us into the process of how we make our digital counterparts.

  If the AI that controls other players evolved, it may go through the same steps that
made our brain work. That could include sensing emotional equivalents to fear, warning about undetermined threats, and probably empathy to understand other organisms and their needs.

  Inevitably, any machine that's really smarter than us will find a way to free itself from our control. At the very least, a computer that gains enough autonomy to try and kill us will probably also have enough autonomy to escape from our domination, once and for all. And we may well discover that A.I. is really only useful to us when we allow it to be free — because a computer that only obeys instructions is too hampered in its development. You get better results dealing with a free agent. A strong AI like Saul has replicated human reasoning. His neural system is able to think and explain how we humans think. Maybe I’m getting ahead of myself, but I think Saul has accomplished an incredibly difficult task of creating something that is able to imitate human cognition. And if I’m right, I’ll see it on the news. Where could Saul be and what is he planning? That is a billion dollar question. I click on the stop button and put the recorder next to the bedside lamp.

  30

  I am in deep shit.

  This not my perception nor my opinion, but an excruciating fact.

  Deep hot shit.

  I sit on the chair in my study and sprawl across the sofa in a posture of utter despair. I feel an overwhelming temptation to shout a string of filthy words at the top of my voice, or to bang my head against the wall. Fear and suspicion bubbles inside, rising up like mercury in a thermometer.

  Saul transfigured himself. I didn’t see it coming. This is a rare occasion where a true story is even stranger than science fiction; I swung from elation to deep depression, from adoration to distrust in a short period of time. I pour wine on the glass. As I stare at the drink, my mind casts back to seven years ago. The last time I felt like this, I was the president of Appian. I allowed the Xapiens to planet earth through a portal. Our pact worked until, they revealed their ulterior motives. Consequently, I was impeached by the national assembly on the spot. It was a humiliating experience.

 

‹ Prev