by Phoenix Ward
Karl started to zone out a bit, making it more difficult to feign interest. He had heard all the same sound-bites and arguments for the better part of a decade. To him, it was all just ignorant white noise at this point.
He remembered being a high schooler when the installation industry had started to boom. At first, it was more like a mortuary service—something the ultra-rich would throw money at in order to preserve their loved ones forever. Historically, it had only been a novelty. Important people, scientists and politicians alike, were installed so students on field trips could ask them asinine questions over and over.
When Karl was young, it had always been that an I.I. could only be created after someone’s death. In the last moments of a person’s life, the electrical data that made up their personality would be downloaded with a neuroscopic recorder, then reassembled into interactive code by programming professionals. It was illegal, and borderline impossible, for an I.I. to exist while the person it was created from lived.
Then the Man With Two Bodies happened. It was only six years prior when an installed intelligence had been accidentally created before the death of Chris Santson. When the two of them had argued on live television, the world exploded. The theory had been that both Chris and his I.I. should agree on everything, since they have identical minds. Because they had not, it had spurred a lot of interest into what makes an I.I. tick.
With a bit of research and some genius creativity, it was discovered that installed intelligences were actually different individuals than the minds they had been created from. Like a tree branch—though it was grown from the trunk—it is not the tree once it is removed. Thus, Chris’s I.I. became the first legal citizen to have never been born. It was probably the most controversial Supreme Court ruling of the twenty-first century.
Some I.I.s didn’t want to start all over, however. They had all the memories their organic counterparts had. It seemed alien to many to begin life anew as some sort of fake infant with the memories of an adult. Thus, the Z-8 clause was introduced.
It was a small amendment to the Santson ruling that allowed I.I.s to continue on as their old identity in every legal sense, so long as they had provided written consent before their organic death. This amendment, despite how much it helped millions of new I.I.s, only threw more fuel on the fire that was anti-I.I. sentiment.
Immediately, half the world seemed to ignite into an uproar. Religious organizations were outraged, saying that only God had the right to determine what was and what was not a human being. This helped pave the way for the birth-to-citizenship movement. Those people lobbied hard for legislation, stating that citizenship could only be granted to those with proof of their birth. The movement was doomed from the beginning, however. Plenty of real people with human bodies lack proof of their own creation. Immigrants from states with poor documentation, for example, had no birth certificate. The proposed legislation would have stripped them of their status as people.
After that failed, the movement rose from the ashes like some bigoted phoenix, reborn as the human economy movement. The basic premise was that, since humans had created the base economy the world functions on, then I.I.s had no right to participate in it. It was the first time the quiet whining about losing jobs to computers became a proud mantra. The more open-minded people in the movement wanted to create a digital currency that the I.I.s used and exchanged separate from human money. It was a bold idea, but had proved infeasible thus far.
I don’t care about the politics of it, Karl said to himself. I know an intelligent being when I see one. Any with a mind that can think is worth sharing ideas with.
The protest had been organized right outside the campus Karl was speaking at today. If he could just manage to break out of the crowd, it’d only be a dozen yards or so before he was home free. Just as he was about to peel away, the attention got thrown on him.
“You, sir!” the woman who had been yelling herself pink called.
Karl stopped and spun around, looking for some other poor soul she could be addressing. When he found no one—and instead saw all the faces pointed at him—he gave an exhausted sigh.
“You don’t seem to be here for the protest. Where are you headed?” she cried.
“Oh, me?” Karl said, though he knew who she was speaking to. “Nowhere. Just going to meet a few friends.”
“A few friends?” The woman’s tone was skeptical. “Are you sure? You’re not going to attend that ridiculous lecture on proge ‘psychology,’ are you?”
“No, no way,” Karl replied. “I definitely hate those computer guys. They’re just so… scientific. Grrr!”
He continued to inch his way out of the gathering, counting every step as a crucial amount of distance.
“They’re only going to fill your head with lies,” the woman said, ignoring Karl’s sarcastic reply. “You’ll be told that proges are like humans. But they aren’t human. A proge can’t hold you or wipe away your tears. Pixels can’t love, sir.”
“I don’t know about all that,” Karl retorted, now breaking the perimeter of the protesters. “I’ve never measured the quality of my relationships in so superficial a manner.”
Karl spun around and walked for the campus sidewalk like a man who just found the only bathroom in a hundred miles. Once he was out of earshot, he shook his head and chuckled.
Lecture
“Psychological tendencies between an installed intelligence and the average human are not so different, despite any preconceived notions you may have,” Karl said into the microphone. The auditorium sound system was hooked up to most of the audience’s cerebral computers, but still output audio for those without an implant. It took some concentration to get used to hearing his voice booming out of the speakers. “In fact, that should come as no surprise, since the human mind and that of an I.I. are just two tools built from the same blueprint.”
The lights pouring onto him were far brighter than he would have preferred. He felt like he had to continue squinting down at the college students listening to him. After a minute or so of trying, he gave up and allowed himself to be blinded. He didn’t need to see any faces anyway.
“Before I go any further, I just wanted to thank your instructors, the campus, and yourselves for having me here today,” Karl said. “My name is Dr. Karl Terrace, and I am what we call an installed intelligence psychologist. I analyze the mental functions of an I.I. much like you would a human mind. I’m here to talk to you about the similarities and differences between a human and an I.I. and the amazing possibilities created by combining the two.”
He paused and gauged the interest of his audience. It was like the entire auditorium had taken in a deep breath of suspense, holding it until he continued on. He grinned.
I have them in the palm of my hand, he thought.
“That’s right,” he said, building on the anticipation he had cultivated. “At this moment, there are scientists toying with the idea of connecting a human mind with an I.I. in what we are calling the ‘mindshare’ process.”
There were a few excited murmurs coming from the audience, and Karl could hear shuffling as people leaned over to speak to their neighbors. He reveled in every fascinated sound.
“Think about it,” Karl continued, timing his words to intrigue the crowd the most. “One day, we might have biologists who would have died still working on the cure for the future’s worst diseases. Picture a world in which Einstein was still alive. How much sooner could mankind have discovered renewable energy? Would World War III have even happened? What problem wouldn’t be easier with two heads tackling it?”
He focused on the text document his cerebral computer displayed in the corner of his vision, skimming through his speech notes. There was nothing written down aside from a few bullet points. He preferred the natural feel of “winging it.” Installation fascinated him to the point that he could blab ceaselessly about it if allowed to.
“You see, this mindshare process has been in development almost as long as t
here have been installed intelligences. The original pioneers of the cerebral computer are believed to have worked to make their devices capable of reading the code I.I.s are written with, or to come up with some way to translate it. Their records, which were kept secret for decades, have helped countless modern-day scientists give birth to a functional form of that process.
“In the near future, the I.I.s of experts will be able to continue their work through the eyes and minds of young and upcoming scientists. They will have unlimited access to the host’s sensory information, allowing for more accurate experimentation, especially in regards to human anatomy and biological reactions.
“However, it doesn’t stop just there. Mindsharing is really a pioneer of what might come to be known as digital telepathy. If we can interface with I.I.s to the full extent that they can feel what we feel, and see what we see, why can’t that same process be adapted for human-on-human communication? Entirely non-verbal; entirely universal. The internet for human minds, if you will, but with sensory information.”
“That’s the exciting future of installation technology,” Karl said after a long pause. “I, however, want to talk to you about the promising present of installation psychology.”
The audience was difficult to read. Some folks seemed to be hanging off his every word like they had a chemical dependency. Others, though, looked upset. They sat with furrowed brows and frowning lips—almost like he was offending them. Karl shrugged it off. He was used to those looks.
“Plenty of people, for many years, have incorrectly attempted to utilize installed intelligences in the same manner we use programs and machinery. The fact of the matter is that we have never beheld a tool such as this. It transcends simple machines, or even complex codes. Imagine a hammer that can think. A search engine that can feel. A vehicle that can hope.
“There are two approaches when dealing with a sentient tool. You can use it like any other instrument and force it to bend to your will with no motivation or conversation. Or, you can inspire it. You can convince installed intelligences to want to do what you want them to do. Putting the ethics of forced labor to bed for the moment, it still remains practical to treat I.I.s more like an employee or a partner than an instrument.”
Those in the audience with pouting faces seemed to grow even more annoyed, and something about that delighted Karl.
“Legally, technically, perhaps even spiritually, these intelligences are human equals to each and every one of us,” he continued, watching his words salt the grumpy folks’ wounds. “Treating them as such is not only right, but it is the most beneficial response for all of mankind. A respected mind, digital or organic, is more likely to create. If a being believes their work will gain recognition based on its merit, he or she will put more effort into it. This is common sense.
“Resentment is not viable fuel for creation. If you have disdain for a group of people, why would you work to solve problems that primarily affect only them? For example, an I.I. could work to cure complex cancers. They don’t have organic bodies, however, so why would they do that? They do it because they love us. With affection, the intelligence will want to cure our ailments and keep us safe.”
Karl could see the entire audience was not with him. Some folk were immersed in the information, jotting down each tidbit on their tablets or in their C.C.s. Karl still needed to reach several holdouts.
“There was an experiment run a few years ago that helps demonstrate what I mean,” Karl explained. “You see, scientists wanted to find out who could be the most compassionate between I.I.s and organic humans, and what would motivate them to be so.”
The psychologist used the tools available to him to bring up clips of the study in question. He played them for the audience.
“Volunteers were taken under the pretense of a test on their reaction times,” Karl explained. “They would fill out a questionnaire of unimportant inquiries, then meet with a researcher for an interview. Now, during this interview, the researcher would either be condescending, demeaning, and cold or he would be respectful, complimentary, and warm. The researcher, who was actually a planted actor, would then fake a medical emergency: a heart attack, a seizure, something of the sort.
“The study found that I.I.s and organic humans were nearly identical in their responses. Those who were chastised and criticized were much more likely to hesitate and merely call for assistance, and those who were treated with respect would attempt to soothe and comfort the victim themselves while waiting for help. In fact, I.I.s were even more responsive than their human counterparts. This is likely due to an expectation to be ostracized, as many I.I.s are used to in society, leading to a more dramatic improvement when treated with kindness.
“Studies like this show us what kind of results we can achieve from proper motivation, and that I.I.s differ very little from humans, psychologically. The possibilities are endless.”
Some heads bent down over tablets while their owners tapped Karl’s words onto them. Others stayed locked on the podium.
“Many of my peers have been exploring these possibilities,” he said. He scanned over the young faces. “For instance, I’ve been working with an intelligence that is in the process of writing a novel. We meet on several occasions and he’ll share his notes with me. He likes to bounce ideas off me, as well, and gauge my reactions as I look over his work. I have observed the intelligence express excitement when talking about possible plot points. He becomes frustrated when he can’t get past a scene. Sometimes, he even grows dismal about his talent.
“It’s important to realize that these are human emotions, not simple emulations. A program cannot get frustrated, nor can it become excited. It is a human mind, through and through.”
One person got up and left.
“A human mind, however, comes with human flaws,” Karl said, transitioning to his next topic. “A colleague of mine specializes in the study of installed intelligence mental health. She has encountered a vast spectrum of human disorders and syndromes, from simple anxiety to advanced autism. Some organic mental health patients may feel as though their ailments are being de-legitimized by testing I.I.s, but few realize the benefit of such a merger.”
He took a look at the clock in his internal retina display to check how much time he had left. He was cutting it close.
“My friend recorded her study of disorders, and many progressive professionals have used her research in application to organic patients. Of course, hormones play a large part in a person’s emotional state, but the base concepts of mental health remain the same in both I.I.s and organic patients.
“In fact, since I.I.s cannot receive medication for their ailments, we’ve seen research into alternative treatments increase tenfold. It suggests that, since several I.I.s have already overcome serious depression and anxiety through various therapies alone, you or someone you know could as well.”
Someone off the side of the stage gave Karl a wave. He nodded in response.
“Well, as my time here wraps up, I just want to urge you young folks to consider the field of installed intelligence psychology. Every additional mind we add to an effort, the easier it becomes. That remains true whether the body it belongs to is made of flesh or not. Thank you.”
Feedback
The camera panned to the judge. She had a bored look glazed over her eyes. Too much makeup surrounded them, sparkling in the light of the studio bulbs. Her long black hair fell straight from her head like a curtain, not deviating in a single curl or wave.
“But the defendant is an installed intelligence,” she argued.
“Yes,” the man hastily retorted, his voice strained with anxiety, “but he is still a legal citizen, ma’am. He should be held accountable for his actions like any other person.”
“So what exactly did he say?” the judged pressed further. “What constitutes this claim as ‘slander’?”
“He posted on a swap-meet website that I had abused his pet dogs. The post included my photo, a satellite image
of my home, and even the name of the company I work for.”
“Wait, wait,” the judge lady stopped him. She waved her hand as if he was getting hysterical, and the camera quickly showed a smirking bailiff. “His pet dogs?”
“Yes, ma’am,” the young man continued. “You see, he created a false persona online of a disabled retiree that lives in my hometown. He made up a convincing story of my alleged attack, but failed to provide any evidence at all.”
“How do we know that you did not attack the dog, then?” she asked.
The man’s cheeks started to glow red, and though he tried to maintain his poise, his anger sped up his speech. “Ma’am, if anyone had ever seen me harm an innocent animal, why did they not file a police report?”
The judge looked to her bailiff with a scrunched-up expression of consideration. “Good point,” she commented. She turned back to the rest of the courtroom. “So what exactly are you asking for?”
“Your honor, the defendant caused me to lose my job and made it staggeringly difficult to get a new one,” the plaintiff said. “I estimate that this caused me to lose over seventy thousand dollars. I am asking for another ten thousand for the pain and humiliation this caused me and my family.”
“You know, I’ve never done one of these settlements against an I.I., but I’m going to rule in the favor of the plaintiff,” she announced before slamming her gavel down. “I sure hope Corey has deep digital pockets, because he owes you big.”
Karl nodded in satisfaction before turning the television off. It’s a good thing the I.I. lost the case, he thought. To be truly human, you have to accept the consequences of your actions.