Book Read Free

The Stories of Ibis

Page 32

by Hiroshi Yamamoto

Who knows?

  “The way they think is nothing like humans. When TAI speak to each other, it is almost impossible for humans to follow. We have no way of telling what they’re thinking, or what they’re going to think, or how they’re going to behave.”

  Robo masters seem to believe their TAIs have consciences.

  “Consciences only develop in warm-blooded creatures. Without time spent in your mother’s womb and in her arms as you grow up, and without the sensation of the warmth of your own body… Well, it’s dangerous to assume a conscience exists.”

  [Raven mocking a downed opponent. “You fool! Did you really think I would ever fight fair?”]

  Is there a chance they will kill people?

  “Yes. They do not see people as their kind. They may even experience our presence as the buzzing of gnats. They have an instinct for self-preservation. If they decided we are not necessary for their own survival, they may swat us like flies and massacre us all.”

  [Jen fighting. Raven fighting. Particularly violent moments only. Professor Yarbrough’s voice over the top of it.]

  “It’ll be too late to put the genie back in the bottle once the androids have claimed their first victims. We have to nip this danger in the bud.”

  [Once again, images of androids being assembled at Quindlen.]

  Jen’s body will be completed next year, in January 2044. Raven’s will follow in April. Quindlen has received orders for several other TAI battlers. And they plan to increase production.

  [Back to the interview with Kessler.]

  “We’ll see an explosive growth in TAI androids starting next year.”

  [The interview with Westheimer.]

  “We believe over two thousand TAI androids will be manufactured worldwide over the next ten years.”

  [Scenes from old movies: Terminator, The Matrix, Saturn 3, Tomb Raider, and Westworld. All scenes of robots attacking humans.]

  Is the future depicted in these films on our doorstep? Will these thinking robots rebel against us and kill us all?

  [The interview with Banbury.]

  “That’s paranoid. Androids are our friends.”

  [The interview with Anno.]

  “We should love one another.”

  [Raven stomping on the body of a fallen enemy, laughing. “Ah ha ha ha ha! See what happens when you underestimate me?”]

  Black Pegasus: I saw you on TV, Pint! Your real name’s Anno?

  Saori: You were different from how I’d imagined. I’d always pictured you with a funnier face.

  1/4 Pint: A funnier face? What’s that supposed to mean? Ha ha.

  Gear Emperor: I’m so used to talking to your avatar, you know? You seemed so much more serious.

  Black Pegasus: Yeah. “We should love one another.” I totally did a spit take.

  Swindler Wolf: That walk in the park was fake, right?

  1/4 Pint: Of course. I’d never flirt in a world with people watching. The TV crew asked me to do it, so I figured, why not?

  Gear Emperor: You are a devious one though. Asking Quindlen for a body without telling us!

  1/4 Pint: I didn’t mean to hide anything from you. I didn’t want to count my chickens before they hatched is all. If I talked it up before they approved her, I thought I’d embarrass myself. The selection process is really strict, you know. They ask all kinds of really detailed questions about how all the parts function. It took months before I got the go-ahead.

  Saori: Have they started work yet?

  1/4 Pint: We just finished vetting the design, and now they’ve started ordering parts. The show said April, but it might be a little sooner than that.

  Black Pegasus: It must have cost a fortune.

  1/4 Pint: Um… yeah, I kind of blew every yen I had.

  Saori: No looking back, eh?

  1/4 Pint: The skeleton was more than I thought. Manufacturing amorphous metal at that thickness is still really hard. But I couldn’t exactly skimp on it, could I?

  Swindler Wolf: Reducing the strength of the skeleton means you’d have to change the entire rest of the design.

  1/4 Pint: But… you all know how I feel, right? You all understand why I’d want to give the TAI I love a real body?

  Everyone: Mm-hmm.

  Gear Emperor: I’m jealous, man. I’d almost saved enough myself…

  1/4 Pint: Second in line is still pretty good, man.

  Gear Emperor: Not so much! Quindlen’s schedule’s booked solid. I order now, it’ll take two or three years.

  Swindler Wolf: There’s a company in Korea now. Hyun Sam is having Kongju’s body made.

  Black Pegasus: Even in Japan, DOAS is making noise about starting TAI robo custom jobs.

  Swindler Wolf: DOAS is? This is getting interesting.

  Gear Emperor: But until we see how well they do, it seems risky…

  Saori: By the way, did that program strike anyone as a little biased?

  Black Pegasus: A little? How about a lot?

  Gear Emperor: The battle footage was pretty blatant.

  1/4 Pint: Yeah, I didn’t expect it to be edited like that. They made me look like a loon.

  Black Pegasus: They weren’t far wrong there!

  Swindler Wolf: They made a pretense of objectively showing both sides, but it was definitely leaning negative.

  Saori: Why didn’t they let Raven speak?

  1/4 Pint: They did! She talked to the reporters through a monitor for, like, forty minutes. She told them that she was just playing a villain in the TAI battles, and that the real her was nothing like that. She told them she had no intention of harming anyone. They cut the whole thing.

  Saori: Why?

  1/4 Pint: I’m guessing they didn’t want to broadcast anything that didn’t fit with their message. If people saw Raven speaking for herself, they’d have a different impression of her.

  Gear Emperor: Classic propaganda technique.

  Swindler Wolf: NEX is big among the Christian types. Can’t say I’m surprised to see them taking an anti-TAI stance.

  Saori: I suppose people can see the battler interviews on the website…

  Swindler Wolf: But how many people bother looking? Most people just see what’s on TV and assume that’s all there was.

  Black Pegasus: Does anyone still fall for such obvious manipulation?

  Gear Emperor: I hope not. But even with TAI battles as popular as they are, the majority of the population has never bothered watching one. Stuff we think is common sense just doesn’t seem that way to them. There are going to be people who buy what that show said.

  At this time we were in Hadley Apennine. The Japan Aerospace Exploration Agency (JAXA) ran a server with several virtual moons for education and promotional reasons. This one re-created the site of the Apollo 15 landing in 1971, between the Apennine Mountains and the Hadley Valley. Anyone could use the site for free, but it was obviously not as famous as the Sea of Tranquility where Apollo 11’s Eagle had landed. Particularly at night, when children weren’t online, few humans ever accessed this site, and no one complained when TAI started hanging out there.

  <“Ah, what a sticky web we’ve woven!”> Raven sang, looking up at the earth floating above us. Between the complexity of the problems our masters were facing and our own helplessness, we were all feeling a little cynical.

  I said, an old but appropriate gag.

  We were sitting on the remains of Falcon, the Apollo 15 landing module, our feet dangling over the edge. Seventy-two years ago people had landed on the moon in this tiny little ship and stepped out onto the surface of the Hadley Apennine. They left behind only a little stand with four legs. Since they’d landed on the rim of a shallow crater, the stand was a little bit lopsided.

  Around us the surface of the moon glittered, reflecting the sunlight. The liftoff had scattered the regolith around the stand. Footprints left by the astronauts were all around us. The three-meter-long lunar rover, a solar wind spectrometer, a seismi
c detector, the laser-ranging experiment, a radioisotope thermoelectric generator, an American flag, and a Bible.

  If you brushed aside the regolith near the rover, a small metal plate would be revealed. Inscribed on it were the names of the fourteen astronauts who had died in the space race between America and the Soviets. When I first dug it up, I had stared at it in mujaibe—the sentiment AI feel when faced with displays of human sentiment related to death.

  Typhoon 18 and Shinano had drawn a square field in the regolith, strung a wire between the rover’s antenna and the American flag, and were playing badminton. Sort of. The shuttlecock was the falcon feather David Scott had left on the moon, and their racquets were shovels and hammers used to collect samples. The rules were also simplified: if the feather fell on their side, they lost. In the vacuum, the feather moved like a rock. The simulation would reset when we left, so it didn’t matter what we did with it.

  Pi Quark was crouched near the lunar rover, carefully inspecting the camera in front of it. It had a very large depth of field and was still pointed at the stand where it had been filming the takeoff of the lunar lander.

  she muttered. Kroof was surprise at the gap between the basic information and the actual experience.

  I chuckled.

 

  Pi’s hyperbole resulted from this being her first visit to the virtual moon. The rest of us had been here several times, so it wasn’t as kroof anymore, but it was hard to shake the feeling entirely.

  A century after Jules Verne wrote a fictional account of a journey to the moon, humans had managed, despite the constrictions of Layer 0 physics and their own fragile flesh, to push the actuality horizon—the border between the possible and the impossible—to the breaking point. This fact remained astonishing, and we all felt a tinge of awe.

  And now, with the help of our masters, we were attempting a journey into the unknown of our own, to Layer 0. Our goal was to make another human fiction, the notion of a robot with a heart, into a reality.

  Raven stood up and jumped off the landing stand as high as she could. Balancing herself with her wings, she soared about ten meters, then circled three times in the air before landing on the surface. Then she skipped lightly toward Shinano.

 

  Raven said, mocking Pi for asking a question a human would. She’d just used the phrase “That’s humans for you,” which was manmeme enough. “I’d be lying if I said VIL0 and Real End weren’t frightening. Anxiety (2+3i). But more importantly, I can cross the green expanse, and the backflow hefts a peak at the corner. Expectation (5+8i).>

 

  I made a jump like Raven had. With no AMBAC, I was not as elegant. When I landed, regolith scattered.

  Pi laughed.

  Shinano said with Bad Mood 2, still playing badminton.

  Bwana was normally a word used when taking a subservient attitude toward one’s master in the context of a joke, but describing them as tigers reduced the derisive nuance, indicating that she was being serious despite her humorous tone.

  Typhoon 18 said grimly.

  There were ten URL tags attached to his words, but I didn’t need to consult them to agree. That TV show was just the tip of the iceberg. You didn’t need to search the net far to realize that anti-TAI sentiment was unnervingly high in America. To not notice made you a Neibralferra—someone with the optimism unique to humans that everything would turn out okay despite the danger staring them in the face.

  We were not like humans. We did not confuse our desires with the truth. We would not inaccurately diminish a genuine threat.

  Pi said.

  Shinano said.

  I thought about this.

  Toucanan were people that knew little to nothing about TAI but still harbored anxieties and hostility toward TAI androids. There were a lot of them, and they had a lot of influence. Attempting to educate them or lecture them was difficult because of their gedoshields, the phenomenon of people who were convinced they knew the truth unconsciously shutting out information that would correct their misconceptions. In other words, their own minds deluded them.

  All humans were DIMB to some degree; in other words, they projected their own anxieties and fears onto their gedoshields and believed that this was the nature of the world around them. Most DIMB were harmless, but when their hatred of the imaginary enemy inside the gedoshield became too strong, it had the potential to harm real people in the outside world. When many DIMB shared the same targets of their aggression, large-scale tragedy resulted. War, terrorism, the Holocaust, the witch hunts, etc. All of the people involved were unaware of their own gedoshields and made no efforts to perceive reality as it was, so they abandoned the communication necessary to prevent conflict.

  Human communication skills were at an extremely low level. They had a strong tendency to talk to the projections on the inside of their gedoshields rather than the actually existing real people around them. Because of this, more than half their words were wasted. While delivering a principal’s speech—in other words, a tedious and meaningless message that has forgotten about the goal of making the listener understand—their gedoshields rejected potentially valuable information. They repeated the obvious incessantly and did not understand what they heard. Arguing with DIMBs was almost always futile. They would not ask the right questions, nor answer the questions you asked. Even politicians and professional thinkers not only used false dichotomies, straw-man arguments, irrelevant analogies, shifting goalposts, and plain logical fallacies, but happily resorted to childish insistence. They were not only tricking others, they were tricking themselves. It was astonishing how inept and awkward humans were.

  We always did our best to ensure that our message was being communicated effectively. Not only did we use Complex Fuzzy Self-Evaluation to clarify intent, we provided hyperlinks and background information to accompany vocabulary that might prove unfamiliar. When listening, we did everything possible to understand the speaker’s point of view. And of course, we never made errors of logic.

  But even then, clear resolution was not always possible. Particularly concerning problems in the human world.

 

 

 

 


 

 

 

 

 

  ds.>

 

 

 

 

  Despite our joking, we were debating the matter seriously. But a harpy’s dilemma—hurting people through scrupulously obeying rules designed to protect people from pain—was a classic issue, and combined with the gedoshields, no clear solutions emerged. When Pi Quark mockingly suggested we should all go for Disshu, we told her to lay off the kriff jokes.

  We weren’t the only ones. TAI all over the world were discussing these issues. How could we stop the tragedy we all saw coming? How could we change people’s minds? But the harpy’s dilemma and the gedoshields were problems much too big to handle. No matter what we did, there was a chance that people would get hurt, and it was impossible to calculate the danger of that. Since appeals to reason could not penetrate gedoshields, they would not reach the people they most needed to reach.

  There was no real solution to the problem. We were headed straight into Krebtzaik—the kind of problem unique to Layer 0, where solution-oriented discussions went in circles, got stuck, and finally ran out of time.

  With tragic consequences.

  There had been warnings.

  In 2030, when TAI androids first became possible, there had been a number of movies with plots about humanoid androids going crazy and killing, or falling in love with human women and stalking them. Most were scripted by writers who didn’t know the first thing about AI engineering and hadn’t bothered doing any research, so the results were a mess. But the public ate the pabulum up.

  “We can’t tell what robots are thinking. They could snap and attack us at any time.”

  This erroneous idea bubbled quietly beneath the surface.

  We bore some responsibility for making things worse. Our language evolved quickly and grew increasingly complex. Dozens of new words were invented a day, and each new word spread across the world before the week was out. We had no privacy—our masters could always listen in on our conversations. But common use of new words, secret words, portmanteaus, secondary metaphors, tertiary metaphors, anagrams, meta-expressions, metathesis, antanaclasis, zeugma, tonal changes, and connotations coupled with the Complex Fuzzy Self-Evaluation made our conversations incomprehensible to humans and impossible to translate accurately. If our masters asked, we could describe the general gist of a conversation, but our summaries would always lack critical nuances.

 

‹ Prev