Book Read Free

The Promethean

Page 6

by Owen Stanley


  There was general agreement all round. Harry stifled a groan.

  “While we’re on the topic of healthism,” added Nkwandi, “I notice that you have designed Ms. X to need as little maintenance as possible. While, obviously, she will not be able to actually suffer human illnesses, she will be an iconic figure, and therefore I feel you should incorporate ways in which she can at least seem to be unwell, so that the wellness-challenged community can identify with her. Perhaps she could be partially sighted.”

  “We could arrange for her to stumble around easily enough,” replied Harry. “Would that help?”

  “It would be a start,” admitted Nkwandi, grudgingly, “but you’ll have to do better than that. She should be prone to having periodic epileptic fits, or at the very least, simulated episodes of dysentery. You also envisage the robot as a young person, which again is highly problematic. I suggest to the committee that we require it to have some of the signs of advancing years, such as grey hair and a stoop, perhaps a shuffle, with glasses and a hearing aid, that would engender empathy with the elderly.”

  There was general agreement, and Percy added, “A hearing aid would establish Ms. X’s link with the deaf community as well. Great idea!”

  “You’ve also designed Ms. X to be exceptionally strong,” said Toni. That’s totally macho and intimidating and sends entirely the wrong message to vulnerable and traumatized women. She should be gentle and weak, so as to encourage people’s nurturing instincts. And she shouldn’t be tall, either, as that’s just as intimidating as well.”

  “Ideally, she would be under the average height,” Godfrey commented.

  “In fact, very short,” said Percy. There was general agreement that all this would tend to encourage people’s nurturing instincts and protect the traumatized. A vote was then taken to endorse the various proposals that had been put forward.

  “To sum up so far,” said Nkwandi, who had been taking notes throughout, “We would require your robot to be female in appearance, black for the Western market, optional for Africa, Asia, and Rest of World, physically plain, elderly, high BMI, physically weak with obvious signs of ill health, and well below average height. Would any of those characteristics present any problems in the manufacturing process?”

  “No,” said Harry, who now looked as if one of his dearest relatives was dying from some lingering and agonising illness. “None at all.”

  “Excellent! Now we come to the problem of the robot’s so-called intelligence. Your specifications make this the most important feature of your robot, its ‘unique selling point’ in your corporate jargon. But many of us feel that the importance of intelligence, or IQ, is grossly exaggerated, and that it is really just a social construct used as a spurious excuse for racist and sexist discrimination. But we are prepared to admit that some individuals do know more than other individuals about certain subjects if they are given the opportunity to learn about them, and even that some individuals appear to learn more quickly than others. So here we must emphasise to you the urgent need to combat ableism–”

  “I don’t think I know what that is,” interrupted Harry, meekly.

  “Ableism is the fascist belief that there is such a thing as physical and mental normality, and that being normal is good. Research has shown that equating normalcy with desirability is discriminatory and materially harmful to persons who are thought of as disabled, so a person who is considered to be above the average in both knowledge and intelligence must therefore have an especially harmful influence on a large proportion of the population, especially those with learning difficulties. I would have thought it obvious to anyone of good will that Ms. X should not encourage these feelings of inferiority in vulnerable minorities and therefore she should have a below-average IQ.”

  “Well below, in fact,” said Percy, who was beginning to feel distinctly nervous by this turn in the discussion. “We must fight elitism!”

  “We could certainly produce a robot of limited intelligence and memory, but what would be the point of that?” said Harry.

  “That’s not really for us to decide. We’re not proposing this project. We’re just telling you what we expect from your robot from the moral perspective. Now there’s one other essential point that we haven’t considered so far, which is the language Ms. X will speak. You seem to assume as a matter of course, Mr. Hockenheimer, that she will speak English.”

  “Sure, but she can speak several other languages as well. She’s not limited to one.”

  “I’m afraid we require a rather more nuanced approach than that. First of all, a multilingual robot would be a serious threat to the self-esteem of those who can barely speak their own language, and so our policy of combating ableism would obviously forbid it. Your robot must only be able to speak one language.”

  “So, English is all right,” said Harry, hardly daring to hope.

  “Of course not! As I’m sure you know, English is the language of global capitalist and imperialist oppression!” Nkwandi crushed him without hesitation.

  “You might as well have her speak German and wear a swastika armband,” Percy laughed derisively.

  “I’m going to assume Hebrew is out?” Harry said without the merest hint of a glance at Aminah.

  “Obviously she cannot speak the tongue of the Zionist oppressors of our Palestinian brothers. Nor would it be proper for her to speak Arabic.”

  “All right,” said Harry, putting his hands up. “So what does that leave us? French? Sign language? Some sort of African clicking thing?”

  “As an African and a black man, I find that offensive,” said Godfrey, who had legally changed his middle name to a series of clicks from the Khoekhoe language of Namibia. He sniffed as Harry winced apologetically.

  “However indelicately put, Mr. Hockenheimer’s question is not entirely irrelevant,” said Nkwandi, “Which is why I suggest the robot speak Quechua, the language of the indigenous people of the Andes, who are iconic victims of Spanish imperialism and societal degradation. This will teach her to empathize with foreign immigrants, especially if she is programmed to find it difficult to learn additional languages.”

  But Godfrey strongly objected that speaking another language without permission from its people might be considered cultural appropriation, which would be very nearly as offensive as wearing a Mexican sombrero if one were not a Mexican. Reluctantly, therefore, the committee finally agreed that Ms. X might be allowed to speak basic English, but she would be handicapped by the installation of a special slow learning programme.

  “So I think our requirements for your robot should be fairly clear by now, Mr. Hockenheimer,” said Nkwandi. “Is there anything else you would like to ask us?”

  “No, thank you,” replied Harry. “It’s all pretty clear. I shall have to go away and think about how to implement your suggestions.” After thanking them for their time, he dejectedly left the room, to barely suppressed laughter from the Committee.

  “Well done, everyone,” said Chairperson Nkwandi, looking around the group with a self-satisfied smile. “I think we can all congratulate ourselves on a very productive meeting. Unless I’m much mistaken, a thoroughly vile and reactionary project that would have been a disgrace to our society is now safely dead and buried!”

  They all heartily agreed and went off together to celebrate another victory for social justice and equality at a nearby five-star restaurant, and charged it to expenses.

  Harry would have preferred several hours of root-canal surgery to the inquisition he had just undergone to such little purpose. Funding from the Bio-Engineering Research Fund was clearly not going to happen. Who in the world would ever want to buy a black, senile, spastic, mentally retarded, and obese female dwarf that could barely string a sentence together? He shuddered at the thought.

  And then it occurred to him that if there were customers for such a monstrous creation, he really didn’t ever want to meet one. Harry returned to Tussock’s Bottom in a state of unmitigated gloom.

  Ch
apter VII

  At this point the wings might have fallen off Harry’s project before it had even left the ground, but fortunately, while he and Jerry were both brooding over the report of the Diversity and Inclusion Committee to see if anything of practical value could be salvaged, Harry received an unexpected and very generous buyout offer from Proctor & Gamble for one of his companies that made adult incontinence pads. The inscrutable workings of Providence had apparently decreed that there should be a notable weakening of the bladders of aging Americans on the East Coast, and Harry found himself one of the unwitting beneficiaries.

  The very large sum involved effectively solved his immediate cash-flow problem and removed any need for the grant. So during the next few months the various members of the Body and Brain teams came and went at Tussock’s Bottom with the latest bits and pieces of Project Frank as they were developed, and tested their assembly in the special dust-free workshop that had been built for the project.

  There had been some discussion about what they should call the prototype, since the robot couldn’t simply go by Frank. Wayne Ruger and Bill Grogan inevitably proposed various names suggesting either science fiction or extreme violence, but these were ignored, while Vishnu suggested that some innocuous and soothing name would be the wisest for potential customers. In the end they settled on the rural name of Meadows, with its idyllic suggestions of cattle grazing peacefully in an English pasture, and christened him Frank Meadows.

  While the Body Team worked steadily on the mechanical details of Frank’s anatomy, Harry and Vishnu and the rest of the Brain Team spent long hours planning what sorts of skills and behaviour they wanted. Vishnu had pointed out that in many ways their task had been greatly simplified by the fact that Frank could not feel pain, have any sensations, feel any emotions, or have any aesthetic sensibilities to get in the way of rational behaviour. Inability to feel pain would be a problem, but this could be rectified by the use of temperature and pressure sensors that would stimulate him to remove himself from harm’s way, and where feelings were socially appropriate they could always be simulated. Nor could he feel pride or inferiority and so would never consider himself superior to humans—or inferior for that matter—and he was therefore not liable to develop either arrogant fantasies of personal domination, or pathological neuroses about his own ineptitude that could leave him weeping in a corner.

  Frank had no feelings to be hurt by the malice or thoughtlessness of others and would therefore possess by nature a Zen-like calm in the face of any provocation, and with no hopes or fears he would be immune to all forms of threats and bribery. He would be entirely devoid of the love of money or material possessions and could not, therefore, be tempted by ambition or greed, and having no sexual impulses he could not be seduced by female wiles to betray his employer. In programming him, then, all they had to concentrate on were the algorithms of duty and civility.

  Again, since he had no real digestive system, they didn’t have to concern themselves with a host of potentially embarrassing bodily functions and the various possibilities of social death that the writers of etiquette books were always agonizing about. Farting, public drooling, nasal drip, blowing one’s nose in the tablecloth or picking one’s nose and eating it, belching, licking the plate, spitting on the floor, or ostentatiously scratching one’s genitals at social gatherings were embarrassments that were simply not physiologically possible for Frank, who had no saliva, nasal secretions, abdominal wind, or any need to scratch. And his lack of sexual capabilities relieved them of all concern about public indecency, rape, sodomy, inappropriate relationships with dogs, cats, or other animals, or the rarer forms of depravity such as foot fetishism or necrophilia, all of whose legal ramifications could have been very expensive for his owner.

  Frank wouldn’t need to sleep, of course, although he would be able to go into hibernation mode to conserve power, and he could never get tired, bored, irritable, or impatient or suffer any of the frailties that human flesh is heir to. It was, in fact, the trivialities of daily life that gave them by far the most trouble, and in the behavioural testing laboratory they lost count of the furniture that was demolished while teaching him how to walk like a human being instead of a gorilla, to sit on a chair while leaving it reasonably intact, to open a door on its hinges rather than trying to smash straight through it, or to pour coffee from a jug into a cup and to stop before it overflowed all over the floor. The design team was frustrated that, even towards the end of his training, although macro-economic analysis, fluid dynamics, and stochastic modelling had become child’s play for him, for some reason tying his shoelaces still remained an impenetrable mystery.

  The algorithms for social etiquette were also very time consuming to formulate—not to drink everything in a glass or cup in one go, not to snatch the food from his neighbour’s plate because there was more on it, and not to go on and eat the glasses and crockery when he had finished the food and drink. The unbreakable Tungsten Tyger Teeth, with which Wayne Ruger had equipped him as part of his personal armament, and his hydraulic jaws, made this only too easy, and in the early days he would happily munch away on the dishes and glasses long after the end of the meal.

  Sitting beside him as he learned to use a knife and fork, at least in the early days, was hazardous, and teaching him the etiquette of shaking hands had been particularly nerve wracking. Wayne’s enhanced hydraulic grip and titanium fingers allowed him to crumple steel pipe, and several weeks’ training with a string of mangled tailor’s dummies had been required before he could be safely trusted to shake a real human hand.

  He had to be treated rather like an Asperger’s patient, so teaching the niceties of conversational interaction was particularly demanding on Vishnu’s skill as a programmer. It was with some difficulty that he learned not to interrupt people or monopolise a conversation and not to make gross personal remarks even if they were factually quite correct, such as “You have a bogey up your nose,” or “You appear to be wearing soiled underwear.” He also learned how to make eye contact in a manner that did not become the creepy, obsessive stare of a sexual predator. Learning when to smile and when to look serious and how to look sympathetic were particularly demanding and took a couple of weeks programming all by themselves.

  When it came to knowledge, Vishnu was determined that Frank’s mind should not be contaminated by the imaginative and the fanciful. Ironically, if Vishnu had ever read any Dickens, he might have come across Mr. Gradgrind and found him a thoroughly sympathetic character.

  “Now, what I want is Facts. Teach these boys and girls nothing but Facts. Facts alone are what are wanted in life. Plant nothing else, and root out everything else. You can only form the minds of reasoning animals upon Facts: nothing else will ever be of any service to them.”

  Most of the robot’s training was based on this philosophy, but in teaching him general social awareness and conventions it was necessary to incorporate many thousands of stories and anecdotes about real-life situations into his learning, which over the weeks was steadily becoming more subtle and complex. But Vishnu was careful, in all these stories, to focus Frank’s attention on people’s behaviour, not on their feelings or motives, as he considered them simply irrelevant to a robot.

  “In one form of teaching,” he explained to Harry, “we ask the machine to look at anecdotes of a thousand different characters who are each experiencing the same general class of dilemma. Then, the machine can average out the responses and formulate the rules that match what the majority of people would say is the correct way to act. We find this is true over a very wide set of situations, from how to order a meal in a restaurant to rules about when to thank people.”

  “You’ve never had a girlfriend,” Harry observed, correctly, as it happened.

  “In another set of tasks,” he went on as if Harry had not said anything, “we are concerned to teach moral rules rather than just social conventions. So the machine reads hundreds of stories we have composed about stealing versus not stealing
, for example, and can examine the consequences of these stories, understand the rules and outcomes, and begin to formulate a more general set of rules, which we can call in some ways a moral framework based on the wisdom of crowds. Humans have these implicit rules that are hard to write down, but the characters in our stories are examples of real-life values. We start with simple stories, like Topsy and Tim go to the Farm, and then progress to young adult stories, with more and more complex situations.”

  Humour and jokes were, of course, utterly incomprehensible to a being that took everything literally, and Frank had to be equipped with a special humour module that enabled him to analyse the structure of jokes and to distinguish between the different categories. Most importantly, he had to be given algorithms for when jokes were funny, when they were totally inappropriate, when they were socially useful, and which type of joke was suitable for each type of occasion.

  Harry also realised, rather late in the day, that they had not clarified any scheme of command that would enable Frank to know whose orders he should obey, and in what order, and he asked Vishnu about this with some urgency. “After all,” he said, “a computer just sits there in the office hooked up to wires and not actually doing anything, but Frank Meadows is going to be out there in the real world, and he could easily get himself into a hell of a mess if we don’t straighten this out.”

  “I anticipated this,” said Vishnu calmly, and with a superior smile. “It derives from the Second Law of robotics, and it was always going to be a problem with all robots that interact with the public. We have the basic laws of robotics, as I previously explained to you, and then on top of those basic laws, we have all the moral rules and social mores that we have been inculcating in him as well. You know all about that. But to answer your question, what we’ll give him is a schedule of command priorities, with you, as the owner, having the ultimate authority, and then a scheme of delegation whereby you specify who can give him orders in your place, and concerning what subjects.”

 

‹ Prev