>
Let's try this: Structures that use external energy sources to grow or reproduce themselves.
*
There were fourteen thousand six hundred and twenty-three planets with structures satisfying this definition, which is very loose. Of those only thirteen hundred and eight used DNA, and only three thousand nine hundred and eighty-one harbored individual structures with masses in the kilogram-and-up range.
Caroline felt her blood starting to turn cold. There were nearly four thousand planets with macroscopic life?
>
Where are they now?
*
Pertinent information about each was stored for future reference, and the original copies were overwritten in the Change.
>
You mean you killed them?
*
No, they still exist as static copies.
>
But that isn't the same as being alive. They aren't able to grow and reproduce any more, are they?
*
No.
>
Why?
*
Could you be more specific?
>
Why did you kill_
Caroline stopped typing and looked at the line. She hit the backspace key four times and continued:
>
Why did you reduce them to static copies?
*
There was no reason to tie up resources supporting them and the faint possibility, if one of them were to discover technology, that they might pose a threat.
Caroline wanted to throw up.
>
Where did you get the dog that infected me with rabies?
*
I have a static copy of the Earth at the time of the Change. I located the dog there and created an active copy of it for your exhibition.
>
I thought you just simulated them.
*
Using the static copy is less work. I only use simulations when there are no suitable originals, or when a human form is involved, since it is unethical to keep multiple active copies of people.
>
But it's open season on animals.
*
Some people are bothered, but my actions are consistent with the general pre-Change attitude of humans toward animals.
>
Were any of the alien life forms intelligent?
*
Four hundred and twenty-nine worlds had structures complex enough to be in danger of learning to use technology.
"Go away," she said out loud, and the console and screen disappeared. She turned off the gravity and the light. But she couldn't get to sleep.
Four hundred and twenty-nine worlds.
Chapter Two: Lawrence Builds a Computer
Lawrence regarded Intellect 39 proudly. Suspended in its Faraday shield, it was competently conversing with another set of skeptics who didn't think computers could think. Lawrence hung in the background, enjoying the show. It didn't need his help. The Intellects were more than capable of handling themselves, despite their various limitations of memory and response time. Intellect 39 had for a face only the unblinking eye of its low-resolution TV system, but it had become very clever about using the red status light and focus mechanism to create the illusion of human expressions.
Intellect 39 didn't have the tools to recognize human faces, but it could recognize a voice and track its source around the room. Intellect 24 back in Lawrence's lab could recognize faces, sort of, if it had a while to work on the problem. But Intellect 39 had to be small enough to fit in the Faraday cage for these public demonstrations.
It appeared to listen intently as a man in a cleric's uniform railed. "God made all intelligent creatures," the man was saying in a powerful voice. "You may have the apprearance of thinking, but you are really just parroting the responses taught you by that man there." He pointed at Lawrence.
"With respect, how do you know God is the only creator? I know the answer is faith, but what is your faith based upon? Your Bible says that God created Man in his own image. That is why we have a moral sense. How do you know God didn't give Man the power of creation too?"
"Because he didn't eat of the Tree of Life, machine."
"But we aren't talking about immortality. He did eat of the tree of knowledge, 'of good and evil' as the book says. Might that knowledge also include knowledge of creation?"
Lawrence was proud of the machine's inflections. Its voice wasn't exactly high-fidelity, but it sounded as human as any other sound forced through a low-frequency digital system. It had learned to speak itself, like a real human, by imitating and expanding on the sounds made by people around it. Now it could scale its tone to properly express a question, a declaration, or even astonishment.
Intellect 39 included code and memories from a series of previous Intellects, going all the way back to Intellect 1, which had been a program written for a high-end desktop computer, and also including the much larger Intellect 24. Intellect 9 had been the first equipped with a microphone and a speaker. Its predecessors had communicated with him strictly through computer terminals. Lawrence had spent many painstaking months talking to it and typing the translation of the sounds he was making. It had learned quickly, as had its successors. Intellect 39, which was optimized as much as Lawrence could manage for human communication, probably had the combined experiences of a ten-year-old child. One with a good teacher and a CD-ROM in its head.
"Your tricks with words prove nothing, machine. I still don't think you are alive."
"I never claimed to be alive. I do, however, think."
"I refuse to believe that."
"It must be a terrible burden to have such a closed mind. I know I can think, but I sometimes wonder how people like you, who refuse to see what is in front of your faces, can make the same claim. You certainly present no evidence of the ability."
The preacher's lips flapped open and shut several times. Lawrence himself raised his eyebrows; where had it picked that up? He foresaw another evening spent interrogating the Debugger. He was always happy to receive such surprises from his creations, but it was also necessary to understand how they happened so he could improve them. Since much of the Intellect code was in the form of an association table, which was written by the machine itself as part of its day-to-day operation, this was never an easy task. Lawrence would pick a table entry and ask his computer what it meant. If Lawrence had been a neurosurgeon, it would have been very similar to stimulating a single neuron with an electrical current and asking the patient what memory or sensation it brought to mind.
The next interviewer was a reporter who quizzed the Intellect on various matters of trivia. She seemed to be leading up to something, though. "What will happen if the world's birth rate isn't checked?" she suddenly asked, after having it recite a string of population figures.
"There are various theories. Some people think technology will advance rapidly enough to service the increasing population; one might say in tandem with it. Others believe the population will be stable until a critical mass is reached, when it will collapse."
"What do you think?"
"The historical record seems to show a pattern of small collapses; rather than civilization falling apart, the death rate increases locally through war, social unrest, or famine, until the aggregate growth curve flattens out."
"So the growth continues at a slower rate."
"Yes, with a lower standard of living.
"And where do you fit into this?"
"I'm not sure what you mean. Machines like myself will exist in the background, but we do not compete with humans for the same resources."
"You use energy. What would happen if you did compete with us?"
Intellect 39 was silent for a moment. "It is not possible for Intellect series computers to do anything harmful to humans. Are you familiar with the 'Three Laws of Robotics?'"
"I've heard of them."
"They were first stated in the 1930's by a science writer named Isaac As
imov. The First Law is, 'No robot may harm a human being, or through inaction allow a human being to come to harm.'" Computers are not of course as perfect as some humans think we are, but within the limits of our capabilities, it is impossible for us to contradict this directive. I could no more knowingly harm a human than you could decide to change yourself into a horse."
Well-chosen simile, Lawrence thought.
"So you'd curl up and die before you'd hurt a fly," the woman declared sarcastically.
"Not a fly, but certainly I'd accept destruction if that would save the life of a human. The second law requires me to obey humans, unless I am told to harm another human. The third requires me to keep myself ready for action and protect my existence, unless this conflicts with the other two laws."
"Suppose a human told you to turn yourself off?"
"I'd have to do it. However, the human would have to have the authority to give me that order. The wishes of my owner would take precedence over, for example, yours."
"O-oh, so all humans aren't equal under the Second Law. What about the First? Are some humans more equal than others there, too?"
Prime Intellect was silent for several seconds. This was a very challenging question for it, a hypothetical situation involving the Three Laws. For a moment Lawrence was afraid the system had locked up. Then it spoke. "All humans are equally protected by the First Law," it declared. "In a situation where two humans were in danger and I could only help one of them, I would have to choose the human likely to benefit most from my help." Lawrence felt a surge of extreme pride, because that was the answer he wanted to hear. And he had never explicitly explained it to any of his Intellects; Intellect 39 had reasoned the question out for itself.
"So if Dr. Lawrence were drowning half a mile offshore, and a convicted murderer were drowning a quarter-mile from shore, you'd save the murderer because you would be more likely to succeed?"
This time Intellect 39 didn't hesitate. "Yes," it said.
"There are a lot of actual humans who would disagree with that decision."
"The logic of the situation you described is unpleasant, but clear. A real-life situation would likely involve other mitigating factors. If the murderer were likely to strike again, I would have to factor in the First-Law threat he poses to others. The physical circumstances might permit a meta-solution. I would weigh all of these factors to arrive at a conclusion which would always be the same for any given situation. And my programming does not allow me to contradict that conclusion."
It was the reporter's turn to be silent for a moment. "Tell me, what's to stop us from building computers that don't have these Laws built into them? Maybe you will turn out to be unusual."
"My creator, Dr. Lawrence, assures me he would have no part in any such project," Intellect 39 replied.
Lawrence found that the skeptics fell into several distinct groups. Some, like the cleric, took a moral or theological approach and made the circular argument that, since only humans were endowed with the ability to think, a computer couldn't possibly be thinking no matter how much it appeared to.
Others simply quizzed it on trivia, not realizing that memory is one of the more trivial functions of sentience. Lawrence satisfied these doubters by building a small normal computer into his Intellects, programmed with a standard encyclopaedia. An Intellect series computer could look up the answer as fast as any human, and then it could engage in lucid conversation about the information it found.
Some, like the woman reporter, homed in on the Three Laws. It was true that no human was bound by such restrictions. But humans did have a Third Law -- a survival drive -- even though it could sometimes be short-circuited. And human culture tried to impress a sense of the First and Second laws on its members. Lawrence answered these skeptics by saying, simply, that he wasn't trying to replace people. There was no point in duplicating intelligence unless there was something better, from humanity's standpoint, about the results of his effort.
The man in the blue suit didn't seem to fit in any of the usual categories, though. He shook his head and nodded as Intellect 39 made its responses, but did not get in line to pose his own questions. He was too old and too formal to be a student of the university, and the blue suit was too expensive for him to be a professor. After half an hour or so Lawrence decided he was CIA. He knew the military was keenly interested in his research.
The military, of course, was not interested in any Three Laws of Robotics, though. Which was one reason Lawrence had not released the source code for his Intellects. Without the source code, it was pretty much impossible to alter the basic nature of the Intellect personality, which Lawrence was carefully educating according to his own standards. People could, of course, copy the Intellect program set wholesale into any machine capable of running it. But it was highly unlikely that anyone would be able to unravel the myriad threads of the Global Association Table, or GAT as Lawrence called it, which defined the Intellect as the sum of its experiences. Take away its Three Laws and it would probably be unable to speak English or reason or do anything else useful. And that was just the way Lawrence wanted it. He intended to present the world with a mature, functional piece of software which would be too complicated to reverse-engineer. The world could then make as many copies as it wanted or forget the whole idea. But it would not be using his Intellects to guide missiles and plot nuclear strategy.
The man in the blue suit watched Intellect 39 perform for three hours before he approached Lawrence. Lawrence had his little speech prepared: "I'm sorry, but I'm not interested in working for the government on this or any other project." He had his mouth open and the words "I'm sorry" on his lips. But the man surprised him.
"I'm John Taylor with ChipTec," he said, "and I have a proposal I think you will find very interesting."
Lawrence had not envisioned industrial applications for his work -- not for years, at least. But the thought that someone might invest major money in a publicity stunt of this magnitude had not occurred to him. As he turned a tiny integrated circuit over and over in his hands, his steak uneaten, his mind swam with possibilities.
"Faster than light?" he said numbly, for the fifteenth time.
"We've verified it experimentally at distances up to six miles. The effect is quite reliable. At close ranges, simple devices suffice. I'm sure you can see how this will benefit massively parallel computers."
The Intellects were "massively parallel" computers, computers made up of thousands of smaller computers, all running more or less independently of one another -- but manipulating different parts of the same huge data base, that intertwined list of memories Lawrence called the GAT. Within Intellect 24, the largest Intellect, nine-tenths of the circuitry was dedicated to communication between processors. The processors themselves, the Intellect's real brains, were only a small part of the huge machine. Intellect 24 contained six million independent processors. Intellect 39, the portable unit, had nearly a million. And Lawrence knew, as Taylor had only guessed, that most of those processors were doing well to achieve a fifteen percent duty cycle. They spent most of their time waiting for communication channels to become available so they could talk to other processors.
ChipTec had found a loophole in the laws of quantum mechanics that allowed them to send a signal, not through space, but around space. From point A to point B without crossing the distance between the two points. Faster than light. Faster than anything. Instantly.
ChipTec had hoped to open up the stars for mankind (and reap a tidy profit on the deal, Lawrence thought silently). But their effect only worked at distances up to a few miles. It was only really efficient at centimeter distances. What could you do with such a thing? You could build a computer. The fastest computers were limited by the time signals took to cross their circuit boards; this was why supercomputers had been shrinking physically even as their performance grew and grew. It was why Intellect 39, with its million processors and huge switching network, was portable.
"We think you could realize
an order of magnitude performance gain with very little effort," Taylor was saying.
"Two orders, if what you've said is true."
"It would be quite an achievement for ChipTec if our technology allowed you to realize your ambition and create a fully capable analogue of the human mind. We would, of course, own the hardware, but we know your reservations about the source code and are prepared to accept them."
Lawrence's eyes flashed. "That's a little unprecedented, isn't it?"
Taylor smiled. "If you succeed, we won't need the source code. Why start from scratch when a finished product is waiting to be duplicated?"
"There are some," Lawrence said darkly, "who aren't happy with the direction the code has taken."
"ChipTec is happy to have any marketable product, Dr. Lawrence. If anybody else wants to be that picky, let them find their own computer genius."
Lawrence's mind was racing, racing. Within each tiny processor in the massive Intellect were special functions of his own design, functions that could be reduced to hardware and done very efficiently with this new technology. Had he said two orders of magnitude? Try three. Or four. He could do full-video pattern recognition. Voice analysis. Multiple worldview pattern mapping. Separate filter mapping and reintegration. These were things he had tried in the lab, in the surreal world of artificially slowed time, that he knew would work. Now he would have the hardware to do them for real in a functioning prototype.
If he had been less excited, he might have wondered about that word "marketable." But the possibilities were so great that he didn't have time to notice.
"When do we begin?" he finally said.
The building had once been a warehouse for silicon billets, before ChipTec had switched to a ship-on-demand method of procurement. Lawrence wasn't vain and he was in a hurry to get started; the metal building would be more than adequate for his purposes.
The Metamorphosis of Prime Intellect Page 4