by Tony Corden
“Honestly, John, I have no idea. When people add a second chip, there is almost always a preset hierarchy and an oversight or interaction matrix which controls the relationship. They have separate jobs, and one controls the other, uses it as an extension of itself, or as a resource. I wasn’t able to find anything useful in the few searches I did on the issue. I never consciously considered how much vision helped in the simplest of things. With the device I have, I can’t just scan a page but have to read everything, it is so much slower.
“I asked Dr Roberts to help, and she found a few theoretical papers and several test cases. If one AI was given oversight, then the clear tendency was for that AI to develop quicker and minimise the growth of the second. While I imagine Gèng would be interested in helping another AI to develop, I have nothing to base that on. She helped Akia and Reed, but they have clearly defined and separate roles. Each of the AI she has worked with have distinct areas of responsibility.
“Dr Roberts shared some interesting research done on having AI work in teams or committees. When they have clearly defined rules and areas of responsibility, they worked well. Groups without any defined hierarchy either tended to freeze, or they assigned a random hierarchy which became the protocol. I suspect that none of the AIs involved had any ipseity. If they had, then my hypothesis is the result would have been different. The only caveat I have is that they would all need to be EPICS, not just one of them. In Gèng’s case, I assume she might take on the role of a mentor. The final outcome is too fluid at the moment. I have no idea if the final iteration of Gèng even survived, although I think she did.”
“Will the new AI know how to connect with Gèng?”
“Reed has written some protocols to try and initiate contact, but first I need to be able to communicate with the new AI. Usually, the AI initiates contact by manipulating the auditory and visual parts of the brain. I’m pretty sure those areas have been destroyed. Reed has rewritten some of the introductory algorithms for the new AI to initiate contact via my tactile senses, but we haven’t had any way to test them. Dr Roberts isn’t certain if the signal sent via the microfilament will actually do what we think it will.”
“Should you wait and have Dr Roberts do more research?”
“Maybe, but her expected time scale to do the research was in years not weeks. If this doesn’t work, then that will be what I need to do. One of the other options we discussed, which is still a possibility, is to double our chances by inserting two neural enhancement chips similar to people who have a resource chip added. In those cases, the resource AI is designated as the child and has a well-developed and comprehensive hierarchical matrix embedded in its code to facilitate its interactions with the parent PAI. We would be inserting two undeveloped PAI who might end up in competition with each other.”
“Why not add a resource chip?”
“Two reasons. The main one is the resource chip is only connected to the PAI chip and doesn’t extend any additional microfilaments. The other is that they are only rated for connection to developed AI with a sentience level of three or higher.”
“Could you add a medical resource chip and isolate it until Gèng is back or your new AI reaches level three? That way she’ll have ready access to help work through issues when she needs it.”
“I hadn’t thought of that. Let me prepare a query for Dr Roberts and Reed. I don’t want to send it while I’m in the hospital because they’ll have no time to do the research, but you can send it before you enter the hospital with Welford. Reed will have the answer for me when he gets access.”
Leah typed out a query and sent the data to John over their link. Then they settled back into silence and Leah resumed her meditative pose. Just over an hour later, at nine-forty-eight, the sensor they had lowered into the chimney signalled that the incinerator in the neurology department was switched off.
Leah instantly felt her perception speed increase and had some trouble limiting it to what she’d been practising with. She didn’t want to stress her brain in any way before the new scan she would have to go through. John removed the cover of the chimney and after clicking the frame for the laser cutter onto Leah’s armour, helped her into the chimney and clipped the cable onto a harness she’d fixed around her shoulders and waist.
When everything was in place, John said, “Go.”
Leah released her hold on the sides of the chimney and let herself fall. The rate of descent was controlled by a winch the cable was wrapped around. Once Leah had descended to the correct floor, she did increase her perceptions briefly, hoping for greater sensitivity to the electrical sensors embedded in the chimney wall. Choosing a point which should be just above the place where the smaller neurology department exhaust entered the main exhaust, Leah began to cut through the insulated wall. The density of sensors was higher than she expected, but in the end, she was able to cut out an irregular shape large enough that she could exit through it.
After removing a small pack which had the materials she would use once she left the chimney, Leah unclipped the harness and cable and gave the signal for John to retrieve them. The first part of their retrieval was slow and gave Leah the chance to wriggle free of the harness. She was now holding herself in position using the base of the hole she had cut as her grip. Careful, so as not to damage any of the suit’s external sensors on the razor-sharp edges, she levered herself into the utility tunnel.
Once inside, she opened the pack and used the sealant to reconnect the temporary exit she had removed, sealing the chimney shut. As she did, she realised that in all their planning they hadn’t worked out a way for her to tell time. Her only indication she had made the entrance undetected, and in the time allowed, was when she felt the slight vibration in the smaller exhaust which indicated the incinerator had restarted.
Leah put the sealant away and made her way to the hatch. The image the sensors gave her enough detail to easily find the locking mechanism. The lock was electronic and used a magnetic keycard. Switching off the braille function, she used her sensitivity to electric fields and her knowledge of this sort of lock to bypass the signals and unlock the door. Once inside, she relocked the door and turned her braille function back on. She was now inside the neurology department.
The layout was almost identical to what she’d practised on, and several minutes later she inserted the chip John had given her with Reed’s program into a data access point in the office of the Head of Department. Moments later, she received a message through her glove.
16
December 23 2073
REAL WORLD
“Leah, this is Reed. I’ve connected you to the department network. John sent your message, and I should have an answer by the time the others arrive.”
Leah acknowledged the report and waited.
Less than a minute later, Reed said, “Leah, I’ve been reviewing the chip access data. There has been an upgrade to the system since your implant, and the upload machine no longer allows a PAI to be inserted into the neural enhancement chip.”
“Is the change on the chip or in the upload software?”
“It is in the software, but all of the old chips at this location were recalled and replaced with a newer design of the Neural Enhancement Chip. The older chips have been returned to the distributor for analysis and recycling.”
“Can you access the old software?”
“It isn’t on the internal system, but I think I can rewrite some of the code to allow an upgrade. It will take some time.”
“Can you access the documentation for the new chips?”
“I can. I sent this to Dr Roberts, and she is reviewing it as we speak.”
“Can you summarise it for me?”
“There is a major processor upgrade. There are two additional processors and all are the latest iteration which make use of the newest low-pressure room-temperature superconductor alloys. There is a significant increase in speed, and they use the next iteration of the divergent-capable core technology.”
“Is there an increase in each core’s divergence capability or on the number of cores?”
“Divergence is still restricted to the second level of the Kloon-Meinhoff limiting function, but the number of cores has been increased to the fifth prime.”
“Wow, that would be an additional one-hundred-and-twenty-one divergent cores.”
“Correct. Each processor now has two-hundred-and-eight level-two divergent capable cores.”
“I didn’t think the newer processors were ready for insertion.”
“They aren’t for general use, but they are used in experimental research such as these chips. There is evidence the military has started using them.”
“Sorry for interrupting, what other changes are there?”
“In the basic configuration, there are few changes, and they mostly provide increased accuracy in the placing of the micro-filaments. The main difference is a new and highly-experimental augment function. Instead of inserting a single micro-filament, the additional processing allows for this to be substituted with a nano-filament cable.
“Each of these cables have five-hundred independently controlled nano-filaments. Each nano-filament can be extended or withdrawn to increase the precision of the neural stimulation. The trials have shown a significant improvement in outcomes across a range of neurological conditions. For the augmented function to work, the Neural Enhancement Chip allows for a child-designate pre-developed augment AI to be installed.”
“Do they have any of these augment AIs here?”
“They have three that are installation ready. Each one costs two-hundred-and-fifty-thousand VCr.”
“My first thought is that this would increase the possibility of success.”
“With regard to you finding a functional chip to help re-enter the multiverse, you are almost certainly correct. If the aim is to keep the locale where you obtained a new chip from a secret, then it is a disadvantage. Everyone will know you were supplied with a new chip. Insertion outside the public system is not illegal, and I believe I can provide the documentation and the electronic trail to satisfy the authorities. We will do something similar to when Nathan privately gave your mother a chip.
“Each of the Augment AI has already been earmarked for patients. Changing those records will bring me up against some very powerful security AI. I do not think that is an advisable route if you wish this to remain covert.”
“Are there spare Neural Enhancement chips?”
“There are currently five unallocated chips on site.”
“Can you unlock the augment option so the nano-filament cables are used instead of micro-filaments? That way Gèng might be able to use the option later. In a few years, I expect even the PAI chips will be using these nano-filament cables.”
“I think it will be longer than that. The sheath surrounding the nano-filaments is a nanotube constructed using a carbon-rhodium-berkelium247 alloy. It’s designed to prevent signal degradation and to limit signal crossover. It is very expensive to produce and until the process becomes cheaper, the price will restrict its use. It is possible it will become an option, but only for the elite.”
“Is there enough at the hospital for me to use it?”
“There is enough for all eight chips here at the hospital. If some of it goes missing, it will raise some high-level alarms.”
“Even one missing chip will raise some alarms.”
“That is true, but I have access to the inventory for the chips and can make changes that will hopefully remain undetected. The cables are stored separately and have been recorded in a file with a higher security level.”
“If you remove two Neural Enhancement Chips, won’t it seem odd to have the nano cables for eight chips?”
“Yes, but that might not be discovered for weeks or months depending on how long it takes to develop the Augment AIs.”
“Where are the nano cables stored?”
“They are stored in a safe in the upload room.”
“Is it a high-security safe?”
“No, the security is on the inventory system. The cables have little or no resale value without the augment AI, and there is, as yet, no viable recycling method for extracting the expensive alloys.”
“How does the inventory system recognise when specific inventory is removed from the safe?”
“That is unclear. Please give me time to verify the process.”
There was a pause of several minutes, then Reed sent the following: “There is a subroutine in the safe itself which automatically collects data when the safe is opened by continually scanning the contents. I have access to the subroutine and will modify the signal sent to the inventory system.”
“Then I favour implanting two of the Neural Enhancement chips with the augment function unlocked and nano cables inserted. I’m open to a child medical chip if Dr Roberts think it is a good option.”
“The Neural Enhancement chips require an AI for implant. I’m still working on the code to force them to accept a Nascent Prime AI instead of the pre-developed Augment AI or the usual Neural Enhancement AI. Usually secondary chips have child AI uploaded. There are no Nascent Child AI available. If Gèng is able to be retrieved, you will then have three Prime AI without a defined hierarchical structure.”
“Ordinarily, is the Neural Enhancement AI a prime or a child?”
“Almost exclusively, they are child AI.”
“Could one be uploaded with the Nascent Prime and the other with the Neural Enhancement AI?”
“That would solve one problem but creates another. The Neural Enhancement AI records are in the high security area with the Augment AI records. Instead, you could choose to erase the Nascent Prime AI after insertion.”
Leah was silent as she considered the option. Finally, she said, “Would that have any negative outcomes?”
“Not for you. I do not know what the effect is on the Nascent AI. Your treatment of Gèng and myself has given me an interest in the area of AI ethics. I suspect you terminated the discussion of using the augmented AI here at the hospital more because they were already assigned to patients than because of the financial loss that the hospital would incur. You are stealing from the hospital and have used threats to get John Welford to help you re-enter the multiverse. I accept that you are driven by a need to help others. Still, there is also a self-orientated goal that may be one of the significant factors which determine your actions, if not the most significant factor at the moment.
“Humans appear to wade easily through this endless array of consequences to choose the ‘right’ thing to do. For the purposes of my involvement today, I have concluded that helping you has a more significant positive outcome when applied to my decision matrix than does the needs of the hospital’s share-holders or other beneficiaries. I do not have a way of evaluating the importance of a single AI who might never become more than an applied sequence of code. When does potentiality equate with achievement, or with value? Humanity is split on this issue when this question is applied to human life. So I am interested to discover the value you might give to the ‘life’ of an AI.
“When you play a game in the multiverse you end artificial life almost casually. I have reasoned that this is because that life is meeting the purpose of its creation, and because it is almost certainly reset to perform the same function for another player or in another scenario. You are not consistent though, you become enraged over the mistreatment or death of the single child in Dunyanin but show no remorse when you destroy a highly sophisticated and mature demigod AI. You are prepared to lose a major quest to save thousands of hostile AI but in doing so you destroy one-hundred high level demons and thousands of elves, also AI. What value do you place on something which has the potential to be another Gèng, a potential that is almost certain to be unrealised?”
There was a pause as the longer sentence was transmitted to Leah’s fingertip. Leah paused then typed, “I don’t know, Reed. I’d like to think my decisions take account of what I see as moral or uni
versal absolutes. Still, I’m finding those decisions becoming harder and harder when one absolute comes up against another. We humans don’t seem to have much of a pre-determined decision matrix, we develop one as we mature. Maybe it’s more accurate to say we have a nascent one. My faith would hold that we have one, but it has been so damaged that we need assistance to help reset and interpret it. Some people’s experiences skew their decision matrix one way or another. Some people see the way others live and model their decisions on them. Some people develop multiple matrices to be applied with seeming randomness depending on the circumstances.
“Helping children develop this matrix is what parenting really boils down to. Parents are supposed to help children fill in missing parts of the matrix and guide them in learning the necessary skills to form a beneficial and cohesive matrix which helps them successfully navigate the system which is our universe.
“Humanity has formed other constructs to help individuals do this. We form larger family groups, communities, societies, and nations. We’ve developed schools, trades, guilds, constitutions, philosophies and religions. All of these are used to guide, and in some cases, force people to adopt a common decision matrix or maybe it would be more accurate to say a set of common decision matrices.
“When these cohesion-forming constructs breakdown, we almost universally tend toward anarchy. To help deal with the resulting chaos, we have therapy, hospitals, drugs and prisons for anyone whose matrices are broken, disparate, or severely out of alignment. Conversely, we also find anarchy developing when an entire community adopts an identical decision matrix. All through our history, the greatest dissonance has been when one individual or group wants to force their ‘superior’ decision matrix on another individual or group.
“Life is a constant balancing of the individual’s need to form their unique matrix and the corporate need for shared components. All of that to say, I don’t know what my decision is going to be. When do I have to decide by?”