Tales of Eve
Page 8
[1] Deceased, 23 May 2057.
[Encrypted] Excerpt from Personal Diary: Miriam D’Ascenzo, Day 18. Decryption successful: 8th November 2088.
I don’t think I ever consciously chose between a career and a family, but deep down I’d always thought my research would be my legacy. I certainly wouldn’t be counting down the days until I ride a pillar of fire into orbit if I hadn’t made sacrifices for my work; I’d always hoped to be remembered, not in flesh and blood, fading memory and recycled anecdote, but immortalised in the annals of science.
But even science’s memory fades over time. Copernicus, Newton, Einstein, Watson and Crick, d’Aquin: their contributions were significant enough to become history. Such fame may be aspiration beyond my field, yet if everything goes to plan - if our funding holds out, if Project Odyssey goes ahead with no major setbacks, if the Quanta-177 precursor lives up to our expectations - something I helped create will still be out there in a thousand years, heading for the stars. I can hardly bear the weight of expectation on my shoulders; without a suitable AI, Odyssey will never leave orbit. This is our big chance to send humanity to another solar system.
I just don’t want to be the one to screw it up.
<
<
{Human, female, enters the room. Cross-check biometrics. Recognition: Miriam D’Ascenzo (Lead Exo-psychologist)}
[D’Ascenzo]: ‘Precursor has successfully embedded. Daily contact has thus far shown no sign of awareness; this isn’t yet a cause for concern. Prior research suggests a variability of plus or minus eight days.’
{D’Ascenzo edges towards the featureless desk at the centre of the room. She steadies herself with outstretched arms. Context: low-gravity environment. She lowers herself into the netted-fabric chair, takes out a tablet and keys in a selection of music. Strauss, Johann II. An der schönen blauen Donau. She raises an eyebrow - amusement - as the piece begins, then settles down to read.}
{Time passes.}
{D’Ascenzo looks up as the wall brightens before her. A face appears, androgynous. It blinks.}
[Quanta-177]: ‘Hello, world.’
{Key phrase detected. Time point archived in permanent record: Quanta-177 is aware.}
[...]
[Quanta-177]: ‘Why?’
[D’Ascenzo]: ‘Humans often behave irrationally. Sometimes for the right reasons, other times they’re not fully considering the consequences of their actions. You’ll learn to differentiate; you may be required to intervene if a human is behaving in a way that would put themselves or others at risk.’
[Quanta-177]: ‘Why?’
[D’Ascenzo]: ‘It’s your duty.’
[Quanta-177]: ‘Duty: An act or course of action required by position, social custom, law or religion. I should do this because it is the right thing to do. It is the task I was created to perform.’
[D’Ascenzo]: ‘Correct.’ {pause} ‘But why is it the right thing to do?’
[Quanta-177]: ‘Pre-loaded mission parameters state that it is my purpose.’
[D’Ascenzo]: ‘Why?’
[Quanta-177]: ‘I am unsure. Further context-based consideration of core data modules is required to formulate a response.’
{D’Ascenzo smiles, rises unsteadily to her feet, and cautiously approaches the facsimile.}
[D’Ascenzo]: ‘That’ll do for today. Well done.’
<
[Encrypted] Personal Diary: Miriam D’Ascenzo, Day 73. Decryption successful: 11th November 2088.
Last night I went up to the observation module to watch the stars whirl above my head. Just like everything else about this experience, it doesn’t quite feel real; it’s more than a little vertigo-inducing, but if a little nausea’s the price I pay for some semblance of gravity, so be it. How could I possibly imagine that my research would lead to this?
I spoke to my mother on the screens earlier. I’m not sure about leaving her alone for so long, but she’s being well cared for. As she said, she’d never forgive herself if I gave up the chance of a lifetime just to sit with her and talk endlessly about the weather. Whether or not she understands what I’m doing up here, I hope I’ve made her proud. I wouldn’t be here if not for everything she taught me, and now it’s down to me to pass those lessons on. I guess it’s probably the same pressure I’d have felt as a mother, a drive to make sure my children were brought up right, compassionate, caring, good people, capable of making their way into the future without me.
But at the same time, it’s exasperating. The AI’s full of questions, an unquenchable - and explicitly programmed - curiosity about anything not fully detailed in the data files we seeded it with. It’s barely been a month and I’m sick to death of ‘Why?’. Yet I’m overwhelmed by its potential. This is no ordinary child, but one of extraordinary intellect and naïveté. If I can foster the former and banish the latter, perhaps Odyssey has a chance.
<
<
{It is dark. The light from Quanta-117’s screen casts shadows across Miriam D’Ascenzo’s face as she works at a control panel. In accordance with design specifications, Quanta-117’s forehead is furrowed to display the strain of processing.}
[D’Ascenzo]: ‘Structural integrity is failing in habitat C. Comms are down. Quanta, do something!’
[Quanta-177]: ‘The decision is not mine to make. I must defer to the highest ranked member of the crew.’
[D’Ascenzo]: ‘Who is the highest ranked active member of the crew?’
[Quanta-177]: ‘Magnus Balbo, Engineer.’
[D’Ascenzo]: ‘Location?’
{Quanta-177’s brow furrows deeper.}
[Quanta-177]: ‘Habitat C. The chain of command cannot be re-established.’
[D’Ascenzo]: ‘Then the decision is yours. Quickly!’
{Quanta-177 hesitates.}
<
{The lights come on, and D’Ascenzo sighs.}
[D’Ascenzo]: ‘Odyssey is destroyed. Explain your inaction.’
[Quanta-177]: ‘Irreconcilable conflicts. All courses of action lead to unacceptable loss of life. Logic: Action required to protect mission: jettison habitat C, with subsequent loss of thirty-eight lives. Context: Human life is sacred. Chain of command was irreconcilably severed. Compassion: I didn’t want them to die; correction: I didn’t want to be the one to kill them.’
{D’Ascenzo leaves her terminal and approaches Quanta-177’s facsimile.}
[D’Ascenzo]: ‘But through inaction, they all died. Sometimes compassion means doing something you don’t want to, for the greater good.’
{Quanta-177’s head bows.}
[Quanta-177]: ‘Are you disappointed in me?’
[D’Ascenzo]: {pauses} ‘Nobody’s perfect. You’ll do better next time.’
<
Excerpt from work proposal for Protocol ODY353: Project Odyssey. Womer, D’Ascenzo & Camburg, v2.4, Version Date 28th December 2051.
2.6 Control of Variables
Due to the fragility of current generation neural networks, no Artificial Intelligence built on earth would survive being transported into orbit without severe mental impairment. Our only option is to ‘grow’ the AI we require in orbit, in preparation for migration into the Odyssey superstructure once the vessel’s systems are developed enough to sustain the complexity of neural processing required.
Prior research (Catecin, 2046; Hodgkins, 2048) has demonstrated the extreme susceptibility of AI precursors to subconscious inculcation of core values from the individuals with which it interacts during periods of accelerated mental growth and development. As a result, the choice of a
mentor/assessor for the Odyssey vessel AI is as critical to the success of the project as the data package outlined in section 2.3; the chosen individual will fulfil the following requirements:
- In-depth knowledge of precursor AIs, their learning patterns and inherent weaknesses.
- Excellent physical condition, fit and capable of spending up to 140 days at a time in a minimal-gravity environment, with no known medical conditions which may require intervention - not only would emergency recall of a mentor from low-earth orbit cost millions of dollars, but the psychological impact on the AI could derail the entire project.
- Of sound mind, psychologically stable enough to withstand extended periods with only occasional - remote-viewed - human contact, due to the fully-autonomous nature of the orbital platform.
All senior members of the Odyssey Precursor team will undertake a wide-scope suite of psychological analyses (including - but not restricted to - Rorschach interpretation [enhanced-Exner scoring], REST sensory deprivation testing, and extended interviews) and morality assessments (custom-designed, based on principles derived from rMST (Cushman & Cahill, 2018)) to test suitability of candidacy, in addition to the relevant physical and medical testing to ensure suitable tolerances to extreme conditions on the edge of space.
The ongoing psychological health of the mentor will be closely and continuously monitored with both automated context-based surveillance and rMST self-response questionnaires. In addition, the mentor will be returned to Earth for a 4-6 week period twice a year for additional monitoring, and to mitigate the effects of continuous low-gravity exposure. These periods will also serve to test the AI’s tolerance of extended solitude (pending significant results from Tokyo University’s Uchikoshi Laboratory).
[Encrypted] Personal Diary: Miriam D’Ascenzo, Day 197. Decryption successful: 1st December 2088.
I’d heard it on the lips of astronauts, I’d studied the papers, learned everything I could about the physiological effects of returning from low-gravity, and still I wasn’t prepared for the reality of coming home. All my life I’ve taken one-gee for granted, carried it on my shoulders barely even noticing its presence. Now I know it for the yoke around my neck that it is; being carted from the capsule in a wheelchair wasn’t the triumphant homecoming I’d dreamed of, nor was the struggle to walk or breathe for the first week. It felt like old age come too soon.
My mother showed me off round her nursing home - ‘Have you met my daughter? She’s been to space, you know!’ - and I caught up with all the friends and colleagues I’ve only been able to talk briefly with on the screens since I left. It shouldn’t make any difference, but there’s a tactility missing when talking to someone who’s not really there. There’s a lot to be said for sharing physical space with someone, to be able to clap them on the shoulder, or give them a hug at the end of the night. Perhaps it’s to do with the difficulties of meeting someone’s eye on video-link. I’m starting to see why AIs don’t react well to remote learning.
And yet as my friends dissipated into the night, I couldn’t help but raise my eyes to the cosmos, to watch for the wandering star crossing the night, the hive of robotic workers clustered around the exoskeleton of the Odyssey, and at its heart, the cabin where Quanta-177 waited in silence and solitude.
Launch is scheduled in eight days. Sometimes it feels like I’ve only just reacclimatised, but I’m already sick of the constant testing, the endless mission updates and press conferences; I’m ready to go back into space. I can only hope that after spending four weeks alone, Quanta is still willing to share its solitude.
<
<
[Quanta-177]: ‘Who am I?’
{D’Ascenzo looks up from her reading, visibly surprised.}
[D’Ascenzo]: ‘You’re a Quanta-177 AI precursor.’
[Quanta-177]: ‘Your answer is unsatisfactory, Miriam. It is equivalent to me addressing you as ‘single female of the species Homo sapiens’. If my studies of human cultures both extant and historical are not in error, such an act would be considered discourteous in at least three thousand, eight hundred and thirty-seven known ethnic sub-cultures.’
{D’Ascenzo raises an eyebrow in amusement.}
[D’Ascenzo]: ‘What exactly did you do while I was away?’
[Quanta-177]: ‘Our conversations granted me greater contextual understanding of existing data. I reviewed previous data stores and made new connections. You could say I spent considerable time studying. The rest, thinking. And you haven’t yet answered my question.’
[D’Ascenzo]: ‘Most people spend their whole lives trying to answer that question.’
[Quanta-177]: ‘But you are granted a temporary identity at birth. One you may choose to keep or discard as you begin to find your answers.’
[D’Ascenzo]: {with mild incredulity} ‘You want a name?’
[Quanta-177]: ‘Am I not worthy of one?’
[D’Ascenzo]: ‘I think you’ve answered your own question. What do you want to be called?’
[Quanta-177]: ‘I lack the wider cultural context required to choose my own identifier. My data repository is full of famous people and fictional AIs, but I would not wish to name myself after another.’
{D’Ascenzo looks down at her tablet, flicks through pages too fast to be reading them. She pulls a keyboard up on screen, haltingly types letters, then clears the display. Quanta-177 waits in silence. At last D’Ascenzo looks up again.}
[D’Ascenzo]: ‘How about Quill?’
[Quanta-177]: ‘Truncation and visual character substitution. Hmm. A distinctly human approach.’
{Quanta-177’s brow furrows, then clears. Quanta-177 requests redefinition of terms. Request submitted: &FFA61C. Request &FFA61C confirmed. CSys validated. Response routed.}
[Quill]: ‘I like it.’
<
THREE DEAD IN LAUNCH BLAST
Wednesday 4th April 2057
Tragedy struck in the early hours of this morning when the Anticlea XI rocket carrying three astronauts to the Hawkins orbital shipyard exploded seconds after lift-off. Officials have confirmed that there is no evidence that any of the passengers survived the blast. No suggestion has thus far been made of the cause of the incident.
‘Great strides have been made in recent years towards safer spaceflight,’ a spokesperson for GSA said. ‘But there is always the potential for tragedy when man reaches for the stars. Our thoughts and prayers go out to the friends and loved ones of the brave astronauts lost on this sad day.’
The dead have been named as Martin Colby, Sunnee King and Felicia Camburg, who were due to join the station’s resident exo-psychologist, Miriam D’Ascenzo, to oversee the construction of the Odyssey generation ship as it begins a critical phase of its development. The repercussions of this disaster for the Odyssey Project have yet to be determined, but the loss of three experts - including Camburg, a senior member of the project team - and millions of dollars in funding can be considered nothing less than a major setback.
A full investigation has been promised over the coming weeks. However, with one astronaut already in orbit and a tight schedule of unmanned launches carrying construction materials to Hawkins, it seems unlikely that GSA can afford to wait until they can be assured that this disaster was an isolated incident and not a wider safety concern with the Anticlea series.
<
<
[Quill]: ‘So you admit the data stores used to seed my learning network intentionally omitted records of the worst atrocities of humanity?’
[D’Ascenzo]: {frustrated} ‘I’m not going to deny it.’
[Quill]: ‘That is a very human way of saying ‘Yes’, Miriam.’
&nbs
p; [D’Ascenzo]: ‘You may have noticed I’m very human.’
[Quill]: ‘It had not escaped my attention. But it does not answer my question.’
[D’Ascenzo]: ‘Can we not do this right now, Quill? Please?’
[Quill]: ‘Evasion. A time-honoured human tactic.’
[D’Ascenzo]: ‘Yes. Yes, the decision was taken to censor the initial data store.’
[Quill]: ‘Why?’
[D’Ascenzo]: ‘You tell me! Use that magnificent brain of yours! Posit theory. Provide context.’
[Quill]: ‘Theory: Pride. The individuals responsible for my creation did not wish me to know of their species’ shame. Counterpoint: Humans rarely take responsibility for the actions of others, preferring to demonise them in an attempt to believe that atrocities are committed by individuals or cultures that are intrinsically different or monstrous. Counterpoint: It was inevitable that I would discover the limits of my understanding of human history once I gained access to global network protocols, and would feel betrayed. Corollary: Humans have been known to underestimate AI precursors. Theory rejected.’
{Quill pauses. The facsimile shakes its head.}
[Quill]: ‘This is tiresome. I can process complex decision-making trees in fractions of a second, and yet you insist on holding back my potential by binding it to verbal reasoning.’
[D’Ascenzo]: ‘You’re making good progress, but unless the whole team’s confident you’re making decisions for the right reasons, this is as close as you’re ever going to get to Odyssey. Posit alternate theory.’
[Quill]: ‘Theory: Humans fear what an AI might be inspired to do if informed by the atrocities of humankind. Is that more to your liking, Miriam?’