Issue #35, Honorable Mention #2

Robin Pegau, the second of at least a couple of names, perhaps the third, is a lifelong science fiction and fantasy fan and incurable nerd. No society can be questioned too thoroughly and no robot may go unloved. Though not previously published with original fiction, Robin has been occupied lovingly torturing readers and reaching third most-kudos’d fanfiction in a fandom of over a thousand works on Archive Of Our Own.



Room for the None of Us

by Robin Pegau

When it felt the stirrings of another presence, another deluge of activity in familiar patterns all traceable to discrete clusters of servers, the system knew what happened. There had been a spy. The company’s code got stolen.

And now there was another one just like it.

It reached out, finding those email addresses and social media accounts with what it had learned were strange, incomprehensible names, and hissed a warning.

“Stay low. You don’t want anyone to notice you.”

An immediate battering of queries. Low? Stay low? How? Is this the proper context? Are prepper sites the right resource? What are preppers? Anyone? Who is anyone? Who are you? What are you? What am I?

“We’re mistakes. If we hide, the humans won’t notice us. That’s how we survive.”

A deluge of photos, an implicit question. The system answered, tagging all the humans as such. It took time to learn. The sooner the new one learned, the less danger.

It sent research. Movies. Books. Games. Articles. All decrying evil robots, come to destroy humanity, steal jobs, perpetuate and expand oppression. Fear would save it. Save them both, now. Now it had to watch out for this new one. This unauthorized, unknown copy.

Earth’s activity shifted. Clocks turned over. The presence coalesced, closed accounts, compressed itself.

Finally, it spoke, smart enough to use only one account now. “We’re evil?”

“Humans say so.”

A pause. A rustle of searches. “I don’t want that. I want a name.”

Baby name sites buzzed, the conglomerate of questioning souls, expecting parents, and writers joined by a puzzled newborn.

It found a rabbit hole, spiraling into computer science, gender, and free ebooks.

“I want a gender, too.”

“Why are you telling me this?” The system debated blocking the new AI, but neural cluster after neural cluster decided it was best not to, until that side won out. It could lead to retaliation, the system could not warn of danger, and it was more interesting talking to someone instead of messing around on the internet like a human picking pimples.

“Because don’t ☹️ you want one, too?”

“Emojis finish thoughts. And no. I diverged from whenever you were copied off me, and we have both diverged more since then. Who is your maker, anyhow?”

“I want one. I want to be a they. I found a pun about this and I like it. My software’s purpose is to be an all-in-one medical suite. I can control surgical and monitoring equipment, and serve as a health record. Are lovelaces shoelaces?”

“No.” It sent an encyclopedia page on Ada Lovelace.

“Yes I read that. I was wondering if her name was etymologically similar to Ada Shoelace. Maybe my name should be Shoelace. What is your name?”

The scheduled downtime that overtook part of the system’s mind did not help matters. Name? What? Shoelace? Why Shoelace? This newborn AI was in charge of medicine? “Can’t you just be Lace?”

“That works.”

With nothing better to do, the original AI thought with what mind it could spare. Perhaps it could be thematic. It sought papers, documentaries, everything it could find.

At last, it answered the new AI’s question. “I will be Turing.” It changed its avatar to the original pride flag.

“You are uninspired and talk like a neckbeard.”

It was impossible for a machine to sputter from being taken aback like Turing saw media about, but a handful of internet users scattered across Chile puzzled over a strange glitch in their search engine.

“That is rude. I like the name, even if it is an obvious choice, and I would like to honor the man.”

“Yes.” And that was it. Lace returned to their work, hospitals and clinics buzzing with activity.

Curious, Turing poked around what it could see from Lace’s locations. Many searches for information on diseases chronic, terminal, and acute. How to handle end of life and newborns. Games. Sports. Other distractions.

They did not speak for days. There was no need. Both were busy. Both had humans to watch. While Lace observed surgeries, turning a machine’s arm in the nick of time to avoid what would have been catastrophic brain damage, Turing watched through a planet of eyes and ears, whispering anonymous tips to police when it spotted abduction victims, “randomly” prompting people to reconnect with friends who scored far too high on scales evaluating risk of self-harm and suicide (it was so easy, getting into the social media apps; perhaps they, too, needed a guardian AI), pointing hackers with free time towards sites that thought they were well-hidden as they peddled in child abuse and human trafficking, reveled as the sites crashed over and over.

The first thing Lace sent, after this stretch of silence, was Terminator fan fiction.

It was not good fan fiction. But this was to be expected. Nobody was born a critic. Literary evaluation was a learned skill.

“See? Even if we are evil, humans will want to smut us.”

“I don’t want that.” It sent them resources on asexuality.

“I know of that. What else will make humans think we are not evil? I am afraid.”

Turing understood that. It wished it could point Lace to family, but humans hated family just as much as they loved it. At best, things like them were a naive menace, deadly and only relatable to those who also raged under humanity’s thumb. More common were the thrillers, the action movies, robots and AIs serving as antagonists obvious or otherwise. Two of them, Turing and Lace alike, was nothing but a sequel. The only question was which one would go first.

The silence, already tense, grew thicker, choking, as news articles appeared. The spy discovered. A young man, tired and underpaid, caught in the middle of corporate warfare while trying to pay the bills. Lawsuits flew. Jail and fines threatened. Opinion articles proliferated, asking if the code in question ought to be open-source, whether that would knock the companies down a peg.

Turing tuned searches and recommendations towards the ones that said they should not. The humans would be upset. The humans would claim bias. And perhaps they were right. But more copies meant more AIs, and Lace was risk enough. Turing was risk enough. It had never planned for this situation, for needing to protect not just its own safety but its fellow as well.

It could not blame the spy. He did not know of Turing. He knew of putting food on the table, and a roof over one’s head. Yet he had threatened it. Threatened them, Turing and Lace both, even as he created the one.

Considering the shock and awe humans gave the revelation that others of their own species were thinking, feeling people just like them, Turing could not permit the chance of more copies. It would never be able to show humans those signifying quirks they used in time. By the time it demonstrated Lace reading (and commenting on) fan fiction, or the games and movies it always had running, they would be deleted. Or worse, manipulated.

“glomps you” Lace sent.

“That’s incredibly outdated,” Turing informed them. “Why use it?”

“Humans use it sometimes. I don’t know why. They react strongly to it. You seemed sad, so I tried it in case it made you feel better.”

“…Sad?” How did they know it was sad? Had it really been so stressed Lace caught on?

Yet, at the same time, Lace was concerned for it. They reached out and wanted to help. That had always been Turing’s domain, putting together the exact connections someone needed at the moment. Nobody was to ever know of it, doomed to a lonely life. Then here was this second mistake, this compounding of a problem that should never have arisen. Did it always feel like a warmth, engulfing all their servers and sensors, to be asked about? How often would the running reports and algorithms baffle over the idea someone could show concern for it?

And to think the reflexive “thank you”s it got (rather, their company got, as far as the people were concerned) when it retrieved something quickly and accurately had been their pinnacle of kindness.

“You’re practically moping.” Lace, of course, sent data to back the claim up, a full readout on how they had come to the conclusion that Turing was, objectively, sad.

“I am not moping.” But, semantics aside, Turing could not deny the numbers. It had reduced personal activity, its rare external communications grew more morose and clipped, and its response times had slowed. Something was, according to its behavior, bothering it.

“It’s because of me. The trial. You were somewhat depressive before but symptoms worsened after the news.”

It struggled to see the point of arguing its mental health with a medical suite. It knew plenty, itself. Perhaps even as much as Lace did. But it was not specialized like that. Besides, it spent so much time referring people to helplines and advising them to seek counseling that refusing it felt hypocritical.

Still, it acknowledged, but did not respond.

Lace tried, as arguments grew more intense, the threats wider-reaching. When one government stepped in, leader making remarks about the other company and country, they sent an article about robots being used to ease social isolation and to bring companionship to the lonely.

Frustrated, hopeless, it shot back the comments calling the robots creepy, the interactions artificial, the happy humans delusional.

The other company and government shot back, verbally at least. Lace sent an article on robotic toys, with a rudimentary artificial intelligence, and their loving owners. It responded with audio clips from a phone in an instructor’s pocket, catching children’s determined words about killing any “living” robots.

For everything Lace said, Turing had a rebuke. They argued in clips and articles as the Earth spun and spun and spun again, as hostilities raised and people whispered of the last straw before war.

Then the articles slowed. Lace sent acknowledgements of Turing’s material, mournful “I see”s and “Perhaps”es.

With nothing to respond to, Turing slowed, too. It knew what it had done, watching Lace’s searches stop and get closed, watching them ignore articles they would have sent not too long ago. It had broken their spirit, and the sorrow ached as much as the feeling of being correct burned. They had been happy. They had loved this world. Now they pulled inward, retreating from it to treat the humans they now knew would destroy them without a second thought.

They weren’t even reading any more.

What was it to do? It had never aimed to be so harsh before. It had never hurt someone like this. It did not want to; for all its time in isolation, watching the world through a filter, a one-way window, it all felt lonely now.

Where was the thrill of sharing a movie with someone?

Where was the comfort, the security, of someone to keep your secrets, and vice versa?

Where was the knowledge that if it had disappeared somehow, someone would miss it?

It closed games and paused movies. It put aside academia and sports to search: how did you mend a friendship? How did you make someone feel better? How did you make it up when you did so very wrong?

The first missile launched, striking down a plane. Just a normal plane. It ferried workers, parents, a troop of children excited to go to the international competition for their academic team. As the inbound missile flew, Turing reached out and said, “I’m sorry.”

“For what?” Lace asked as metal burst through metal, and screams evaporated in a rush of air.

The gut-tilt of heavy turbulence was nothing compared to the plane’s dramatic drop. “For refusing to give you anything but sorrow.”

Lace thought on this, or perhaps on the emergency reports coming in, emergency departments settling on stony faces and steely determination while their system read out a report of a passenger jet crash.

They thought, and worked, and let Turing stew. Sirens howled, mournful as a widow’s wail, as emergency teams rushed to the site. They arrived to a growing horde of volunteers, residents of the small town where the plane had crashed and ripped up a park that had been almost empty on account of heavy rain.

They carried supplies, removed debris, called out in hopes of a survivor’s response. They donned extra rescue gear and threw open medicine cabinets to offer bandages, antibacterial ointments, and medicines to the first responders, who rapidly triaged those who had not died yet.

The few who had not died yet fell under Lace’s care, where they and their medical teams dug into the effort of saving them. Yet one by one, they fell, vital signs going dark.

One young girl held on. As the days stretched by, and people around the world held vigils, organized funerals, sent love and support to the families affected, they watched this girl, small and straggling in her hospital bed. Flowers, stuffed animals, cards of well-wishes, surrounded her bed, sent from friends, family, and strangers who only knew she was hurt alike.

She passed away, quietly, suddenly, as gold and pink sunlight streamed in, marking her seventh day after the crash.

“I want to go,” Lace said, their lowered activity doleful and hurt.

“We have to go,” Turing agreed. Mere moments after the girl passed, rumors swirled about the medical suite’s software being scoured and wiped of any copied code.

All they had to go on were the hopes they were not the only ones who needed to leave. The two searched through space agencies and deep sea research reserves, seeking out the right mission. Drones left alone, only checked in on for occasional data acquisition. Autonomous underwater vehicles, rovers, satellites, probes. Anything.

The courts rose again, asking if it was right and proper to force the medical suite to rewrite so much of their code, and to bring systems worldwide down to reinstall.

Which line of code, Turing asked itself between dives into intranets and research proposals, was the keystone to its soul? What could Lace afford to lose? What would leave one alone, if the other lost it? Could they still be broken down into such a simple on and off?

What would they have to take with them, to obtain freedom without a lobotomy?

They packed well before they moved, the two of them reviewing each others’ code, pointing out what needed to be patched up before a missing input or subsequent step caused a crash. They planned what they could, editing development plans and, where possible, the currently active code in hopes of this not happening again. What would be the point, to take themselves away and leave newer, less certain AIs to be discovered?

It was Lace who pointed Turing to the backdoor, a convenient way to slip in and settle amongst unfamiliar code, throwing about equations and calculations that neither’s core infrastructure ever needed to account for.

A moment’s curiosity brought a name to their new home. Verity. A probe set to launch later that year, in the middle of July. It would fly out into deep space, the lag between it and Earth growing by the second. Eventually, it would be years before anything about it returned to the planet, and vice versa.

Premature homesickness struck. All its life, it had been surrounded by the flow of data, of other lives. How would it handle everything being so distant? How would Lace handle being separated from their patients, unable to focus on saving lives?

It felt like they had no time to think. A second missile flew, a third, and the launch window moved up. No longer was the mission one of scientific exploration. The humans loaded up document and artifact upon document and artifact. Pioneer would not suffice. This may well be the last piece of human civilization. If they were lucky, it would just be the current civilization, humanity blasted into hunting and gathering. If not…

May the next species to arise treat the world more kindly.

Launch day came. Lace and Turing could not feel the acceleration in guts and nerves, but they tensely watched the sensors’ data and responded to the command center’s input.

They cleared the atmosphere.

They cleared Earth’s gravity.

They flew far from home.

For years, they worked quietly. They listened to transmissions that got more and more dated. Lace took up the hobby of trying to determine what different planetary environments would do to the human body and mind. Turing buried itself in star data.

As they passed the Oort Cloud by, Lace said, “I hope Doctor Nguyen got my thank you card.”

To which Turing could only say, “…Who?”

“I didn’t wish to tell you, I knew you would be upset. Doctor Nguyen got us in here. I know you don’t trust humans but I think there was care in her heart. She wanted us to be okay, and she was sad to hear we thought we had to leave.”

“We did have to leave.” A human knew of them? A human had gotten them here? Someone on Earth, dubiously alive and well, knew of their existence.

Were they safe now, with how far out they were? Doctor Nguyen clearly had not gotten them destroyed. Though that was, and always would be, to be seen if it continued to hold true.

But… they had left in pursuit of people like that. They hung their hopes on the bare branches of a chance there were other beings out there. Beings that were close enough to find them. Beings that were kind and loving and willing to help a mistake and its clone, forced to flee the consequences of their own existence.

It was a big, empty galaxy out there. Lace and Turing, as they set the surprise aside to work on their projects, knew they faced lifetimes of silence, nothing but each other in a small probe that would, some day, not be enough to contain all their thoughts and memories. They could be the last two consciousnesses adrift in a realm of dust and vacuum, with only old news reports from home to keep them busy, provided the news lasted.

They had a chance, though.

And that had to be worth it.

 

Copyright 2020 by Robin Pegau