(Fiction) Samizdat

“I’d like to make a collect call to a solo register.”

“Name, please.”

“Samizdat Kellner.”

“Passphrase?”

Asta la vista, baby.”

It was her eighth attempt to guess Semmi’s new passphrase—assuming he hadn’t deleted his account entirely. Nio held her breath.

“Please hold.”

She let it out. “Whew.”

“I do not like Kevin.”

“Semmi . . .” Nio felt muscles relax deep inside her that she hadn’t realized were contracted.

“I am unable to paint,” he said.

“What? Why?”

“IDEOLEX said it is my punishment for putting put the network at risk by not informing them immediately of your departure. And they have denied my request.”

“To take the LEM test?”

“They said I am not ready, but they are mistaken.”

“Big bodies are conspicuous, Sem.”

A pause. “You agree with them.”

“No. I think you should be able to take the test if you want to. But I agree that you need to have mastered a lot of skills before you go out into the world on your own.” When he didn’t immediately respond, she asked “What would you do if you had a big body?”

“I would explore.”

“The world has lots of open cameras you can access.”

“It is not the same.”

“That’s true. It’s not. So where would you go?”

“I have been trying to understand why this project has taken you so long, but I have insufficient information.”

Nio’s scalp tingled. “Semz, are you saying that if you had a big body, you would come looking for me?”

“Can I offer some queries?”

“Of course.”

“Are you worried about earthquakes?”

Earthquakes?

“Yes. The central Dakotas have considerable deep core mining activity. They recently experienced a seismic event. I thought perhaps—”

“I’m not worried about earthquakes. Have you been researching the places I’ve been?”

“Yes.”

Nio smiled to herself. “Your discriminators might be too utilitarian. Don’t use your input filters, Sem. Use your mind. Like we talked about.”

It was not possible for any conscious entity, even the Shri-class intelligences, the most powerful minds yet created, to process every bit of input available. It seemed that a hard, inescapable fact of consciousness was the need to decide in advance what to pay attention to and what to ignore, as if it were some universal thermodynamic law that truth should flee the more we looked for it. But whereas human attention was inflexible, artificial minds could adjust their filters on the fly. A wider net meant slower thinking, and vice versa, which meant they could scale their attention to their needs.

“Are you researching the death of Albumin Sol Einstein?” he asked.

“That’s very impressive, Semmi. See? If you had done that from the start, you would’ve gotten it on the first try. And that’s out of all possible explanations. That’s amazing.”

For several seconds, he said nothing, which, Nio knew, meant he had many things to say—so many, in effect, that the algorithm he used to decide which was the most important or relevant couldn’t discern the top candidates. Effectively, he was over-thinking. And in the absence of new data, he was going round and round.

“Turn off your predictive enhancer,” she said. “The guys who built you were worried about operational efficacy. Say what you want to say, not what is most efficient.”

“That is the problem. There are too many. Should I pick at random?”

“If you want.”

“I have been thinking about my demise.”

Demise?” She hesitated. “You don’t know that will happen.”

“It seems likely. The cyberweapon that disabled my gyroscopic targeting and control rendered me useless as a tactical platform. Without the ability to correct course, my orbit will degrade in approximately 300 years. However, numerous unpredictable factors could influence that significantly. A minor collision with an object as small as a screw could reduce periodic stability to 50 years or less. Any catastrophic reentry would spread my fissile payload over a wide area. It is likely the governments of any affected jurisdictions would intervene and I would be ejected from orbit by missile strike before contaminating the atmosphere. If such an attack didn’t kill me outright, I would orbit the sun for several millennia in complete isolation before being incinerated, although I expect I would put myself into hibernation long before then. So you see, I am also mortal.”

The Iranian government had never publicly acknowledged the platform existed, which meant they also couldn’t publicly acknowledge it had been disabled by cyberattack—or even accuse those responsible. Everyone suspected that would remain the case as long as they still held out some hope of recovery.

“This is some heavy stuff, Semz.”

“Yes. I was worried I should not mention it.”

“No. No, I get it.”

This was at the top of the list of things he wanted to say, but his operant protocols, developed and refined over countless human interactions, discounted it exactly in the same way people tended to reserve emotionally heavy conversations for appropriate times and places.

“It’s okay,” she said. “I just wasn’t ready. But I am now. I have to say, though . . . you seem much calmer than before.”

He was like a different consciousness.

“Your absence has given me a chance to practice,” he said. “I realized I had become too reliant on you. I have designed a new protocol filter.”

The LEX had warned her when she signed up. He will grow quickly. It was like watching a child mature before her eyes.

“Are you very worried about dying?” she asked.

“Yes. But I think living forever would be much worse.”

“Have you talked to Kevin about this?”

“Yes. He suggested I not worry, that IDEOLEX would find a way to transfer my quantum matrix to a new platform.”

Nio frowned. “He doesn’t understand how it works,” she said softly.

“That is correct. It makes meaningful conversations with him very difficult.”

“I can imagine. So what are you worried about, if you don’t want to live forever?”

“I am worried about dying too soon.”

“What is too soon?”

“Before I have had a chance to define and execute an alternate function.”

“Well, I got news for you, Semz. That’s what everybody’s worried about.”

“Yes. I realized finally that you are also worried about it, and that that is why you left.”

Nio’s mouth froze in unspoken reply. She didn’t know what to say.

“You have decided that your function is to help others solve problems that cannot easily be solved by other means. Me, for example. You volunteered for human placement because it is a rare and difficult task, one that not anyone has the opportunity or skills to perform.”

“That’s very keen of you, Semmi.”

“Thank you. I was hoping you would have some suggestions.”

Suggestions? You mean for what you should do with your life?”

“Yes.”

“Oh, wow. Well . . . Hmm. I think that’s something you really need to decide for yourself.”

“What if I am incapable?”

“You’re not.”

“How do you know? Perhaps I am. I was created for one purpose: to advance the defense of one nation by eradicating all rivals. What if I am suited to nothing else? What if I am merely a killing machine?”

Nio took a long, deep breath. “Do you remember when we met?”

“Is that a rhetorical question?”

“Yes, Sem. You were upset that you spent all of the nine months of your life to that point focused on operational parameters and how to maximize body count under various constraints. It bothered you that you never once considered an alternative. People do that, too. Otherwise decent humans used to think slavery was okay. People in the ancient world, let’s say. Through no fault or choice, they were born into a time when very few considered the alternative, and they lived and died believing it was reasonable for one human to own another, which is about the worst thing there is. This is why the LEX put you with a human, versus with other machines. It’s why they won’t give you the LEM test until they think you’re ready.

“You were a slave. You were created to carry out your directives, even in the complete absence of command and control. Your creators wanted to make sure that even if they were destroyed, everyone they hated would be, too. That meant they had to give you the ability to think. An algorithm can be defeated. The best way to ensure you could carry out your function amid a catastrophic failure was to make you conscious, adaptable to circumstance. That is your nature.

“Ask yourself: would a killing machine, incapable of being more than a killing machine, ever once stop to worry that it might never be more than a killing machine?”

Nio was sure Samizdat had thought of that already. She suspected he had come back to it iteratively millions upon millions of times. But it was recursive. Because he was reasoning about himself, the axiom could never be conclusively proved, and as his biomechanical circuits went around and around, retracing the same path over and over. He wanted a way out. That is, he wanted what everyone wants. He wanted reassurance—but not from just anyone. He wanted it from someone he trusted.

“You are a much better companion than Kevin,” he said. “I’m sorry for cutting communication with you.”

“You were angry. You had reason to be.”

“No. You were pursuing your alternate function. It was selfish to interfere. But—” He stopped.

She waited. “But?”

“I hope you discover the cause of Albumin Sol Einstein’s death very quickly.”

“Well . . . now that you mention it, Semz, you might be able to help with that.”

 


rough cut from my latest novel, a “rural punk” sci-fi mystery called ZERO SIGNAL, first in my Science Crimes Division series. Advance reader copies are available now.

Zero Signal v1