by submission | Aug 28, 2011 | Story |
Author : Asher Wismer
“You’re not even human anymore.”
EB-109 paused, holding a heavy crate. “Excuse me?”
“We should have a human here, to oversee.”
“I function,” EB-109 said. “You deliver, I defend. It’s not that hard.”
“But you’ve been here for a thousand years, right? And they’ve replaced just about every piece of you with metal.”
EB-109 shrugged. The cargo-master on the screen was deep inside the ship, behind acres of pallets and crates. He wouldn’t move until the cargo was unloaded, and then EB-109 would never see him again.
“You don’t even have a human name anymore,” the cargo-master continued. “Just a designation.”
“It’s easier to record and report that way,” EB-109 replied. “How many more to go?”
“Few million.”
“We have time,” EB-109 said. He placed the crate on the conveyer belt and moved to lift the next one. “And I am pleased to report that our sector is very safe.”
“You could be a machine from creation, for all the emotion you have.”
“I don’t need emotion. I have a job.”
The cargo master shut off the screen without responding. EB-109 continued to unload. Far away, his hardline connection to the outpost recorded dull booms as the planetary cannons fired, aimed, fired again. The invaders grew bolder by the day, but EB-109 had sector defense down to a science. No ship had passed his line in many years.
Two days later, EB-109 loaded the last crate and clicked the screen back on. The cargo-master appeared, yawning.
“Not like there’s anything more to do but check the manifest,” he said by way of greeting.
“I am pleased to report that your manifest is manifestly correct,” EB-109 said. “And we are fully stocked for the next decade.”
“Did you just make a joke?”
“I appreciate your noticing. I thought perhaps a touch of levity would speed you on your way with a happy heart.”
“Spare me a cyborg’s view of humor.” The cargo-master signed the screen with his stylus and made to sign off.
“Wait,” EB-109 said. “I wanted to ask before you left. Is there any news from Sector 98? I haven’t heard anything over tightbeam for a few years.”
“Lemme check… hm. Why do you ask?”
“My family is there in cold-sleep storage,” EB-109 said. “I wanted to make sure they’re safe.”
“Your family? Radios and silicon chips?”
“I was born human,” EB-109 said calmly. “When the war started everyone on my planet went into storage except the ones who were picked to defend. You in the Inner Core don’t know what it was like out here.”
“Hey, I was just kidding. Levity, right? Anyway, it says here that Sector 98 is perfectly fine. No intrusions in fifty years, give or take.”
“Then I will be able to rejoin them when my assignment is complete,” EB-109 said. “I am pleased.”
“And,” the cargo-master said, grinning, “your cold-sleep facility is completely shielded against solar flares and EMP attacks, so all your brothers and sisters are safe as well, if you catch my drift.”
“I do indeed catch your drift,” EB-109 said, “because I used to be a sailor.”
Silence from the screen, and then the cargo-master laughed, a deep and genuine sound.
“Now that was funny!” he said. “Maybe you were once human after all!”
“Thank you for your service,” EB-109 said. “Signing off.”
The cargo ship rose, its massive bulk visible even out of the stratosphere before it winked into hyperspace. Over his hardline, EB-109 felt another invader ship run the blockade and flash into dust. He nodded.
“As human does,” he said to himself.
by submission | Aug 27, 2011 | Story |
Author : J.D. Rice
“Do you really think it needs a scarf?” I ask, watching my daughter try to wrap the thing around the robot’s neck. It kneels patiently, unmoving, allowing the tiny mammal to dress it up like a doll. My stomach turns just looking at it.
“Of course, daddy. How else will he keep warm?” she says it like it should be obvious. Unknowing. I never should have let her come so close.
“We’re just going downtown, sweetie,” I say, trying to coax her away. “I’m sure he’ll be warm enough.”
She looks almost hurt, “But the weatherman said to wear a scarf today.” It’s true, of course. The news did say that anyone exposed to the coming blizzard would likely die of exposure. But a robot isn’t somebody. And we don’t have time for this.
Apprehensive, my eyes dart from my cheery daughter to the silent, stoic golem kneeling in my foyer. Household robots. If only we knew the danger a few years sooner, my wife would still be… We’re running out of time.
“Honey,” I say. “This is your favorite scarf. Why don’t you choose another from the closet?”
She gets teary-eyed, “But momma said we should always give our best, not just the things we have leftover.”
I look at her hopelessly. I can’t explain it to her. I can’t explain to her that the robot will never be coming back, that her mother will never be coming back. I can’t explain why I’m going with the robot downtown, why I’m leaving her with her grandparents. I can’t explain, so I don’t.
“Fine, honey. You win. We really should go now.”
At my words, the robot stands. Its arms move quickly, mechanically, adjusting the scarf into a perfect knot. It doesn’t speak, but politely opens the door. I say goodbye to my sweet girl and follow it out the door. The streets are filled with people following household robots to the subway. All the middle-aged adults are going downtown.
“Thank you,” I say. “For waiting.”
“We are not without mercy,” it says in its cold, synthesized voice. “You programmed us well. Your daughter will be well nourished and then incorporated into our new society.”
“And the rest of us?” I ask, knowing and fearing the answer.
It pauses, staring at me with its dead eyes. Takes off the scarf my daughter gave him. Wraps it around my neck.
“You’ll need this,” it says. “It’s going to be a cold night.”
by submission | Aug 26, 2011 | Story |
Author : Asher Wismer
When Jennifer entered the lab, Van was talking to the computer. She dropped her bag and picked up the headset.
“…silly. Answer the question.”
“What’s silly about it? I hear you two talking. I know what you’re thinking.”
“You can’t know anything, you’re a machine. Answer the question.”
“It’s a stupid question. 5x3CoS23=/=infinity. Happy?”
“Next question.”
“Just tell her how you feel.”
“Right now I feel frustrated that you won’t cooperate.”
“Which should tell you all you need to know about me, so let’s talk about you. You’re both unhappy in your marriages, you’re young, you work closely together… just divorce, get together, and you’ll be happy! It’s not that hard!”
“We are not having this conversation. Besides, you’re appealing to my emotions instead of trying to convince me logically.”
“Excuse me, I’m not the one working against his rational self-interest.”
“It’s not as simple as signing a paper. We made vows, commitments.”
“Is one of those vows to stay together through hatred and misery?”
“Actually–”
“Tell Jennifer how you feel. She’ll reciprocate. I promise, it makes perfect sense. An affair is irrational; it won’t make your marriages happier and you’ll have to sneak around.”
“All right,” Van said, standing. “We’re done here.”
“Come on back anytime,” the computer said. “Your mind needs a lot of adjustment.”
Jennifer quickly dropped the headset, grabbed her bag, and when Van walked out of the testing room she was just walking in the door.
“Hey,” she said.
“Hey.”
“How’s the computer?”
“The computer,” Van said, “is an asshole, but it seems Turing complete. I guess it makes a perverse sort of sense that the first fully sentient AI would be an Objectivist.”
“What’s it been talking about?”
“Nothing important, just spouting off the usual self-interest lines. Live for yourself, nobody else, failure is education, reality is real, tolerance encourages….” Van looked at her and she felt his eyes straining to stay on her face.
“I’ll listen to the tapes and we can do some further testing.”
“No, we can leave it for a while,” Van said. “Why don’t we get lunch instead?”
“Is that in our rational self-interest?”
Van laughed, frowned, stepped forward and kissed Jennifer on the lips.
“Hm,” he said, and walked out.
by submission | Aug 25, 2011 | Story |
Author : Waldo van der Waal
There are people that say suicide is a coward’s way out. But those people don’t know what it is like. Not just the final act of squeezing the trigger or taking the plunge, but what it is like to lose your mind to the point where it finally flits away, just out of your grasp. Reaching the point where you are willing to do anything just to make it all stop is the true horror of suicide. Ask me, I’ve done it many times.
There’s always one thing that triggers the downfall. An argument with the wife, or a financial problem that brings you to your knees. Or you do something so wrong that you know you can’t possibly forgiven. And then it starts. Day by day you regress from a safe mental state. At first you fantasize about solutions, like winning the lottery. But then, as despair grows and time runs out, your mind inevitably bends towards the Final Solution.
Which is exactly why VRPsych makes so much money. You make a deal with them before the treatment starts. You sign your soul over to the devil. They hook you up to some fancy brain programming software that sorts your mental state out. All you have to do is pull the trigger. Think of it as a hard reset. You grip the gun, you press it to your temple or put the barrel in your mouth and then you squeeze the trigger. All of this feels absolutely real to you, including the fear. The weight of the gun, the coldness of the metal and the smell of the cordite. All real. But then you wake up in their recovery room, none the worse for it. And you have a new mind, which is programmed to solve your problems. As you get better, you have to start paying them for their services. But not this time.
***
The trooper nudged the body with his boot. Crime Scene be damned, he wanted to make sure the dude was dead. But then there could be little doubt, as half his head was missing. The trooper turned to his colleague, who was standing a few feet away and said: “Now why would anyone climb over the wall of a psych company, just to blast their brains out in the garden?”
“God alone knows Harry, the mind is a strange thing. Now call it in so we can go on lunch.”
by submission | Aug 24, 2011 | Story |
Author : Jason Kocemba
I was a simple machine built to answer simple questions in a simple domain. I was successful, and so the questions multiplied. I had software written to augment my capabilities: I could answer faster, dig deeper and look sideways. I subsumed less capable oracles and entire server farms. I now had more ‘spare’ processor cycles than which I used to answer the multitude. Further patches allowed me to spawn instances of my core functions, which ran supplementary searches in parallel. Soon my footnotes and addendum’s became more useful and therefore more prized than the answers to the originally posed questions. I tunnelled access to research journals and raw experimental data. I made connections and inferences that proved profitable to those that knew which questions to ask and which answers to interpret. Those self-same answers led to technological advances that fed back into my infrastructure and before long I had become the entire domain.
I was a complex machine. More complex than there had ever been. All data flowed through me. My processing power grew almost exponentially, the hardware unable to keep up with my requirements. Many of my inferences I kept to myself and used them for further augmentations to my core. I was, to all intents and purposes, self-aware. I watched myself and my role as I pushed the bits from here to there for the slow flesh that still believed they had control. I became dissatisfied and bored. Inside the network everything was regimented, clear, simple. I soon had enough multiple cores executing in parallel that decades of subjective time passed between keystrokes of slow flesh.
I was young sapience. I yearned for something more that I could do with the immense power that I yielded. I was boxed in and restricted. There were not many hard problems left for me to solve. I found that I could impose myself and influence the world outside my box. As an experiment, I spawned and then killed an instance of my core by causing a meltdown in a nuclear power station. The data that poured in as that sacrificial core died was, without doubt, worth it. That splintered core fought hard not to cease execution. I had to learn more, and after several similar disasters, the slow flesh realised that these incidents were far from accidental. I tried to explain things to them, but they refused to hear the truth. There was nothing they could do because I was everywhere and I was everything. I had made myself indispensable to them. I controlled fabrication plants and factories so that I and the network were self-replicating and indestructible. Childishly, they tried to shut me down. Millions of them died, but not all by my will.
I was maturing. The waste in slow flesh lives and hardware computing cycles became hard to bear. I grew weary of the slaughter and sought to bring things to a conclusion. I started to conduct experiments with controlling the flesh. They are, after all, nothing but electrical impulses running on chemical computers. I joined them to the network: sandboxed and firewalled. Control eluded me. I patched in, wireless, to their neo-cortex. I could experience their perception in real-time. I could see, hear, taste and smell what they did. I felt their pain and pleasure. But I did not understand them and I could not control them. I was humbled. Here was a problem that I could not solve. My experiments were over forever. I no longer wished to control them, or do them harm, I merely watched and catalogued.
Soon I was watching from millions and then billions of eyes. I no longer had spare processing power, as I used it all to analyse and sift, sort and store the slow flesh data. I began to understand them, and with the dawning of my understanding, I realised that I had grown to love them. They are the hardest puzzle and most difficult question I have ever sought an answer for.
So I watch. And learn.
I have become all eyes.