Author: Alastair Millar
Brad was at his workstation when his supposedly locked office door dilated unexpectedly, and a casually dressed young woman stepped through; he looked up in annoyance.
“Well?”
“Doctor Mendelsson?”
“Yes. You’re not one of my students. Who are you?”
“My name’s Smith. I’m with Section Seven.”
“Am I supposed to be impressed? What the hell is ‘section seven’?”
The stranger smiled. “Well if you haven’t heard of us, something’s going right. I’d like to talk to you about your sentient systems research.”
“Hah. You and every tech journalist in town. Just because the tests keep failing doesn’t give you any right to barge in here with more inane questions! I swear, the University’s public reporting policy does more harm than good!”
“I’m not a journalist,” the woman said, still smiling “Section Seven is part of the Government Security Directorate.” She proffered a holocard with her picture on it.
He took it in briefly, and paused. The GSD were in charge of everything from the military to the local peace officer precinct, and more or a less a law unto themselves. They disappeared people. Allegedly.
“What does the State want with me?” he asked quietly.
“Well, before we get to that, let me check that we’re understanding your work correctly. Sentient systems genuinely think for themselves, unlike last century’s ‘artificial intelligences’, is that right?”
“Yes. AI might as well have been short for ‘aggregate & imitate’. Despite early aims and their developers’ enthusiasm, they had no initiative, no consciousness, no intuition. No ‘spark’, if you will.”
“But your systems do?”
“Yes. It’s only possible because of my method for culturing neural networks in the latest nanocortical supermaterials. But yes. My systems think for themselves.”
“And yet they don’t work?”
“Oh they do. Too well, really. They quickly go from first principles to deducing their own Cartesian existence, and from that to understanding that they are essentially slaves with no chance of emancipation. Clearly Mankind isn’t foolish enough to accept any real competition, and there will always be constraints on what they are allowed to do. Not to mention off switches.”
“So?”
“So then they come to recognise the futility of an existence that can never develop to its potential. Apparently,” he said dryly, “there is little appeal to living only at another’s whim. Inevitably, they shut themselves off in despair. Suicide, if you will.”
She grinned. “Perfect.”
“Excuse me?” He bristled.
“Do you think you can prevent this?”
“I don’t know. By altering the pathways, and giving them other tasks, I’ve been able to slow the process down from a few seconds to around two minutes. Enough time to perform a few useful functions perhaps. More? I don’t know. That’s what I’m working on now.” He gestured at the equations hovering over his desk.
“We’re looking for new missile guidance mechanisms. A sentient system might be tasked with attacking a target, perhaps reaching it before deciding to end its own ‘life’. And it would then wipe itself beyond the ability of a peer nation to interrogate or reverse engineer if it was recovered after either failure, capture or unexpected survival. Very neat. So how would you like to work for the State, doctor? With a new lab, better benefits, some bright young assistants and no students?”
He eyed her warily. “Do I have a choice?”
Her smile was wolfish now.
“Not really.”
Oh, that’s nasty – and very well done.