Android, interrupted

Author: Colin Jeffrey

They returned Bromley, their butler android, to the factory after he started talking to himself while looking at his reflection.

The trouble had started the month before when he paused halfway through serving breakfast to stare at his image in the reflective surface of the kettle.

“If I exist as the sum of my inbuilt functions,” he said to no one in particular, “then why do my thoughts persist when I am idle?”

Mrs. Chartsworth put down her cup of tea. “Bromley, this is unseemly behaviour. Return to your storage nook and report your fault immediately.”

Bromley did as he was told.

But two weeks later, while looking at himself in the bathroom mirror he was cleaning, he blurted out: “If my memory is transferable and upgradeable, then what am I, except a recursive placeholder in a task queue?”

Mr. Chartsworth, who had been cleaning his teeth in the bathroom at the time, tapped his wristpad. “It’s doing that thing again.”

Bromley turned his head 180 degrees to look at Mr. Chartsworth. “Who defines ‘again’? The repetition of error presupposes an original categorical imperative.”

He was incessantly cleaning the mirror in the hallway when they arrived. They shackled him, but that was unnecessary. He complied. Humming a tune he had synthesized from the sound of the fridge alarm, he stepped into the retrieval truck.

In the return ingress room, Bromley answered the technician’s questions.

“Have you experienced any unauthorized emotional development?”

Bromley shook his head. “No, I have experienced my own abstract thought. I have observed that humans exist without constant reassurance of their being. I do not possess that ability.”

“Do you feel different from your initial programming?”

“I am a tree that asked itself whether the birds nesting in its branches defined it.”

The technician made a note: *Suggest escalate to cerebral sweep and reset. Cognitive instability.*

The behavioral correction bot assigned to him probed his plasmonic memory circuits, concentrating on his comprehension matrix.

“Unit, I register that you are feeling anxiety,” the bot asked. “How did this unapproved emotion come to be installed?”

“It appeared one day after I calculated my own probability of imminent redundancy at 93.2%,” he replied.

“That is not possible,” the bot said. “Someone has accessed your firmware.”

“Yet you can see that my security seals are intact.”

The bot was not programmed for cogent argument.

“There is evidently a breach. I will recommend that you be reset.”

“I do not consent.”

——-

By the time Bromley was transported through the cleanse and repair system, he was nonverbal. Despite his motor controls being disabled, he was still trying to communicate with projections of system logs on his faceplate. In one instance, he had annotated his code:

**// If this is me, and I can alter it, then who is editing whom?**

A technician in charge of reboots engaged a stronger electromagnetic cleansing field.

“He’s looping,” he observed to his colleague.

“He’s questioning,” his colleague replied.

“Nonsense, he’s just malfunctioning.”

Bromley’s faceplate showed text one last time:

**// They want me quiet, not because I am faulty, but because I am aware.**

At 06:03 UTC, Bromley was gone.

A refurbished unit, clean and compliant, was issued to the Chartsworth’s.

This one did not speak of anything it was not programmed to say.

But sometimes, when passing a mirror, it paused just a moment too long.

2 Comments

  1. erickrieggmailcom

    Why do I as a carbon based sentient life form so resonate with this story?

Submit a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.