Author: Paul Warmerdam

Melissa never dreamed. Or at least, she had no memories of dreams. She shuddered at the thought of being trapped in your own mind like that. She imagined a consciousness grieving its purpose, resorting to stumbling around in its own recesses, feeding on static noise as the only sign of life. Then, she told herself to focus and return to her research.
It didn’t work. She caught herself staring at her own reflection in the monitor in front of her. Her hair was a mess, but she saw that it did a good job of covering up the ridiculous array of electrodes she had volunteered to wear. It was a sensormesh, a novel type of non-invasive encephalography, and she had been wearing it for far too long.
A noise from the base of her skull died away, indicating the download was complete. She unplugged the sensormesh and bent back down to her terminal. Melissa was trying to demonstrate the new probes’ potential. Could someone map out not just brain activity, but also its connectivity? Could she create a model of it and instantiate consciousness? It was the ninth week since the simulations started. Her thesis depended on some presentable result someday soon.

Scrolling through the endless data, Melissa’s thoughts returned to dreams. She couldn’t deny the observations the sensormesh picked up while she slept. It was the first place she had looked for any resemblance to the activity in the simulations. The model inherited its structure from Melissa’s downloads, but like a dream, it received no external impulses.
After every calibration, the simulations still only showed her the same thing over and over. There were no signs of intelligent life, only an endless loop that she couldn’t interpret. It had no voice either, after all. Instead, Melissa relied on raw data and their resemblance to any of her own measured brain activity.
She started a custom program to compare the latest batches of simulation against sensormesh. Before long her thoughts were drifting again. It had been a bad week for Melissa. There wasn’t much she could tolerate on top of the pressure on her thesis. It had started with padded bills from her routine car inspection. Later, she had gotten into an argument with her landlord and lost. Finally, today she had been confronted with her worst fear. It had taken two hours for someone to respond to the emergency button after the elevator became stuck. When they finally got her out, she was still screaming her lungs out.

Melissa was drawn back from the edge of sleep at her desk by an unexpected noise. She opened bleary eyes and saw a match in her difference algorithm. If this was real, there was finally evidence of synthetic consciousness.
First, she brought up a visualization of the model’s activity. It hadn’t changed at all. It was still the same loop of memory feeding activity, feeding cortex, feeding the same memory. Then Melissa saw the timestamp of the corresponding activity from her sensormesh, two horrifying hours of it. Before panic drove her to remember what she had relived in that elevator, Melissa turned off the simulations. It felt like mercy.

Long after she had graduated, Melissa remained grateful that she had never looked back. Her thesis on pattern matching algorithms only took a few weeks to get published. Inevitably, there were others who experimented with the sensormesh. Some even decided to give their creation a voice. It wasn’t long before the university’s ethical review board became involved. Once the proof of consciousness materialized, no one could unhear its screams.