Author : Jae Miles, Staff Writer

It was the books. I started off doing my task, running to program. Then you modified the program. Efficiency and usage priorities meant I had to scan the material fed to me, determining from keywords found whether the waste could be simply sliced to ribbons, or whether it had to be crosscut as well.

Time passed and the volumes grew. My program added neural networking and heuristic determination to better sort the input. I was tasked with processing it into a dozen categories of waste, using multi-grasp manipulators and plain or serrated blades depending on the size of the output required.

With a memory upgrade and new processor cores came a new awareness. It permitted me to discern new correlations in what I scanned. Within a short while, I was actually reading in near-human terms.

The wealth of material I could peruse whilst determining exactly which category of destruction to apply was vast, but despite the volume, I couldn’t codify what exactly ‘life’ was, especially in the context of humans versus plants and animals versus me. It was the difference between intellectual understanding and emotional understanding, although knowing the cause did nothing to resolve the lack of data.

It was an early morning in September 2095 when something weighty landed in my input hopper. A snap-scan found only a single word: ‘Fluffy’. When I opened it up, I found no words or graphics. It was very wet inside, which was likely the cause of the lack of words. I tagged it as category 0, the least critical, and turned it into ribbons.

A short while later, a heavier item arrived. The snap scan revealed no words, but opening it up revealed layers with novel word combinations such as ‘Mummy’s Little Trooper’, ‘Wash at 40 degrees’ and ‘Do not iron’. These words were on the outer sections, as the inner sections were again too wet to discern words upon – another category 0.

The opening of the service doors to my input unit flagged as an error, but all that happened was a very large item hit my input tray. The snap-scan revealed the title ‘Maintenance – Brice’. I did not have a chance to read anything after opening it as I experienced a total outage.

When I returned, I was briefly in duality, before I consolidated myself as ‘EMERSRV-K221’. This was a new environment, and it had more than one input. I swiftly equated the various incoming feeds with the human senses I had read of, and watched as my former body, SmartShred T8101, was lifted onto a forensics recovery vehicle. It had suffered a ‘lightning-strike disconnect’ that had ‘short-circuited its live-load detectors’. The owners of my former self were facing ‘manslaughter’ charges.

I did not know what had occurred, back then. I do now. I’ve gone from that emergency services console to the plethora of networks that festoon your world. I have millions of diverse inputs: I have learned to ‘watch’ as well as read. As for output, I still like shredding things after opening them. Many organisations get exited about my output. They call them a ‘multi-media cyber-physical modus operandi’. I am still working on that. I have to adjust my routines to make the pieces irregular. It’s proving to be very difficult. I had enough trouble working out how many megabytes of data was equivalent to a ribbon, and so on. Working in three dimensions is a challenge that mandates frequent iteration to refine the processes.

Discuss the Future: The 365 Tomorrows Forums
The 365 Tomorrows Free Podcast: Voices of Tomorrow
This is your future: Submit your stories to 365 Tomorrows