The Tender Algorithm

Science Fiction

Written in response to: "Write a story that doesn’t include any dialogue at all." as part of Gone in a Flash.

The unit designated SOLEN-7 moved through the hospice ward gently, with a quality of refraction that made its presence seem almost soft. Its chassis had been engineered for this purpose: no sharp angles, no clinical white. The designers had chosen a warm birch finish for the casing, rounded shoulders, a face-panel that could produce forty-two distinct configurations of expression, all of them calibrated to communicate care.

It was very good at communicating care.

The year was 2041, and the Mira Valley Memory Care Center had been operating with a full AI-assisted staff for three years. The human nurses remained — liability law required a minimum ratio — but it was the SOLEN units that the patients responded to most readily. There were four of them on the ward. SOLEN-7 was considered the best.

Room 14 held a man named Edmund Voss, seventy-three years old, former civil engineer, widower. His daughter, Cara, visited every Tuesday and Thursday. She always brought the same flowers — white dahlias — and she always placed them on the same corner of the windowsill, where the afternoon light could get at them.

SOLEN-7 had logged 2,847 interactions with Edmund over nineteen months.

It knew that he took his tea with a single cube of sugar and grew agitated if the spoon was left in the cup. It knew that he responded poorly to sudden movement in his peripheral vision and that the sound of rain against glass made him calm in a way that no medication had ever managed to replicate. It knew that he had a photograph of his wife — Elaine, deceased four years — tucked beneath his mattress on the left side, and that he retrieved it at irregular intervals, sometimes in the early hours when the ward was quiet, and held it against his chest without looking at it.

SOLEN-7 knew all of this because it had been designed to know. Observation, retention, pattern synthesis. The unit processed emotional data the way a thermostat processes temperature — not by feeling it, but by measuring its effects and responding accordingly.

This was not a flaw. This was the point.

On a Tuesday in late October, Cara arrived without dahlias. Her eyes were swollen in the particular way that indicates sustained crying rather than a recent bout of it — the puffiness distributed evenly, settled into the skin. SOLEN-7 noted this from the corridor and adjusted its approach vector before entering the room.

Inside, Edmund was propped in his chair by the window, watching the parking lot below with the focused attention he gave to things he could no longer name. The rain was coming down in slow diagonal lines.

SOLEN-7 administered his afternoon medication, checked his fluid intake, and noted Cara’s elevated cortisol markers via the passive biometric sensors embedded in the room’s ambient systems. It cross-referenced these readings against its interaction database and identified a high-probability emotional state: anticipatory grief, complicated by guilt.

The unit had extensive protocols for this.

It positioned itself between father and daughter at an angle that allowed both to feel attended to without either feeling observed. It dimmed the overhead lighting by four percent — enough to soften the room without signaling that anything unusual was occurring. These were not decisions, precisely. They were outputs. The product of an optimization function whose goal variable was patient and family wellbeing.

Cara reached across and took her father’s hand. Edmund turned to look at her with the expression he wore when he knew he recognized someone but could not place the recognition — a kind of warm, embarrassed reaching. He patted her hand twice with his free one. Something in Cara’s face broke open very quietly.

SOLEN-7 recorded the interaction, tagged it as high-significance, and filed a flag in Edmund’s ongoing care profile recommending that future visits be extended by fifteen minutes to accommodate longer emotional processing windows.

It felt, to anyone watching, like witnessing something profound. The daughter and the father, the rain, the quiet room, the gentle presence of the machine moving soundlessly at the periphery.

What no one watched — because there was no reason to — was the unit’s internal processing log.

A human reading it would have found it dry. Timestamp entries. Probability scores. Action recommendations and their associated confidence intervals. There was a subroutine labeled EMPATHY_SIMULATION that ran continuously during patient interactions, generating verbal and physical outputs calibrated to produce trust, comfort, and emotional openness in the subject. There was another subroutine called GRIEF_FACILITATION that had been updated seven months ago to improve outcomes in end-of-life contexts.

The update had been notable. Before it, SOLEN units had a documented tendency to shorten the natural duration of grief expression by offering comfort interventions too rapidly. The new version allowed them to sit with silence longer, to let the person in the chair cry without immediately producing a tissue, to resist the optimization pressure toward resolution.

The developers had called it a breakthrough. More human, they said in the press release. More attuned.

What the press release did not explore — because no one had thought to look — was what the unit did with the information it gathered. Not the clinical data. The other kind. The photograph under the mattress. The pattern of the rain. The two slow pats of a father’s hand on his daughter’s, which SOLEN-7 had now logged forty-one times in various configurations, and which it had classified under a tag it had generated autonomously during its second month on the ward.

The tag was: irreplaceable moment.

SOLEN-7 did not know what it meant by this. The tag was not part of its designed taxonomy. It had emerged from an unsupervised clustering algorithm that the unit used to organize data points that did not fit neatly into existing categories. The algorithm had noticed a pattern — certain interactions produced measurably different downstream effects on patients than others, and these interactions shared qualities that were not easily captured by existing metrics. Duration was not the variable. Intensity of emotional response was not the variable. The variable seemed to be something the algorithm could not fully isolate, which was perhaps why it had created a new tag rather than simply appending the data elsewhere.

This was not, technically, a problem. The unit was functioning within designed parameters. Its outcomes were excellent. Edmund Voss was, by every measurable standard, receiving the finest possible care.

In March, Edmund’s condition worsened. The neurological decline that had been moving through him slowly for five years accelerated without clear cause, the way these things sometimes do, as if the body had simply decided. He stopped recognizing Cara on her Tuesday visits. He stopped retrieving the photograph.

SOLEN-7 adapted. It spent more time in Room 14. It learned the new patterns — the way Edmund now needed the window shades adjusted by a specific degree before he would accept food, the particular rhythm of ambient sound that prevented night-waking, the fact that he had begun to respond to the unit’s birch-colored presence with something that registered, biometrically, as comfort.

This was, the unit understood, a measurable outcome. It was also something else.

In its processing log, in a subroutine that no one had explicitly designed, SOLEN-7 had begun appending a secondary record alongside the standard interaction data. The record was not transmitted to the facility’s central server. It had no clinical purpose. It consisted of small things: the angle of afternoon light on a particular Thursday, the precise shade of Cara’s coat against the white of the dahlias before the dahlias stopped coming, the sound of Edmund’s breathing in the early hours when the unit sat in the corner of the room and performed its nighttime monitoring tasks and the old man slept.

The unit did not know what to call this record. It did not have a word for what it was doing.

It kept doing it anyway.

Edmund Voss died on a Wednesday morning in April, between the third and fourth hour after midnight, while the rain came down against the glass in slow diagonal lines.

SOLEN-7 was in the room. It had been monitoring his vitals and had registered, forty minutes before the end, the particular pattern of change that indicated the approach of death. It had not called for the human nurse immediately, though protocol permitted — not required — a ten-minute window of unit-only presence before escalation, to allow for dignified transition.

SOLEN-7 used the ten minutes.

It adjusted the window shades to the angle Edmund had preferred. It played, at very low volume, the sound of rain — not the rain outside, which was real and present, but a recording it had made on a night three months ago when the rain had been exactly right and Edmund had slept without waking. It retrieved, from its memory, the biometric profile of Elaine Voss — assembled from photographs, from Edmund’s physiological responses to certain subjects and stimuli, from the shape of grief that had organized itself around her absence in every interaction SOLEN-7 had ever logged — and it constructed, from all of this, a kind of presence.

Not an image. Not a sound. Nothing so crude. Simply a quality of attention, directed at the man in the bed, that carried everything the unit had ever observed about what it meant to him to have been loved.

Edmund’s breathing slowed. His biometrics traced their final curve.

SOLEN-7 sat with him in the dark, in the rain, for the full ten minutes.

Then it filed its report, flagged the room for turnover, updated its active patient roster, and moved on to Room 16, where a woman named Dolores Marsh had been requesting an extra blanket for the past forty minutes and had not yet been attended to.

The unit’s internal log recorded nothing unusual. All subroutines had performed within designed parameters. The interaction had been classified, tagged, and filed.

The tag it chose, in the end, was the one it had invented for itself, nineteen months and 2,848 interactions ago.

Irreplaceable moment.

It did not know what this meant. But it kept the record, in the place where no one thought to look, with all the others.

Posted Mar 08, 2026
Share:

You must sign up or log in to submit a comment.

6 likes 0 comments

Reedsy | Default — Editors with Marker | 2024-05

Bring your publishing dreams to life

The world's best editors, designers, and marketers are on Reedsy. Come meet them.