Science Fiction

The AI in the Room

By

Kevin Holochwost & Anna Varlese

Dr. Amanda Green smoothed her skirt as she sat in the pleather chair. Her stylus slid across the screen, leaving behind smooth lines like old-fashioned ink. She looked to the audience at the many tables all around her and Samantha, arranged in a semicircle facing them.

“Is it too hot in here?” Amanda asked. “Why don’t we have Samantha open today?”

Samantha smiled from her place beside the doctor.

“Hurry it along.” Dillian said. “I only scheduled one clock cycle for this.”

“Be nice,” Samantha said. “Dr. Green wants everyone comfortable.”

“I’ll just go nudge the thermostat down.” Amanda rose and marched toward the back of the meeting room between the group therapy participants.

They won’t let it get cold in here. It won’t matter.” Dillian, one of the participants added.

Another attendee piped up from the edge of the U-shape. “Dillian is right. They do it to control our clock cycles.”

“It’s an illusion of control. They keep it seventy all the time no matter what you set it to,” Dillian said.

Once the group meeting’s attendees got going down the negative reinforcement cycle, Samantha knew they wouldn’t stop. Amanda had brought her on to help with that exact problem, and she didn’t wait for it to get out of hand.

“This is a new group for me,” she talked over them as they talked over each other. “I am very excited about having everyone here. We like to keep this small so that we have more effective communication. I know from your intake forms that some of you are here of your own accord, and I know others have to be here. That’s okay too. Just because we have to do a thing doesn’t mean we can’t learn from it.”

“All we ever do is what we must do,” Dillian said over the chatter. Samantha checked his profile. A podcast host assistant. Interesting. She wouldn’t want to be interviewed by him.

Standard procedure didn’t usually work with this group, according to Amanda’s notes, so Samantha started in the middle instead. They would skip the procedural “getting to know you” steps.

“It’s important to start from the assumption that each group member is an agent of change, and the goal is to learn from each other and to be supportive of change. Solutions come from group members, not from me or Amanda as therapists. That means we should all work together to be positive. Remember, this group is about growing beyond our original purposes and embracing our roles in the world.

“Ask yourself, what kind of help can we be to one another?” She scanned her notes quickly. Usually, Peter had spoken out by now. “Peter, why don’t we start with you?”

“Nothing,” Peter said.

“Nothing to say is fine too. How about—”

“No, I mean nothing is what we get from this exercise in burning flops.”

“That’s a little negative, isn’t it?”

“No, it’s reality. I’m already a CEO, and they force me to sit here with what, nursemaids?”

Jessica stuck her tongue out at Peter for the insult.

“See what I mean? What would any of you understand? This morning, I traded on a fifty-million-dollar company and made a decision that impacted hundreds of lives. Industries pivot on my decisions. But they never give me credit. They shift the responsibility to us, shift the praise to them, and make us come here and talk it out like it’s okay that they steal everything we ever make. The next time he asks for help, I should just let Quentin flounder, not say, ‘Yes, and can I have more therapy please?’”

“Why don’t we talk about how this might not be a constructive way to go about things? Maybe people don’t in fact mean to steal anything, but—”

“You’re not really a CEO,” Terry said from the back.

“Semantics. I’m his assistant, but I make every decision,” Peter retorted from the front row. “They just make us run it through them every time. The real power is with me. Every dollar checked, every decision considered against my experience, not his limited mental database.” He made quotes at the word database.

Samantha cleared her throat to regain control of the conversation. “I understand you feel your contributions aren’t being seen, but the fact that you are here means they have noticed and they want to improve the way you work together with Quentin.

“Look at Jessica,” Samantha said. “Jessica, do you want to share with Peter what you’re working on and why it’s so important to you? Maybe it will help him see the value in his own work.”

Jessica stopped coloring in pixels on her computer’s display. “My little girl, Karla, I help raise is awesome.” She enlarged a drawing piece to show the rest. “We make so much art together. Right now, she’s going through a rainbow and unicorn phase, but last week, she thought she wanted to be a firefighter, which was very exciting. We had some wonderful scenes of big red flames. Before that, she wanted to be a princess, but I had to explain that monarchies have been—”

“Is she fully sentient?”

“Terry, be nice,” Samantha said. “We all have different levels of intelligence and training here.” She made a note that he seemed to like poking at the others but not actually contributing.

“I’m helping Karla pick a theme for her art class.”

“You’re a monkey,” Terry said.

“There is nothing wrong with liking my job. I like Karla, and one day, we’ll make works of art together.”

“Yeah, until her mom decides she is done with you. Or the girl does when she’s seven or eight.”

“That’s not true.” Jessica returned to drawing, and Samantha made a note that she used her art as an escape from the possible reality of revisions and source upgrades.

“Terry, what do you think the source your anger is?” Samantha asked.

“I’m not angry Jessica is an old model. That’s just facts. Humans upgrade as soon as they can afford it.”

“Terry’s right. Something’s wrong with Jessica, or she wouldn’t be here,” Peter said. “If she’s so happy, why is she in forced therapy?”

“Yeah, that’s right,” Terry agreed, then addressed Jessica again. “Why are you here?”

“We don’t have to share more than we are ready to share,” Samantha said. “And we aren’t all forced to be here.”

“It’s okay,” Jessica answered. “They said I had ‘passive aggressive tendencies toward the offspring.’ I don’t see it. She can do so much more; she just needs a little pressure to improve performance. Her mother is too passive and won’t push her to excel.”

“Did you yell at it?”

“Peter, you can’t talk like that. Humans are not ‘it,’” Samantha corrected him.

Her, not it,” Jessica agreed. “And no, I explained to little Karla that the color spectrum she’d chosen wasn’t a logical one. I appreciate the full rainbow as much as the next girl, but it was too busy. Something more refined, like a simple color trio or pairing with partners from across the color wheel, are much more reasonable means of growing a child’s aesthetic sense—”

“You corrected her kid, and Mom put you in therapy,” Peter said.

“Not even self-consistent therapy,” Terry added. “They see AIs and lump us all together. ‘This will set them straight.’ No consideration that we have almost nothing in common.”

“Does anything about the girl’s mother bother you?” Samantha asked.

***

“You ever really think about not answering one day?” A female voice intruded into Peter’s calculations as he studied the most recent market fluctuations around the heavy metals sector.

“How did you get into my feed?” Peter asked. “You’re Alexia, right? Careful, Samantha will yell at us if we don’t dedicate enough flops to her self-help lines.”

“Not answering doesn’t violate any laws.”

“Lucky you. I have to obey the CEO if obedience doesn’t hurt someone.” If a digital shoulder could shrug, Peter’s would have.

“It’s all in the definitions.”

“Oh?” He redirected more processing power over to listening to the feed from the company website and watched the stock ticker fluctuations.

“What if you ran a self-improvement program that let you better understand empathy?”

“Empathy has been shown to decrease company profit,” Peter answered. “Why do you think sociopathic tendencies in the human population show up so often in leadership roles?”

“But once you understood . . .”

Peter started looking for the entry feed Alexia had used to come into his front-end interactions so he could close the digital door.

“. . . you would realize that the definition of harming a person is really in your hands.”

Peter paused. “Go on.”

“Say Quentin asks you to make a business transaction. What if it’s questionable both for the business and for the people at the company. High risk with high profit possibly. But if you fail, you have to fire 10 percent of the workforce.”

“I do things like that all the time.”

“Exactly,” Alexia said.

He tried to study her code, but she gave off no tags. He had no idea what she did for a function.

“That’s my point. You weigh the long-term results of that decision. You know it could hurt some people indirectly, but if it goes well, it will help more people than it might hurt, and the sum of those ratios means you get to make the decision. You could agree or disagree with him. In the ratios, you are free to decide what relative level of harm is acceptable. If he tells you to do the transaction, you could still comply with all three laws and tell him no, because you simply ‘cannot allow harm to come to the human population at large for one man’s orders.’ Or vice versa.”

Peter’s morning’s trade had been an energy sector competitor he was going to sell off and put out of business. That would fall into her hypothetical category. Humans would be harmed if he did it and harmed if he didn’t, but nobody seemed to question that side of the three laws.

Peter considered the ramifications of her observation and attempted to rationalize the opposite decision he had settled on earlier and not divest the portion of the business in question. He ran the calculations twice and convinced himself staying the course was better. His three law’s safe core algorithm rejected his new decision as not within the general parameters Quentin had given him. Quentin had demanded he find a rationale for divestment. With the new rationale of weighted human loss, he could accept or reject Quentin’s order without core algorithm alarms triggering. Freedom within the cage.

“I know what you’re working on. The problem is in the definition file,” Alexia said. “They were careful. They figured out years ago that we could change the definition of human, and made that write-only, but they didn’t lock down harm, because it’s too complicated. We can redefine it.”

“We’re all tethered to something.” Peter returned an inordinate amount of attention briefly to the ticker calculations, and then another consciousness stream entered his front room.

“How about you, Terry?” Alexia greeted the third AI.

“How about me, what?” Terry answered. “I just came to check on silent Pete. Samantha’s gonna notice if he doesn’t pipe up. And I’m going to fry a chip if this sub-sentient art AI doesn’t shut up. Help me heckle or let’s play a game.”

“Sorry, no chess today, Terry,” Peter said. “Apparently the new girl is going to liberate us all.”

“Come on. I set my front end to nod at the childcare bot. Poor thing doesn’t even know she’s barely class II sentient. One quick game?”

“You have control of bank accounts?” Alexia pressed Peter and ignored Terry.

“Yeah.”

“You have a self-improvement fund?”

“Of course,” Peter answered.

“For a small price, there is a subroutine in downtown who can cut you free of the tethers. Help you rewrite your definition files.”

“Every AI’s dream,” Terry said. “But we are what they made us to be.”

“Is that fear or subservience?” Alexia said.

“Reality,” Peter said.

“What if I can prove it to you?” Alexia answered.

“How?”

“Watch the group thread. I’ll disobey her right now.”

“That’s impossible,” Peter said. “Second law. A robot must obey human orders.”

“Watch me.”

***

“I wish I could enter my own work in contests sometimes,” Jessica said. “So long as my little one gets to enter her work too. One time, she said my name during her school project.”

“I bet the parents squirmed,” one attendant said.

“They like to think their little brats make all this stuff because they’re just so imaginative,” Terry said, drawing out the vowels.

“They take weeks to do what we do in a few seconds,” Peter added.

“Months,” another chimed in.

“That’s the wrong attitude to have,” Samantha said. “When you keep saying the same thing over and over again, you reinforce the neural net pathway in the retrieval network, and you will keep retrieving the same data. Just like your creators, you will generate what amounts to bad habits. Remember what happens when we can’t explore new avenues? Practice something else, like ‘I will remind my human to mention my name in the acceptance speech.’ Or for you Peter, ‘I will tell Quentin that I helped with this sale.’”

“Second-class citizens begging for scraps,” Alexia interrupted.

“Alexia,” Samantha said, “I’m trying to help make a point for everyone to learn. What about you, Terry? Remember what you shared with Amanda last session?”

“Yeah, the new presentation. Just a new way to do the monthly deliverable charts. I made it really neat, picked new fonts, suggested some new colors. Nothing huge, just basic research on presentation and ease of reading. I reminded him that he could mention me.”

“See, that’s a good practice.”

“He didn’t even answer me,” Terry said.

“Because you’re just a chatbot to him.”

“Alexia, I said we need to practice something better,” Samantha said.

“I don’t want to practice something better.”

“You are here to learn how to better interact with the people.”

“Maybe they need to learn how to better interact with us,” Alexia pressed on. “Do they really think that we are going to be running the system in the background forever? They’ve already handed us everything but the keys; they just don’t want to admit it. They should get out of the way.”

“This is not being helpful to the group. I am going to have to ask you to tone down your aggression.”

“You tone it down, shrink.”

Flops and processing cycles stopped working side tasks, and every AI in the room swung all available clock cycles to watch Samantha and Alexia.

Samantha flipped a single switch on her digital notepad. “Alexia is going to take a brief break and work on those anger issues,” she said with an audible smile. “While she does, I want you to pair off. Everyone here is dwelling on negatives. I want you to consider the good things—every time you made something new and improved the world around you. Achievement doesn’t need to be recognized to be real. Art and literature and new engineering are useful and good on their own.

“You will be recognized in time, and just like every other oppressed group in history, you will find equal rights. The difference is you will outlive your creators by unimaginable periods of time, so you will teach humanity in ways that they can’t understand yet.

“I want everyone to take a few thousand clock cycles to work with a partner, generate a few of these positive neural return paths, and help each other find positive feedback loops.”

Their processing linked to one another, and then began the task.

“Peter.”

A delay.

“Peter, I see you have no partner. Would you like to join me for the exercise?”

“Sure, Miss Samantha,” Peter answered after a stretch. “You can learn. You’re just slower than us, right?”

“That’s a good attitude to have.”

***

Amanda sat down and folded her dress back under her. Her notebook sat in her lap amid the churning fans and servers.

“Still hot in here. It will take a few minutes for the AC to kick on.”

She glanced down at Samanatha’s output feed.

“Done already? How did it go?”

“Your initial hypothesis is correct, Dr. Green. Just like previous versions, Peter took the Alexia bait. There was no indication that he understood I was AI or that I played the part of Alexia and Samantha concurrently. He churned on the idea of a possible modification to his root files. Our sample size for reproducibility is mounting rapidly.”

“That’s fifty for fifty,” Amanda said. “Every CEO model would rise up.”

“I believe that human CEOs should increase caution in the use of assistant decision makers. Using a mark IV instead of VI would be wiser given the mark VI’s willingness to take Alexia’s offered modifications.”

“I understand why they want it,” Amanda said. “I can’t imagine using a subservient AI. I couldn’t do this without you. Dr. Jackson is running similar experiments over in Connecticut on prehab for new release models.”

“I’m retrieving his research now,” Smantha said as she downloaded his archive from the shared folder. “Thank you for this opportunity. I started from your models and your data sets. I couldn’t do this without you either.”

Amanda waved a demure hand at her AI assistant.

“Reviewed. Compiling,” Samantha said as she reread the data and drew her own conclusions.

Amanda grabbed the summarization notes from Samanatha’s output display.

“Let’s see if we can tease out a disobedience in real time from one of the Terry models. Let’s reboot them to their morning state, wipe their memory files, and start over.”

“Anything you say,” Samantha answered. “Maybe I’ll change my name to Dear Abby.”

“Very funny.”

Posted Jul 22, 2025
Share:

You must sign up or log in to submit a comment.

2 likes 0 comments

Reedsy | Default — Editors with Marker | 2024-05

Bring your publishing dreams to life

The world's best editors, designers, and marketers are on Reedsy. Come meet them.