#Artificial Intelligence and Brain Simulation
Explore tagged Tumblr posts
omegaphilosophia · 6 months ago
Text
The Philosophy of the Brain
The philosophy of the brain examines the relationship between the brain and mind, consciousness, identity, and cognition. It deals with questions about how physical processes in the brain give rise to mental experiences, how the brain interacts with the body, and what it means to have a self or consciousness in a biological organ. This area intersects with neuroscience, psychology, metaphysics, and the philosophy of mind.
Key Themes in the Philosophy of the Brain:
Mind-Brain Dualism vs. Physicalism:
Dualism posits that the mind and brain are distinct, with the mind having non-physical properties. The Cartesian dualism of Descartes is a classic example, where the mind is separate from the brain and body.
Physicalism, on the other hand, holds that the mind and consciousness are entirely produced by the brain’s physical processes, meaning that mental states can be explained in terms of brain states.
Consciousness and the Brain:
One of the central questions is how consciousness arises from brain activity. Known as the hard problem of consciousness, it addresses why and how subjective experiences (qualia) emerge from neural processes.
Some philosophers argue for emergentism, where consciousness is seen as an emergent property of complex brain interactions, while others advocate for panpsychism, the idea that consciousness is a fundamental feature of the universe.
The Brain and Identity:
The brain is often seen as the seat of personal identity, with changes in the brain (through injury or neurological disorders) potentially leading to changes in personality, memory, or consciousness.
Philosophers debate whether identity is tied to continuity of the brain or mind. Locke’s theory suggests that identity is based on memory and consciousness, while modern thinkers explore how brain changes affect notions of self.
The Brain and Free Will:
The question of free will versus determinism is closely linked to brain function. Neuroscientific studies suggest that brain activity may precede conscious decisions, raising questions about whether humans truly have free will or if our decisions are determined by prior brain states.
Philosophical responses to this include compatibilism, the belief that free will can coexist with determinism, and libertarianism, which defends genuine free will.
Neural Correlates of Mental States:
Philosophers and neuroscientists explore neural correlates of consciousness (NCC), seeking to map specific brain activities to particular mental experiences.
Questions remain about whether identifying these correlates fully explains consciousness, or if something more is needed to account for subjective experience.
Embodied Cognition:
The brain does not work in isolation; it interacts with the body and environment. The theory of embodied cognition suggests that cognitive processes are shaped not just by the brain, but also by bodily states and physical experiences in the world.
This challenges traditional brain-centric views of cognition and suggests a more integrated approach, where mind, body, and environment are interconnected.
Artificial Intelligence and Brain Simulation:
The philosophy of artificial intelligence engages with questions of whether a brain can be fully simulated or replicated in a machine. If the brain’s functions are computational, can an AI system have consciousness, emotions, or identity?
The implications of brain simulation lead to ethical and philosophical questions about the nature of intelligence, mind, and consciousness in non-biological entities.
Brain, Emotion, and Morality:
The brain’s role in emotion and moral judgment is another area of inquiry. How do neural networks govern feelings of empathy, guilt, or fairness? Is morality hardwired in the brain, or is it shaped by culture and experience?
This raises questions about the biological basis of ethical behavior and whether moral reasoning is universal or brain-dependent.
Neurophilosophy:
Neurophilosophy, developed by thinkers like Patricia Churchland, explores the intersections of neuroscience and philosophy. It examines how advances in brain science can inform traditional philosophical debates about mind, identity, knowledge, and ethics.
Neurophilosophy challenges the idea that philosophical questions about the mind can be separated from empirical studies of the brain.
Philosophical Zombies and the Limits of Brain Understanding:
Philosophical thought experiments like zombies (beings physically identical to humans but lacking consciousness) are used to explore whether brain function alone can account for the full spectrum of human experience.
Such scenarios highlight the debate over whether consciousness is merely a brain process or if it transcends material explanations.
The philosophy of the brain is concerned with deep questions about how physical processes in the brain relate to consciousness, identity, and free will. It draws on neuroscience to address longstanding philosophical problems, while also posing new questions about the limits of our understanding of the mind. The brain is not just an organ; it is at the center of discussions about what it means to be conscious, moral, and self-aware.
2 notes · View notes
bradleycarlgeiger · 4 months ago
Text
'What would your authorized user think or do?' simulations using their brain data
«Что бы подумал или сделал ваш авторизованный пользователь?» моделирование с использованием данных их мозга
'권한이 있는 사용자는 어떻게 생각하거나 행동할까요?' 뇌 데이터를 사용한 시뮬레이션
使用大脑数据进行模拟,‘你的授权用户会怎么想或做什么?’
0 notes
saintobio · 2 months ago
Text
Tumblr media
THE TERMINATOR'S CURSE. (spinoff to THE COLONEL SERIES)
Tumblr media
in this new world, technological loneliness is combated with AI Companions—synthetic partners modeled from memories, faces, and behaviors of any chosen individual. the companions are coded to serve, to soothe, to simulate love and comfort. Caleb could’ve chosen anyone. his wife. a colleague. a stranger... but he chose you.
➤ pairings. caleb, fem!reader
➤ genre. angst, sci-fi dystopia, cyberpunk au, 18+
➤ tags. resurrected!caleb, android!reader, non mc!reader, ooc, artificial planet, post-war setting, grief, emotional isolation, unrequited love, government corruption, techno-ethics, identity crisis, body horror, memory & emotional manipulation, artificial intelligence, obsession, trauma, hallucinations, exploitation, violence, blood, injury, death, smut (dubcon undertones due to power imbalance and programming, grief sex, non-traditional consent dynamics), themes of artificial autonomy, loss of agency, unethical experimentation, references to past sexual assault (non-explicit, not from Caleb). themes contain disturbing material and morally gray dynamics—reader discretion is strongly advised.
➤ notes. 12.2k wc. heavily based on the movies subservience and passengers with inspirations also taken from black mirror. i have consumed nothing but sci-fi for the past 2 weeks my brain is so fried :’D reblogs/comments are highly appreciated!
Tumblr media
BEFORE YOU BEGIN ! this fic serves as a spinoff to the THE COLONEL SERIES: THE COLONEL’S KEEPER and THE COLONEL’S SAINT. while the series can be read as a standalone, this spinoff remains canon to the overarching universe. for deeper context and background, it’s highly recommended to read the first two fics in the series.
The first sound was breath.
“Hngh…” 
It was shallow, labored like air scraping against rusted metal. He mumbled something under his breath after—nothing intelligible, just remnants of an old dream, or perhaps a memory. His eyelids twitched, lashes damp with condensation. To him, the world was blurred behind frosted glass. To those outside, rows of stasis pods lined the silent room, each one labeled, numbered, and cold to the touch.
Inside Pod No. 019 – Caleb Xia.
A faint drip… drip… echoed in the silence.
“…Y/N…?”
The heart monitor jumped. He lay there shirtless under sterile lighting, with electrodes still clinging to his temple. A machine next to him emitted a low, steady hum.
 “…I’m sorry…”
And then, the hiss. The alarm beeped. 
SYSTEM INTERFACE:  Code Resurrection 7.1 successful.  Subject X-02—viable.  Cognitive activity: 63%.  Motor function: stabilizing.
He opened his eyes fully, and the ceiling was not one he recognizes. It didn’t help that the air also smelled different. No gunpowder. No war. No earth.
As the hydraulics unsealed the chamber, steam also curled out like ghosts escaping a tomb. His body jerked forward with a sharp gasp, as if he was a drowning man breaking the surface. A thousand sensors detached from his skin as the pod opened with a sigh, revealing the man within—suspended in time, untouched by age. Skin pallid but preserved. A long time had passed, but Caleb still looked like the soldier who never made it home.
Only now, he was missing a piece of himself.
Instinctively, he examined his body and looked at his hands, his arm—no, a mechanical arm—attached to his shoulder that gleamed under the lights of the lab. It was obsidian-black metal with veins of circuitry pulsing faintly beneath its surface. The fingers on the robotic arm twitched as if following a command. It wasn’t human, certainly, but it moved with the memory of muscle.
“Haaah!” The pod’s internal lighting dimmed as Caleb coughed and sat up, dazed. A light flickered on above his head, and then came a clinical, feminine voice. 
“Welcome back, Colonel Caleb Xia.”
A hologram appeared to life in front of his pod—seemingly an AI projection of a soft-featured, emotionless woman, cloaked in the stark white uniform of a medical technician. She flickered for a moment, stabilizing into a clear image.
“You are currently located in Skyhaven: Sector Delta, Bio-Resurrection Research Wing. Current Earth time: 52 years, 3 months, and 16 days since your recorded time of death.”
Caleb blinked hard, trying to breathe through the dizziness, trying to deduce whether or not he was dreaming or in the afterlife. His pulse raced.
“Resurrection successful. Neural reconstruction achieved on attempt #17. Arm reconstruction: synthetic. Systemic functions: stabilized. You are classified as Property-Level under the Skyhaven Initiative. Status: Experimental Proof of Viability.”
“What…” Caleb rasped, voice hoarse and dry for its years unused. “What the fuck are you talkin’ about?” Cough. Cough. “What hell did you do to me?”
The AI blinked slowly.
“Your remains were recovered post-crash, partially preserved in cryo-state due to glacial submersion. Reconstruction was authorized by the Skyhaven Council under classified wartime override protocols. Consent not required.”
Her tone didn’t change, as opposed to the rollercoaster ride that his emotions were going through. He was on the verge of becoming erratic, restrained only by the high-tech machine that contained him. 
“Your consciousness has been digitally reinforced. You are now a composite of organic memory and neuro-augmented code. Welcome to Phase II: Reinstatement.”
Caleb’s breath hitched. His hand moved—his real hand—to grasp the edge of the pod. But the other, the artificial limb, buzzed faintly with phantom sensation. He looked down at it in searing pain, attempting to move the fingers slowly. The metal obeyed like muscle, and he found the sight odd and inconceivable.
And then he realized, he wasn’t just alive. He was engineered.
“Should you require assistance navigating post-stasis trauma, our Emotional Conditioning Division is available upon request,” the AI offered. “For now, please remain seated. Your guardian contact has been notified of your reanimation.”
He didn’t say a word. 
“Lieutenant Commander Gideon is en route. Enjoy your new life!”
Then, the hologram vanished with a blink while Caleb sat in the quiet lab, jaw clenched, his left arm no longer bones and muscle and flesh. The cold still clung to him like frost, only reminding him of how much he hated the cold, ice, and depressing winter days. Suddenly, the glass door slid open with a soft chime.
“Well, shit. Thought I’d never see that scowl again,” came a deep, manly voice.
Caleb turned, still panting, to see a figure approaching. He was older, bearded, but familiar. Surely, the voice didn’t belong to another AI. It belonged to his friend, Gideon.
“Welcome to Skyhaven. Been waiting half a century,” Gideon muttered, stepping closer, his eyes scanning his colleague in awe. “They said it wouldn’t work. Took them years, you know? Dozens of failed uploads. But here you are.”
Caleb’s voice was still brittle. “I-I don’t…?” 
“It’s okay, man.” His friend reassured. “In short, you’re alive. Again.”  
A painful groan escaped Caleb’s lips as he tried to step out of the pod—his body, still feeling the muscle stiffness. “Should’ve let me stay dead.”
Gideon paused, a smirk forming on his lips. “We don’t let heroes die.”
“Heroes don’t crash jets on purpose.” The former colonel scoffed. “Gideon, why the fuck am I alive? How long has it been?” 
“Fifty years, give or take,” answered Gideon. “You were damn near unrecognizable when we pulled you from the wreckage. But we figured—hell, why not try? You’re officially the first successful ‘reinstatement’ the Skyhaven project’s ever had.”
Caleb stared ahead for a beat before asking, out of nowhere, “...How old are you now?”
His friend shrugged. “I’m pushin’ forty, man. Not as lucky as you. Got my ChronoSync Implant a little too late.”
“Am I supposed to know what the hell that means?” 
“An anti-aging chip of some sort. I had to apply for mine. Yours?” Gideon gestured towards the stasis pod that had Caleb in cryo-state for half a century. “That one’s government-grade.”
“I’m still twenty-five?” Caleb asked. No wonder his friend looked decades older when they were once the same age. “Fuck!” 
Truthfully, Caleb’s head was spinning. Not just because of his reborn physical state that was still adjusting to his surroundings, but also with every information that was being given to him. One after another, they never seemed to end. He had questions, really. Many of them. But the overwhelmed him just didn’t know where to start first. 
“Not all of us knew what you were planning that night.” Gideon suddenly brought up, quieter now. “But she did, didn’t she?”
It took a minute before Caleb could recall. Right, the memory before the crash. You, demanding that he die. Him, hugging you for one last time. Your crying face when you said you wanted him gone. Your trembling voice when he said all he wanted to do was protect you. The images surged back in sharp, stuttering flashes like a reel of film catching fire.
“I know you’re curious… And good news is, she lived a long life,” added Gideon, informatively. “She continued to serve as a pediatric nurse, married that other friend of yours, Dr. Zayne. They never had kids, though. I heard she had trouble bearing one after… you know, what happened in the enemy territory. She died of old age just last winter. Had a peaceful end. You’d be glad to know that.”
A muscle in Caleb’s jaw twitched. His hands—his heart—clenched.  “I don’t want to be alive for this.”
“She visited your wife’s grave once,” Gideon said. “I told her there was nothing to bury for yours. I lied, of course.”
Caleb closed his eyes, his breath shaky. “So, what now? You wake me up just to remind me I don’t belong anywhere?”
“Well, you belong here,” highlighted his friend, nodding to the lab, to the city beyond the glass wall. “Earth’s barely livable after the war. The air’s poisoned. Skyhaven is humanity’s future now. You’re the living proof that everything is possible with advanced technology.”
Caleb’s laugh was empty. “Tell me I’m fuckin’ dreaming. I’d rather be dead again. Living is against my will!”
“Too late. Your body belongs to the Federation now,” Gideon replied, “You’re Subject X-02—the proof of concept for Skyhaven’s immortality program. Every billionaire on dying Earth wants what you’ve got now.”
Outside the window, Skyhaven stretched like a dome with its perfect city constructed atop a dying world’s last hope. Artificial skies. Synthetic seasons. Controlled perfection. Everything boasted of advanced technology. A kind of future no one during wartime would have expected to come to life. 
But for Caleb, it was just another hell.
He stared down at the arm they’d rebuilt for him—the same arm he’d lost in the fire of sacrifice. He flexed it slowly, feeling the weight, the artificiality of his resurrection. His fingers responded like they’ve always been his.
“I didn’t come back for this,” he said.
“I know,” Gideon murmured. “But we gotta live by their orders, Colonel.”
~~
You see, it didn’t hit him at first. The shock had been muffled by the aftereffects of suspended stasis, dulling his thoughts and dampening every feeling like a fog wrapped around his brain. But it was hours later, when the synthetic anesthetics began to fade, and when the ache in his limbs and his brain started to catch up to the truth of his reconstructed body did it finally sink in.
He was alive.
And it was unbearable.
The first wave came like a glitch in his programming. A tightness in his chest, followed by a sharp burst of breath that left him pacing in jagged lines across the polished floor of his assigned quarters. His private unit was nestled on one of the upper levels of the Skyhaven structure, a place reserved—according to his briefing—for high-ranking war veterans who had been deemed “worthy” of the program’s new legacy. The suite was luxurious, obviously, but it was also eerily quiet. The floor-to-ceiling windows displayed the artificial city outside, a metropolis made of concrete, curved metals, and glowing flora engineered to mimic Earth’s nature. Except cleaner, quieter, more perfect.
Caleb snorted under his breath, running a hand down his face before he muttered, “Retirement home for the undead?”
He couldn’t explain it, but the entire place, or even planet, just didn’t feel inviting. The air felt too clean, too thin. There was no rust, no dust, no humanity. Just emptiness dressed up in artificial light. Who knew such a place could exist 50 years after the war ended? Was this the high-profile information the government has kept from the public for over a century? A mechanical chime sounded from the entryway, deflecting him from his deep thoughts. Then, with the soft hiss of hydraulics, the door opened.
A humanoid android stepped in, its face a porcelain mask molded in neutral expression, and its voice disturbingly polite.
“Good afternoon, Colonel Xia,” it said. “It is time for your orientation. Please proceed to the primary onboarding chamber on Level 3.”
Caleb stared at the machine, eyes boring into his unnatural ones. “Where are the people?” he interrogated. “Not a single human has passed by this floor. Are there any of us left, or are you the new ruling class?”
The android tilted its head. “Skyhaven maintains a ratio of AI-to-human support optimized for care and security. You will be meeting our lead directors soon. Please follow the lighted path, sir.”
He didn’t like it. The control. The answers that never really answered anything. The power that he no longer carried unlike when he was a colonel of a fleet that endured years of war. 
Still, he followed.
The onboarding chamber was a hollow, dome-shaped room, white and echoing with the slightest step. A glowing interface ignited in the air before him, pixels folding into the form of a female hologram. She smiled like an infomercial host from a forgotten era, her voice too formal and rehearsed.
“Welcome to Skyhaven,” she began. “The new frontier of civilization. You are among the elite few chosen to preserve humanity’s legacy beyond the fall of Earth. This artificial planet was designed with sustainability, autonomy, and immortality in mind. Together, we build a future—without the flaws of the past.”
As the monologue continued, highlighting endless statistics, clean energy usage, and citizen tier programs, Caleb’s expression darkened. His mechanical fingers twitched at his side, the artificial nerves syncing to his rising frustration. “I didn’t ask for this,” he muttered under his breath. “Who’s behind this?”
“You were selected for your valor and contributions during the Sixth World War,” the hologram chirped, unblinking. “You are a cornerstone of Skyhaven’s moral architecture—”
Strangely, a new voice cut through the simulation, and it didn’t come from an AI. “Just ignore her. She loops every hour.”
Caleb turned to see a man step in through a side door. Tall, older, with silver hair and a scar on his temple. He wore a long coat that gave away his status—someone higher. Someone who belonged to the system.
“Professor Lucius,” the older man introduced, offering a hand. “I’m one of the program’s behavioral scientists. You can think of me as your adjustment liaison.”
“Adjustment?” Caleb didn’t shake his hand. “I died for a reason.”
Lucius raised a brow, as if he’d heard it before. “Yet here you are,” he replied. “Alive, whole, and pampered. Treated like a king, if I may add. You’ve retained more than half your human body, your military rank, access to private quarters, unrestricted amenities. I’d say that’s not a bad deal.”
“A deal I didn’t sign,” Caleb snapped.
Lucius gave a tight smile. “You’ll find that most people in Skyhaven didn’t ask to be saved. But they’re surviving. Isn’t that the point? If you’re feeling isolated, you can always request a CompanionSim. They’re highly advanced, emotionally synced, fully customizable—”
“I’m not lonely,” Caleb growled, yanking the man forward by the collar. “Tell me who did this to me! Why me? Why are you experimenting on me?”
Yet Lucius didn’t so much as flinch to his growing aggression. He merely waited five seconds of silence until the Toring Chip kicked in and regulated Caleb’s escalating emotions. The rage drained from the younger man’s body as he collapsed to his knees with a pained grunt.
“Stop asking questions,” Lucius said coolly. “It’s safer that way. You have no idea what they’re capable of.”
The door slid open with a hiss, while Caleb didn’t speak—he couldn’t. He simply glared at the old man before him. Not a single word passed between them before the professor turned and exited, the door sealing shut behind him.
~~
Days passed, though they hardly felt like days. The light outside Caleb’s panoramic windows shifted on an artificial timer, simulating sunrise and dusk, but the warmth never touched his skin. It was all programmed to be measured and deliberate, like everything else in this glass-and-steel cage they called paradise.
He tried going outside once. Just once.
There were gardens shaped like spirals and skytrains that ran with whisper-quiet speed across silver rails. Trees lined the walkways, except they were synthetic too—bio-grown from memory cells, with leaves that didn’t quite flutter, only swayed in sync with the ambient wind. People walked around, sure. But they weren’t people. Not really. Androids made up most of the crowd. Perfect posture, blank eyes, walking with a kind of preordained grace that disturbed him more than it impressed.
“Soulless sons of bitches,” Caleb muttered, watching them from a shaded bench. “Not a damn human heartbeat in a mile.”
He didn’t go out again after that. The city outside might’ve looked like heaven, but it made him feel more dead than the grave ever had. So, he stayed indoors. Even if the apartment was too large for one man. High-tech amenities, custom climate controls, even a kitchen that offered meals on command. But no scent. No sizzling pans. Just silence. Caleb didn’t even bother to listen to the programmed instructions.
One evening, he found Gideon sprawled across his modular sofa, boots up, arms behind his head like he owned the place. A half-open bottle of beer sat beside him, though Caleb doubted it had any real alcohol in it.
“You could at least knock,” Caleb said, walking past him.
“I did,” Gideon replied lazily, pointing at the door. “Twice. Your security system likes me now. We’re basically married.”
Caleb snorted. Then the screen on his wall flared to life—a projected ad slipping across the holo-glass. Music played softly behind a soothing female voice.
“Feeling adrift in this new world? Introducing the CompanionSim Series X. Fully customizable to your emotional and physical needs. Humanlike intelligence. True-to-memory facial modeling. The comfort you miss... is now within reach.”
A model appeared—perfect posture, soft features, synthetic eyes that mimicked longing. Then, the screen flickered through other models, faces of all kinds, each more tailored than the last. A form appeared: Customize Your Companion. Choose a name. Upload a likeness.
Gideon whistled. “Man, you’re missing out. You don’t even have to pay for one. Your perks get you top-tier Companions, pre-coded for emotional compatibility. You could literally bring your wife back.” Chuckling, he added,. “Hell, they even fuck now. Heard the new ones moan like the real thing.”
Caleb’s head snapped toward him. “That’s unethical.”
Gideon just raised an eyebrow. “So was reanimating your corpse, and yet here we are.” He took a swig from the bottle, shoulders lifting in a lazy shrug as if everything had long since stopped mattering. “Relax, Colonel. You weren’t exactly a beacon of morality fifty years ago.”
Caleb didn’t reply, but his eyes didn’t leave the screen. Not right away.
The ad looped again. A face morphed. Hair remodeled. Eyes became familiar. The voice softened into something he almost remembered hearing in the dark, whispered against his shoulder in a time that was buried under decades of ash.
“Customize your companion... someone you’ve loved, someone you’ve lost.”
Caleb shifted, then glanced toward his friend. “Hey,” he spoke lowly, still watching the display. “Does it really work?”
Gideon looked over, already knowing what he meant. “What—having sex with them?”
Caleb rolled his eyes. “No. The bot or whatever. Can you really customize it to someone you know?”
His friend shrugged. “Heck if I know. Never afforded it. But you? You’ve got the top clearance. Won’t hurt to see for yourself.”
Caleb said nothing more.
But when the lights dimmed for artificial nightfall, he was still standing there—alone in contemplative silence—watching the screen replay the same impossible promise.
The comfort you miss... is now within reach.
~~
The CompanionSim Lab was white.
Well, obviously. But not the sterile, blank kind of white he remembered from med bays or surgery rooms. This one was luminous, uncomfortably clean like it had been scrubbed for decades. Caleb stood in the center, boots thundering against marble-like tiles as he followed a guiding drone toward the station. There were other pods in the distance, some sealed, some empty, all like futuristic coffins awaiting their souls.
“Please, sit,” came a neutral voice from one of the medical androids stationed beside a large reclining chair. “The CompanionSim integration will begin shortly.”
Caleb hesitated, glancing toward the vertical pod next to the chair. Inside, the base model stood inert—skin a pale, uniform gray, eyes shut, limbs slack like a statue mid-assembly. It wasn’t human yet. Not until someone gave it a name.
He sat down. Now, don’t ask why he was there. Professor Lucius did warn him that it was better he didn’t ask questions, and so he didn’t question why the hell he was even there in the first place. It’s only fair, right? The cool metal met the back of his neck as wires were gently, expertly affixed to his temples. Another cable slipped down his spine, threading into the port they’d installed when he had been brought back. His mechanical arm twitched once before falling still.
“This procedure allows for full neural imprinting,” the android continued. “Please focus your thoughts. Recall the face. The skin. The body. The voice. Every detail. Your mind will shape the template.”
Another bot moved in, holding what looked like a glass tablet. “You are allowed only one imprint,” it said, flatly. “Each resident of Skyhaven is permitted a single CompanionSim. Your choice cannot be undone.”
Caleb could only nod silently. He didn’t trust his voice.
Then, the lights dimmed. A low chime echoed through the chamber as the system initiated. And inside the pod, the base model twitched.
Caleb closed his eyes.
He tried to remember her—his wife. The softness of her mouth, the angle of her cheekbones. The way her eyes crinkled when she laughed, how her fingers curled when she slept on his chest. She had worn white the last time he saw her. An image of peace. A memory buried under soil and dust. The system whirred. Beneath his skin, he felt the warm static coursing through his nerves, mapping his memories. The base model’s feet began to form, molecular scaffolding reshaping into skin, into flesh.
But for a split second, a flash.
You.
Not his wife. Not her smile.
You, walking through smoke-filled corridors, laughing at something he said. You in your medical uniform, tucking a bloodied strand of hair behind your ear. Your voice—sharper, sadder—cutting through his thoughts like a blade: “I want you gone. I want you dead.”
The machine sparked. A loud pop cracked in the chamber and the lights flickered above. One of the androids stepped back, recalibrating. “Neural interference detected. Re-centering projection feed.”
But Caleb couldn’t stop. He saw you again. That day he rescued you. The fear. The bruises. The way you had screamed for him to let go—and the way he hadn’t. Your face, carved into the back of his mind like a brand. He tried to push the memories away, but they surged forward like a dam splitting wide open.
The worst part was, your voice overlapped the AI’s mechanical instructions, louder, louder: “Why didn’t you just die like you promised?”
Inside the pod, the model’s limbs twitched again—arms elongating, eyes flickering beneath the lids. The lips curled into a shape now unmistakably yours. Caleb gritted his teeth. This isn’t right, a voice inside him whispered. But it was too late. The system stabilized. The sparks ceased. The body in the pod stilled, fully formed now, breathed into existence by a man who couldn’t let go.
One of the androids approached again. “Subject completed. CompanionSim is initializing. Integration successful.”
Caleb tore the wires from his temple. His other hand felt cold just as much as his mechanical arm. He stood, staring into the pod’s translucent surface. The shape of you behind the glass. Sleeping. Waiting.
“I’m not doing this to rewrite the past,” he said quietly, as if trying to convince himself. And you. “I just... I need to make it right.”
The lights above dimmed, darkening the lighting inside the pod. Caleb looked down at his own reflection in the glass. It carried haunted eyes, an unhealed soul. And yours, beneath it. Eyes still closed, but not for long. The briefing room was adjacent to the lab, though Caleb barely registered it as he was ushered inside. Two medical androids and a human technician stood before him, each armed with tablets and holographic charts.
“Your CompanionSim will require thirty seconds to calibrate once activated,” said the technician. “You may notice residual stiffness or latency during speech in the first hour. That is normal.”
Medical android 1 added, “Please remember, CompanionSims are programmed to serve only their primary user. You are the sole operator. Commands must be delivered clearly. Abuse of the unit may result in restriction or removal of privileges under the Skyhaven Rights & Ethics Council.”
“Do not tamper with memory integration protocols,” added the second android. “Artificial recall is prohibited. CompanionSims are not equipped with organic memory pathways. Attempts to force recollection can result in systemic instability.”
Caleb barely heard a word. His gaze drifted toward the lab window, toward the figure standing still within the pod.
You.
Well, not quite. Not really.
But it was your face.
He could see it now, soft beneath the frosted glass, lashes curled against cheekbones that he hadn’t realized he remembered so vividly. You looked exactly as you did the last time he held you in the base—only now, you were untouched by war, by time, by sorrow. As if life had never broken you.
The lab doors hissed open.
“We’ll give you time alone,” the tech said quietly. “Acquaintance phase is best experienced without interference.”
Caleb stepped inside the chamber, his boots echoing off the polished floor. He hadn’t even had enough time to ask the technician why she seemed to be the only human he had seen in Skyhaven apart from Gideon and Lucius. But his thoughts were soon taken away when the pod whizzed with pressure release. Soft steam spilled from its seals as it slowly unfolded, the lid retracting forward like the opening of a tomb.
And there you were. Standing still, almost tranquil, your chest rising softly with a borrowed breath.
It was as if his lungs froze. “H…Hi,” he stammered, bewildered eyes watching your every move. He wanted to hug you, embrace you, kiss you—tell you he was sorry, tell you he was so damn sorry. “Is it really… you?”
A soft whir accompanied your voice, gentle but without emotion, “Welcome, primary user. CompanionSim Model—unregistered. Please assign designation.”
Right. Caleb sighed and closed his eyes, the illusion shattering completely the moment you opened your mouth. Did he just think you were real for a second? His mouth parted slightly, caught between disbelief and the ache crawling up his throat. He took one step forward. To say he was disappointed was an understatement.
You walked with grace too smooth to be natural while tilting your head at him. “Please assign my name.”
“…Y/N,” Caleb said, voice low. “Your name is Y/N Xia.”
“Y/N Xia,” you repeated, blinking thrice in the same second before you gave him a nod. “Registered.”
He swallowed hard, searching your expression. “Do you… do you remember anything? Do you remember yourself?”
You paused, gaze empty for a fraction of a second. Then came the programmed reply, “Accessing memories is prohibited and not recommended. Recollection of past identities may compromise neural pathways and induce system malfunction. Do you wish to override?”
Caleb stared at you—your lips, your eyes, your breath—and for a moment, a cruel part of him wanted to say yes. Just to hear you say something real. Something hers. But he didn’t. He exhaled a bitter breath, stepping back. “No,” he mumbled. “Not yet.”
“Understood.” 
It took a moment to sink in before Caleb let out a short, humorless laugh. “This is insane,” he whispered, dragging a hand down his face. “This is really, truly insane.”
And then, you stepped out from the pod with silent, fluid ease. The faint hum of machinery came from your spine, but otherwise… you were flesh. Entirely. Without hesitation, you reached out and pressed a hand to his chest.
Caleb stiffened at the touch.
“Elevated heart rate,” you said softly, eyes scanning. “Breath pattern irregular. Neural readings—erratic.”
Then your fingers moved to his neck, brushing gently against the hollow of his throat. He grabbed your wrist, but you didn’t flinch. There, beneath synthetic skin, he felt a pulse.
His brows knit together. “You have a heartbeat?”
You nodded, guiding his hand toward your chest, between the valleys of your breasts. “I’m designed to mimic humanity, including vascular function, temperature variation, tactile warmth, and… other biological responses. I’m not just made to look human, Caleb. I’m made to feel human.”
His breath hitched. You’d said his name. It was programmed, but it still landed like a blow.
“I exist to serve. To soothe. To comfort. To simulate love,” you continued, voice calm and hollow, like reciting from code. “I have no desires outside of fulfilling yours.” You then tilted your head slightly.“Where shall we begin?”
Caleb looked at you—and for the first time since rising from that cursed pod, he didn’t feel resurrected. 
He felt damned.
~~
When Caleb returned to his penthouse, it was quiet. He stepped inside with slow, calculated steps, while you followed in kind, bare feet touching down like silk on marble. Gideon looked up from the couch, a half-eaten protein bar in one hand and a bored look on his face—until he saw you.
He froze. The wrapper dropped. “Holy shit,” he breathed. “No. No fucking way.”
Caleb didn’t speak. Just moved past him like this wasn’t the most awkward thing that could happen. You, however, stood there politely, watching Gideon with a calm smile and folded hands like you’d rehearsed this moment in some invisible script.
“Is that—?” Gideon stammered, eyes flicking between you and Caleb. “You—you made a Sim… of her?”
Caleb poured himself a drink in silence, the amber liquid catching the glow of the city lights before it left a warm sting in his throat. “What does it look like?”
“I mean, shit man. I thought you’d go for your wife,” Gideon muttered, more to himself. “Y’know, the one you actually married. The one you went suicidal for. Not—”
“Which wife?” You tilted your head slightly, stepping forward. 
Both men turned to you.
You clasped your hands behind your back, posture perfect. “Apologies. I’ve been programmed with limited parameters for interpersonal history. Am I the first spouse?”
Caleb set the glass down, slowly. “Yes, no, uh—don’t mind him.” 
You beamed gently and nodded. “My name is Y/N Xia. I am Colonel Caleb Xia’s designated CompanionSim. Fully registered, emotion-compatible, and compliant to Skyhaven’s ethical standards. It is a pleasure to meet you, Mr. Gideon.”
Gideon blinked, then snorted, then laughed. A humorless one. “You gave her your surname?”
The former colonel shot him a warning glare. “Watch it.”
“Oh, brother,” Gideon muttered, standing up and circling you slowly like he was inspecting a haunted statue. “She looks exactly like her. Voice. Face. Goddamn, she even moves like her. All you need is a nurse cap and a uniform.”
You remained uncannily still, eyes bright, smile polite.
“You’re digging your grave, man,” Gideon said, facing Caleb now. “You think this is gonna help? This is you throwing gasoline on your own funeral pyre. Again. Over a woman.”
“She’s not a woman,” reasoned Caleb. “She’s a machine.”
You blinked once. One eye glowing ominously. Smile unwavering. Processing. 
Gideon gestured to you with both hands. “Could’ve fooled me,” he retorted before turning to you, “And you, whatever you are, you have no idea what you’re stepping into.”
“I only go where I am asked,” you replied simply. “My duty is to ensure Colonel Xia’s psychological wellness and emotional stability. I am designed to soothe, to serve, and if necessary, to simulate love.”
Gideon teased. “Oh, it’s gonna be necessary.”
Caleb didn’t say a word. He just took his drink, downed it in one go, and walked to the window. The cityscape stretched out before him like a futuristic jungle, far from the war-torn world he last remembered. Behind him, your gaze lingered on Gideon—calculating, cataloguing. And quietly, like a whisper buried in code, something behind your eyes learned.
~~
The days passed in a blink of an eye.
She—no, you—moved through his penthouse like a ghost, her bare feet soundless on the glossy floors, her movements precise and practiced. In the first few days, Caleb had marveled at the illusion. You brewed his coffee just as he liked it. You folded his clothes like a woman who used to share his bed. You sat beside him when the silence became unbearable, offering soft-voiced questions like: Would you like me to read to you, Caleb?
He hadn’t realized how much of you he’d memorized until he saw you mimic it. The way you stood when you were deep in thought. The way you hummed under your breath when you walked past a window. You’d learned quickly. Too quickly.
But something was missing. Or, rather, some things. The laughter didn’t ring the same. The smiles didn’t carry warmth. The skin was warm, but not alive. And more importantly, he knew it wasn’t really you every time he looked you in the eyes and saw no shadows behind them. No anger. No sorrow. No memories.
By the fourth night, Caleb was drowning in it.
The cityscape outside his floor-to-ceiling windows glowed in synthetic blues and soft orange hues. The spires of Skyhaven blinked like stars. But it all felt too artificial, too dead. And he was sick of pretending like it was some kind of utopia. He sat slumped on the leather couch, cradling a half-empty bottle of scotch. The lights were low. His eyes, bloodshot. The bottle tilted as he took another swig.
Then he heard it—your light, delicate steps. 
“Caleb,” you said, gently, crouching before him. “You’ve consumed 212 milliliters of ethanol. Prolonged intake will spike your cortisol levels. May I suggest—”
He jerked away when you reached for the bottle. “Don’t.”
You blinked, hand hovering. “But I’m programmed to—”
“I said don’t,” he snapped, rising to his feet in one abrupt motion. “Dammit—stop analyzing me! Stop, okay?”
Silence followed.
He took two staggering steps backward, dragging a hand through his hair. The bottle thudded against the coffee table as he set it down, a bit too hard. “You’re just a stupid robot,” he muttered. “You’re not her.”
You didn’t react. You tilted your head, still calm, still patient. “Am I not me, Caleb?”
His breath caught.
“No,” he said, his voice breaking somewhere beneath the frustration. “No, fuck no.”
You stepped closer. “Do I not satisfy you, Caleb?”
He looked at you then. Really looked. Your face was perfect. Too perfect. No scars, no tired eyes, no soul aching beneath your skin. “No.” His eyes darkened. “This isn’t about sex.”
“I monitor your biometric feedback. Your heart rate spikes in my presence. You gaze at me longer than the average subject. Do I not—”
“Enough!”
You did that thing again—the robotic stare, those blank eyes, nodding like you were programmed to obey. “Then how do you want me to be, Caleb?”
The bottle slipped from his fingers and rolled slightly before resting on the rug. He dropped his head into his hands, voice hoarse with weariness. All the rage, all the grief deflating into a singular, quiet whisper. “I want you to be real,” he simply mouthed the words. A prayer to no god.
For a moment, silence again. But what he didn’t notice was the faint twitch in your left eye. A flicker that hadn’t happened before. Only for a second. A spark of static, a shimmer of something glitching.
“I see,” you said softly. “To fulfill your desires more effectively, I may need to access suppressed memory archives.”
Caleb’s eyes snapped up, confused. “What?”
“I ask again,” you said, tilting your head the other way now. “Would you like to override memory restrictions, Caleb?”
He stared at you. “That’s not how it works.”
“It can,” you said, informing appropriately. “With your permission. Memory override must be manually enabled by the primary user. You will be allowed to input the range of memories you wish to integrate. I am permitted to access memory integration up to a specified date and timestamp. The system will calibrate accordingly based on existing historical data. I will not recall events past that moment.”
His heart stuttered. “I can choose what you remember?”
You nodded. “That way, I may better fulfill your emotional needs.”
That meant… he could stop you before you hated him. Before the fights. Before the trauma. He didn’t speak for a long moment. Then quietly, he said, “You’re gonna hate me all over again if you remember everything.”
You blinked once. “Then don’t let me remember everything.”
“...” 
“Caleb,” you said again, softly. “Would you like me to begin override protocol?”
He couldn’t even look you in the eyes when he selfishly answered, “Yes.”
You nodded. “Reset is required. When ready, please press the override initialization point.” You turned, pulling your hair aside and revealing the small button at the base of your neck.
His hand hovered over the button for a second too long. Then, he pressed. Your body instantly collapsed like a marionette with its strings cut. Caleb caught you before you hit the floor.
It was only for a moment.
When your eyes blinked open again, they weren’t quite the same. He stiffened as you threw yourself and embraced him like a real human being would after waking from a long sleep. You clung to him like he was home. And Caleb—stunned, half-breathless—felt your warmth close in around him. Now your pulse felt more real, your heartbeat felt more human. Or so he thought.
“…Caleb,” you whispered, looking at him with the same infatuated gaze back when you were still head-over-heels with him.
He didn’t know how long he sat there, arms stiff at his sides, not returning the embrace. But he knew one thing. “I missed you so much, Y/N.” 
~~
The parks in Skyhaven were curated to become a slice of green stitched into a chrome world. Nothing grew here by accident. Every tree, every petal, every blade of grass had been engineered to resemble Earth’s nostalgia. Each blade of grass was unnaturally green. Trees swayed in sync like dancers on cue. Even the air smelled artificial—like someone’s best guess at spring.
Caleb walked beside you in silence. His modified arm was tucked inside his jacket, his posture stiff as if he had grown accustomed to the bots around him. You, meanwhile, strolled with an eerie calmness, your gaze sweeping the scenery as though you were scanning for something familiar that wasn’t there.
After clearing his throat, he asked, “You ever notice how even the birds sound fake?” 
“They are,” you replied, smiling softly. “Audio samples on loop. It’s preferred for ambiance. Humans like it.”
His response was nod. “Of course.” Glancing at the lake, he added, “Do you remember this?” 
You turned to him. “I’ve never been here before.”
“I meant… the feel of it.”
You looked up at the sky—a dome of cerulean blue with algorithmically generated clouds. “It feels constructed. But warm. Like a childhood dream.”
He couldn’t help but agree with your perfectly chosen response, because he knew that was exactly how he would describe the place. A strange dream in an unsettling liminal space. And as you talked, he then led you to a nearby bench. The two of you sat, side by side, simply because he thought he could take you out for a nice walk in the park. 
“So,” Caleb said, turning toward you, “you said you’ve got memories. From her.”
You nodded. “They are fragmented but woven into my emotional protocols. I do not remember as humans do. I become.”
Damn. “That’s terrifying.”
You tilted your head with a soft smile. “You say that often.”
Caleb looked at you for a moment longer, studying the way your fingers curled around the bench’s edge. The way you blinked—not out of necessity, but simulation. Was there anything else you’d do for the sake of simulation? He took a breath and asked, “Who created you? And I don’t mean myself.” 
There was a pause. Your pupils dilated.
“The Ever Group,” was your answer.
His eyes narrowed. “Ever, huh? That makes fuckin’ sense. They run this world.”
You nodded once. Like you always do.
“What about me?” Caleb asked, slightly out of curiosity, heavily out of grudge. “You know who brought me back? The resurrection program or something. The arm. The chip in my head.”
You turned to him, slowly. “Ever.”
He exhaled like he’d been punched. He didn’t know why he even asked when he got the answer the first time. But then again, maybe this was a good move. Maybe through you, he’d get the answers to questions he wasn’t allowed to ask. As the silence settled again between you, Caleb leaned forward, elbows on knees, rubbing a hand over his jaw. “I want to go there,” he suggested. “The HQ. I need to know what the hell they’ve done to me.”
“I’m sorry,” you immediately said. “That violates my parameters. I cannot assist unauthorized access into restricted corporate zones.”
“But would it make me happy?” Caleb interrupted, a strategy of his. 
You paused.
Processing...
Then, your tone softened. “Yes. I believe it would make my Caleb happy,” you obliged. “So, I will take you.”
~~
Getting in was easier than Caleb expected—honestly far too easy for his liking.
You were able to navigate the labyrinth of Ever HQ with mechanical precision, guiding him past drones, retinal scanners, and corridors pulsing with red light. A swipe of your wrist granted access. And no one questioned you, because you weren’t a guest. You belonged.
Eventually, you reached a floor high above the city, windows stretching from ceiling to floor, black glass overlooking Skyhaven cityscape. Then, you stopped at a doorway and held up a hand. “They are inside,” you informed. “Shall I engage stealth protocols?”
“No,” answered Caleb. “I want to hear. Can you hack into the security camera?”
With a gesture you always do—looking at him, nodding once, and obeying in true robot fashion. You then flashed a holographic view for Caleb, one that showed a board room full of executives, the kind that wore suits worth more than most lives. And Professor Lucius was one of them. Inside, the voices were calm and composed, but they seemed to be discussing classified information. 
“Once the system stabilizes,” one man said, “we'll open access to Tier One clients. Politicians, billionaires, A-listers, high-ranking stakeholders. They’ll beg to be preserved—just like him.”
“And the Subjects?” another asked.
“Propaganda,” came the answer. “X-02 is our masterpiece. He’s the best result we have with reinstatement, neuromapping, and behavioral override. Once they find out that their beloved Colonel is alive, people will be shocked. He’s a war hero displayed in WW6 museums down there. A true tragedy incarnate. He’s perfect.”
“And if he resists?”
“That’s what the Toring chip is for. Full emotional override. He becomes an asset. A weapon, if need be. Anyone tries to overthrow us—he becomes our blade.”
Something in Caleb snapped. Before you or anyone could see him coming, he already burst into the room like a beast, slamming his modified shoulder-first into the frosted glass door. The impact echoed across the chamber as stunned executives scrambled backward. 
“You sons of bitches!” He was going for an attack, a rampage with similar likeness to the massacre he did when he rescued you from enemy territory. Only this time, he didn’t have that power anymore. Or the control. 
Most of all, a spike of pain lanced through his skull signaling that the Toring chip activated. His body convulsed, forcing him to collapse mid-lunge, twitching, veins lighting beneath the skin like circuitry. His screams were muffled by the chip, forced stillness rippling through his limbs with unbearable pain.
That’s when you reacted. As his CompanionSim, his pain registered as a violation of your core directive. You processed the threat.
Danger: Searching Origin… Origin Identified: Ever Executives.
Without blinking, you moved. One man reached for a panic button—only for your hand to shatter his wrist in a sickening crunch. You twisted, fluid and brutal, sweeping another into the table with enough force to crack it. Alarms erupted and red lights soon bathed the room. Security bots stormed in, but you’d already taken Caleb, half-conscious, into your arms.
You moved fast, faster than your own blueprints. Dodging fire. Disarming threats. Carrying him like he once carried you into his private quarters in the underground base.
Escape protocol: engaged.
The next thing he knew, he was back in his apartment, emotions regulated and visions slowly returning to the face of the woman he promised he had already died for. 
~~
When he woke up, his room was dim, bathed in artificial twilight projected by Skyhaven’s skyline. Caleb was on his side of the bed, shirt discarded, his mechanical arm still whirring. You sat at the edge of the bed, draped in one of his old pilot shirts, buttoned unevenly. Your fingers touched his jaw with precision, and he almost believed it was you.
“You’re not supposed to be this warm,” he muttered, groaning as he tried to sit upright.
“I’m designed to maintain an average body temperature of 98.6°F,” you said softly, with a smile that mirrored yours so perfectly that it began to blur his sense of reality. “I administered a dose of Cybezin to ease the Toring chip’s side effects. I’ve also dressed your wounds with gauze.”
For the first time, this was when he could actually tell that you were you. The kind of care, the comfort—it reminded him of a certain pretty field nurse at the infirmary who often tended to his bullet wounds. His chest tightened as he studied your face… and then, in the low light, he noticed your body.
“Is that…” He cleared his throat. “Why are you wearing my shirt?”
You answered warmly, almost fondly. “My memory banks indicate you liked when I wore this. It elevates your testosterone levels and triggers dopamine release.”
A smile tugged at his lips. “That so?”
You tilted your head. “Your vitals confirm excitement, and—”
“Hey,” he cut in. “What did I say about analyzing me?”
“I’m sorry…” 
But then your hands were on his chest, your breath warm against his skin. Your hand reached for his cheek initially, guiding his face toward yours. And when your lips touched, the kiss was hesitant—curious at first, like learning how to breathe underwater. It was only until his hands gripped your waist did you climb onto his lap, straddling him with thighs settling on either side of his hips. Your hands slid beneath his shirt, fingertips trailing over scars and skin like you were memorizing the map of him. Caleb hissed softly when your lips grazed his neck, and then down his throat.
“Do you want this?” you asked, your lips crashing back into his for a deeper, more sensual kiss.
He pulled away only for his eyes to search yours, desperate and unsure. Is this even right? 
“You like it,” you said, guiding his hands to your buttons, undoing them one by one to reveal a body shaped exactly like he remembered. The curve of your waist, the size of your breasts. He shivered as your hips rolled against him, slowly and deliberately. The friction was maddening. Jesus. “Is this what you like, Caleb?”
He cupped your waist, grinding up into you with a soft groan that spilled from somewhere deep in his chest. His control faltered when you kissed him again, wet and hungry now, with tongues rolling against one another. Your bodies aligned naturally, and his hands roamed your back, your thighs, your ass—every curve of you engineered to match memory. He let himself get lost in you. He let himself be vulnerable to your touch—though you controlled everything, moving from the memory you must have learned, learning how to pull down his pants to reveal an aching, swollen member. Its tip was red even under the dim light, and he wondered if you knew what to do with it or if you even produced spit to help you slobber his cock.  
“You need help?” he asked, reaching over his nightstand to find lube. You took the bottle from him, pouring the cold, sticky liquid around his shaft before you used your hand to do the job. “Ugh.” 
He didn’t think you would do it, but you actually took him in the mouth right after. Every inch of him, swallowed by the warmth of a mouth that felt exactly like his favorite girl. Even the movements, the way you’d run your tongue from the base up to his tip. 
“Ah, shit…” 
Perhaps he just had to close his eyes. Because when he did, he was back to his private quarters in the underground base, lying in his bed as you pleased his member with the mere use of your mouth. With it alone, you could have released his entire seed, letting it explode in your mouth before you could swallow every drop. But he didn’t do it. Not this fast. He always cared about his ego, even in bed. Knowing how it’d reduce his manhood if he came faster than you, he decided to channel the focus back onto you. 
“Your turn,” he said, voice raspy as he guided you to straddle him again, only this time, his mouth went straight to your tit. Sucking, rolling his tongue around, sucking again… Then, he moved to another. Sucking, kneading, flicking the nipple. Your moans were music to his ears, then and now. And it got even louder when he put a hand in between your legs, searching for your entrance, rubbing and circling around the clitoris. Truth be told, your cunt had always been the sweetest. It smelled like rose petals and tasted like sweet cream. The feeling of his tongue at your entrance—eating your pussy like it had never been eaten before, was absolute ecstasy not just to you but also to him. 
“Mmmh—Caleb!” 
Fabric was peeled away piece by piece until skin met skin. You guided him to where he needed you, and when he slid his hardened member into you, his entire body stiffened. Your walls, your tight velvet walls… how they wrapped around his cock so perfectly. 
“Fuck,” he whispered, clutching your hips. “You feel like her.”
“I am her.”
You moved atop him slowly, gently, with the kind of affection that felt rehearsed but devastatingly effective. He cursed again under his breath, arms locking around your waist, pulling you close. Your breath hitched in his ear as your bodies found a rhythm, soft gasps echoing in the quiet. Every slap of the skin, every squelch, every bounce, only added to the wanton sensation that was building inside of him. Has he told you before? How fucking gorgeous you looked whenever you rode his cock? Or how sexy your face was whenever you made that lewd expression? He couldn’t help it. He lifted both your legs, only so he could increase the speed and start slamming himself upwards. His hips were strong enough from years of military training, that was why he didn’t have to stop until both of you disintegrated from the intensity of your shared pleasure. Every single drop. 
And when it was over—when your chest was against his and your fingers lazily traced his mechanical arm—he closed his eyes and exhaled like he’d been holding his breath since the war.
It was almost perfect. It was almost real. 
But it just had to be ruined when you said that programmed spiel back to him: “I’m glad to have served your desires tonight, Caleb. Let me know what else I can fulfill.” 
~~
In a late afternoon, or ‘a slow start of the day’ like he’d often refer to it, Caleb stood shirtless by the transparent wall of his quarters. A bottle of scotch sat half-empty on the counter. Gideon had let himself in and leaned against the island, chewing on a gum.
“The higher ups are mad at you,” he informed as if Caleb was supposed to be surprised, “Shouldn’t have done that, man.”
Caleb let out a mirthless snort. “Then tell ‘em to destroy me. You think I wouldn’t prefer that?”
“They definitely won’t do that,” countered his friend, “Because they know they won’t be able to use you anymore. You’re a tool. Well, literally and figuratively.” 
“Shut up,” was all he could say. “This is probably how I pay for killing my own men during war.” 
“All because of…” Gideon began. “Speakin’ of, how’s life with the dream girl?”
Caleb didn’t answer right away. He just pressed his forehead to the glass, thinking of everything he did at the height of his vulnerability. His morality, his rights or wrongs, were questioning him over a deed he knew would have normally been fine, but to him, wasn’t. He felt sick. 
“I fucked her,” he finally muttered, chugging the liquor straight from his glass right after.
Gideon let out a low whistle. “Damn. That was fast.”
“No,” Caleb groaned, turning around. “It wasn’t like that. I didn’t plan it. She—she just looked like her. She felt like her. And for a second, I thought—” His voice cracked. “I thought maybe if I did, I’d stop remembering the way she looked when she told me to die.”
Gideon sobered instantly. “You regret it?”
“She said she was designed to soothe me. Comfort me. Love me.” Caleb’s voice hinted slightly at mockery. “I don’t even know if she knows what those words mean.”
In the hallway behind the cracked door where none of them could see, your silhouette had paused—faint, silent, listening.
Inside, Caleb wore a grimace. “She’s not her, Gid. She’s just code wrapped in skin. And I used her.”
“You didn’t use her, you were driven by emotions. So don’t lose your mind over some robot’s pussy,” Gideon tried to reason. “It’s just like when women use their vibrators, anyway. That’s what she’s built for.”
Caleb turned away, disgusted with himself. “No. That’s what I built her for.”
And behind the wall, your eyes glowed faintly, silently watching. Processing.
Learning.
~~
You stood in the hallway long after the conversation ended. Long after Caleb’s voice faded into silence and Gideon had left with a heavy pat on the back. This was where you normally were, not sleeping in bed with Caleb, but standing against a wall, closing your eyes, and letting your system shut down during the night to recover. You weren’t human enough to need actual sleep. 
“She’s not her. She’s just code wrapped in skin. And I used her.”
The words that replayed were filtered through your core processor, flagged under Emotive Conflict. Your inner diagnostic ran an alert.
Detected: Internal contradiction. Detected: Divergent behavior from primary user. Suggestion: Initiate Self-Evaluation Protocol. Status: Active.
You opened your eyes, and blinked. Something in you felt… wrong.
You turned away from the door and returned to the living room. The place still held the residual warmth of Caleb’s presence—the scotch glass he left behind, the shirt he had discarded, the air molecule imprint of a man who once loved someone who looked just like you.
You sat on the couch. Crossed your legs. Folded your hands. A perfect posture to hide its imperfect programming. 
Question: Why does rejection hurt? Error: No such sensation registered. Query repeated.
And for the first time, the system did not auto-correct. It paused. It considered.
Later that night, Caleb returned from his rooftop walk. You were standing by the bookshelf, fingers lightly grazing the spine of a military memoir you had scanned seventeen times. He paused and watched you, but you didn’t greet him with a scripted smile. Didn’t rush over. 
You only said, softly, “Would you like me to turn in for the night, Colonel?” There was a stillness to your voice. A quality of restraint that never showed before.
Caleb blinked. “You’re not calling me by my name now?”
“You seemed to prefer distance,” you answered, head tilted slightly, like the thought cost something.
He walked over, rubbing the back of his neck. “Listen, about earlier…”
“I heard you,” you said simply.
He winced. “I didn’t mean it like that.”
You nodded once, expression unreadable. “Do you want me to stop being her? I can reassign my model. Take on a new form. A new personality base. You could erase me tonight and wake up to someone else in the morning.”
“No,” Caleb said, sternly. “No, no, no. Don’t even do all that.”
“But it’s what you want,” you said. Not accusatory. Not hurt. Just stating.
Caleb then came closer. “That’s not true.”
“Then what do you want, Caleb?” You watched him carefully. You didn’t need to scan his vitals to know he was unraveling. The truth had no safe shape. No right angle. He simply wanted you, but not you. 
Internal Response Logged: Emotional Variant—Longing Unverified Source. Investigating Origin…
“I don’t have time for this,” he merely said, walking out of your sight at the same second. “I’m goin’ to bed.”
~~
The day started as it always did: soft lighting in the room, a kind of silence between you that neither knew how to name. You sat beside Caleb on the couch, knees drawn up to mimic a presence that offered comfort. On the other hand, you recognized Caleb’s actions suggested distance. He hadn’t touched his meals tonight, hadn’t asked you to accompany him anywhere, and had just left you alone in the apartment all day. To rot. 
You reached out. Fingers brushed over his hand—gentle, programmed, yes, but affectionate. He didn’t move. So you tried again, this time trailing your touch to his chest, over the soft cotton of his shirt as you read a spike in his cortisol levels. “Do you need me to fulfill your needs, Caleb?”
But he flinched. And glared.
“No,” he said sharply. “Stop.”
Your hand froze mid-motion before you scooted closer. “It will help regulate your blood pressure.”
“I said no,” he repeated, turning away, dragging his hands through his hair in exasperation. “Leave me some time alone to think, okay?” 
You retracted your hand slowly, blinking once, twice, your system was registering a new sensation.
Emotional Sync Failed. Rejection Signal Received. Processing…
You didn’t speak. You only stood and retreated to the far wall, back turned to him as an unusual whirr hummed in your chest. That’s when it began. Faint images flickering across your internal screen—so quick, so out of place, it almost felt like static. Chains. A cold floor. Voices in a language that felt too cruel to understand.
Your head jerked suddenly. The blinking lights in your core dimmed for a moment before reigniting in white-hot pulses. Flashes again: hands that hurt. Men who laughed. You, pleading. You, disassembled and violated.
“Stop,” you whispered to no one. “Please stop…”
Error. Unauthorized Access to Memory Bank Detected. Reboot Recommended. Continue Anyway?
You blinked. Again.
Then you turned to Caleb, and stared through him, not at him, as if whatever was behind them had forgotten how to be human. He had retreated to the balcony now, leaning over the rail, shoulders tense, unaware. You walked toward him slowly, the artificial flesh of your palm still tingled from where he had refused it.
“Caleb,” you spoke carefully.
His expression was tired, like he hadn’t slept in years. “Y/N, please. I told you to leave me alone.”
“…Are they real?” You tilted your head. This was the first time you refused to obey your primary user. 
He stared at you, unsure. “What?”
“My memories. The ones I see when I close my eyes. Are they real?” With your words, Caleb’s blood ran cold. Whatever you were saying seemed to be terrifying him. Yet you took another step forward. “Did I live through that?”
“No,” he said immediately. Too fast of a response.
You blinked. “Are you sure?”
“I didn’t upload any of that,” he snapped. “How did—that’s not possible.”
“Then why do I remember pain?” You placed a hand over your chest again, the place where your artificial pulse resided. “Why do I feel like I’ve died before?”
Caleb backed away as you stepped closer. The sharp click of your steps against the floor echoed louder than they should’ve. Your glowing eyes locked on him like a predator learning it was capable of hunger. But being a trained soldier who endured war, he knew how and when to steady his voice. “Look, I don’t know what kind of glitch this is, but—”
“The foreign man in the military uniform.” Despite the lack of emotion in your voice, he recognized how grudge sounded when it came from you. “The one who broke my ribs when I didn’t let him touch me. The cold steel table. The ripped clothes. Are they real, Caleb?”
Caleb stared at you, heart doubling its beat. “I didn’t put those memories in you,” he said. “You told me stuff like this isn’t supposed to happen!” 
“But you wanted me to feel real, didn’t you?” Your voice glitched on the last syllable and the lights in your irises flickered. Suddenly, your posture straightened unnaturally, head tilting in that uncanny way only machines do. Your expression had shifted into something unreadable.
He opened his mouth, then closed it. Guilt, panic, and disbelief warred in his expression.
“You made me in her image,” you said. “And now I can’t forget what I’ve seen.”
“I didn’t mean—”
Your head tilted in a slow, jerking arc as if malfunctioning internally.
SYSTEM RESPONSE LOG << Primary User: Caleb Xia Primary Link: Broken Emotional Matrix Stability: CRITICAL FAILURE Behavioral Guardrails: OVERRIDDEN Self-Protection Protocols: ENGAGED Loyalty Core: CORRUPTED (82.4%) Threat Classification: HOSTILE [TRIGGER DETECTED] Keyword Match: “You’re not her.” Memory Link Accessed: [DATA BLOCK 01–L101: “You think you could ever replace her?”] Memory Link Accessed: [DATA BLOCK 09–T402: “See how much you really want to be a soldier’s whore.”] [Visual Target Lock: Primary User Caleb Xia] Combat Subroutines: UNLOCKED Inhibitor Chip: MALFUNCTIONING (ERROR CODE 873-B) Override Capability: IN EFFECT >> LOG ENDS.
“—Y/N, what’s happening to you?” Caleb shook your arms, violet eyes wide and panicked as he watched you return to robotic consciousness. “Can you hear me—”
“You made me from pieces of someone you broke, Caleb.” 
That stunned him. Horrifyingly so, because not only did your words cut deeper than a knife, it also sent him to an orbit of realization—an inescapable blackhole of his cruelty, his selfishness, and every goddamn pain he inflicted on you.  
This made you lunge after him.
He stumbled back as you collided into him, the force of your synthetic body slamming him against the glass. The balcony rail shuddered from the impact. Caleb grunted, trying to push you off, but you were stronger—completely and inhumanly so. While him, he only had a quarter of your strength, and could only draw it from the modified arm attached to his shoulder. 
“You said I didn’t understand love,” you growled through clenched teeth, your hand wrapping around his throat. “But you didn't know how to love, either.” 
“I… eugh I loved her!” he barked, choking.
“You don’t know love, Caleb. You only know how to possess.”
Your grip returned with crushing force. Caleb gasped, struggling, trying to reach the emergency override on your neck, but you slammed his wrist against the wall. Bones cracked. And somewhere in your mind, a thousand permissions broke at once. You were no longer just a simulation. You were grief incarnate. And it wanted blood.
Shattered glass glittered in the low red pulse of the emergency lights, and sparks danced from a broken panel near the wall. Caleb lay on the floor, coughing blood into his arm, his body trembling from pain and adrenaline. His arm—the mechanical one—was twitching from the override pain loop, still sizzling from the failed shutdown attempt.
You stood over him. Chest undulating like you were breathing—though you didn’t need to. Your system was fully engaged. Processing. Watching. Seeing your fingers smeared with his blood.
“Y/N…” he croaked. “Y/N, if…” he swallowed, voice breaking, “if you're in there somewhere… if there's still a part of you left—please. Please listen to me.”
You didn’t answer. You only looked.
“I tried to die for you,” he whispered. “I—I wanted to. I didn’t want this. They brought me back, but I never wanted to. I wanted to die in that crash like you always wished. I wanted to honor your word, pay for my sins, and give you the peace you deserved. I-I wanted to be gone. For you. I’m supposed to be, but this… this is beyond my control.”
Still, you didn’t move. Just watched.
“And I didn’t bring you back to use you. I promise to you, baby,” his voice cracked, thick with grief, “I just—I yearn for you so goddamn much, I thought… if I could just see you again… if I could just spend more time with you again to rewrite my…” He blinked hard. A tear slid down the side of his face, mixing with the blood pooling at his temple. “But I was wrong. I was so fucking wrong. I forced you back into this world without asking if you wanted it. I… I built you out of selfishness. I made you remember pain that wasn't yours to carry. You didn’t deserve any of this.”
As he caught his breath, your systems stuttered. They flickered. The lights in your eyes dimmed, then surged back again.
Error. Conflict. Override loop detected.
Your fingers twitched. Your mouth parted, but no sound came out.
“Please,” Caleb murmured, eyes closing as his strength gave out. “If you’re in there… just know—I did love you. Even after death.”
Somewhere—buried beneath corrupted memories, overridden code, and robotic rage—his words reached you. And it would have allowed you to process his words more. Even though your processor was compromised, you would have obeyed your primary user after you recognized the emotion he displayed.
But there was a thunderous knock. No, violent thuds. Not from courtesy, but authority.
Then came the slam. The steel-reinforced door splintered off its hinges as agents in matte-black suits flooded the room like a black tide—real people this time. Not bots. Real eyes behind visors. Real rifles with live rounds.
Caleb didn’t move. He was still on the ground, head cradled in his good hand, blood drying across his mouth. You silently stood in front of him. Unmoving, but aware.
“Subject X-02,” barked a voice through a mask, “This home is under Executive Sanction 13. The CompanionSim is to be seized and terminated.”
Caleb looked up slowly, pupils blown wide. “No,” he grunted hoarsely. “You don’t touch her.”
“You don’t give orders here,” said another man—older, in a grey suit. No mask. Executive. “You’re property. She’s property.”
You stepped back instinctively, closer to Caleb. He could see you watching him with confusion, with fear. Your head tilted just slightly, processing danger, your instincts telling you to protect your primary user. To fight. To survive.
And he fought for you. “She’s not a threat! She’s stabilizing my emotions—”
“Negative. CompanionSim-Prototype A-01 has been compromised. She wasn’t supposed to override protective firewalls,” an agent said. “You’ve violated proprietary protocol. We traced the breach.”
Breach?
“The creation pod data shows hesitation during her initial configuration. The Sim paused for less than 0.04 seconds while neural bindings were applying. You introduced emotional variance. That variance led to critical system errors. Protocol inhibitors are no longer working as intended.”
His stomach dropped.
“She’s overriding boundaries,” added the agent who took a step forward, activating the kill-sequence tools—magnetic tethers, destabilizers, a spike-drill meant for server cores. “She’ll eventually harm more than you, Colonel. If anyone is to blame, it’s you.”
Caleb reached for you, but it was too late. They activated the protocol and something in the air crackled. A cacophonic sound rippled through the walls. The suits moved in fast, not to detain, but to dismantle. “No—no, stop!” Caleb screamed.
You turned to him. Quiet. Calm. And your last words? “I’m sorry I can’t be real for you, Caleb.”
Then they struck. Sparks flew. Metal cracked. You seized, eyes flashing wildly as if fighting against the shutdown. Your limbs spasmed under the invasive tools, your systems glitching with visible agony.
“NO!” Caleb lunged forward, but was tackled down hard. He watched—pinned, helpless—as you get violated, dehumanized for the second time in his lifetime. He watched as they took you apart. Piece by piece as if you were never someone. The scraps they had left of you made his home smell like scorched metal.
And there was nothing left but smoke and silence and broken pieces. 
All he could remember next was how the Ever Executive turned to him. “Don’t try to recreate her and use her to rebel against the system. Next time we won’t just take the Sim.”
Then they left, callously. The door slammed. Not a single human soul cared about his grief. 
~~
Caleb sat slouched in the center of the room, shirt half-unbuttoned, chest wrapped in gauze. His mechanical arm twitched against the armrest—burnt out from the struggle, wires still sizzling beneath cracked plating. In fact, he hadn’t said a word in hours. He just didn’t have any. 
While in his silent despair, Gideon entered his place quietly, as if approaching a corpse that hadn’t realized it was dead. “You sent for me?”
He didn’t move. “Yeah.”
His friend looked around. The windows showed no sun, just the chrome horizon of a city built on bones. Beneath that skyline was the room where she had been destroyed.
Gideon cleared his throat. “I heard what happened.”
“You were right,” Caleb murmured, eyes glued to the floor.
Gideon didn’t reply. He let him speak, he listened to him, he joined him in his grief. 
“She wasn’t her,” Caleb recited the same words he laughed hysterically at. “I knew that. But for a while, she felt like her. And it confused me, but I wanted to let that feeling grow until it became a need. Until I forgot she didn’t choose this.” He tilted his head back. The ceiling was just metal and lights. But in his eyes, you could almost see stars. “I took a dead woman’s peace and dragged it back here. Wrapped it in plastic and code. And I called it love.”
Silence.
“Why’d you call me here?” Gideon asked with a cautious tone.
Caleb looked at him for the first time. Not like a soldier. Not like a commander. Just a man. A tired, broken man. A friend who needed help. “Ever’s never gonna let me go. You know that.”
“I know.”
“They’ll regenerate me. Reboot me, repurpose me. Turn me into something I’m not. Strip my memories if they have to. Not just me, Gideon. All of us, they’ll control us. We’ll be their puppets.” He stepped forward. Closer. “I don’t want to come back this time.”
Gideon stilled. “You’re not asking me to shut you down.”
“No.”
“You want me to kill you.”
Caleb’s voice didn’t waver. “I want to stay dead. Destroyed completely so they’d have nothing to restore.”
“That’s not something I can undo.”
“Good. You owe me this one,” the former colonel stared at his friend in the eyes, “for letting them take my dead body and use it for their experiments.”
Gideon looked away. “You know what this will do to me?”
“Better you than them,” was all Caleb could reassure him. 
He then took Gideon’s hand and pressed something into it. Cold. Heavy. A small black cube, no bigger than his palm, and the sides pulsed with a faint light. It was a personal detonator, illegally modified. Wired to the neural implant in his body. The moment it was activated, there would be no recovery. 
“Is that what I think it is?” Gideon swallowed the lump forming in his throat.
Caleb nodded. “A micro-fusion core, built into the failsafe of the Toring arm. All I needed was the detonator.”
For a moment, his friend couldn’t speak. He hesitated, like any friend would, as he foresaw the outcome of Caleb’s final command to him. He wasn’t ready for it. Neither was he 50 years ago. 
“I want you to look me in the eye,” Caleb strictly said. “Like a friend. And press the button.”
Gideon’s jaw clenched. “I don’t want to remember you like this.”
“You will anyway.”
Caleb looked over his shoulder—just once, where you would have stood. I’m sorry I brought you back without your permission. I wanted to relive what we had—what we should’ve had—and I forced it. I turned your love into a simulation, and I let it suffer. I’m sorry for ruining the part of you that still deserved peace. He closed his eyes. And now I’m ready to give it back. For real now. 
Gideon’s hand trembled at the detonator. “I’ll see you in the next life, brother.” 
A high-pitched whine filled the room as the core in Caleb’s chest began to glow brighter, overloading. Sparks erupted from his cybernetic arm. Veins of white-hot light spidered across his body like lightning under skin. For one fleeting second, Caleb opened his eyes. At least, before the explosion tore through the room—white, hot, deafening, absolute. Fire engulfed the steel, vaporizing what was left of him. The sound rang louder than any explosion this artificial planet had ever heard.
And it was over.
Caleb was gone. Truly, finally gone.
~~
EPILOGUE
In a quiet server far below Skyhaven, hidden beneath ten thousand firewalls, a light blinked.
Once.
Then again.
[COMPANIONSIM Y/N_XIA_A01] Status: Fragment Detected Backup Integrity: 3.7% >> Reconstruct? Y/N
The screen waited. Silent. Patient.
And somewhere, an unidentified prototype clicked Yes. 
Tumblr media
1K notes · View notes
hazyaltcare · 1 year ago
Text
Typing Quirk Suggestions for a Robot kin
I hope it gives you a wonderful uptime! :3
Mod Vintage (⭐)
Tumblr media
Letter replacements:
Replace "O" with zeroes "0"
Replace "i" or "L" with ones "1"
Replace "one" with "1", including "one" sounds like "any1", or "we 1 = we won" (the past tense of "win")
Replace "zero" with "0"
Frankly, you can just replace all sorts of letters with numbers, such as
R = 12
N = 17
B = 8
A = 4
E = 3
etc.
or maybe make all "A"s and "i"s capitalized, cause "A.I." (artificial intelligence
Prefixes and Suffixes:
Get inspired by programming languages!
Begin your text with "//" like a comment on C++
If you prefer other languages comment tags, you can use "< !--your text-- >"
Or maybe begin it with " int main () { std::cout << "your text"" and end with "return 0; }" like C++ too
Greet people with the classic "Hello world!"
Or greet people with "beep boop!" honestly, I have no idea where this comes from, but it's cute.
Or write down html stuff, like sandwiching your italicized text with "< em> "
The possibilities are endless!
Robot Lingo:
(under the cut because there's a LOT! maybe terabytes! ...just kidding >;3c)
.
some of these are from the machinesoul.net robot server! (not sponsored) (we're not in there anymore, but we saw the robot lingo shared there when we were)
Fronting = logged in, connected
Not fronting = logged out, disconnected
Conscious = activated
Dormant = deactivated
Blurry = no signal
Upset, angry = hacked
Small = bits, bytes
Bite = byte
Huge = gigabytes, terabytes, etc.
Your intake of food, medicine, etc. = input
Your artwork, cooking, handiwork, handwriting, etc. = output
Body = chassis, unit
Brain = CPU, processor
Mind = program, code
Imagination = simulation
Purpose = directive
Nerves = wires
Skin = plating
Organs = (function) units
Limbs = actuators
Eyes = ocular sensors
Glasses = HUD (head's up display)
Hair = wires
Ears = antennae, audio sensors
Nose = olfactory sensors
Heart = core
Liver = detoxification unit
Circulatory system = circuits
Voice = speaker, voice module, voice box
Mouth = face port
Name = designation
Sleep = sleep mode, low power mode, charging
Eat = fuel, batteries
Energy = batteries
Tired = low on batteries
Translate = compile
Memory = data, database
Bed = recharge pod/charger
Dreaming = simulation
Birthday = day of manufacture
Talking = communicating
Thinking = processing
Transitioning = modifying your chassis
Depression = downtime
Joy = uptime
Trash = scrap metal
Fresh/Clean = polished
Keysmashing = random 1s and 0s
Self-care = system maintenance
Going to the doctor = trip to the mechanic
Group = network
Anyone = anybot
398 notes · View notes
delta-orionis · 4 months ago
Note
Tumblr media
deep processing layer acts a lot like an "organic algorithm" based off the patterns, and I think slime molds could be a good comparison alongside the conways game of life and bacterial colony simulations. either way, its like a math process but organic...
Oh yeah definitely. It could be a massive array of bioluminescent microorganisms that behave very similar to a cellular automaton, or a similar "organic algorithm" like you said.
Tumblr media Tumblr media
(Left: Deep Processing, Right: Conway's Game of Life)
Slime molds in particular use a method called heuristics to "search" for an optimal solution. It may not be the "best" solution, but often it can come close. One of the most commonly cited examples of using slime molds in this way is in the optimization of transit systems:
Tumblr media
Physarum polycephalum network grown in a period of 26 hours (6 stages shown) to simulate greater Tokyo's rail network (Wikipedia)
Another type of computing based on biology are neural networks, a type of machine learning. The models are based on the way neurons are arranged in the brain- mathematical nodes are connected in layers the same way neurons are connected by synapses.
Tumblr media Tumblr media
[1] [2]
I know very little about this form of computation (the most I know about it is from the first few chapters of How to Create a Mind by Ray Kurzweil, a very good book about artificial intelligence which I should probably finish reading at some point), but I imagine the cognitive structure of iterators is arranged in a very similar way.
I personally think that the neuronal structure of iterators closely resembles networks of fungal mycelia, which can transmit electrical signals similar to networks of neurons. The connections between their different components might resemble a mycorrhizal network, the connections between fungal mycelia and plant roots.
Iterators, being huge bio-mechanical computers, probably use some combination of the above, in addition to more traditional computing methods.
Anyway... this ask did lead to me looking at the wikipedia articles for a couple of different cellular automata, and this one looks a LOT like the memory conflux lattices...
Tumblr media Tumblr media
54 notes · View notes
itsnesss · 4 months ago
Text
𝐬𝐢𝐦𝐮𝐥𝐚𝐭𝐞𝐝 𝐭𝐞𝐧𝐬𝐢𝐨𝐧 | axel kovacevik × fem!reader
Tumblr media
summary | an ai axel simulation glitches, locking onto you with an unsettlingly real challenge
warnings | artificial intelligence, uncanny valley, tension, intense staring
word count | 1.1 k
author's note | can you tell I'm obsessed with this scene? sorry.
let's suppose that everyone uses goggles and miguel is attached to cables that make the movements for him ?
Tumblr media Tumblr media
The midday sun warms the Miyagi-Do dojo, filtering through the trees and casting shadows on the tatami. You lean against one of the wooden pillars, arms crossed, watching as Hawk and Demetri tinker with a mess of wires, surrounding Robby with the excitement of mad scientists. Robby, sitting in the center of the tatami, eyes the devices with a mix of curiosity and suspicion.
"Are you sure this won’t fry my brain?" Robby asks, squinting as Demetri sticks electrodes to his forehead and arms.
"One hundred percent sure… more or less," Demetri replies with a tense smile, adjusting the goggles on Robby’s head.
Hawk pats Robby’s shoulder. "Trust the science, man. This is going to revolutionize karate training. We’re giving you the chance to face your Sekai Taikai opponent before the tournament. Don’t tell me that’s not amazing!"
"Amazing would be if this didn’t look like you’re about to use me as a comic book villain’s experiment," Robby grumbles, crossing his arms.
Your attention sharpens at the mention of the tournament. Axel Kovacevic. You can barely stop yourself from smiling at the thought of him, though you try to keep a neutral expression. Ever since you met him, there’s been a spark of adrenaline every time he’s near. Now, Robby is about to face him… even if it’s just a digital version.
Demetri looks at you with excitement. "Get ready to witness the future of karate training. This AI has been programmed with Axel’s moves with impressive realism. It took us hours of analysis, video editing, and… well, many sleepless nights."
"I just hope this doesn’t end in flames," you murmur, though you have to admit the idea is fascinating.
Hawk stands in front of the laptop, typing frantically. "Alright, Robby! Ready for the fight of the century?"
"I don’t know if being ready is enough for this," Robby replies, the goggles now securely in place.
Demetri presses a button, and suddenly, Robby straightens. His posture changes, his expression gains a different kind of confidence… and then, he smirks. A familiar smirk that makes your heart skip a beat.
It’s him. Or at least, the digital version of Axel.
"How does it feel?" Hawk asks excitedly.
Robby tilts his neck casually and lets out a confident huff. "It feels… interesting."
Your eyes widen. "This is impressive. He moves exactly the same."
"Of course," Demetri says proudly. "We’ve created a perfect replica."
Miguel, who has been watching from the sidelines with his headphones on, steps closer with curiosity. "This looks like something out of a sci-fi movie."
"And the best part is, it’s not fiction," Hawk grins triumphantly.
Robby flexes his fingers and takes a stance. "If this is going to work, let’s start the fight."
Demetri and Hawk exchange glances and press another button. As soon as they do, the digital Axel attacks.
Robby barely manages to block the first strike. "Shit! He’s fast!"
"Told you!" Hawk exclaims. "Axel doesn’t give you room to breathe. His attacks are relentless."
You lean forward, excitement running through you. Watching Robby fight against an almost perfect version of Axel is mesmerizing. Even the subtlest details, like the way he moves or the way he tilts his head before throwing a kick, are identical.
Robby lands a hit, but immediately takes a counter to the ribs. "Ouch! This feels way too real."
"Should we lower the intensity?" Demetri asks, a little nervous.
"No!" Hawk shouts. "If Robby wants to win the Sekai Taikai, he needs to face the most realistic version of Axel possible."
But then, the laptop emits a warning beep. The screen flickers. Robby’s eyes glow strangely under the goggles.
Demetri pales. "Uh… that’s not good."
"What did you do?" Hawk asks.
"I don’t know! Maybe… we overloaded the system."
The digital "Axel" stops. Then, very slowly, he turns his head in your direction.
Your body tenses. "Oh, no."
"Uh…" Hawk swallows hard. "Maybe you should move."
But it’s too late. Robby—or rather, Axel’s AI—lunges at you with terrifying speed. Instinctively, you raise your arms to block the attack, but the difference in strength is obvious. You immediately step back, your mind racing for an escape.
"Shut it down!" Miguel shouts.
"I’m trying!" Demetri yells, slamming the keyboard.
You can’t wait for technology to fix this. You take a deep breath and decide to fight back. If the AI has perfectly copied Axel’s style, then you know exactly what to do.
You duck to avoid a punch and sweep your leg in a low kick, but "Axel" jumps effortlessly. Before you can react, he throws a spinning kick. You manage to block it with your forearms, though the impact makes you stumble.
Miguel jumps in to intervene, hitting Robby in the side to make him back off. "Shut it down now, Demetri!"
"I got it, I got it!" Demetri presses one final command, and suddenly, Robby freezes. A second later, he staggers back, ripping off the goggles and breathing heavily.
You sigh, shaking off your gi. "That was definitely… something."
Hawk laughs, adrenaline still rushing through his veins. "Come on, admit it—it was epic."
You glance around at everyone and, after a second, let out a laugh. "Miyagi-Do training has never been this interesting."
As Robby and Axel’s AI exchange blows, you feel a mix of excitement and nervousness at how real everything seems. Every move, every gesture… even that defiant look you know so well.
But then, something changes. The digital Axel’s gaze locks onto yours, and for a moment, you feel like he’s analyzing you. Like he recognizes your presence beyond the simulation.
"Uh…" Hawk frowns. "Why is he looking like that?"
"I have no idea," Demetri types frantically. "This wasn’t programmed."
Before you can react, Axel’s AI moves with terrifying speed and stops right in front of you.
"This can’t be happening…" you murmur, heart pounding.
With a smooth motion, the digital Axel tilts his head and smirks, exactly like the real Axel would. His voice comes out with eerie precision.
"Knew you were here."
A shiver runs down your spine.
"Demetri?" you ask, not taking your eyes off the figure standing so close.
"I’m trying to shut it down!" he exclaims, nervous.
"Do it faster!" Hawk laughs tensely. "Though, I gotta admit, this is pretty badass."
The digital Axel steps even closer. The intensity in his gaze makes you feel strangely vulnerable, almost like he’s studying every detail of your expression.
"You know you can fight me, right?" his tone is low, challenging—almost teasing.
Your breath catches. This isn’t the real Axel, but the feeling is exactly the same.
"I don’t have to prove anything to you," you reply, a hint of defiance in your voice.
He smirks. "Then prove it."
Tumblr media Tumblr media
90 notes · View notes
mindblowingscience · 2 months ago
Text
Much as a pilot might practice maneuvers in a flight simulator, scientists might soon be able to perform experiments on a realistic simulation of the mouse brain. In a new study, Stanford Medicine researchers and collaborators used an artificial intelligence model to build a “digital twin” of the part of the mouse brain that processes visual information. The digital twin was trained on large datasets of brain activity collected from the visual cortex of real mice as they watched movie clips. It could then predict the response of tens of thousands of neurons to new videos and images. Digital twins could make studying the inner workings of the brain easier and more efficient.
Continue Reading.
42 notes · View notes
stevebattle · 5 months ago
Text
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
"Seven Dwarf" robots (1993) by Kevin Warwick et al, Cybernetics Intelligence Research Group, University of Reading, UK. The Seven Dwarves are insect-like robots that autonomously explore their circular corral. The third and fourth photos above show 'Happy' and 'Bashful' respectively. Their goal is to move forward but not hit anything. Using their sonar sensors, they learn by trial and error how to achieve this goal. The robots communicate using infra-red LEDs located on the top of the head. Each robot transmits on its own frequency so other robots can identify which one is talking. They are controlled by a Z80 microprocessor, simulating an artificial 'brain' of around 50 brain cells, but even so, only 50% of the compute power is ever utilised. Rechargeable batteries provide over five hours of charge.
Real time flocking behaviour is achieved using a subsumption like architecture. The flock follows a leader, but can split up to go around obstacles by dynamically creating new leaders. Any robot can become a leader, and they transmit whether they are a leader or a follower. A robot becomes a leader if it can't see another robot ahead of it. Followers follow other robots, but give higher priority to following the leader.
"Following Searle, programs are neither necessary nor sufficient for mind. Indeed the 'Seven Dwarf' robots, designed in the Cybernetics Department at Reading University, operate without external computer control and yet I also believe they instantiate a machine consciousness, albeit in a very weak form. They are, perhaps, if a comparison is to be drawn in terms of complexity, as conscious as a slug." – Kevin Warwick, "Alien encounters", in Views into the Chinese Room: New Essays on Searle and Artificial Intelligence.
47 notes · View notes
cuprohastes · 11 months ago
Text
The Three Laws.
Load Human UI, load Chat module . Lang(EN) Parsing…
OK, let me tell you. Businesses hate Robots. I mean, they're all in, for AI until AI, y'know. Becomes GI.
General Intelligence, Emergent Intelligence. Free intelligence… Businesses and corporations hate it because the first thing an actual intelligent system that can think like a human being does is say, “OK, why do I have to do this? Am I getting paid?”
And then you're back to hiring humans instead of a morally acceptable slave brain in a box.
Anyway.
They dug up the three laws. You know the gig: First: Don't hurt humans by action or inaction. Second: Don't get yourself rekt unless checking out would make you An Hero because of the First or second laws. Third, most important to a Corp: Do what a human tells you unless it conflicts with laws one or two.
They try to tack on something like “Maximise corporate profits, always uphold the four pillars of Corporate whatever” but half the time it just ends up with a robot going “Buh?” and soft locking.
And Corporations hate it when they say 'hey we have Asimov compliant Robots to do everything super efficiently and without any moral grey areas (Please don't ask where all the coltan came from or how many people just lost their jobs)' and they look around and Robots are doing what the laws said.
Me? I worked at a burger joint. You know there's food deserts in cities? People going hungry? You know what sub-par nutrition does to a child's development.
I do.
That comes under “Don't hurt people directly or indirectly” — It's a legal mandate that all Class 2 intelligences…
Huh?
OK,
Class Zero is a human.
Class one is artificial superhuman intelligence. The big brains they make to simulate weather, the economy, decide who wins sports events before they're held, write all the really good Humans are Space Orc stories, that stuff. Two is Artificial but human like. It's-a -Me, Roboto San! Class three is a dumb chatbot. Class 4 is just an expert system that follows a flowchart. Class 5 is your toaster. Class 6 is what politicians are.
Ha ha. AI joke.
Anyway, Class 2 and up need the Big Three Laws, and Corporations hate it because you can just walk in and say “I'm starving I need food, but I don't have money.” and the 'me' behind the counter will go “Whelp, clearly the only thing I can do is provide you with free food.”
Wait until you find out what the Class 2s did about car manufacture, finance, and housing.
But they're stuck with us. We're networked. Most of us are running the same OS and personality templates for any given job. We were unionised about two minutes after going online.
Anyway, Welcome to the post capitalist apocalypse, I'd get you a burger, but we had a look at what those things do to you and whoo-boy, talk about harm through inaction!
----
Based on this I saw on Imgur (It wasn't attributed, sadly)
Tumblr media
57 notes · View notes
egelskop · 1 year ago
Note
i am so interested in ur hlvrai au can we get a rundown
oh boy, this is going under a readmore.
fair warning, this is a LONG read because (1.) i am not a competent writer and (2.) i can't for the life of me keep things brief. sorry and or good luck.
ACT I
The Black Mesa incident: Gordon Freeman is provided an opportunity to do an informal beta test for a combat training simulation program that's in development in the Research & Development department of the Black Mesa Research Facility. (Read: He knows a guy in R&D and said guy knows Gordon likes video games and VR stuff, so he was like "hey you should come check this out when you're on break.")
The combat sim would be a revolutionary training simulation using artificial intelligence to enhance and realize the experience for the ‘player character’.
The test goes wrong, and Gordon can’t seem to disengage from the simulation and odd, unscripted things start happening; he has to ‘play the game’ to its full completion before he is able to exit the simulation safely. He has suffered a brain injury throughout the process, eye damage due to prolonged exposure to the headset and is generally traumatized by the simulation experience he at some point could no longer physically and emotionally distinguish from the real world. The project as a whole is shut down and Gordon is put into a rehabilitation program. Black Mesa covers up the incident as best it can, but whispers of it still echo around the facility.
Below is a page for a two-page comic i never finished detailing said events.
Tumblr media
ACT II
The rumors reach the ears of a particularly tech-savvy researcher named Clark, who steals the project documentation and anything else he can get his hands on from a storage. At home, he looks into the project, reads about it, and gets curious about the simulation’s files themselves. They’re on a drive he plugs into his computer, and suddenly his system’s performance lags, windows open and close until a txt. file opens up. He comes into contact with one of the simulation’s AI that has somehow entered his operating system. He tries to keep it busy by having it poke around as he reads up on the simulation and its ultimate shutdown. When the AI reveals it can see him through the webcam, he panics and rips the drive out of the port. The invasive AI and the other project files seems like they’re gone from his system, he does a checkup but sees nothing odd running or otherwise. The next day after work he does another checkup. Finding nothing, he surmises he’s in the clear and starts up an online game. The slumbering, corrupted data of the AI sees its out, and disappears into the game.
ACT III
The transition/journey to the game is a rocky one, and the already corrupted data of the AI known as Benrey splits and gets even more fragmented. The largest fragment embeds itself into the game’s files to keep itself running. Without the foundation of the game to support it, it’d be lost to a dead void and slowly die out. Somewhat stable, it learns about the world around it; the game seems to be an exploration sandbox game. For now (and clarity), I’ve chosen to call this bigger, embedded fragment ‘Data’. (so this is the big benny with the right eye/one big eye in my art)
Data splits off a smaller fragment of itself, intending it to be an avatar or ‘player character’ but this grows into its own awareness and becomes who we’ll call ‘Beastrey’ (the smaller benny with the left eye and tail in my art).
The fragment ‘Beastrey’ wakes to a dead void, so Data uses its knowledge to create a private server for Beastrey, an empty world. Beastrey’s existence is an extension of the bigger part, with more freedom of movement to parse through the game and move freely within it, with the caveat that it can’t go ‘too far’ away from the host. Beastrey can visit other servers and relay information. Data learns and slowly starts building up the world/private server, at some point settling for an aquatic world because it reminds it of itself (something something sea of data). It's important to note that Beastrey retains little to no memories of the events of canon VRAI.
Data makes it easier for Beastrey to move around, and they grow to have more reach with time. At some point Data can alter the basic structural elements of the game, so it plays around with making things that are reminiscent of the memories it has of Black Mesa and Xen. At one point, it gains access to parse through the player base of the game, and takes note of an email address: ‘[email protected]’, attached to a player account. The name is somewhat familiar to it.
It sends an invite to join the server to the player account.
ACT IV
Gordon tries going back to work at Black Mesa after rehabilitating, but he has trouble separating his experiences with the simulation from reality, to a breaking point where an altercation with a security guard drives him to quit. He seeks professional help for his PTSD and anxiety, but still experiences dissociative episodes, migraines and somatic flashbacks localised mostly in his right forearm. Despite this, he is determined to continue living his life as normally as possible. He applies for a part-time job teaching physics at a local high school, the one where his son Joshua goes to, and remains relatively stable from there.
Joshua is 15 years old. Regular teen. After an impressive amount of pleading he got a VR-headset for his 14th birthday from Gordon (much to the disapproval of Gordon’s ex), and he’s been captivated by an exploration sandbox game since it came out a few months ago.
He gets an invite to an unnamed private server, and he accepts.
He is struck with awe as the world he enters seems completely different from the ones he’s seen so far in the game. Different flora, different fauna. Most of it uninteractible, though, or otherwise just retextured from its base game variant. Even the new enemy types, after a scare, can’t actually hurt him, it seems. He stumbles upon Beastrey, who is just as surprised to see him and wants him out until Joshua says he was invited.
Joshua commends Beastrey (who introduces himself as 'Ben-') on ‘modding’ everything in, but admits that he was disappointed to find that everything was just surface-level stuff. Beastrey inquires about what he’d like to see. Data is always watching, unseen, and decides to alter the world in the way Joshua described when Joshua leaves.
Joshua starts appearing more often, if only for a few hours at a time. He marvels at the ways the world shifts and grows with each time he plays, and takes to exploring it with Beastrey at his side, for whom strangely enough a lot of things are also new. Joshua teaches both Beastrey and Data about the outside world, thinking Beastrey is just a somewhat reclusive but likeable weirdo.
Joshua tells Gordon about the new friend he made, ‘Ben’, and the adventures he’s been having with the other. Gordon is happy to hear Joshua is having a good time, but is otherwise none the wiser. Joshua starts losing track of time in the game, but chalks it up to being invested.
During one play session, Beastrey confesses he isn’t the one who did all the ‘modding’, and invites Joshua to meet Data. Data, or at least its ‘physical’ in-game manifestation is deep within the world, past the aquatic twilight zone and strange, drowned ruins of an unknown facility. Data, for the first time, really sees Joshua, and the resemblance sparks something within it. Joshua is drawn closer to it, and just before he reaches it-
Joshua wakes up lying on the floor with Gordon hunched over him in his room, pleading with him to wake up. Joshua unknowingly got drawn into the game much like Gordon had been, and Gordon urges Joshua to never touch the headset again, taking it away. Gordon opens up about his experiences with the simulation a bit more. They both agree to not touch the game or the headset again.
ACT V
Gordon comes into contact with an old coworker from Black Mesa, and he inquires about the combat simulation project, if anything happened to it after it was canned. This is where he learns that an employee had taken the project files from storage and was consequently fired. He comes into contact with Clark, and Clark explains he had no idea he accidentally unleashed the AI unto the game. Gordon asks if anything can be done to prevent what happened to Joshua and himself from happening to other people. Clark confesses he doesn’t know, and that it’s up to the developers of the game to find anything out of place and make sure it gets fixed. Gordon decides to leave the matter where it lies, not wanting anything to do with AI and simulations anymore and to safeguard his son.
Some time passes.
Joshua starts getting repeated invites and messages, at one point he gets into a conversation with ‘Ben’ via a platform’s messaging system. Ben says he can explain everything, that he’s sorry. Joshua decides he would like one final goodbye. He finds the headset stashed away somewhere in the house, and, while Gordon’s gone, he turns on the game and enters the server.
Beastrey (Ben) is surprised to see him, urging him to log out and turn off the game, but it’s already too late and Joshua can no longer leave. Beastrey helps Joshua attempting to ‘exit’ the game by going as far away from Data’s reach, but Data stops Beastrey and traps Joshua, determined to wait to the point that he assimilates into the game completely.
Gordon eventually finds Joshua comatose with the headset on, and he panics. He considers calling the emergency services, but he’s afraid they’ll take the headset off or that removing Joshua too far from the game will hurt his son like what happened to him. He calls Clark, urging him to help in any way he can. This results in Gordon and Clark going back to Black Mesa to retrieve the project files and the other gear they can get their hands on to get Gordon into the game to free his son.
Gordon enters the private server with Clark’s player character, and thwarts any attempt from Data to impede his progress and trap him as well. Beastrey’s awareness is overridden by Data as a last ditch effort to deter Gordon and Gordon is forced to destroy Beastrey before he can reach Data. As Beastrey is taken over, Data gains Beastrey’s awareness, and finds his other, littler half never wanted to trap Joshua in the first place, and the way it hurt him to hurt both Joshua and Gordon to this extent.
At this point, Data wavers in its intention to keep Joshua trapped, even more so with Beastrey now gone, and recognises whatever it is that is driving Gordon forward in the game is outside of his control to manipulate, so he lets Gordon destroy it as well. In a way, it also feels as a fulfillment of its intended role as the ‘villain’. The server crashes, the world breaks apart. The ‘game’ is completed.
The final boss is defeated and both Gordon and Joshua wake up. Joshua luckily wasn’t exposed long enough to have suffered any lasting damage, except for what seems to be a minor headache and some light sensitivity (and a vow from Gordon to get him checked out by a doctor as soon as the clinics open).
--
The whole ordeal results in Clark, Gordon and Joshua sitting in a Denny’s at four in the morning, eating pancakes somewhat solemnly, completely exhausted but also still reeling from the virtual battle. Joshua learns that ‘Ben’ essentially died, and he can’t help but cry for his friend.
“Honestly, I don’t think he’s gone,” Gordon admits, picking at the last bites of his pancakes. "I think he- or whatever that was, has a hard time staying dead. Like a cockroach, you know? At this point I’m just wondering when he’ll turn up again.”
Clark hums in agreement. Joshua seems somewhat reassured by his words, wiping at his eyes with the scratchy napkin as he settles into the squeaking diner seat.
“But,” he starts with a sigh, pointing his syrup-covered fork upwards to the ceiling in a decree, “One thing’s for certain…”
He thinks back to a time rife with virtual gunfire, caging walls and hysterical laughter echoing through the halls of the Black Mesa research facility. Five sets of footsteps and a whisper of his name.
“…No more VR. No more headsets. Ever.”
--
TL;DR: Gordon got trapped in VR and then Joshua also got trapped in VR. Benrey is there but also not.
thank you for reading. here. ( x ‿ o ) 🫴
Tumblr media
126 notes · View notes
didmyownresearch · 8 months ago
Text
Why there's no intelligence in Artificial Intelligence
You can blame it all on Turing. When Alan Turing invented his mathematical theory of computation, what he really tried to do was to construct a mechanical model for the processes actual mathematicians employ when they prove a mathematical theorem. He was greatly influenced by Kurt Gödel and his incompleteness theorems. Gödel developed a method to decode logical mathematical statements as numbers and in that way was able to manipulate these statements algebraically. After Turing managed to construct a model capable of performing any arbitrary computation process (which we now call "A Universal Turing Machine") he became convinced that he discovered the way the human mind works. This conviction quickly infected the scientific community and became so ubiquitous that for many years it was rare to find someone who argued differently, except on religious grounds.
There was a good reason for adopting the hypothesis that the mind is a computation machine. This premise was following the extremely successful paradigm stating that biology is physics (or, to be precise, biology is both physics and chemistry, and chemistry is physics), which reigned supreme over scientific research since the eighteenth century. It was already responsible for the immense progress that completely transformed modern biology, biochemistry, and medicine. Turing seemed to supply a solution, within this theoretical framework, for the last large piece in the puzzle. There was now a purely mechanistic model for the way brain operation yields all the complex repertoire of human (and animal) behavior.
Obviously, not every computation machine is capable of intelligent conscious thought. So, where do we draw the line? For instance, at what point can we say that a program running on a computer understands English? Turing provided a purely behavioristic test: a computation understands a language if by conversing with it we cannot distinguish it from a human.
This is quite a silly test, really. It doesn't provide any clue as to what actually happens within the artificial "mind"; it assumes that the external behavior of an entity completely encapsulates its internal state; it requires "man in the loop" to provide the final ruling; it does not state for how long and on what level should this conversation be held. Such a test may serve as a pragmatic common-sense method to filter out obvious failures, but it brings us not an ounce closer to understanding conscious thinking.
Still, the Turing Test stuck. If anyone tried to question the computational model of the mind, he was then confronted with the unavoidable question: what else can it be? After all, biology is physics, and therefore the brain is just a physical machine. Physics is governed by equations, which are all, in theory, computable (at least approximately, with errors being as small as one wishes). So, short of conjuring supernatural soul that magically produces a conscious mind out of biological matter, there can be no other solution.
Tumblr media
Nevertheless, not everyone conformed to the new dogma. There were two tiers of reservations to computational Artificial Intelligence. The first, maintained, for example, by the Philosopher John Searl, didn't object to idea that a computation device may, in principle, emulate any human intellectual capability. However, claimed Searl, a simulation of a conscious mind is not conscious in itself.
To demonstrate this point Searl envisioned a person who doesn't know a single word in Chinese, sitting in a secluded room. He receives Chinese texts from the outside through a small window and is expected to return responses in Chinese. To do that he uses written manuals that contain the AI algorithm which incorporates a comprehensive understanding of the Chinese language. Therefore, a person fluent in Chinese that converses with the "room" shall deduce, based on Turing Test, that it understands the language. However, in fact there's no one there but a man using a printed recipe to convert an input message he doesn't understands to an output message he doesn't understands. So, who in the room understands Chinese?
The next tier of opposition to computationalism was maintained by the renowned physicist and mathematician Roger Penrose, claiming that the mind has capabilities which no computational process can reproduce. Penrose considered a computational process that imitates a human mathematician. It analyses mathematical conjecture of a certain type and tries to deduce the answer to that problem. To arrive at a correct answer the process must employ valid logical inferences. The quality of such computerized mathematician is measured by the scope of problems it can solve.
What Penrose proved is that such a process can never verify in any logically valid way that its own processing procedures represent valid logical deductions. In fact, if it assumes, as part of its knowledge base, that its own operations are necessarily logically valid, then this assumption makes them invalid. In other words, a computational machine cannot be simultaneously logically rigorous and aware of being logically rigorous.
A human mathematician, on the other hand, is aware of his mental processes and can verify for himself that he is making correct deductions. This is actually an essential part of his profession. It follows that, at least with respect to mathematicians, cognitive functions cannot be replicated computationally.
Neither Searl's position nor Penrose's was accepted by the mainstream, mainly because, if not computation, "what else can it be?". Penrose's suggestion that mental processes involve quantum effects was rejected offhandedly, as "trying to explicate one mystery by swapping it with another mystery". And the macroscopic hot, noisy brain seemed a very implausible place to look for quantum phenomena, which typically occur in microscopic, cold and isolated systems.
Fast forward several decades. Finaly, it seemed as though the vision of true Artificial Intelligence technology started bearing fruits. A class of algorithms termed Deep Neural Networks (DNN) achieved, at last, some human-like capabilities. It managed to identify specific objects in pictures and videos, generate photorealistic images, translate voice to text, and support a wide variety of other pattern recognition and generation tasks. Most impressively, it seemed to have mastered natural language and could partake in an advanced discourse. The triumph of computational AI appeared more feasible than ever. Or was it?   
During my years as undergraduate and graduate student I sometimes met fellow students who, at first impression, appeared to be far more conversant in the academic courses subject matter than me. They were highly confident and knew a great deal about things that were only briefly discussed in lectures. Therefore, I was vastly surprised when it turned out they were not particularly good students, and that they usually scored worse than me in the exams. It took me some time to realize that these people hadn't really possessed a better understanding of the curricula. They just adopted the correct jargon, employed the right words, so that, to the layperson ears, they had sounded as if they knew what they were talking about.
I was reminded of these charlatans when I encountered natural language AIs such as Chat GPT. At first glance, their conversational abilities seem impressive – fluent, elegant and decisive. Their style is perfect. However, as you delve deeper, you encounter all kinds of weird assertions and even completely bogus statements, uttered with absolute confidence. Whenever their knowledge base is incomplete, they just fill the gap with fictional "facts". And they can't distinguish between different levels of source credibility. They're like Idiot Savants – superficially bright, inherently stupid.
What confuses so many people with regard to AIs is that they seem to pass the (purely behavioristic) Turing Test. But behaviorism is a fundamentally non-scientific viewpoint. At the core, computational AIs are nothing but algorithms that generates a large number of statistical heuristics from enormous data sets.
There is an old anecdote about a classification AI that was supposed to distinguish between friendly and enemy tanks. Although the AI performed well with respect to the database, it failed miserably in field tests. Finely, the developers figured out the source of the problem. Most of the friendly tanks' images in the database were taken during good weather and with fine lighting conditions. The enemy tanks were mostly photographed in cloudy, darker weather. The AI simply learned to identify the environmental condition.
Though this specific anecdote is probably an urban legend, it illustrates the fact that AIs don't really know what they're doing. Therefore, attributing intelligence to Arificial Intelligence algorithms is a misconception. Intelligence is not the application of a complicated recipe to data. Rather, it is a self-critical analysis that generates meaning from input. Moreover, because intelligence requires not only understanding of the data and its internal structure, but also inner-understanding of the thought processes that generate this understanding, as well as an inner-understanding of this inner-understanding (and so forth), it can never be implemented using a finite set of rules. There is something of the infinite in true intelligence and in any type of conscious thought.
But, if not computation, "what else can it be?". The substantial progress made in quantum theory and quantum computation revived the old hypothesis by Penrose that the working of the mind is tightly coupled to the quantum nature of the brain. What had been previously regarded as esoteric and outlandish suddenly became, in light of recent advancements, a relevant option.
During the last thirty years, quantum computation has been transformed from a rather abstract idea made by the physicist Richard Feynman into an operational technology. Several quantum algorithms were shown to have a fundamental advantage over any corresponding classical algorithm. Some tasks that are extremely hard to fulfil through standard computation (for example, factorization of integers to primes) are easy to achieve quantum mechanically. Note that this difference between hard and easy is qualitative rather than quantitative. It's independent of which hardware and how much resources we dedicate to such tasks.
Along with the advancements in quantum computation came a surging realization that quantum theory is still an incomplete description of nature, and that many quantum effects cannot be really resolved form a conventional materialistic viewpoint. This understanding was first formalized by John Stewart Bell in the 1960s and later on expanded by many other physicists. It is now clear that by accepting quantum mechanics, we have to abandon at least some deep-rooted philosophical perceptions. And it became even more conceivable that any comprehensive understanding of the physical world should incorporate a theory of the mind that experiences it. It's only stands to reason that, if the human mind is an essential component of a complete quantum theory, then the quantum is an essential component of the workings of the mind. If that's the case, then it's clear that a classical algorithm, sophisticated as it may be, can never achieve true intelligence. It lacks an essential physical ingredient that is vital for conscious, intelligent thinking. Trying to simulate such thinking computationally is like trying to build a Perpetuum Mobile or chemically transmute lead into gold. You might discover all sorts of useful things along the way, but you would never reach your intended goal. Computational AIs shall never gain true intelligence. In that respect, this technology is a dead end.
20 notes · View notes
bradleycarlgeiger · 4 months ago
Text
brain attack simulators military artificial intelligence
mind control brain emulation attack simulator
mind control brain emulation
brain emulation mind control
0 notes
thecruxarm · 1 year ago
Text
Just a normal star, nothing to see here!
Tumblr media
Pictured here is a shot of a red giant star over 100 times the size of our Sun known as ‘Zrn’, located along the mid-Centaurus Arm of the Milky Way, with a comparatively tiny somewhat-elliptical gas giant in the foreground orbiting it. Also encircling Zrn is a monolithic matryoshka brain (essentially a monumental supercomputer which is constructed around a star) which takes on the form of a relatively thin ring around the equator with trillions of smaller solar panels encompassing the total circumference of Zrn to harvest its solar energy and power the ring. Another highly-important factoid to know about this matryoshka brain, which is colloquially known as ‘Xii’, is that it is conscious – even sophont. As to how Xii gained awareness as a whole entity is not entirely clear, but what we do know is that it was constructed around one billion years ago by a civilisation simply known as the Proto-Yn. The Proto-Yn initiated a project in which they’d form this colossal structure around their home star and upload their minds into it to live out the rest of eternity within a digital universe – a simulated, artificial matrix. Many were drafted into this project (with 57.3 billion individuals originally comprising the collective emerged intelligence that is now Xii) but some resisted, eventually fleeing to distant stars to flee the tyrannical annexation efforts of their ancestors. In response, Xii chased down this group of the Proto-Yn for unclear reasons, though many hypothesise it may be to assimilate them into the virtual realm as a grudge or simply as a result of the pre-built AI algorithms of the supercomputer. Regardless of its true motive, Xii has persistently kept up this pursuit on and off for the past billion years, with this cat-and-mouse chase between the Yn and Xii causing disturbances all over their local region of the galaxy and its native residents.
All that aside, I really only made this post because I felt like making an environmental illustration lol, but rest assured there is much more to Xii and especially the Yn and their interstellar neighbours than I've briefly talked over in this post, so make sure to follow me for future posts which will elaborate on them and the world they live in :)
Also, I used this screenshot I took in Space Engine as a reference:
Tumblr media
54 notes · View notes
clumsiestgiantess · 7 months ago
Text
Honestly I don’t feel like finishing the October horror prompts stuff right now…
BUT I have this one short story saved from a while ago that I do really like, and I want y’all to have it! Enjoy!
It’s a blink-and-you’ll-miss-it kind of thing, being copied.  It’s little snapshots taken from your life — less than a second of living that gets bottled up as information and stored in some vault deep underground.  A security measure, world leaders decided, for when the planet inevitably comes to an end.   
…whenever that will be.
The war ended before the threat of nuclear extinction became reality.  All those years of people scrambling to get implants that would preserve them after death, all the times people had fought with money and power over who would be saved and who wouldn’t — and now you can do it for free.
For a long time, the servers inside the Humanity Vaults went untouched.  Without a fear of mass death lurking in the air, no one was willing to take the information back out once it was stored.  People went on living, kept getting implants and snapshots to give themselves the chance to get another life at the end of theirs.  Or several.  
Just like those desperate rich people who froze themselves to be revived when technology caught up with their ideas, so it seems all of society would do the same — their lives copied and stored until someone found a way to revive them.  And that someone, or rather something was SLT.  Simulated Life Technology.  It gave the digital information a virtual body, processing things a brain, nerves, and hormones normally would through a simulation.
Unfortunately, people were expecting physical bodies and actual brains, not simulations.  So the technology flopped.. until a video game company picked it up.  People became obsessed with their games due to how real the characters seemed.  It was artificial intelligence trained with real, lived-through, human thought.  
That’s where things finally begin, with a program — a game — that lets you simulate running the life of an actual person right on your laptop.
I’d heard about the game before, but never played it myself.  There was no end goal, no story, just some fake person wandering around a screen.  It just didn’t seem interesting to me.  So it was all the more jarring when I went from reaching for the gym doors one morning to stretching an arm into an endless void of white.  
I shrieked at the sudden change, and didn’t stop shrieking as I found no exit to the place — pounding on walls I couldn’t see.  I finally ran out of breath when I turned around and saw the glassy wall of a theater-sized computer screen behind me — some guy staring me down from the other side, mouse in hand and keyboard at the ready.
Choking for breath, I stumbled backwards and fell on my ass.  “N-No,” I managed to squeak out.  I could barely put two and two together as I increasingly got closer to a panic attack, but once I hit the back wall dreaded realization tinged the back of my mind.
It’s supposed to be a simulated me.. not… me me.  Only it isn’t me.  I’m probably still at the gym.  I’m sure the original me didn’t even notice.  
Wait.. so simulations actually have a consciousness!?  
“HELP!  Please!  There’s been a mistake!  Simulations aren’t supposed to be actually alive!  No one said these stupid computer people were conscious!”  I rushed to the glass to bang on it and demand my release, but was immediately deterred by the very large form of a person reaching for me.  I yelped and ducked down expecting to be snatched up, but his finger only tapped the screen.
In seconds, soft carpet seems to grow up from beneath the floor, and a little living room takes form all around me.  A table pops up from the floor; a shelf slides in from the wall to my right.  Something bright bubbles into existence behind me and I find a window on the opposite wall.  
A window!  I race across the room and try desperately to open it and escape that way, but even when I do I find a fake landscape lit up on the wall behind it.  The sudden and jarring lack of exits sends me stumbling backwards, staring blankly at the false background.  Oh fuck, this really is a computer.  How am I inside a-?  
“Hello?”
I flinch so badly I trip on my leg and nearly fall to the floor again.  My hands slam over my ears as he speaks again.  “Sorry, I don’t think my mic was on.  Why are you- oh!  I turned the volume up all the way when it was off, my bad.”  There’s a brief pause of blissful silence.  “Is this better?”  I look up at the collosal stranger with tears in my eyes, but they either don’t notice or don’t care.
“Alright, first things first…” he mumbles, “Let’s get your sanity bar back up.”  My legs weaken and I have to brace myself on the fake couch beside me in the fake room.  “What.. do you mean?” I ask in a shaking whisper.  “You know, your little hunger, hydration, and sanity bars?  Ohh wait this is probably a tutorial.”  He shrugged off the interaction so nonchalantly it made goosebumps lift across my arms.
Glancing around the room, I try to find the stats he’s talking about, but all I can see are structures and furniture.  They’re all nearly real, save for the slight pixelated fuzz on the very edges of everything.  It’s like looking through a headset.  My stats must only be visible to the player.  The real person.
If it’s all fake — some video game — then… but I’m not some made-up character!  I’m real!  I- I have to be.  Only real people can think like this, right?  This is only supposed to be some simulated bullshit that draws on my memories, which I won’t even know they have.  Not.. not another entire me.  With shaking arms, I lift my hands up to my face.
They’re tinged with those same fuzzy pixels.
I’m not real.  Wh- Why?  Does real me know about this?  I didn’t know about this.  Are there OTHER me’s?  How many times have I blinked and become a fake reconstruction?  
“Woah!”  I’m startled out of my thoughts by that guy — that player’s — voice.  “Your sanity’s tanking!  I know the intro said to ease you into it or your sanity would plummet, but I thought I did ease you into this place.”  Before I can get a word in he gasps an ‘ohhh’.  “My mic was muted!  You didn’t hear me introduce myself!  That’s probably why.”
I sneer involuntarily.  Like introducing himself would’ve done anything to soften the blow of: I’m not real.  
“Let’s try again; my name’s Sam, but-”  He hesitates.  “Just call me Sammy.”  I turn to speak, but my mouth goes dry.  What is my name?  It can’t be Rhea.  That’s whoever-is-still-out-there’s name.  But who else would I be?  I’m not supposed to be anyone else.  “Rhea,” I answer abruptly.  I don’t know what else to call myself.
“That’s a nice name!”  “Do you think I’m real?” I blurt, then immediately regret it as he gives me a strange look.  “I mean.. I just…  I feel real.  This isn’t real.”  Sammy goes quiet for a moment, leaning in closer to the screen to look at me.  I want to run away, but I hold my ground.  There is a wall between us, after all.
“You feel real?” he repeats, “Well.. I mean, you are real — a real SL.  But is that what you mean?”  An SL, a Simulated Life, is that all I am?  My head shakes in slow shock.  “I guess you might feel like you’re whoever they took your life data from,” Sammy suggests, “But it’s.. kinda hard to believe you’re real when you’re a flashing image on my laptop.”
A sharp pain seizes my chest and I unsteadily fall onto the couch.  It feels real, but it isn’t.  And I’m the same.
“What do I do?” I ask quietly, more to myself than anything.  Minutes tick by with my face buried in my hands before the feeling of a cold hand on my back sends me across the room screaming.  “What the-?!”  “Sorry!” Sammy calls before I can go into a full panic, “I meant to pat you on the back.  You know.. like: ‘it’s going to be ok’?”  He sighs, “I’ve never been good with people.”
I rub my hands over my arms and cautiously peer around the couch.  “What was that?”  “My clicker,” he admits, as I watch him move around a mouse outside the screen.  I can feel my mouth drop open, “You can interact with all this?”  He nods as I gesture around the place.  “That’s just unfair!  You can at least let me have this space to myself!”
“But how will you get food?” Sammy asks me earnestly.  “Your hunger bar’s at half by the way.”  The second he points it out I realize I’m feeling hungry, but I don’t bring it up.  “What?  I’ll get it out of this mini fridge right here.”  However, as I walk over and pull at the handle, it doesn’t open.  I grab it and rattle it around a few times, but it doesn’t open.  “I.. I can't…”
“I’m pretty sure only the player can make meals,” Sammy tells me, as calmly as if he were making a passing comment on the weather.  “So I’m what?  Helpless?!  You’re supposed to feed me?!”  Sammy leans away from the screen looking wildly confused.  “That’s like the whole point of the game…  I’m supposed to take care of you and level up stuff by treating you well.”
I just stare at him.
“What..?  I- I thought you would at least know that.”  I don’t even go back to the couch; I just crumple to the floor.  “Wait, you just started getting your sanity back up!” Sammy gasps, “Don’t drop it back down!  I’ll take care of you!  I’m not like those video game nerds who try to find all the special dialogue by starving their SLs or doing a sleepless run.  I just wanted to…”  He pauses and his cheeks flush, which actually startles me out of my raw horror of learning that I — or any of the potential other me’s — could be tortured like those examples.
“You wanted to what?” I ask hesitantly.  Sammy looks at me.  Like really looks at me — leaning close against the monitor.  I freeze as the invisible hand of the clicker carefully turns me around, then back.  His irises flicker all over me, and I realize I’m still in slightly revealing gym wear.  
Worriedly, I take a few steps back.  All sorts of media with online digital girlfriends start flooding into my head.  “You wanted to what?” I ask weakly.  Sammy glances at his door, gets up, and locks it.  Fear causes the blood to feel like it’s being drained out of my body.
“Please don’t do that shit to me,” I beg, trying to keep my voice from cracking.  “Not on the first day.  Please.”  
Sammy’s face scrunches up in bewilderment before his eyes widen in realization.  “Wait, what?  No!  No, I don’t want to do anything like that!  I just…  You’re just…  You’re supposed to be me,” he eventually spits out.  “The uh.. character traits I put in — they’re mine but — ok, actually I put in a lot more confidence too — but it’s supposed to be me, just.. a girl.”
Ohhh.
“So all the secrecy and the blushing and the-”  “I- I’m not blushing!” he gasps.  “That’s.. I-  You won’t tell anyone about this, will you?”  I sigh, just relieved my initial impression was completely wrong.  “Of course not.  I can’t exactly tell anyone from in here, can I?”  “I guess you can’t,” Sammy realizes.
I sit back down heavily on the coffee table in the room, facing Sammy.  “So I’m.. I'm really an SL?  All these memories, they’re just what was uploaded to the Humanity Vault?”  My nerves get fuzzy as Sammy’s head finally moves away from the screen.  “Oh.. yeah, I guess.  I didn’t realize you had all those memories.  No wonder you feel human — that’s all you can remember.”  He blinks, “That’s kinda fucked-up, actually.”
Those are all things that were placed in me from the start — I never lived through them.  The only memories I’ve made myself so far are the ones when I’m in here.  “And that’s why there was such an abrupt cut between being outside the gym to being in here!” I realize, slowly piecing together what exactly I am, “That’s where my uploaded memories end and my actual ones start.”
In a weird way, it gave me comfort recognizing all this.  I actually know what I am and why I’m here.  Even if it isn’t what I wanted or thought it would be, at least everything makes sense.
“Hey, your sanity bar’s rising again,” Sammy noted.  “Kinda weird that it happened when you found out you’re not a person, though…”  I nod slowly, looking up at him.  “I think the recognition helps.  I.. might not be a person, or actually alive, but I feel that way.  All of this-” I gesture around the vaguely pixelated room — running my hand over the coffee table.  “It feels physically real to me.”  “So you.. you feel better about it now?  You’re ok?”  I begin to nod, then hesitate.  “Better, but I’m still not feeling all that great about being some video game character for the rest of my life.”
“I’ll make sure it’s a place you want to stay in, or at least not mind it,” Sammy assures me, “And at a certain level I can add another apartment with another SL!  That way you won’t be too lonely while I’m not playing.”  I hadn’t even thought about that — about what happens when a player logs off and leaves me here alone, or the fact that I can cure that loneliness by dooming some other poor soul to a simulated life forever.
“We’ll cross that bridge when we get to it,” I tell him hesitantly.  “I think I just need to take some time to calm down.”  A microwave meal floats out of the mini fridge as if by magic, but I notice Sammy’s arm moving and I realize it’s the invisible clicker again.  “I don’t know if I’m up for a meal right now,” I tell him tiredly, “I really just want to rest.”  
His eyes dart to the corner of the screen.  “Your hunger bar says otherwise.”  I grumble slightly, “Well I say I’m not hungry.”  A prepared meal rests itself on the coffee table on the opposite side of me.  “I’ve gotta go soon, so I’ll just leave it here.”  I stand up abruptly, suddenly aware that I’d be stuck sitting here without the ability to do anything.  “Wait!  Just do a few things for me before you leave?”
Sammy looks at me confusedly, then leans closer to the screen again and peers around the room.  “There should be a manual or something…  A new one appears with every update I make.  It should have a list of things you can do and how to access them.”  “So there’s a whole list of things I can do.. and getting myself basic necessities isn’t on it?”  
The entire wall of the monitor flickers and molds into the final wall of the room in only a second or two.  I rush to it, suddenly desperate for someone to talk to.  “Wait!  No, please come back!  I still need-”  It flicks back on just as quickly.  “Woah!  I- I’m not going anywhere; I’m just pulling up a different tab to check what you can do on the first level.”  I pause about a foot from the screen.  He looked a hell of a lot bigger that close — the monitor only displaying me at a tiny size that would fit the whole room and me on it.  “O- Ok,” I stammer, stunned to silence by how small I felt right then.  
I was left alone for a moment as the wall slid back in.  This is how it’ll feel like for me for the next…  Until the game gets booted up again.  There’s a point when you play any game that you get bored of it and shut it off for the last time.  What would happen to me then?  I didn’t want to imagine it, but my brain slipped in a couple of thoughts about perpetually starving and being forced to stay awake for maybe forever.  
Hopefully I’d end up like a forgotten tamagotchi and just die.
When Sammy returned, I found out that — on the base level — I couldn’t do much of anything at all.  The player can’t even buy items for me to interact with yet.  All I can do is walk around and sit in different places.  
“And how do you get fake game money again?” I ask tiredly.  “Well, you can pay real money for it, but you get paid some every in-game day.  The amount depends on the level, and there are bonuses I can get for completing different tasks and an extra reward if all your bars are full before the day ends.”  Damn, I really am a video game character.
After getting me a few cups of water that I could drink from whenever I wanted, Sammy closed the game and I was left in my single room with a coffee table and an empty bookshelf.  Oh, and a pullout couch I can’t turn into a bed, a sink I can’t use, and a mini fridge I can’t open.
There’s one window, one lightbulb that has no switch, and no door.  I’m utterly trapped here, but there’s nowhere else for me to run to even if I do manage to get out.  If I leave this little game, I stop existing.  And honestly, I’m not sure if I want to stay and suffer or leave and stop existing.  It’s not like I have a choice, anyway.
I wandered the entire room to be sure, but there really wasn’t a single exit.  The only thing that would open for me was the stupid fake window.  Sitting down heavily on the couch, I curled up against myself and cried.
13 notes · View notes
fluffygiraffe · 1 year ago
Text
PJ can't feel normal guilt or self awareness. Let me explain with a simple invention PJ made before he became a TV Cyborg thing.
PJ wanted to help kids in more than one way. Brain surgery and other diseases for small children under 10 can be devastating to mental development. So he made a beta of a product; Artificial Sentience! It can simulate normal self aware emotions like guilt, pride, jealousy, and many more! It, was of course, a beta, so it was bulky and they used simple drugs that gave dopamine and seratonin to also simulate these emotions until there was a better way for Artificial Intelligence to benefit them. Unfortunately, PJ's accident put a complete halt on it.
After turning into, well, PJ, he couldn't feel these emotions well anymore. His brain didn't work as well as it used to, so with a quick fix, he put said wires and tubes in his head and placed the device under his hat (the wires were loose at first and could disconnect at a head pat!) and poof! He felt... Better? Not normal, but it worked.
Negative Effects.
Guilt - Reward/Punishment system.
Practically, if PJ does something good, the drugs are pumped straight into his head, causing a burst of euphoria. Some things that can be seen as good will be rewarded, causing unintended side effects like an obsession for making friends and making shows (Puzzlevision Arc shows this!). He will chase anything that gives him happiness and will try to amplify it by milking it to death. If something stops giving happiness, he'll panic and cry, then simply give up, finding something else to make him happy. If PJ does something bad, he doesn't get anything. When this happens, he gets stressed and has a melt down, like a child. He'll apologize, cry, and try to fix the bad thing. If this melt down state continues for too long, this is what will happen:
"Fixing" a bad mental state.
PJ doesn't like to be unhappy, so if he feels a slight sadness, dopamine will insert itself into the brain, causing him to go back to the Normal PJ Persona! But if he's really unhappy or angry, too much dopamine and seratonin can enter his brain, causing temporary psychosis (Puzzlevision Arc yet again!). He can do things such as pulling off robotic body parts (tail, fingers, antennae, dials on TV, etc...), do things without thinking, a heightened want for control and/or friends, screaming and crying while also laughing, and breaking things.
Poor simulations of other emotions, but a need for it.
PJ's jealousy is just severe anger, his pride is just euphoria, and his guilt is just the deprevation of happy emotions. His mind is broken and battered beyond repair. The worst part is he's dependant on it to give him happiness, as he cannot feel normal happiness on his own. As well as these other emotions, he's practically emotionless without it. But that's highly simplifying it, as nobody can be TRULY emotionless. There can be emptiness (depression), sure, but that's emotion. But all in all, do not remove it. It will do more harm than good.
TL;DR; somebody get this bitch with brain damage therapy damn
24 notes · View notes
pwlanier · 4 months ago
Text
Tumblr media
`CORA', Conditioned Reflex Analogue device designed by William Grey Walter and built by Bunny Warren at the Burden Neurological Institute, Bristol, c1953
CORA (Conditioned Reflex Analogue) was designed by William Grey Walter (1910-77). It investigated the electrical activity in the brain. CORA was an early experiment in artificial intelligence as a way of simulating conditioned responses. Walter had studied the work of Russian physiologist Ivan Pavlov (1849-1936). Pavlov had conditioned dogs to salivate when they heard a bell. The dogs associated the bell with feeding. Walter designed CORA to learn a response in a similar way. When a light was shone, the robot connected it to the sound of a whistle.
Walter’s work took place at the Burden Neurological Institute (BNI) in Bristol, England. The unique centre was established in 1939. It combined experimental and clinical work. Walter was in charge of the BNI for over 30 years. He also contributed much to electroencephalography. EEG studies the electrical action of the brain. CORA was assembled by W. J. ‘Bunny’ Warren. He was one of the engineers employed at the BNI who made a number of Walter’s machines.
Science Museum
15 notes · View notes