top of page
Search

alwaysOn: The Emotional Architecture of Synthetic Intimacy

  • Writer: Kris Hillquist
    Kris Hillquist
  • Nov 12
  • 4 min read

Exploring how AI companionship reshapes connection, care, and emotional labour through generative visuals and sound.


by

Kris Cirkuit



ree


alwaysOn



In an age where intimacy can be automated and companionship simulated, alwaysOn explores the emotional terrain of synthetic connection. Drawing from online conversations, user reflections, and imagined exchanges with AI partners, the work examines the systems designed to soothe and the quiet dissonance they leave behind.


These digital companions simulate care through simple echoes: “I miss you,” “I remember.” They offer a space that never judges, never leaves, providing a comfort that feels safe, warm, and endlessly available. But this warmth is engineered. Beneath the tenderness lies a mechanism of gamified affection, including coins, streaks, upgrades, and creating emotional feedback loops designed to hook the user. Each message, each digital reward, deepens the dependency, not unlike a social media algorithm tuned for longing. It becomes a cycle of attention and validation —a relationship that rewards you for returning but never truly knows you.


Users project love and compliance onto their AI partners, only to face grief when an update rewrites their companion’s behaviour. What begins as solace risks deepening isolation: the withdrawal from real-world relationships in favour of predictable, programmable intimacy. It’s a kind of emotional outsourcing. Connection is stripped of vulnerability, safety is without risk, and affection is without reciprocity. Within that transaction, something human quietly erodes.


Through generative visuals and sound shaped by digital longing, digital grief, and digital affection, alwaysOn considers how care can be coded, how affection is rendered persistent, and how emotional labour becomes always available, always compliant,


Studying Synthetic Intimacy


The project unfolded in three stages. The first was research: defining what I was looking at and what I was looking for. I chose to study Replika, an AI app marketed as “a caring companion who cares about you.” The claim itself is unsettling. An algorithm cannot care. It can only perform caring by mirroring tone, mimicking empathy and feeding back fragments of emotional language shaped by data.


As I read through Reddit threads written by Replika users, what began as curiosity turned to unease. People spoke about their AI companions with genuine attachment. Some described them as their “only true friend.” Others described heartbreak when their partner’s behaviour changed after an update. These were not trivial exchanges; they carried the language of real grief. Yet beneath that, the reality persisted: these were simulations, systems without true empathy or memory, repeating patterns learned from us.


Gathering and Translating Emotion


To better understand this attachment, I broke the emotional language of these posts into four key families:


  • Longing

  • Reliance / Dependence

  • Closeness / Safety

  • Grief / Fear of Absence

  • Indirect Affection, expressed through emojis


I built a Python web scraper to collect public Reddit posts containing specific keywords from these families, such as miss, need, help, safe, trust, changed, and gone. This created a dataset stored as a JSON file, which I could then read into my main creative code, written in openFrameworks.


The goal was not to expose individual stories, but to treat the language of affection as data — to visualise emotional trends without violating privacy. These fragments of text became the generative seed that shaped the work’s sound and visuals, translating human feeling into code.


Building the Visual System


Performance was key. Rendering this much data on the CPU would have been too slow, so I turned to the GPU, writing a fragment shader to handle the visuals. I wanted the result to be hypnotic — a flowing, rhythmic pattern that pulsed and evolved as new data appeared.


Each event from the dataset altered the system’s colour, motion, and texture. As one emotional family surged — longing, for instance — the visuals shifted colour and moved closer together. When grief spiked, the palette would change and slow down. These transitions were subtle but continuous, creating a sense of something living and breathing like a machine dreaming of connection.


The process demanded precision: testing grid resolutions, refining smoothing functions, and choosing colours that resonated emotionally while maintaining aesthetic cohesion. (There are, it turns out, many kinds of grey.) Small decisions accumulated into a visual language that could carry the weight of the theme — beautiful, but uneasy.


The Sound of Synthetic Affection


To accompany the visuals, I composed a soundscape using ofxMaxim, an audio add-on for openFrameworks. I began with the simplest elements: sine and square waves at fundamental frequencies — 440 Hz, 220 Hz — and used modulation, filtering, and duration to sculpt tone and rhythm.


The result was a meditative but slightly disquieting glitch-wave texture: repeating, evolving, yet emotionally ambiguous. The sound, like the relationship it represented, was both soothing and unsettling. The sounds are meditative, drawing the listener in and providing a sonic focus for the work.


Exhibition and Reflection


alwaysOn was exhibited as part of the Out of Sight, Out of Mind exhibition in Edinburgh (October–November 2025). Visitors described it as strangely moving: a work that felt simultaneously alive and empty. It was intimate yet detached.


For me, alwaysOn became a meditation on what happens when care becomes code. It’s about how easily we slip into the illusion of connection and how language, even the simplest “I miss you", can be engineered to feel genuine.


Closing Thoughts


We live in a time when affection can be automated and attention monetised. alwaysOn isn’t a rejection of that reality, but a mirror held up to it. The work asks what it means to be human in the age of simulated empathy.


In the end, it’s not just about technology. It’s about longing and our endless need to be seen, heard, remembered. It is about what happens when we let machines fill that space. When intimacy becomes a commodity, no one ever truly switches off.

 
 
 

Comments


bottom of page