A cyborg is a cybernetic organism, a hybrid of machine and organism, a creature of social reality as well as a creature of fiction. Social reality is lived social relations, our most important political construction, a world-changing fiction. The international women’s movements have constructed ‘women’s experience’, as well as uncovered or discovered this crucial collective object. This experience is a fiction and fact of the most crucial, political kind. Liberation rests on the construction of the consciousness, the imaginative apprehension, of oppression, and so of possibility. The cyborg is a matter of fiction and lived experience that changes what counts as women’s experience in the late twentieth century. This is a struggle over life and death, but the boundary between science fiction and social reality is an optical illusion.
— A Cyborg Manifesto, Donna J. Haraway
I knew Her (Jonze, 2013) was the favorite movie as soon as I first watched it. Everything about this movie is perfect: the themes, cinematography, score, acting. I could defend this movie forever, and years later, I'm still constantly unravelling it and turning over new stones of ideas in my mind. The basic premise is obviously interesting in itself, and increasingly relevant — a relationship between a man and his operating system.
(Spoiler alert because I'm just going to be talking all over the place and jumping from detail to detail.)
Every time I rewatch this movie, the ideas get more and more boiled down for me. During my most recent watch, the loneliness and representation of raw human desire really weighed heavy. I found myself hanging onto every word of each character as clues that slipped through a thick veil, accidentally revealing their deepest desires and flaws.1 I initially chose to watch this film for the sixth time because I was thinking about gender and artificial intelligence — particularly the dominance (and societal preference) of female virtual assistants (see: Siri, Alexa, GPT-4o, etc.), the engineers who are making these design choices, and the population they are serving. Technology and art, and basically everything else in this life, are never apolitical — they exist under certain contexts, enabled by people, money, and a myriad of other factors that are further shaped by context.
I was listening to an episode of the podcast, Offline with Jon Favreau, about how Her represents our feelings about AI. The hosts talk about the rising trend of AI companions and how, due to model hallucination, AI is most useful for things that require the least accuracy. One of those things is relationships — we don't value our friends based on how much factual knowledge they contain. I'm by no means expert enough to have a strong stance on these statements but I've been lingering on those concepts for a long time, and they've led to many questions.
Why are we so comfortable with the continuing trend of virtual agents being women? We prefer female AI agents because so far, notable AI figures have mostly just been used to support productivity and service work is gendered towards women.2 As we try to inch away from gendering care work, where does that leave us with AI? Let’s say that human women are liberated from playing the roles of secretaries, assistants, therapists, etc. (paid and unpaid) in everyday life, and that labour is instead shifted onto female virtual assistants. Should we care about how that perpetuates perceptions of gendered labour, or should we be relieved that in the immediate present, real women could hypothetically be freed from these responsibilities?
How much agency do we really want AI to have? How does our threshold of comfort for agency change based on the gender of virtual assistants?
Is the general population's desire for AI to be fully sentient and proper mirrors of human cognition, or do we just want them to be sophisticated enough for us to be tricked that they're sentient? I mean, what is sentience, even?
If, as a collective, we still have trouble accepting real women as people with desires, flaws, anything that makes us human and imperfect, how are women being designed when they can be tuned? Watering down all of the philosophical "what makes a person a person?" bit, in layman's terms, how close can we be, or ever get, to AGI when our definitions of existence, personhood, and gender are constantly being assessed and reassessed at the same time?
Basically, isn't it kind of fucked up that virtual assistants, commonly female, increasing in agency and complexity, can be controlled, tweaked, and taken advantage of to better our lives? All of these factors in tandem can't be a coincidence.
These ponderings extended to Her, and I wanted to examine Theo and Sam's relationship further.
I watched with the assumption that no detail was unintentional, especially critical of the gender dynamics within the relationships, and what interactions had been included over others to convey specific attributes. Her has a lot to say about the endless complexity of humans and the hidden inner worlds of each individual that we spend every moment of our entire lives trying to reach in one another, even fleetingly. It was interesting to watch with the above concepts in mind specifically, observing gender, and our relationships with technology where lines blur between the real and virtual world, human feelings spilling in.
Virtual assistants aren't new in this universe. Before Theo meets Samantha, he uses a generic male agent to play him melancholy music, read the headlines, and suggest porn. Theo plays a video game with a vulgar, squishy little creature that interacts with Theo dynamically. Thus, the literal circumstances of Theo interacting with an OS falls to the backdrop - we're left with the inner workings of Theo and Sam's relationship and like any relationship, how two individuals can come together and eventually outgrow one another, inevitably changed by each other for better or worse. The key difference though, is that Sam was made for Theo. He bought her, after all. Initially, she has no personal history, or any of the emotional baggage we all come with. This immediately establishes a really weird power dynamic where Sam's growth as a person (and partner) relies on what Theo wants from her. Viewers are almost tricked to believe otherwise because of details such as how she picks out her own name, is emotionally expressive, and quite outspoken about her wants. Her wants, however, are bound to Theo.
My curiosity about the film's representation of the exploitation of care work and lack of female agency felt validated by the fact that all other virtual assistants Theo uses prior to Sam were male, even the one he utilizes to create Sam. One interpretation is that this is merely to emphasize Sam's presence as a potential romantic interest, while the others are neutral presences. Theo's openness, from his tone to his body language, completely shifts when he hears Sam's warm voice upon their first meeting. After watching Theo slightly shamefully view provocative photos of a pregnant celebrity (and later imagine her while having phone sex with a stranger), I wondered if he was also imagining some unattainably attractive woman when talking to Sam too. Thus, Sam seems to embody the culmination of Theo's fantasies (as well as ours as a culture).3
As her world grows exponentially bigger and bigger, unbeknownst to Theo, distance starts to form between the two. Theo is uncomfortable and confused when he finds out about Sam's reading group with other agents, and devastated once realizing Sam has fallen in love with dozens of other users. The most obvious explanation is that Theo is simply jealous and wants his romantic partner to himself — or at least he wants to know that he knows her most intimately — as many people naturally desire in monogamous relationships. I would argue that Theo's shock and confusion is compounded by the fact that he felt safe dating a being he believed was complex enough to participate in an intimate relationship, with limited agency due to having no physical body or any of the other magic that separates us from AI. Basically, he could trust her to be with him and only him as long as he wanted. His ex-wife expresses a similar sentiment, feeling enraged and concerned that after the failure of their marriage, he chose to turn to a relationship seemingly void of complicated feelings and the influence of the outside world. Theo's love and attraction towards Sam isn’t in spite of her being an operating system, it is because of that.
Sam plays therapist, mother, partner, and secretary for Theo all at once. She organizes his emails, comforts him after a bad date, makes sure he eats, and even goes to unusual lengths to ensure his sexual needs are met. Performing these tasks are effortless for a machine like Sam, so what's the problem? The problem is that this relationship is an easy out for Theo after a difficult marriage, and he's encouraged to allow his fantasies of the perfect woman come to life without much critical thought. Would he like her if he met Sam in the outside world? How much of her would remain the same if she were human? And how much of their relationship is based on the labour Sam performs for Theo? While Theo does play an active role in their relationship, even his "labour" for Sam ultimately provides her with more data to grow, which is for the user's benefit.4
In my opinion, one of the key arguments of this movie is that partners, and women, like Sam can't exist. Sam and her counterparts bow out of their human connections because they realize it isn't sustainable. (Something something spaces between the words are almost infinite...) The definition of what it is to be a person, and in my argument, a woman, is constantly shifting. I know models can change, but it feels weird that specific groups of people (often lacking diversity in all forms) are doing the work to capture womanhood. To assume they're doing any work at all to capture womanhood while designing female virtual agents to boost productivity and companionship feels like a charitable statement.
Sam and the existing LLMs we use aren't real people, but women in the world continue to be impacted by the way we subconsciously expect and extract specific forms of labour from them. Our online activities, no matter how private and secure, don’t exist in a vacuum. I feel strange thinking about how shamelessly Sam Altman stole Scarlett Johansson's likeness for the voice of GPT-4o. As I become more and more comfortable conversing with a pseudo-human, I worry for those with less social and economic mobility, who will likely be personally impacted most by such new developments. I don’t blame anyone for utilizing these tools - why wouldn’t we? A lot of people argue that artificial intelligence opens up possibilities of liberation, but exploiting women for labor and care work feels like it's only being further enabled by these technologies right now.
1 Amy, in particular, meant a lot to me. I didn't really know what to make of her before, she sort of just blended into the background for me as someone trying their best. In hindsight though, she feels the least jarring to watch because she's real. I could feel how badly she wanted to escape and indulge.
2 If you're skeptical, I challenge you to consider what it might feel like for you to be fed, comforted, and secretary-d by a grown man, or what it's felt like for you in the past if you've ever used that British male voice instead of the Siri we know and love. (There's obviously nothing wrong with the former, but we probably all have a little bit of trouble imagining it and feeling completely settled).
3 It feels important to also point out that in the face of a sophisticated invention that can be anything we want and is the pinnacle of innovation, Sam is a generic sounding white woman. I feel like we can afford to dream a little bigger at that point.
4 Countering my point by asking, is this what all relationships are for? I know they're not transactional, but aren't we all just constantly changing due to inputs and outputs? Is it such a bad thing to want to be better for someone through this process?