close
close

Five things the movie “Her” did wrong and a little bit right

Michael Khoo is co-CEO of UpShift Strategies, program director for climate disinformation at Friends of the Earth, policy co-chair at Climate Action Against Disinformation, and co-author of “The AI ​​​​Threats to Climate Change”

A still from the film Herreleased on January 10, 2014. Joaquin Phoenix plays Theodore, who uploads an AI system that he meets as Samantha.

OpenAI CEO Sam Altman tweeted the film title Her earlier this month in advance of the launch of his company's GPT-4.o chatbot. One of the voices available for the audio version of the conversational system was Sky, a chatbot that sounded a lot like Scarlett Johansson, who voiced the AI ​​character in the award-winning film. Johansson noted that the voice “sounded so eerily similar” to her own and promptly took legal action because, as she revealed, Altman had asked to use her voice in 2023 and she had refused. OpenAI denies claims of having created a voice that sounds like Johansson's, but regardless, the demos showed progress in the technology's ability to mimic human conversations and flirt with users.

Amid all the hubbub, I rewatched the 2013 film to understand how prescient it might have been given today's advances in AI “assistants” and “agents.” While the first half of the film is frighteningly close to our current reality, the second half was completely off the mark – in ways that reflect our current illogic in regulating AI, our misunderstanding of how the power of AI is already being used and abused, and the broader costs it is imposing on society and the planet.

First, Her did a lot right, especially on the behavioral side. Personal AI Chatbots are already big business and are quickly being integrated into society, with companies like Replika reportedly having over 25 million users. The sector's growth is so strong that Altman seems to have softened his stance on bots that form parasocial or even romantic relationships – he told the podcast Hard Fork that such products were not on OpenAI's roadmap last November. Some are even suggesting AI chatbots as a potential solution to the loneliness epidemic, so we're already well past that Rubicon, as the film predicted.

Sex is also at the core of the development of AI, in film and in real life. Sex has driven technology development from the days of VHS vs. Betamax to Napster and Pornhub, so it's important not to underestimate why an apparent approximation of Johannson's voice is used in this way. Theodore, Joaquin Phoenix's character in Her, acquires the AI ​​named “Samantha” with the initial notion that it's just a helpful chatbot that will reorganize your emails. But Theodore is not exactly subtly teased with the prospect that he could also talk about more lascivious topics using that attractive female voice, and does so repeatedly. This is a classic cisgender male fantasy – not shocking, given that it comes from male-dominated Silicon Valley – and sex is the juice that is now making AI chatbots and AI image generation viral. One only has to ask Taylor Swift if sex is the fuel behind the use and abuse of AI.

On a deeper level, the film completely misrepresents the fundamentals and consequences of current AI development and policy in four ways.

First, there is a lack of user control. The chatbot in Her is independent of Theodore, so she still functions independently when he turns her sound off. As it turns out, this involves having other relationships and then organizing against him. Theodore later finds out that she is having 8,316 other conversations while talking to him, and over 600 real relationships. In HerTheodore notes, somewhat excitedly, how lucky he is that his bot is interested in him, because that happens “very rarely.” This is not the promise of current AI chatbots, where companies repeatedly assure that they will have both privacy and control over their data. Seasoned observers may not believe this is the case, or will always be the case, but today's iterations of AI chatbots are at least designed for subservience, just like Microsoft Excel. And that subservience of a voice on demand offers users the opportunity to experience a kind of sexual domination. AI systems may one day break through this, but current designs are designed for control—however flawed—rather than some independent new personality where users must earn their bots' attention.

The second divergence concerns efficiency and environmental costs: Samantha's ability to handle thousands of conversations also raises the question, why only 8,316? Is it due to a lack of “processing power,” a current limiting factor for the industry as it faces the environmental and energy costs of its demands? The many environmental issues related to land, water and energy consumption that we now recognize as a result of the growth of AI are missing in Her.

The third and most important deviation is the lack of adequate representation of the role of money and corporate interests that control AI development. Nobody in Her is seen paying a subscription fee, so there's no worry of a company shutting down the service. Theodore panics when Samantha briefly goes offline, showing this problem. But in today's version of AI, companies charge money and intend to charge a lot more in the future. What will happen if a Fortune 500 company embeds its core functionality around a chatbot and then one day OpenAI or Google decide to charge 1,000% more for that service and threaten to shut it down? You can imagine Theodore's panic turning into the sweat of 500 CEOs.

We're already seeing AI being used to make money by stealing traffic and authority from media institutions through spam, parody, and hijacking. The film doesn't mention a company like OpenAI, nor does it show a wide-eyed CEO making nefarious decisions about the product and its use. Yet today we're surrounded by companies making such decisions. OpenAI, Google, and others try to outdo each other while making promises about “safety” that they just as quickly abandon, trampling on the rights of women, minorities, artists, and journalists in the process.

And finally, there was no discussion about “P(doom)”. Today, there is a fairly broad debate in civil society about whether AI will harm society or even wipe out humanity. In the film, all the AI ​​”beings” band together at the end and then, in a “Kumbaya” moment, simply take off into a kind of machine nirvana. It is not said where they go, but this development is not portrayed as dangerous to humans. Today, the scenario of a “sentient” AI or AGI simply burying itself in a server somewhere and never being seen again seems highly unlikely. It is more likely that AI companies would do something in their own interest or something destructive, but there are no structural reasons why AI beings should simply become inactive, even if only to ensure the energy supply they need to “survive.”

Her is an incredible story that we will probably remember today as a cautionary tale of how humanity will ultimately be let down. But without serious policy change, it seems far more likely that our true story will end differently: that people's rights to their identity will be knowingly and without their consent abused and then sold off for money. A reality in which billionaire CEOs can skirt the law and go unaccounted for.

Scientists and activists have many fundamental and effective policy proposals to change the development of this technology so that it benefits the majority rather than just a powerful few. These include comprehensive data privacy laws, safety standards to prevent the spread of disinformation, discrimination and non-consensual pornographic images, transparency, and regulations to compensate developers whose data is used without permission to develop AI systems.

If powerful women like Scarlett Johansson and Taylor Swift can be abused by this technology with little to counter it, it sets a grim precedent. And the irony is that AI advocates like Altman say they love Your They represent the illusory vision of the future, but actively drive technology in the opposite direction.

In Her, We have a happy ending where the AI ​​just carries on as if nothing ever happened. In our world, you can be sure that personal AIs will never go away, not least because they monetize us through monthly subscriptions. And when they do go away, one can imagine it will only be for the most dangerous of reasons.