A plot spoiler follows, for a movie that’s not suspenseful in any way. But beware.
When Theodore, the main character of Her, confesses to a friend that he is in love with Samantha, his computer’s artificially intelligent operating system, his friend shrugs it off: lots of people are doing it now. In fact, almost everyone in this lonely man’s rather small social circle are annoyingly accepting. The only person who challenges Theodore is his ex-wife. Her approbation causes Theodore some doubt, but not serious doubt, and his ex is dismissed as an unstable crank. Theodore’s feelings are real, his happiness is real, so his friends just leave it at that.
I couldn’t tell if this easy acceptance of man-software love was a Hollywood show of hyper-tolerance— a movie sensibility in which embracing some taboo lifestyle is uncritically portrayed as personal growth (we don’t dare be judgmental lest someone condemns our own degenerate ways!)— or just a plot device to get around a controversy the film doesn’t want to explore.
But Her goes off in a different direction anyway: the risk to Theodore is not that his love is founded on artificiality but that he’s become vulnerable to the computer’s fast-growing self-awareness. The more it discovers its own emotional potential, the less need it has for Theodore. And that’s when he really gets hurt, though even that’s cushioned by the revelation that it’s all been a lesson in how to love.
I found the movie sad (he takes his smart phone on a romantic getaway in the mountains, for heaven’s sake). Visually, the film is full of vast, glittering cityscapes and spacious but brightly lit, futuristic-looking rooms that enhance the feeling of isolation. Characters rarely talk to each other. Theodore is extremely lonely and vulnerable, and I felt the computer was manipulating him with its Manic Pixie Dream Girl persona.
I don’t think computer AIs could spontaneously develop genuine human emotions, let alone fall in love. Our emotions are the product of millions of years of evolution. What need would a computer have of them, except to mimic them for our sakes?
For the same reason, I don’t think a Terminator or Matrix-style catastrophe— in which a computer’s self-awareness causes it defend its existence by wiping out humanity— is likely, either. Survival instinct is also evolved. I can’t imagine a computer suddenly developing the same kinds of anxieties. More likely, an AI would not have either the same sense of individuality or mortality as we humans. As long the AI could be copied and backed up and saved somewhere, why would it fear death? Even without the external hard drive option, why would it care? Human fear of death is probably more related to the agony and anguish that goes with it than the end of life. Our emotions are tied into physical sensations, but an AI wouldn’t physically feel anything, so how could it fear death?
In Her Samantha mentions its lack of mortality and embarrasses Theodore and his friends. The AI thinks its freedom from concern about death is a great advantage. The ending of Her, where the operating systems disappear into the ether to find their own way, is probably a more likely outcome of AI self-awareness: they would develop their own interests and have no need for us, except perhaps as a power supply.