How would you feel about using artificial intelligence (AI) to help manage your mental wellbeing? And how much personal information would you be willing to hand over to the company behind it, if they promised it would enable them to better support you?
Seeing the rise of automation and AI in the IT industry has given me an interest in the ‘people’ aspect of my work and questions like this. We’re realising that it isn’t just about computers any longer: technological progress comes with a range of benefits which make our lives seemingly easier, but also has the potential to affect society in unforeseen ways. Younger generations are the first to have grown up with modern technology since the day they were born and seeing how this is changing our relationship with it is fascinating.
This explains why I’ve been keen to play Eliza for a while now. Released in August last year, Zachtonic’s visual novel tells a story about mental health and AI which encourages the player to think about how much of themselves they’re willing to give over to technology. Today’s post isn’t a review as such but more a discussion about some of the questions raised – ones which have been bouncing around in my head since completing the title a few days ago and for which I’m still don’t I have the answers.
The narrative centres on Evelyn Ishino-Aubrey, a woman in her mid-thirties who left a promising IT career three years ago to now resurface as a proxy for a virtual counselling app called Eliza. Her clients attend sessions with her at the coffee-shop-like centre in Seattle where they reveal what is troubling them and it’s her job to make them feel they’re being listened to. The thing is though, Evelyn has no autonomy over how she responds; all her answers are provided in real-time by an AI through a pair of augmented-reality (AR) glasses.
The protagonist admits early in the game that she’s comforted by this lack of freedom because it means she ‘doesn’t have to think’. For the player though, it feels disturbing. The system monitors the clients’ heart-rate, vocal distress and keyword use so it can present the best response to keep the conversation at optimum level. None of it is of use to Evelyn though and it isn’t long before doubt starts to creep in: who is this app really designed to help? Does it have a patients’ best interests at heart or those of the company behind it, Skandha?
Although sections of the game take place outside of Evelyn’s workplace with friends and colleagues, a lot of time is spent with clients in these counselling sessions. You’re unable to make any choices during most of them and must simply select each answer generated by Eliza as a proxy. The more time I spent doing this and effectively behaving like a robot, the more I started to feel uncomfortable. How could a service this impersonal and based only on data analysis truly care for anybody with a mental health issue?
For example: an early patient is Darren, a young man who is worn down by the existential burden of living in a damaged world where people are cruel to each other and those in charge care about nothing but themselves. He’s clearly distressed and crying out for a human connection, but the support Evelyn is able to provide through Eliza felt so far removed from the thing he seemed to need the most. Skandha may promote their app as bringing therapy to those who otherwise couldn’t access it, but I wasn’t so sure it was helping.
Later we see a client named Maya, a female artist who experiences anxiety issues and is struggling because she feels her work isn’t being noticed by anyone in the comic industry. Occasionally she asks questions or makes comments that Eliza is unable analyse properly and it’s here that the limitations of the software begin to be exposed. Instead of generating a direct response, it instead ignores what Maya has said and pushes her back into the strict discussion format of introduction-discovery-challenge-intervention-conclusion.
Then we have Holiday. At first it appears that she is a lonely older lady who has come to check out the counselling centre just to be nosey. Once she signs over access to her personal communications however it becomes clear her situation is far worse she’s letting on during her sessions: she’s in debt, estranged from her children and entirely alone. But instead of offering her channels of support for these problems, Eliza doles out its standard advice of meditation games and possible medication.
As well as examining the effectiveness of such an app, the huge amount of the personal and sensitive information collected through it is highlighted too. Evelyn says herself at one point: ‘the potential for misuse seems kind of high.’ The game frequently returns to questions about data collection and the ethics of using it for research or new business propositions. When the slimy Skandha CEO’s intentions are revealed, it makes you uncomfortably consider where all that information you’re giving away on online is going.
Is it ever ok to access someone’s personal information to such an extent, if they agree to it and it allows a company to better help them – as well as sell to them more effectively? And even with all this data, can an AI truly understand human emotions and correctly interpret the meaning behind our words? Is an app like Eliza really able to support the people who need it the most, or does devolve responsibility to a computer and reduce the care of our mental health to a commercial venture?
So many questions and ones I’m still thinking about. What I can say though is that seeing Evelyn go off script towards the end of the title and push against Eliza’s boundaries is so satisfying. After realising the app isn’t aiding society in the way it was intended, she chooses to take a stand and intervene. It will get her in trouble with her employer, is unlikely to change the company’s culture and won’t completely change the world; but if it helps even just one person then surely it’s worth it?
Evelyn is an unexpected character in some ways. In releases about mental health, the focus is usually on a protagonist who’s struggling or trying to work through their problems. But here we have someone who’s coming out on the other side, who’s gradually starting to peel off the armour she built up over the past three years and that makes her far more vulnerable. She shows us that it’s ok not to have all the answers and to have to find your way one step at a time.
The things she says and the way she behaves make her incredibly relatable. You might see yourself in her or one of the other characters at some point, and realising you’re not alone is an incredibly powerful thing.