I decided to write a review of Eliza, a 2019 visual novel by Zachtronics, built on the premise that in the very nearest future the field of individual psychotherapy, in an attempt to battle the current “mental health epidemic”, will become completely dominated by tech giants competing to automate counselling through artificial intelligence and machine learning. One of these companies, the maker of the titular psychotherapeutic tool Eliza (named after the 60’s MIT natural language processing program), puts an original spin on their creation. Whatever questions, care and advice is offered by the computer program to its clients is delivered in-person through “proxies” – real humans (contract workers on minimum wage) that read prompts off of their augmented reality visors. You play as Evelyn, one of these “proxies” and former engineer working on Eliza, who, after taking a prolonged break from the tech world, comes back to see what her creation turned into and what kinds of people and in what circumstances use this intervention.
I wanted to like this game really badly, and there were many good reasons to do so. It combines two very pertinent topics – mental health and big tech, – and offers surprisingly profound insights into both. The people in charge of Eliza’s development couldn’t be further from the vision of universal and affordable mental health care their corporation supposedly propagates, and, even more importantly, they no longer even have mental health professionals in their midst – the chief psychologist, himself a sad and sleazy alcoholic in private, has recently left to found his own startup, based around technologically-induced lucid dreaming. The decisions engineers and managers make regarding the extremely personal information clients share with Eliza are profoundly unethical, and their responses to pointing that out usually amount to “at least I’m a good and rational person, someone else would do something even worse”. The lackluster and laughably uniform recommendations Eliza offers (“do some breathing exercises and get a prescription for antidepressants from your doctor”) are fundamentally unsuited to the very real problems that clients hope it can resolve. Every one of these is a conversation worth having, and the game takes great effort to not entirely vilify any of the people representing these issues: you might disagree with their arguments, but you still see where they are coming from.
In addition to thematic relevance, all of this is delivered not simply through dialogue, but also with more subtle environmental storytelling (what apps are installed on your phone and others; what emails, receipts, news articles and notifications you periodically get from your colleagues and your previous engagements; and what objects Evelyn finds note-worthy in her surroundings in the rare moments when she is left alone to ponder her situation). I like this (uniquely gamic) style of presenting the narrative (though, of course, this is very far from the first time a game does it this way – see, for example, famous indie titles Cibele, Her Story or Emily is Away). And, as can be seen from the screenshots, the art style is (fittingly) subdued but very pleasing.
But with all of these good qualities, I’m still left with the feeling that something is missing. The most powerful sections of the game for me were the ones where Evelyn had to participate in actual Eliza sessions – listening to (well-written and well-acted) problems of actual people: an aspiring artist struggling and failing to turn her passion into a career, an upcoming young father who is heavily hinted to have recently realized his bisexuality and is struggling to commit to his wife in the face of big life changes, an older woman struggling to get out of debt who comes to Eliza simply seeking out company. It’s profoundly upsetting to listen to their understandable problems and not be able to offer them even a sympathetic ear, let alone actionable advice or real help, – as both the player and Evelyn in-game are strictly limited to the responses Eliza generates. It’s a powerful procedural argument about the failure of both big tech (“it’s possible by abstracting enough to find a generalized solution that fits everyone”) and the contemporary extremely individualized approach to mental health (“it’s on you that you are depressed and you are personally responsible for getting better”) to treat problems created through structural and highly institutionalized means – it’s no coincidence that the very first client Evelyn encounters suffers from severe environmental grief.
But even as Evelyn leaves these sessions and comes back to her own life, she still experiences the same inability to make any impact on the world around her: even though the player is sometimes given options to choose from in conversations, Evelyn (and thus the player) can never respond to any of the arguments put before them, and the narrative of the game is still almost entirely linear, with the multiple endings being decided by a single choice at the very end – to return to big tech, become a licensed counselor or leave all of it behind. Because of this I ended up seeing Eliza’s argument as individualistic and pessimistic at heart: the problems it raises are clearly structural and political in nature, and through the course of the game the player can clearly see that none of them can be solved alone. This is a lesson that I appreciate Eliza teaching. What I don’t agree with is how it seems to stop right before taking the last step: the fact that these issues cannot be solved alone doesn’t mean they can’t be solved at all or that we should stop trying.
- Silicon Valley is Not Your Friend – Jonathan Taplin, New York Times (17 October, 2017)
- Big Data is so large, it’s raising privacy & ethical issues – Megan Ray Nichols, EuroScientist (15 November, 2018)
- Google cancels AI ethics board in response to outcry – Kelsey Piper, Vox (4 April, 2019)
- There Is No Cure for Existence: On the Medicalization of Psychological Distress – Albarracin, Dolores & Ducousso-Lacaze, Alain & Cohen, David & Gonon, François & Keller, Pascal & Minard, Michel. (2016). Ethical Human Psychology and Psychiatry. 17. 149-158. 10.1891/1559-4318.104.22.168.
- ‘Climate grief’: The growing emotional toll of climate change – Avichai Scher, NBC News (24 December, 2018)