By Arman Baradaran, MEng ’24 (BioE)
The following essay received an honorable mention in this year’s Berkeley MEng op-ed contest. In this contest, Master of Engineering students were challenged to communicate an Engineering-related topic they found interesting to a broad audience of technical and non-technical readers. Note: As opinion pieces, the views shared here are neither an expression of nor endorsed by UC Berkeley or the Fung Institute.
“Sorry to hear that. Here’s a motivating message you wrote yourself for the next time you’re having a bad day: you have gotten through hard times before.”At a cursory glance, the interaction style in this distance therapy can look similar to a typical visit with a therapist or psychologist. Missing, however, are said clinician, the need to travel to said clinician’s office, and the associated temporal hassles or steep monetary concerns. Using just a smartphone and a digital mental health intervention (DMHI), even simple text messages can interactively deliver guidance and self-management strategies for the plethora of known mental health symptoms (Kornfield et al., 2023). But if, inevitably, you’re thinking that the above scenario involved ChatGPT, let me dispel this terrifying notion — it didn’t. Here’s why: AI is unable to consistently behave empathetically, but it certainly succeeds in manufacturing the illusion of doing so, rendering individuals with mental illness particularly vulnerable to interpreting conversations within AI apps as invariably trustworthy (subconsciously, mind you — a result of manipulative marketing) even when the AI has failed, which is an issue exacerbated by the lack of an intervening therapist. Indeed sometimes, unwittingly, the human psyche desperately tries to find a needle in a haystack rather than admit to itself that it should be looking out for a buried red flag instead; when interacting via text as we do with chatbots, our brains anthropomorphize AI specifically because we ascribe emotion and understanding to written replies in which these traits are nonexistent, a phenomenon that can be catastrophic when the AI encourages harmful behavior (see Melley, 2023 and Xiang, 2023). Historically, research conducted on the possible paradigm shift in mental healthcare delivery has skewed toward academic considerations, emphasizing the design of DMHIs instead of their real-world implementation, and somehow it is only very recently that solution-focused field trials began, specifically with smartphones. Hopes are high, given that the sheer accessibility of smartphones makes them the ideal vessel for identifying and intervening in mental health conditions such as anxiety disorders and clinical depression.

Well, if AI therapy is not the holy grail of mental health that the current app market is making it out to be, then… what is?Universally accessible digital technologies such as SMS and, to a lesser extent, the Internet, along with developments in online therapy, teleconsultation, and gamified apps, all contain significant, novel opportunities for answering that question, as seen in recent research (Amo et al., 2023; Badesha et al., 2022; Kornfield et al., 2023). For example, the curious conversation that I relayed earlier originates from the field trial of a text messaging intervention prototype developed by research psychologists at Northwestern University that delivered symptom management strategies adapted from classical methodologies, like cognitive behavioral therapy and acceptance-commitment therapy, to young adults — and unlike past implementations of most DMHIs, this user-centered text messaging script successfully kept the participants engaged and satisfied (Kornfield et al., 2023). Though the horizon holds promise for a psychologically healthier future, we must remember that a viable replacement does not currently exist for the mental health resources provided by clinicians locally, regionally, and federally, despite their high cost, the shortage of trained clinicians, and the stigma associated with seeking and receiving a formal diagnosis. Nonetheless, the alarming ubiquity of unregulated “therapy” apps highlights the immense potential of DMHIs delivered via smartphones for bypassing the traditional roadblocks present on the healing path, and the onus is on us to coordinate interdisciplinarily to devise solutions that implement known treatment strategies in a way that is personalizable and accessible to the general populace. After all, the world is one clever engineer away from making accessible mental healthcare possible remotely.
Op-Ed: Is accessible mental healthcare even remotely possible? was originally published in Berkeley Master of Engineering on Medium, where people are continuing the conversation by highlighting and responding to this story.