6 Comments

"For instance, effective psychedelic treatments for PTSD tend to include multiple followups over a long period, whereas narcosynthesis attempted what L. Ron Hubbard — who was, himself, a proponent of a modified form of narcosynthesis — once called “the one-shot clear.”

Today, AI therapy is poised to become the narcosynthesis of the 2020s — a supposedly faster and cheaper replacement for “real” therapy. Chatbots based on GPT-4 are already being proposed as a new short cut to healing."

These arguments against one-shot treatments are arguments for AI therapy. No one is proposing that you'll just talk to a chatbot for half an hour (and no one does that - whether it's today's GPT-4 bot or AI Dungeon 2 or Replika or Xiaoice or ELIZA back in the mists of time, pretty much all uses of chatbots for therapy-esque purposes reports that many users will spend a *huge* amount of time on them). All proposed uses, like your SciAm link, assume that users will be talking to the therapy bots a huge amount indefinitely - and that's why they work, because a bot can talk to you for hours a day anywhere anytime forever for a few bucks of electricity a month (and rapidly decreasing plus rapidly getting better), while a trained psychotherapist would cost a fortune and then burn out and have to be replaced by a worse & more expensive therapist.

Expand full comment

Thanks for the thoughtful comment Gwern. That's a fair point about AI therapy not being a "one shot cure" in the manner of narcosynthesis. What I was trying to get at was the idea of it. being a single modality, even if it's one that is used repeatedly. In other words, if someone is *exclusively* using an AI system as a therapist, they may be spending hundreds of hours with it, but unless they are supplementing this with the advice and guidance of a mental health professional (which would typically include multiple treatment modalities from pharmacological interventions to CBT and the like), I'm deeply skeptical that it will be able to replace human experts. One thing we learn from the history of psychiatry, IMO, is to distrust claims that a shiny new technology is a short cut to mental health, whether it's a fashionable new drug or an AI therapist.

Will some form of AI chatbot with a human expert in the loop play a prominent role in the mainstream psychiatry in the 2020s and beyond? That seems overwhelmingly likely, since it already is doing so for early adopters. But studying the history of narcosynthesis and psychedelic therapy is going to be important as we navigate the outsized claims of efficacy and profit-seeking that will inevitably come along with that.

Expand full comment

Is being a single modality so bad? A large fraction of therapy is already 'single modality', as I understand it: if you look at all of the uses of things like gratitude journaling or CBT with workbooks or (non-AI) apps, they are not always accompanied by a pack of pills. So if pills are inappropriate for many patients given even a modest amount of talk/writing-based intervention, then the much more intensive possible LLM-assisted interventions ought to be able to handle at least that many patients. And then they can possibly handle even more: early intervention might help, and then, the 'single-modality text-only LLM' are already something of a thing of the past and will be superseded by models capable of voice output and images and eventually video. (It's taken longer than I expected it would in 2020, because single-modality text LLMs truly exceeded almost all expectations for how good they could become, but it *is* happening, finally.)

> That seems overwhelmingly likely, since it already is doing so for early adopters

Certainly seems reasonable, especially if one thinks about using the AI to analyze the logs and highlight things for the human expert (as you demonstrate to some degree here). But if such a semi-automated loop works, it is also progressively replacing the expert - the expert's interventions are, after all, just another piece of text to model... (All roads lead to Rome.) One is reminded of the improper linear models & clinical inventory debate of yesteryear - it didn't take nearly as much to beat psychiatrists as psychiatrists furiously insisted it would.

Expand full comment

This is an interesting discussion and I'll think more on it (and maybe write a post inspired by it). I think the key place where our intuition about this differs is encapsulated here:

> the expert's interventions are, after all, just another piece of text to model

Are they though? To me the key factor in therapy is the knowledge that another human being is in the room with you (or Zoom room, but you get what I mean). In that sense it's very different from, say, teaching: I can communicate ideas to 100 students and then get feedback from them in the form of assignments. Those interactions can potentially all be done via online learning tools, and TAs can do the grading. The therapist-patient relationship by contrast seems to me to be grounded in one-to-one feedback and the attendant body language, facial cues, etc. It can't be scaled in the same way.

I take your point about the multimodal part and AI video, which is where this heading. But again I think our intuitions about what would be effective in therapy differ. After all, a theatrical performance is also at heart "just another piece of text to model," and yet I suspect people will still be going to theater even after AI generated interactive video becomes a thing. The impact of actually being in a room with another human experiencing the same thing will, I predict, be *higher* and *more* valuable in that world, not less. I suspect the same thing will hold true for therapy.

But then again, maybe not! I definitely think some form of AI-assisted therapy will be a thing, and it will surely help (is helping) some people. I've just learned from my research to be wary of magic bullets and big claims about new technology in the field of mental health.

Thanks for engaging and reading, I appreciate it.

Expand full comment

AI therapy for sure feels like a step change, no panacea though - the human experience being knotty and complex and so many facets of recovery still not clear. This is also a difficult domain for science due to the lack of clear replicability in so many cases, as well as the difficulty of arranging large scale experiments due to the intimate nature of the data that needs to be gathered.

Personally I do think therapy in VR and with robots IRL will be effective AF. Experiments conducted by the VA have already shown huge promise. But we will need a lot of R&D to get us there, I'd say we're a decade away. And it's only a decade because the VA is putting that kind of money and attention to this kind of therapy in this mode.

IMO Humans will be needed in the observation loop for a considerable amount of time since we are unlikely to trust AI entirely with oversight into helping us heal our private human experience.

My $0.02. Great article, great discussion. Thank you.

Expand full comment
Comment removed
Oct 11, 2023Edited
Comment removed
Expand full comment

VA is going big on VR + AI. They are the biggest and best funded public health system in the world in this field and the current administration has armed them with funds for this specific purpose.

https://www.innovation.va.gov/hil/views/immersive/immersive.html

Which is very exciting news IMO.

Expand full comment