#72: Should you build a ChatGPT app?

The opportunities for mental health businesses

Hi friends,

Good strategy is knowing which trends to act on and which to ignore.

For mental health companies, generative AI has been a trend to act on.

First, because of its potential to build transformative solutions, and then because of the millions of people who adopted it for mental health support.

As of last week, there's a third reason: ChatGPT introduced apps.

Booking.com, Canva, Zoopla and Spotify can now all be used from within the ChatGPT interface.

Soon, more apps like Uber, AllTrails and Peloton will be available.

But later this year, OpenAI will allow anyone to submit an app to their marketplace. This is a massive opportunity for mental health businesses to unlock distribution, solve engagement challenges, improve crisis support and more.

In today’s article, I share my thoughts on what these specific opportunities could be.

Let’s get into it.

In two weeks, APA Labs will host their flagship event for mental health innovators in Austin, Texas. Inside The Lab was created to bring together founders, clinicians, researchers and other stakeholders interested in collaborating on actual solutions to mental health problems. The agenda looks awesome, and there’ll be a startup pitch showcase on the last day with an expert panel. If you want exposure and feedback for your startup, I’d recommend applying.

Hemingway Report readers get a 20% discount for the event - just use ITL2025GJMMM at checkout.

A Platform Shift

It’s hard to predict platform shifts. But they are easier to spot once they have already arrived. Conversational AI is a platform shift, and it’s happening right now.

Observable user behaviour and usage data prove this.

I don’t Google stuff anymore. I ask ChatGPT. I’m not alone, over 800 million people are now using ChatGPT each week, shifting their behaviours away from individual apps and search engines to conversational AIs.

ChatGPT dominates this new landscape, and its app marketplace will both strengthen that position and create new opportunities.

Mental health businesses should pay attention. Most mental health technology today lives in mobile apps and websites, relying on search engines for distribution. As users migrate to AI chat interfaces, those channels will be disrupted.

Until now, this territory has been inaccessible to mental health businesses. Hundreds of millions of people use ChatGPT for support, but there's been no way to connect them with mental health solutions. That changed last week when OpenAI announced ChatGPT Apps.

Starting in the next few months, businesses will be able to build apps that users interact with inside ChatGPT.

Who Should Build What?

Before we answer that, we need to understand two things: how users will discover apps in ChatGPT, and what makes a good ChatGPT app.

Users can find apps in two ways. First, they can summon apps directly - I can ask ChatGPT to use the Booking.com app to find me a hotel.

Using the Booking.com app within ChatGPT

Secondly, ChatGPT can use the user conversation to suggest an app from its directory of approved apps.

To compare it to using a search engine, it’s like searching for a brand name, versus searching for a product/problem.

This helps us understand how people will discover and engage with these apps. It also guides us in understanding the criteria for a good ChatGPT app.

Good ideas would probably:

  • Solve a problem users are already trying to solve in Chat. People need to already be trying to solve this problem in ChatGPT. Given ChatGPT's 800 million weekly users and the high percentage of conversations involving emotional or psychological needs, this likely won't be an issue for most mental health use cases. But if your solution addresses a niche problem people aren't currently discussing in ChatGPT, don't assume a marketplace app will suddenly change their behaviour.

  • Fill a gap ChatGPT can't. What resources or capabilities do ChatGPT users need that the product can’t provide? This could be a therapist network, a regulated intervention, or specific resources, like evidence-based crisis support tools. The apps currently on ChatGPT all fill a gap like this - booking.com provides a marketplace of accommodation, Canva provides design tools, and Spotify provides a music library.

  • Have a clear trigger and "handoff moment". Most people won’t search for a specific mental health app within ChatGPT. Therefore, mental health apps will rely on “introductions” from ChatGPT. These introductions will come at trigger moments - when ChatGPT identifies a gap in its ability to help the user and realises it has an app in its marketplace that could do a better job. For example, if a user asks, “Hey, can you help me find a therapist?”. Chat will identify the user’s need, realise it can’t fulfil it, find a therapy platform app in its marketplace, and recommend that app to the user.

  • Be able to use ChatGPT’s context to provide a better experience. It’s unclear how much user context OpenAI will allow developers to access through their SDK. They will likely be quite restrictive. But ChatGPT has a huge amount of user context that could be very helpful to mental health apps. If you can use that data to improve your solution, you have a compelling reason to build a ChatGPT app. For example, if you run a therapy platform business, you could theoretically use ChatGPT’s data on the user’s context (location, insurance coverage, preference data, etc.) to more easily provide a better therapist match. In this case, ChatGPT is not just a distribution platform, it could enhance the actual value of your offering. Of course, this would need to be done very thoughtfully to respect ethical boundaries and user preferences.

We could simplify these criteria by asking, “would your app be helpful to ChatGPT users?”

That is what OpenAI cares about and optimises for.

Google gives the highest page rank to the most useful websites (at least, it used to). That optimisation creates the best user experience and drives usage. That is what OpenAI are doing now, and their app marketplace will be no different. At some point, you’ll be able to pay to promote your app within ChatGPT (like you can with Google Ads), but I reckon that’s a while away. OpenAI will want to develop a thriving marketplace with deep user engagement and transactions before they introduce paid promotion.

So what are the opportunities?

Now that we understand some of the framing principles and criteria for a good mental health ChatGPT app, let’s discuss three ideas.

1. A new acquisition channel for therapy platforms

Client acquisition is a massive challenge for mental health businesses. Large therapy platforms spend tens of millions on it each year. Most channels are competitive and expensive. But ChatGPT offers a new channel - the first in a while. It will be cheap (probably free) and has huge upside potential if it gains traction. Building a ChatGPT App is the way to capitalise on it.

We know millions of people are using ChatGPT for emotional and psychological support (whether they should be, or not). People are likely asking ChatGPT for help finding a therapist.

Could a therapy platform build a Chat App to give a great answer to this prompt?

Therapy platforms should consider building ChatGPT apps to solve this problem. They already have therapist networks and the ability to match clients to those therapists. If they can also use some of ChatGPT’s contextual data to improve matching, that would further increase the value of their solution. OpenAI have already stated a desire to “connect people to certified therapists”, so I imagine they would be supportive of those building an app for this problem. I’m sure large platforms are already in discussions with OpenAI about this.

This would not be a huge lift; the core assets and technology are already there, it’s just about surfacing them in a ChatGPT app. If I were running one of these businesses, I’d be allocating a small amount of resources to experiment with a ChatGPT app over the next six months.

2. Solving mobile app engagement

One of the dominant challenges of mental health apps has been maintaining engagement and retention. It’s hard to get someone to download an app, register, onboard, start using it and keep coming back day after day. One study found that the median 15-day retention rate for mental health apps was just 3.9%.

But if the core interventions of these apps were surfaced in an interface that people are already using every day, could those retention measures be improved?

For example, when a user discusses catastrophising about work in Chat, a CBT app could be surfaced with a cognitive reframing exercise. When sleep problems come up, a meditation or sleep tracking app could appear. When someone commits to a behaviour change, an accountability app could capture that commitment and then resurface each day to support change.

3. Much better crisis support

OpenAI has acknowledged challenges with crisis situations. When users express suicidal ideation or severe distress, ChatGPT currently directs them to hotlines. The company has faced lawsuits following user deaths and has publicly stated that its models "fell short in recognising signs of delusion or emotional dependency." While they say they have addressed many of these original concerns, I don’t think the problem is solved.

There is a significant and highly impactful opportunity to build a ChatGPT App that helps solve this problem.

I mean, what would it look like to develop a crisis support tool specifically for ChatGPT? Perhaps it could provide evidence-based crisis intervention tools that could be used immediately within Chat. It could monitor users and escalate them to human care, providing pathways to support when needed. Perhaps that conversation with humans could happen through the Chat interface itself, through the crisis support app.

We know these crisis conversations are happening within Chat. Wouldn’t it be great if we could better intervene and support those people in need with apps designed to do so?

The Hemingway Community

We recently crossed 200 members in the Hemingway community. If you’d like to apply to join this vetted community for innovators shaping the future of mental health, you can learn more here.

And if you want to hear more from members on what it’s actually like, here’s some of my favourite snippets of feedback.

Do What You Do Best

Mental health leaders must think about how to use ChatGPT for distribution while ensuring they stay focused on enhancing their own unique capabilities. They must also not get too protective about their own apps and existing distribution.

More generally, we can consider the example of Booking.com and Expedia, both launch partners for ChatGPT's app marketplace. These companies have their own apps and websites with massive traffic. They're not abandoning those channels - they're adding ChatGPT as another distribution point. The strategic question they asked wasn't "how do we protect our app?" but "where are users making travel decisions, and how do we show up there?"

The same logic applies to mental health. We know people are already discussing their emotional and psychological challenges with ChatGPT. And yes, we would all agree that those conversations should be held with products designed to do so safely and effectively. And, if those conversations move into regulated territory, then those products should be compliant with the relevant regulatory standards.

We need to build those products. But we also need to get them to people, wherever they are already having these conversations. Building a ChatGPT app may facilitate that.

While doing so, companies must remember to still focus on their core competencies - to enhance the assets that differentiate them. That might be the size and quality of their therapist network, or it might be the evidence base and regulatory approval they hold for a clinical intervention. There are some interesting opportunities to innovate with mental health apps in ChatGPT, but for most businesses, they should be focusing on how to use ChatGPT as purely a distribution and engagement layer, whilst still improving on what they do better than anyone else.

ieso is a good example here. While not ChatGPT app-specific, they are shifting to an API first strategy, doubling down on their core capabilities as a creator of evidence-based chat products, while recognising that other organisations are better placed to own distribution.

The Public Health Opportunity

If mental health businesses build thoughtfully for this platform - providing evidence-based interventions, appropriate crisis resources, and pathways to professional care at the right moments - we could dramatically expand access to mental health support. The alternative is leaving millions of users with only ChatGPT's responses, which OpenAI readily acknowledges are insufficient for many mental health needs.

Population mental health improves when we meet people where they are, not where we wish they would be. Right now, they're in ChatGPT.

That’s all for this week. What other ideas do you have for mental health apps that could be built for ChatGPT?

Reply to this and let me know.

Keep fighting the good fight!

Steve

Founder of The Hemingway Group

Reply

or to participate.