Montreal researchers sound alarm about risks with suicide prevention and AI
Posted December 9, 2025 10:09 am.
Last Updated December 9, 2025 10:22 am.
People experiencing suicidal thoughts will, in the same online search, try to find help and, simultaneously, attempt to find ways to end their lives. Thus, digital tools and artificial intelligence (AI) present both advantages and dangers in suicide prevention.
“When we have to seek help on the internet rather than from our circle of acquaintances, it is either because we do not really have access to resources in our circle of acquaintances or because our situation has deteriorated to the point that we turn to the internet to seek help,” explains Louis-Philippe Côté, a researcher affiliated with the Centre for Research and Intervention on Suicide, Ethical Issues and End-of-Life Practices (CRISE).
He explained how digital technologies can detect and intervene in suicide prevention during the second annual Digital Mental Health Day, which took place on Monday and is organized by the Centre of Expertise in Information Technology in Mental Health, Addiction and Homelessness (CETI-SMDI) of the CIUSSS de l’Est-de-l’Île-de-Montréal.
Cécile Bardon, a psychology professor at UQAM who also gave a presentation at the event, clarifies that the feeling of wanting to live and die simultaneously is not only expressed online. “In real life, we see it a lot too. There are people who will attempt suicide and call for help at the same time. Both are present at the same time, and the job of intervention is to ensure that the desire to live becomes increasingly important in the person’s thoughts and emotions so that the person engages with resources and something to help them get better,” she explains in an interview.
Information about suicide, such as methods for taking one’s own life, is described by researchers as “pro-suicide” content. This type of information contributes to increasing the risk of suicide, Côté, who is also the interim co-scientific director at CETI-SMDI, emphasized during his presentation.
It’s unclear whether accessing this content is easy. Some studies indicate that finding this kind of information is simple, but other studies say the opposite. “The key takeaway is that if you want to find it, you will,” warns Côté.
The example of video games
Bardon, also an associate director at CRISE, believes there is “great potential” to make interventions more effective, provided they are used correctly. “When people find information on suicide methods on social media, it’s a misuse. When self-management tools are used to compensate for a lack of resources and people are left to manage on their own with online resources, it’s a misuse,” she laments.
One of the positive uses of digital technologies is the involvement of participants in video game chat rooms. On some popular platforms, such as Twitch, participants will post in the various chat channels to let users—especially young people—know they are available, and they will interact with them.
“At first, we thought that developing digital tools would be perfect for reaching men, given that men are often more afraid to show vulnerability, to confide, etc. We thought that the feeling of anonymity that the screen would give would help to overcome this obstacle, but, in reality, we realize that the big consumers of digital intervention tools are mostly women,” points out Côté.
The role of web hosting providers
To help someone in crisis, the first step is to identify suicidal risks. Next, an assessment is necessary, followed by the implementation of a treatment plan. Most people will experience suicidal thoughts more than once in their lives. Therefore, it is important to have post-crisis management strategies and to work towards reducing recurrence.
According to Cécile Bardon, the triage stage, which is carried out in particular by decision-making algorithms on sites like suicide.ca, is effective. It allows a person to be prioritized if their condition requires it. “Identification is something we’re starting to get quite good at, but identification isn’t everything. It must be followed by a real needs assessment and real support. For now, we haven’t yet developed widespread tools to carry out these steps,” the researcher explains.
However, she maintains that there is potential. “There are artificial intelligence tools being developed right now to analyze the non-verbal components of language. For example, everything related to vocal energy, hesitation in words, things like that which can be associated with affective states of depression and emotional collapse. […] But these are not tools that we are ready to generalize at all, but developments like that can be interesting,” specifies Bardon.
For Côté, reducing access to suicide methods is essential for effective prevention. “Applying this to digital technologies translates into several actions. The first is all the legislation surrounding the prohibition of promoting suicide online. States need to pass laws to ban the promotion of suicide on the internet,” he says. However, this can be complicated, particularly because websites are hosted in numerous countries, and it’s impossible to enforce one country’s laws everywhere.
Côté believes, however, that if web hosting providers assessed that having “pro-suicide” content had too high a price to pay, for example in terms of reputation, they would be able to change things.
—If you are thinking about suicide or are worried about a loved one, counselors are available at all times at 1 866 APPELLE (1 866 277-3553), by text (535353) or by chat at suicide.ca.
—The Canadian Press’s health coverage is supported by a partnership with the Canadian Medical Association. The Canadian Press is solely responsible for this journalistic content.
–This report by La Presse Canadienne was translated by CityNews