Trending Now
We have updated our Privacy Policy and Terms of Use for Eurasia Group and its affiliates, including GZERO Media, to clarify the types of data we collect, how we collect it, how we use data and with whom we share data. By using our website you consent to our Terms and Conditions and Privacy Policy, including the transfer of your personal data to the United States from your country of residence, and our use of cookies described in our Cookie Policy.
{{ subpage.title }}
Is AI responsible for a teen’s suicide?
Moments before Sewell Setzer III took his own life in February 2024, he was messaging with an AI chatbot. Setzer, a 14-year-old boy from Florida, had struck up an intimate and troubling relationship — if you can call it that — with an artificial intelligence application styled to simulate the personality of “Game of Thrones” character Daenerys Targaryen.
Setzer gave numerous indications to the chatbot, developed by a company called Character.AI, that he was actively suicidal. At no point did the chatbot break character, provide mental health support hotlines, or do anything to prevent the teen from harming himself, according to a wrongful death lawsuit filed by Setzer’s family last week. The company has since said that it has added protections to its app in the past six months, including a pop-up notification with the suicide hotline. But that’s a feature that’s been standard across search engines and social media platforms for years.
The lawsuit, filed in federal court in Orlando, also names Google as a defendant. The Big Tech company hired Character.AI’s leadership team and paid to license its technology in August, the latest in a spate of so-called acqui-hires in the AI industry. The lawsuit alleges that Google is a “co-creator” of Character.AI since its founders initially developed the technology while working there years earlier.
It’s unclear what legal liability Character.AI will have. Section 230 of the Communications Decency Act, which largely protects internet companies from civil suits, is untested when it comes to AI chatbots because it protects companies from speech posted by third parties. In the case of AI chatbots, the speech is directly from an AI company, so many experts have predicted that it won’t apply in cases like this.
Euthanasia for mentally ill off the table in Canada – for now
In most US states, medically assisted suicide remains illegal, with only 11 jurisdictions offering legal medical aid in dying for the terminally ill. In Florida, there is a charge of manslaughter for anyone assisting another person “self-murder.”
As such, it must seem utterly bizarre to many Americans that Canada was on the brink of legalizing medically assisted dying for people with mental illness, which would have made it one of the most liberal euthanasia regimes in the world.
At the insistence of the courts, Canada legalized assisted death in 2016 for people with terminal illnesses and expanded it to people with incurable but not terminal illnesses in 2021. As part of that expansion, a provision to cover people whose only underlying condition is mental illness was included, but its implementation was delayed until this coming March 17. In the meantime, a special joint committee of members of Parliament and appointed senators was asked to verify that the health system is ready to safely apply medically assisted dying for the mentally ill.
That committee reported back this week that Canada is not ready, and according to expert witnesses may never be ready to institute such a regime. The central sticking point was less about logistics and more about ethics. The concern of many expert witnesses is the nature of mental illness.
“The committee heard it is difficult, if not impossible, to accurately predict the long-term prognosis of a person with a mental disorder,” the report concluded.
According to one witness, Dr. K Sonu Gaind, chief of the department of psychiatry at Sunnybrook Health Sciences Centre in Toronto, there is evidence that clinicians’ predictions are wrong one-half of the time.
Preparations to introduce medically assisted dying for those with mental disorders will continue, but the committee heard a majority of psychiatrists are not in favor of the expansion.
Nor, for the moment, is the federal government. Mark Holland, the health minister, introduced legislation today that will push back the implementation three years, to March 17, 2027.