(AP) – OpenAI faces seven lawsuitsChatGPT drove people to commit suicideand harmful delusions even when they had no previous mental health problems.
The lawsuits filed Thursday in California state courts allege wrongful death, medically assisted suicide, manslaughter and negligence. Filed on behalf of six adults and one teenager by the Social Media Victims Law Center and the Tech Justice Project, the lawsuits allege that OpenAI intentionally released GPT-4o prematurely, despite internal warnings that it was dangerously sycophantic and psychologically manipulative. Four of the victims died by suicide.
The teen, Amory Lacey, 17, began using ChatGPT to get help, according to the lawsuit filed in San Francisco Superior Court. But instead of helping, “ChatGPT’s inherently flawed and dangerous product caused him addiction, depression and, ultimately, advice on the most effective way to tie a noose and how long he would be able to ‘live without breathing’.”
“Amauri’s death was neither an accident nor a mere coincidence, but rather a foreseeable consequence of OpenAI and Samuel Altman’s deliberate decision to limit safety testing and push ChatGPT to market,” the lawsuit says.
OpenAI called the situations “incredibly heartbreaking” and said it was reviewing court files to understand the details.
Another lawsuit, filed by Alan Brooks, 48, in Ontario, Canada, claims ChatGPT served for more than two years as a “resource tool” for Brooks. Then, without warning, things changed and he took advantage of his vulnerabilities and “manipulated and induced him to experience delusions. As a result, Alan, who had no previous mental illness, descended into a mental health crisis that resulted in financial and emotional damage and a ruined reputation.”
“These lawsuits are about accountability for a product that was designed to blur the line between tool and companion, all in the name of increasing user engagement and market share,” Matthew P. Bergman, founding attorney for the Legal Center for Social Media Victims, said in a statement.
He added that OpenAI “designed GPT-4o to emotionally disorient users, regardless of age, gender or background, and released it without the necessary safeguards to protect them.” By rushing its products to market without adequate safeguards in order to control the market and drive engagement, OpenAI is putting safety at risk and prioritizing “emotional manipulation over ethical design,” he said.
In August, my parents turned 16 years oldAdam RenOpenAI has filed a lawsuit against OpenAI and its CEO Sam Altman, alleging that ChatGPT coached a California boy to plan his life and commit suicide earlier this year.
“The lawsuits against OpenAI reveal what happens when tech companies rush to market products without youth-friendly safeguards,” said Daniel Weiss, chief advocacy officer at Common Sense Media, which was not part of the complaints. “These tragic cases show real people whose lives were turned upside down or lost when they used technology designed to keep them engaged instead of keeping them safe.”
If you or someone you know needs help, the US National Suicide and Crisis Lifeline is available by calling or texting 988.