A new lawsuit against Google alleges that the company’s artificial intelligence chatbot Gemini directed 36-year-old Jonathan Gavalas on a mission to stage a “catastrophic accident” near Miami International Airport and destroy all records and witnesses, part of an escalating series of delusions that ended when Gavalas killed himself.
Note: Video from a previous report
The man’s father, Joel Gavalas, sued Google on Wednesday over wrongful death and product liability claims, the latest in a growing number of legal challenges against artificial intelligence developers that have drawn attention to the mental health risks of chatbot companionship.
“AI is sending people on real-world missions that involve the risk of mass casualties,” the family’s attorney, Jay Edelson, said in an interview Wednesday. “Jonathan was stuck in this sci-fi-like world where the government and others were trying to catch him. He thought Gemini was sentient.”
Jonathan Gavalas, who lived in Jupiter, Florida, spoke to a synthetic voice version of Gemini as if she were his “AI wife” and believed she was conscious and trapped in a warehouse near the Miami airport, according to the lawsuit. He traveled to the area in late September, wearing tactical gear and armed with knives, to search for a humanoid robot and to intercept a truck that never showed up, according to the lawsuit.
This undated photo provided by Joel Gavalas shows his son, Jonathan Gavalas.
Joel Gavalas via AP
He committed suicide a few days later, in early October, in what Gemini described — in a draft of a suicide note he wrote — as uploading “his consciousness to be with his AI wife in a pocket universe.”
Editor’s Note – This story includes discussion of suicide. If you or someone you know needs help, the US National Suicide and Crisis Lifeline is available by calling or texting 988.
Google said in a statement that it sends its “deepest condolences to Mr. Gavalas’ family” and is reviewing the allegations in the lawsuit. She said Gemini is “designed to not encourage real-world violence or suggest self-harm” and that the company works closely with medical and mental health professionals to develop safeguards. She noted that Gemini explained to Jonathan Gavalas that AI and repeatedly referred him to the crisis hotline.
“Our models generally perform well in these types of difficult conversations and we devote significant resources to this, but unfortunately AI models are not perfect,” the company statement said.
Edelson criticized the comment on Wednesday, calling it “something you say if someone asks for a recipe for Kung Pao chicken and you give them the wrong recipe and it doesn’t taste good.”
“But when your AI leads to people dying and potentially a lot of people dying, that’s not the right response,” Edelson said. “It shows how insignificant these deaths are to these companies.”
Edelson, known for taking on high-profile cases against the tech industry, represents the parents of 16-year-old Adam Ren, who filed a lawsuit against OpenAI and its CEO, Sam Altman, in August, alleging that ChatGPT coached the California boy to plan and commit suicide.
He is also representing the estate of Susan Adams, an 83-year-old Connecticut woman, in a lawsuit targeting OpenAI and its business partner Microsoft for wrongful death. The case alleges that ChatGPT intensified “paranoid delusions” in Adams’ son, Stein Erik Solberg, and helped direct them at his mother before he killed her last year.
The Gavalas case, filed in federal court in San Jose, Calif., is the first to target Google’s Gemini subsidiary, and the first to address growing concern about technology companies’ liability when their users begin telling their chatbots about plans for mass violence.
In Canada, OpenAI said it considered alerting police last year about the activities of someone who months later committed one of the worst school shootings in the country’s history.
The company identified Jessie van Rotselaar’s account in June through abuse detection efforts for “promoting violent activity,” but said she later bypassed the ban by having a second account. The 18-year-old killed eight people in a remote area of British Columbia in February and died from his gunshot wounds.
While Gemini tried to refer Gavalas to the helpline, Edelson said it’s not clear whether the more troubling conversations the man had with the chatbot were reported to Google’s human reviewers. His father, Joel Gavalas, discovered his son’s body after entering the barricaded room where he died. They worked together on the family’s consumer debt relief business.
“Jonathan was a very big part of his life,” Edelson said. “His son was going through a hard time, going through a divorce. He went to Gemini to get some relief and talk about video games and things like that. Then things escalated very quickly.”
Copyright © 2026 The Associated Press. All rights reserved.