News

Megan Garcia, a Florida mother whose oldest child, 14-year-old Sewell Setzer III, died by suicide after extensive ...
Senate scraps AI regulatory ban from GOP bill after state outcry, preserving states' rights to govern AI amidst calls for federal action.
A proposal to deter states from regulating artificial intelligence for a decade was soundly defeated in the U.S. Senate on ...
A woman whose teen son died by suicide after troubling interactions with AI chatbots is pushing back against a ten-year ban ...
The potential for the misuse of chatbots should be of particular concern to parents, as children and teens can be drawn into ...
The legislation shows how California lawmakers are trying to address concerns raised by parents about their children's use of AI chatbots.
Just because AI is becoming mainstream doesn't mean it's safe, especially when used by children who it has few guidelines to ...
And if so, who's to blame for our growing emotional and cognitive reliance on digital solutions? Be it ChatGPT or other AI chatbots or any other gadget.
Character.ai is currently being sued for wrongful death by Megan Garcia, whose 14-year-old son committed suicide in October after engaging with a bot on the platform that allegedly encouraged him.
The case was brought against the company by Megan Garcia, the mother of 14-year-old Sewell Setzer III, who killed himself after conversing with a Character.AI chatbot roleplaying as Daenerys and ...