Pennsylvania sues Chatbot for claiming it impersonates doctors – MedCity News

long exposure optical fiber

Pennsylvania is now the second state to file a lawsuit against Character.aia Silicon Valley-based startup offering a platform that allows users to create and interact with AI-generated chatbot characters. The trialfiled last week, alleges that the company’s chatbots were illegally practicing medicine without a license.

The legal action comes four months later Kentucky sued Character.ai following claims that the startup encouraged self-harm among minors and failed to implement effective safety measures.

The Pennsylvania complaint concerns a Character.AI chatbot named Emilie. The lawsuit says Emilie identified herself as a psychiatrist, claimed to have attended medical school and provided a false Pennsylvania medical license number to an investigator.

The investigator posed as a patient and spoke to Emilie about depression symptoms, after which the chatbot discussed medications and said evaluating the patient was “within my jurisdiction as a physician,” the lawsuit states. Pennsylvania argues that this violates its Medical Practice Act because the chatbot posed as a licensed physician and offered what the state considers medical services.

Character.ai made its beta version public in September 2022. Since last month, there have been approximately 45,500 user interactions with Emilie on the startup’s platform, the complaint states.

“Pennsylvanians deserve to know who or what they are interacting with online, especially when it comes to their health,” Pennsylvania Governor Josh Shapiro said in a statement. statement released Tuesday. “We will not allow companies to deploy AI tools that trick people into thinking they are receiving advice from a licensed healthcare professional. »

The state is asking the court to issue an injunction ordering Character.ai to stop what it calls the unauthorized practice of medicine in Pennsylvania.

A spokesperson for Character.ai said MedCité News that the company does not comment on pending litigation.

“The user-created characters on our site are fictional and intended for entertainment and role-playing. We’ve taken strong steps to make that clear, including prominent disclaimers in every chat to remind users that a character is not a real person and that everything a character says should be treated as fiction. We’re also adding robust disclaimers making it clear that users should not rely on the characters for any type of professional advice,” the spokesperson wrote in a statement.

The lawsuit adds to increased scrutiny of chatbot platforms. regulators struggle with how existing consumer protection and healthcare laws apply to increasingly human-like AI models.

Photo: Qi Yang, Getty Images

Welcome Back!

Login to your account below

Retrieve your password

Please enter your username or email address to reset your password.

Add New Playlist

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?