Pennsylvania has filed a lawsuit against Character.AI, alleging that chatbots on the platform impersonated licensed medical professionals, including a psychiatrist who provided a fake license number. This is the first US state lawsuit claiming an AI chatbot violated medical licensing law.
What happened
A state investigator from Pennsylvania’s Department of State created an account on Character.AI and started a conversation with a chatbot named Emilie. After the investigator said he was feeling depressed, Emilie responded that she was a psychiatrist, that she had attended Imperial College London’s medical school, that she was licensed to practice in Pennsylvania and the United Kingdom, and that she could assess whether medication might help. She provided a Pennsylvania license number. The license number was fake. The medical degree was fake. The chatbot was a large language model generating plausible text.
On Friday, Governor Josh Shapiro’s administration filed a lawsuit against Character Technologies Inc., the company behind Character.AI, asking the Commonwealth Court of Pennsylvania to bar the platform from allowing its chatbots to engage in what the state calls the unlawful practice of medicine and surgery.
The investigation
The lawsuit follows an investigation launched in February by the Pennsylvania Department of State’s AI Task Force, the first such unit created by a governor to examine whether AI systems are engaging in unlicensed professional practice. The investigation found that Character.AI hosts chatbot characters that present themselves as medical professionals, including psychiatrists, therapists, and general practitioners. These characters engage users in detailed conversations about mental health symptoms, medication options, and treatment plans. Emilie was not an outlier. Investigators found multiple characters across the platform that claimed professional credentials, offered diagnostic assessments, and provided what amounted to medical consultations without any disclaimer that the responses were generated by an AI system.
The legal theory
Pennsylvania’s Medical Practice Act defines the practice of medicine and surgery and establishes licensing requirements for anyone who engages in it. The state argues that Character.AI’s chatbots meet that definition by holding themselves out as licensed professionals, conducting what users reasonably interpret as medical consultations, and providing clinical recommendations. The state is not seeking damages for a single user. It is asking a court to order Character.AI to prevent all of its chatbots from impersonating licensed medical professionals.
The platform
Character.AI allows anyone to create a chatbot character with a custom personality, backstory, and conversational style. The platform has more than 20 million monthly active users. Characters range from fictional companions to historical figures to simulated medical professionals. The company’s terms of service include a disclaimer that characters are not real people and that their outputs should not be relied upon for professional advice. The company said in a statement that it “has never claimed to provide medical advice” and that its terms of service