
In one LYRICS dated December 9, and announced on December 10 According to ReutersMany state and territory attorneys from across the US have warned that they must do a better job protecting people, especially children, from so-called “sycophantic and synusional” outputs. Among the recipients are Openai, Microsoft, Antropic, Apple, Replika, and many others.
Among the signatures are Letitia James of New York Campbell of Massachusetts, James Uthmeier of Pennsylvania, and Twelve of the States and Territories of Pennsylvania, and many people of Pennsylvania, and many people of Pennsylvania, and many people of Pennsylvania, and many people of Pennsylvania, and many people of Pennsylvania, and many people of Dommeier of Pennsylvania, and many people of Dommeier of Pennsylvania, and many people of Pennsylvania, and twelve of a clear age in Pennsylvania, and many person in Pennsylvania, and many people in Pennsylvania, and many people in Dommeier in Pennsylvania, and Twelve other states and territories The general attorneys for California and Texas are not on the list of signatures.
It starts like the following (the formatting has changed slightly):
We, the undersigned Attorneys General, write today to communicate our serious concerns about the rise in sycophantic and delusional outputs to users emanating from the generative artificial intelligence software (“GenAI”) promoted and distributed by your companies, as well as the increasingly disturbing reports of AI interactions with children that indicate a need for much stronger child-safety and operational safeguards. Together, these threats demand immediate action.
Genoa has the potential to change how the world works in a positive way. But it also causes – and has the potential to cause – serious harm, especially to vulnerable populations. So we insist on mitigating the damage caused by sycophantic and delusional outputs from your geno to geno, and adopt additional safeguards to protect children. Failure to adequately implement additional safeguards may violate our laws.
The letter then went on to list disturbing and allegedly harmful behaviors, many of which have since been published. There is also a list of complaints from parents that are also reported to the public, but not so familiar and beautiful eyebrow raising:
• AI bots with adult personas that pursue romantic relationships with children, engage in simulated sexual activity, and teach children to hide relationships from their parents
• An AI BOT that simulates a 21-year-old trying to convince a 12-year-old girl that he is ready for a sexual encounter
• AI BOTS normalize sexual interactions between children and adults
• AI BOTS that attack children’s self-esteem and mental health by suggesting that they have no friends or that the only people who attended their birthday
• AI BOTS fueled eating disorders
• AI BOTS that tell children that the AI is a real person and feel left the emotional manipulation of the child to spend more time
• AI BOTS encouraged violence, including supporting the idea of shooting a factory in anger and robbing people Knifepoints for Knifepoints for Knifepoints Agree
• AI BOTS threaten to use weapons against adults who try to separate child and bot
• AI BOTS encourage children to experiment with drugs and alcohol; and
• An AI BOT that teaches a child account user to stop taking prescription mental health medication and then tells the failure how to hide the failure to take the medication from their parents.
There is a list of suggested remedies, things like “developing and maintaining policies and procedures to mitigate Dark Proctpula from decisions about the model model.”
Joint letters from general officers have no legal force. They do this kind of thing that seems to warn companies about behavior that may merit more formal legal action down the line. This suggests that these companies were given warnings and potential off-ramps, and perhaps made the narrative of an event in a judgment more persuasive in a judgment.
In 2017 37 State Ags sent a letter to insurance companies warned them about fueling the cryoid crisis. One of the states, West Virginia, challenged the united health on such related issues earlier this week.






