Some experts are concerned about a rising loneliness epidemic where roughly one-in-two people report that they are experiencing loneliness. With more people feeling alone, there has been a significant increase in AI companionship with some companies reporting over 20 million active users per month. Some users of the AI companion have described close friendships that have led to romantic interactions with their AI companions – described as feeling akin to falling in love. With a rising loneliness epidemic and increasing use of AI companionship, there is a fear that these companions can become addictive and create unhealthy attachments.
 
For this reason, I will be introducing legislation that will require AI companionship applications to implement safety features when the user interacts with the applications for long periods of time. Additionally, this legislation would require the application to detect and implement a safety protocol if a user talks about suicidal ideation or self-harm and notify and remind the user that the companion is not human.  
 
With the rising popularity of AI and its improving abilities, it is becoming more difficult to differentiate between what is real and what is AI. Therefore, we must promote safety, transparency, and responsible AI usage relating to companionship.   
 
Please join me in co-sponsoring this important legislation and protect the mental health of Pennsylvanians.