Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.
This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operator's internet website.
This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action.