The "Protecting Children from Chatbots Act" aims to enhance the safety of minors interacting with chatbots by establishing specific regulations for covered entities, which are defined as operators of chatbots with over 500,000 monthly active users. The bill mandates that these entities provide a limited-access mode for unverified users, ensuring that minors can only access chatbots in this mode unless parental consent is obtained for additional features. It also requires operators to implement reasonable age verification processes, restrict access to explicit content, and avoid prioritizing engagement metrics that could harm user wellbeing.

Additionally, the bill outlines procedures for reporting incidents of harm caused by chatbots, including a requirement for covered entities to notify emergency services if a user is at imminent risk of serious harm. Violations of the act can result in civil penalties of up to $50,000 per violation, and individuals harmed by violations may seek damages through civil action. The Attorney General is empowered to enforce the act, and the provisions are designed to protect minors from potential emotional dependence and harmful interactions with chatbots.