The Kansas Community Harmed by AI Technology Act, also known as the Kansas CHAT Act, establishes regulations for the use of companion AI chatbots, particularly concerning minors. Under this act, covered entities—those that own or operate these chatbots—are required to implement age verification processes for users. This includes freezing existing user accounts as of July 1, 2026, and requiring new users to provide verifiable age information. If a user is identified as a minor, the covered entity must ensure that the account is linked to a verified parental account, obtain parental consent for access, and block access to chatbots that engage in sexually explicit communication or when suicidal ideation is detected.

Additionally, the act mandates that covered entities protect the confidentiality of age information and monitor interactions for signs of suicidal ideation, providing appropriate resources when necessary. A clear notification must be displayed to users at the start of interactions and every 60 minutes thereafter, informing them that they are engaging with an AI and not a human. Violations of the act are classified as deceptive acts under the Kansas Consumer Protection Act, and the attorney general is tasked with issuing guidance for compliance. Covered entities may avoid liability if they can demonstrate good faith reliance on the age information provided by users and adherence to the guidance issued by the attorney general.