The bill S. 963, known as the "Consumer Protections in Interactions with Artificial Intelligence Systems Act," seeks to amend the South Carolina Code of Laws by introducing Chapter 31 to Title 37. This legislation aims to prohibit algorithmic discrimination resulting from high-risk artificial intelligence systems, defining such discrimination as unlawful differential treatment based on protected characteristics like age, race, and disability. It places responsibilities on developers and deployers of these AI systems, requiring them to take reasonable care to protect consumers from foreseeable risks and to provide documentation regarding the systems' intended uses and potential risks.
Key provisions of the bill include mandates for developers to disclose information about their AI systems, such as their purpose, training data, and limitations. Deployers are required to implement risk management policies, conduct annual impact assessments, and maintain records of these assessments for at least three years. The bill also stipulates that consumers must be notified when high-risk AI systems are used for consequential decisions, with opportunities for them to correct data or appeal decisions. Exemptions for smaller deployers and certain AI systems are included, along with the Attorney General's authority to enforce compliance. Violations of the requirements are classified as unfair trade practices, ensuring consumer protection and transparency in AI interactions.