South Carolina Lawmakers Advance Regulations for AI Chatbots

South Carolina legislators are moving forward with a new bill designed to establish safety guardrails around the use of artificial intelligence chatbots, particularly for young users. The proposal stems from growing concerns regarding the emotional impact and potential risks these platforms pose to developing children, including the possibility of creating unhealthy dependencies or exposing minors to inappropriate content.

The proposed legislation introduces several strict requirements for AI companies operating within the state:

  • Age Verification: Companies would be required to verify the age of all users.

  • Parental Consent: Minors would need explicit parental permission to access chatbot services.

  • Limited Access Mode: For underage users, high-risk features and sexually explicit content would be automatically blocked.

  • Emergency Reporting: Platforms would be mandated to report serious risks, such as indications of self-harm, directly to emergency services.

To address data privacy concerns, the bill requires that any data collected for age verification purposes be deleted within 24 hours. Furthermore, the legislation prohibits AI tools from being presented as licensed professionals, preventing the software from masquerading as medical doctors or legal counsel.

Supporters of the measure emphasize that the bill is not a ban on technology but rather a necessary set of protections. They argue that because children often lack the critical thinking skills to recognize the dangers of digital engagement, technology must be designed with inherent safety boundaries. The bill has advanced to a full committee, but it must still secure approval from both the House and Senate before the current legislative session concludes.

Sign up here to receive the Tega Cay Sun "day" Spectator every Sunday morning with all the news from the week directly to your inbox