ChatGPT developing age-verification system to identify under-18 users after teen death

OpenAI, the company behind ChatGPT, is developing an age-verification system to identify users under the age of 18. This move comes after the family of a 16-year-old who died by suicide in April filed legal action, alleging that the teen had been engaging with the chatbot for months. According to OpenAI CEO Sam Altman, if there is any doubt about a user's age, the system will default to the under-18 experience, prioritizing "safety ahead of privacy and freedom for teens." The company aims to restrict how ChatGPT responds to users it suspects are under 18, unless they provide identification or pass the age estimation technology. Altman emphasized the need for "significant protection" for minors using the platform. The implementation of this age-verification system is a response to the tragic incident and an effort to ensure the safety and well-being of young users on the ChatGPT platform.
Source: For the complete article, please visit the original source link below.