OpenAI Says It's Working With Actors to Crack Down on Celebrity Deepfakes in Sora

OpenAI, the artificial intelligence research company, has announced that it is working with actors to address the issue of celebrity deepfakes. This comes after actor Bryan Cranston alerted SAG-AFTRA, the actors' union, when he saw AI-generated videos of himself made with the AI video app Sora. Deepfakes, which are manipulated media that make people appear to say or do things they did not, have become a growing concern, particularly when used to create false content involving celebrities. OpenAI says it is working with actors to develop tools and policies to help prevent the unauthorized use of their likenesses in deepfakes. The company is also exploring ways to empower actors to take action against the creation and distribution of deepfakes that misrepresent them. This includes working with Sora, the AI video app, to address the issue on its platform. Overall, this move by OpenAI reflects the growing recognition of the need to address the challenges posed by deepfakes and to protect the rights of individuals, particularly public figures, whose likenesses may be misused.
Source: For the complete article, please visit the original source link below.