Internet detectives are misusing AI to find Charlie Kirk’s alleged shooter

The news article discusses the misuse of AI technology by internet users to enhance blurry photos shared by the FBI in connection with the shooting of right-wing activist Charlie Kirk. The FBI had shared two low-quality surveillance photos of a person of interest, and numerous users quickly responded with AI-generated, "enhanced" versions of the images, claiming to provide more detail. However, the article cautions that AI tools are not reliable for uncovering hidden details in fuzzy pictures, as they often infer and create features that do not actually exist in the original image. The article provides examples of past incidents where AI upscaling has produced inaccurate and misleading results, such as depicting President Obama as a white man and adding a nonexistent lump to President Trump's head. The article emphasizes that the AI-generated enhancements should not be treated as hard evidence in a criminal investigation and that the public should rely on the original FBI photos for the ongoing manhunt.
Source: For the complete article, please visit the original source link below.