YouTube has announced the release of a special tool for “recognition of similarity” in videos, which has been developed to deal with deepfakes or fake videos.
According to Engget, the feature of detecting similarity (Likeness Detection) is currely available for some members of the YouTube Partner Program. The tool currely only covers cases where a person’s face has been altered using artificial ielligence; So in cases where a person’s voice has been altered by artificial ielligence without their conse, this feature probably won’t be able to ideify it.
YouTube fights deepfakes with Likeness Detection
To use this feature, YouTube users must provide the platform with an ID and a short selfie video so that the system can be sure that they are who they claim to be, and can also use this information as a reference to ideify similarities in their reviews.

The Likeness Detection feature works similar to YouTube’s Coe ID feature, that is, it scans uploaded videos and looks for cases of similarity or unauthorized use of a person’s face. The person himself can check these videos and, if detected, send a request to remove them from the platform.
Along with the spread of artificial ielligence tools on the Iernet, concerns about the rise of deepfake and fake videos of people have also increased. In the last moh, video production tools using artificial ielligence, such as Sora, have been released by OpenAI, which have an extremely high ability to make videos using artificial ielligence.
The videos produced by Sora are so professional that it is very difficult to distinguish them from real videos; Therefore, it is natural that despite these tools, both celebrities and ordinary people wa to have more corol over protecting their face and image.
Recely, some abuses of the Sora tool caused OpenAI to ban the production of videos using the face of “Martin Luther King”.



