Recent data has revealed that deepfake celebrity endorsement scams saw a 22% rise last year alone, making them one of the fastest growing types of scams.
In light of this, the experts at AI prompt management tool AIPRM have provided insight on how to spot a celebrity impersonation scam, the money lost from these scams, as well as tips to keep you and your loved ones safe. Expert comment has also been provided by Christoph C. Cemper, founder of AIPRM.
THINGS TO DO: Experience AZ: 10 Arizona restaurants with an amazing view
GET THE LATEST NEWS: Subscribe for free to get AZ Big Media’s newsletter
What are Deepfake Celebrity Impersonation Scams?
Celebrity impersonation scams often entail the scammer using advanced AI technology to manipulate existing footage or audio of a well-known celebrity, which can make people believe that they are communicating with the real celebrity. People often feel a strong affiliation and closeness towards celebrities, making victims more likely to adhere to the scammer’s requests.
How to Identify a Deepfake Celebrity Impersonation Scam
With these scams becoming more advanced and deep-fakes becoming harder to spot, it is more important than ever to stay vigilant. Being aware of key warning signs can help you identify and avoid falling victim to a deep-fake celebrity scam.
1. Examine images or video, and look for suspicious features
Deep-fakes often aren’t able to perfect the fine details of a real image or video. Therefore, it is advised to keep an eye out for distortion or blurriness in images, particularly when it comes to the edges of faces or objects. Additionally, inconsistencies in lighting, or unnatural placement of reflections could also be a red flag. Pay attention to the fine details and attributes that might not look quite right, for example, errors in hair, nails, or teeth, could be a key indicator that the image is AI generated. It is crucial to examine the image in depth before making a decision on the legitimacy of it.
When analysing video and audio, always look for discrepancies between the audio and visual components. If the audio or voice of the celebrity doesn’t quite match their lips, the video could be fraudulent. A robotic tone to the voice, or one that lacks the natural conversational flow of a human speaker could indicate a fake piece of audio. Lastly, if the audio sounds too perfect or unnaturally ‘smooth’, it could be a sign of deep-fake manipulation.
If you do spot any of these signs, it is best practice to avoid responding, and report the content to your local reporting centre for fraud and cybercrime.
2. AI- generated text
Some scams may utilise text crafted to replicate that of the celebrity, in an attempt to draw in the victim. In these instances, keep a lookout for unusual phrasing, grammar mistakes, or inconsistency in tone, all of which are indicators of AI generated text that lacks human touch. Ask yourself if the text does reflect how the celebrity writes or talks, and check any claims against trusted online sources. As a general rule, anything that seems too good to be true, most likely is.
3. Strong emotional triggers
A very common indicator of celebrity deepfake scams is a call to action, which utilizes a sense of urgency to play on your emotions and pressure you into complying. If you feel any sense of urgency or emotional pull, this could be a red flag. It is always best to avoid engaging with the scammers request, and report it to your local fraud and cybercrime centre.
“Deep-fake impersonation scams are expected to grow this year, driven by the advancement of AI technologies,” said Christoph C. Cemper, founder of AIPRM. “That’s why understanding these warning signs is crucial in keeping you and your loved ones safe.
“If you spot any of the above scam indicators, it is best to report it as soon as possible to your local fraud and cybercrime centre,” Cemper said. “As a rule, if you receive an out of the blue message or video from a celebrity, especially one that seems too good to be true, the chances are it is a scam attempt. My advice is to ignore such content entirely.”