Several BBC presenters, including Sally Bundock and Matthew Amroliwala, along with popular YouTuber MrBeast, have become victims of deepfake technology, raising alarms about the potential misuse of these highly convincing but fraudulent videos. The deepfake videos, appearing on various social media platforms, falsely claim that viewers can earn substantial returns on investment through a project associated with Elon Musk, enticing them to participate in a financial scam.
One of the affected presenters, Sally Bundock, expressed shock at the lifelike video circulating on social media. The deepfake, created using Artificial Intelligence, imitates her appearance and voice in a fabricated news scenario. Similar instances involving other presenters and MrBeast have also emerged.
Experts have advised vigilance when encountering suspicious content online. While deepfake technology is becoming increasingly sophisticated, there are subtle indicators that can help discern real from fake. Verbal cues, such as mispronunciations or odd phrasing, can be revealing. Visual glitches, like unnatural placements of body parts, are also common in deepfakes.
The scammers attempted to lend credibility to their fake videos by embedding social media verification marks and names. However, these efforts fell short of authenticity, especially in platforms like TikTok, where uploader names are automatically displayed below the logo.
As the prevalence of deepfake technology grows, concerns about its ethical and legal implications mount. Law professor Lilian Edwards emphasised the complexities surrounding deepfakes, particularly in the context of varying international laws. Determining jurisdiction and applicable laws in cases involving deepfake scams remains a challenge. Experts warned against a blanket criminalization of all deepfakes, highlighting the need for nuanced legal approaches.
In the face of these developments, viewers are urged to exercise caution online. The overarching lesson remains: skepticism is crucial, and if an offer seems too good to be true, it probably is.