www.theguardian.com
Open in
urlscan Pro
2a04:4e42:400::367
Public Scan
Submitted URL: https://m.avanan.com/api/mailings/click/PMRGSZBCHI3DCMBSGMZTCLBCOVZGYIR2EJUHI5DQOM5C6L3XO53S45DIMVTXKYLSMRUWC3ROMNXW2...
Effective URL: https://www.theguardian.com/technology/article/2024/may/10/ceo-wpp-deepfake-scam
Submission: On December 02 via manual from GB — Scanned from GB
Effective URL: https://www.theguardian.com/technology/article/2024/may/10/ceo-wpp-deepfake-scam
Submission: On December 02 via manual from GB — Scanned from GB
Form analysis
2 forms found in the DOMhttps://www.google.co.uk/search
<form action="https://www.google.co.uk/search" class="dcr-1jjnk9d"><label for="gu-search" class="dcr-0">
<div class="dcr-6v110l">Search input </div>
</label><input type="text" id="gu-search" aria-required="true" aria-invalid="false" aria-describedby="" required="" name="q" placeholder="Search the Guardian" data-link-name="header : search" tabindex="-1"
class="selectableMenuItem dcr-qcy5wo"><label for="gu-search" class="dcr-0">
<div class="dcr-6v110l">google-search </div>
<div class="dcr-radnun"><svg width="30" viewBox="-3 -3 30 30" xmlns="http://www.w3.org/2000/svg" aria-hidden="true">
<path fill-rule="evenodd" clip-rule="evenodd"
d="M9.273 2c4.023 0 7.25 3.295 7.25 7.273a7.226 7.226 0 0 1-7.25 7.25C5.25 16.523 2 13.296 2 9.273 2 5.295 5.25 2 9.273 2m0 1.84A5.403 5.403 0 0 0 3.84 9.274c0 3 2.409 5.454 5.432 5.454 3 0 5.454-2.454 5.454-5.454 0-3.023-2.454-5.432-5.454-5.432m7.295 10.887L22 20.16 20.16 22l-5.433-5.432v-.932l.91-.909z">
</path>
</svg><span class="dcr-1p0hins">Search</span></div>
</label><button type="submit" aria-live="polite" aria-label="Search with Google" data-link-name="header : search : submit" tabindex="-1" class="dcr-1gsboxi">
<div class="src-button-space"></div><svg width="30" viewBox="-3 -3 30 30" xmlns="http://www.w3.org/2000/svg" aria-hidden="true">
<path fill-rule="evenodd" clip-rule="evenodd" d="M1 12.956h18.274l-7.167 8.575.932.932L23 12.478v-.956l-9.96-9.985-.932.932 7.166 8.575H1z"></path>
</svg>
</button><input type="hidden" name="as_sitesearch" value="www.theguardian.com"></form>
https://www.google.co.uk/search
<form action="https://www.google.co.uk/search" class="dcr-1jjnk9d"><label for="gu-search" class="dcr-0">
<div class="dcr-6v110l">Search input </div>
</label><input type="text" id="gu-search" aria-required="true" aria-invalid="false" aria-describedby="" required="" name="q" placeholder="Search the Guardian" data-link-name="header : search" tabindex="-1"
class="selectableMenuItem dcr-qcy5wo"><label for="gu-search" class="dcr-0">
<div class="dcr-6v110l">google-search </div>
<div class="dcr-radnun"><svg width="30" viewBox="-3 -3 30 30" xmlns="http://www.w3.org/2000/svg" aria-hidden="true">
<path fill-rule="evenodd" clip-rule="evenodd"
d="M9.273 2c4.023 0 7.25 3.295 7.25 7.273a7.226 7.226 0 0 1-7.25 7.25C5.25 16.523 2 13.296 2 9.273 2 5.295 5.25 2 9.273 2m0 1.84A5.403 5.403 0 0 0 3.84 9.274c0 3 2.409 5.454 5.432 5.454 3 0 5.454-2.454 5.454-5.454 0-3.023-2.454-5.432-5.454-5.432m7.295 10.887L22 20.16 20.16 22l-5.433-5.432v-.932l.91-.909z">
</path>
</svg><span class="dcr-1p0hins">Search</span></div>
</label><button type="submit" aria-live="polite" aria-label="Search with Google" data-link-name="header : search : submit" tabindex="-1" class="dcr-1gsboxi">
<div class="src-button-space"></div><svg width="30" viewBox="-3 -3 30 30" xmlns="http://www.w3.org/2000/svg" aria-hidden="true">
<path fill-rule="evenodd" clip-rule="evenodd" d="M1 12.956h18.274l-7.167 8.575.932.932L23 12.478v-.956l-9.96-9.985-.932.932 7.166 8.575H1z"></path>
</svg>
</button><input type="hidden" name="as_sitesearch" value="www.theguardian.com"></form>
Text Content
Skip to main contentSkip to navigation Close dialogue1/1Next imagePrevious imageToggle caption Skip to navigation SUPPORT THE GUARDIAN Fund independent journalism with 50% off Support us Support us Print subscriptions Search jobs Sign in UK * UK edition * US edition * Australia edition * Europe edition * International edition The Guardian - Back to homeThe Guardian News provider of the year * News * Opinion * Sport * Culture * Lifestyle Show moreHide expanded menu * News * View all News * UK news * US politics * World news * Climate crisis * Middle East * Ukraine * Football * Newsletters * Business * Environment * UK politics * Society * Science * Tech * Global development * Obituaries * Opinion * View all Opinion * The Guardian view * Columnists * Cartoons * Opinion videos * Letters * Sport * View all Sport * Football * Cricket * Rugby union * Tennis * Cycling * F1 * Golf * Boxing * Rugby league * Racing * US sports * Culture * View all Culture * Film * Music * TV & radio * Books * Art & design * Stage * Games * Classical * Lifestyle * View all Lifestyle * The Filter * Fashion * Food * Recipes * Travel * Health & fitness * Women * Men * Love & sex * Beauty * Home & garden * Money * Cars Search input google-search Search * Support us * Print subscriptions * * Search jobs * Hire with Guardian Jobs * Holidays * Live events * About Us * Digital Archive * Guardian Print Shop * Patrons * Guardian Licensing * The Guardian app * Video * Podcasts * Pictures * Newsletters * Today's paper * Inside the Guardian * The Observer * Guardian Weekly * Crosswords * Wordiply * Corrections * Search input google-search Search * Search jobs * Hire with Guardian Jobs * Holidays * Live events * About Us * Digital Archive * Guardian Print Shop * Patrons * Guardian Licensing * UK * US politics * World * Climate crisis * Middle East * Ukraine * Football * Newsletters * Business * Environment * UK politics * Society * Science * Tech * Global development * Obituaries Mark Read, CEO of WPP, the largest global advertising and public relations agency. Photograph: Toby Melville/Reuters View image in fullscreen Mark Read, CEO of WPP, the largest global advertising and public relations agency. Photograph: Toby Melville/Reuters Technology This article is more than 6 months old CEO OF WORLD’S BIGGEST AD FIRM TARGETED BY DEEPFAKE SCAM This article is more than 6 months old Exclusive: fraudsters impersonated WPP’s CEO using a fake WhatsApp account, a voice clone and YouTube footage used in a virtual meet Nick Robins-Early Fri 10 May 2024 08.01 BSTLast modified on Fri 10 May 2024 17.31 BST Share The head of the world’s biggest advertising group was the target of an elaborate deepfake scam that involved an artificial intelligence voice clone. The CEO of WPP, Mark Read, detailed the attempted fraud in a recent email to leadership, warning others at the company to look out for calls claiming to be from top executives. Fraudsters created a WhatsApp account with a publicly available image of Read and used it to set up a Microsoft Teams meeting that appeared to be with him and another senior WPP executive, according to the email obtained by the Guardian. During the meeting, the impostors deployed a voice clone of the executive as well as YouTube footage of them. The scammers impersonated Read off-camera using the meeting’s chat window. The scam, which was unsuccessful, targeted an “agency leader”, asking them to set up a new business in an attempt to solicit money and personal details. How to tell if an image is AI-generated Read more “Fortunately the attackers were not successful,” Read wrote in the email. “We all need to be vigilant to the techniques that go beyond emails to take advantage of virtual meetings, AI and deepfakes.” A WPP spokesperson confirmed the phishing attempt bore no fruit in a statement: “Thanks to the vigilance of our people, including the executive concerned, the incident was prevented.” WPP did not respond to questions on when the attack took place or which executives besides Read were involved. Once primarily a concern related to online harassment, pornography and political disinformation, the number of deepfake attacks in the corporate world has surged over the past year. AI voice clones have fooled banks, duped financial firms out of millions and put cybersecurity departments on alert. In one high-profile example, an executive of the defunct digital media startup Ozy pleaded guilty to fraud and identity theft after it was reported he used voice-faking software to impersonate a YouTube executive in an attempt to fool Goldman Sachs into investing $40m in 2021. The attempted fraud on WPP likewise appeared to use generative AI for voice cloning, but also included simpler techniques like taking a publicly available image and using it as a contact display picture. The attack is representative of the many tools that scammers now have at their disposal to mimic legitimate corporate communications and imitate executives. “We have seen increasing sophistication in the cyber-attacks on our colleagues, and those targeted at senior leaders in particular,” Read said in the email. Read’s email listed a number of bullet points to look out for as red flags, including requests for passports, money transfers and any mention of a “secret acquisition, transaction or payment that no one else knows about”. “Just because the account has my photo doesn’t mean it’s me,” Read said in the email. WPP, a publicly traded company with a market cap of about $11.3bn, also stated on its website that it had been dealing with fake sites using its brand name and was working with relevant authorities to stop the fraud. “Please be aware that WPP’s name and those of its agencies have been fraudulently used by third parties – often communicating via messaging services – on unofficial websites and apps,” a pop-up message on the company’s contact page states. Many companies are grappling with the boom of generative AI, pivoting resources toward the technology while simultaneously facing its potential harms. WPP announced last year that it was partnering with the chip-maker Nvidia to create advertisements with generative AI, touting it as a sea change in the industry. “Generative AI is changing the world of marketing at incredible speed. This new technology will transform the way that brands create content for commercial use,” Read said in a statement last May. In recent years, low-cost audio deepfake technology has become widely available and far more convincing. Some AI models can generate realistic imitations of a person’s voice using only a few minutes of audio, which is easily obtained from public figures, allowing scammers to create manipulated recordings of almost anyone. The rise of deepfake audio has targeted political candidates around the world, but also crept into other less prominent targets. A school principal in Baltimore was put on leave this year over audio recordings that sounded like he was making racist and antisemitic comments, only for it to turn out to be a deepfake perpetrated by one of his colleagues. Bots have impersonated Joe Biden and former presidential candidate Dean Phillips. Explore more on these topics * Technology * Artificial intelligence (AI) * Deepfake * WPP * Advertising * news Share Reuse this content MOST VIEWED * ACTOR IAN SMITH, BEST KNOWN AS NEIGHBOURS’ HAROLD BISHOP, REVEALS TERMINAL CANCER DIAGNOSIS * HMRC IS DEMANDING I PAY £7,662 TAX ON MY £79 EARNINGS * LIVE STARMER SAYS BRITISH STATE NEEDS ‘COMPLETE REWIRING’ AS CHRIS WORMALD APPOINTED NEW CABINET SECRETARY – UK POLITICS LIVE * MEET THE REES-MOGGS REVIEW – MY OBSESSION WITH JACOB’S WIFE RUNS DEEP * LIVE JOE BIDEN’S PARDON FOR SON HUNTER CONDEMNED AS ‘BAD PRECEDENT’ – US POLITICS LIVE MOST VIEWED MOST VIEWED * UK * US politics * World * Climate crisis * Middle East * Ukraine * Football * Newsletters * Business * Environment * UK politics * Society * Science * Tech * Global development * Obituaries * News * Opinion * Sport * Culture * Lifestyle Original reporting and incisive analysis, direct from the Guardian every morning Sign up for our email * About us * Help * Complaints & corrections * SecureDrop * Work for us * * Privacy policy * Cookie policy * Terms & conditions * Contact us * All topics * All writers * Modern Slavery Act * Tax strategy * Digital newspaper archive * Facebook * YouTube * Instagram * LinkedIn * Newsletters * Advertise with us * Guardian Labs * Search jobs * Patrons Back to top © 2024 Guardian News & Media Limited or its affiliated companies. All rights reserved. (dcr)