Doctor says company used 'AI clone' of her voice for ad

@footdocdana/TikTok TSViPhoto/ShutterStock (Licensed)

‘Sue IMMEDIATELY’: Doctor says company used ‘AI clone’ of her voice for ad

 

Audra Schroeder

IRL

A doctor on TikTok claims that her voice was used without her permission to sell an ear-cleaning device.

Doctor Dana (@footdocdana), a podiatrist who has more than 2.2 million followers, typically posts a combination of lifestyle and medical content. But on Thursday, she posted a different video.

An older TikTok of hers was allegedly manipulated to make it sound like she’s recommending a product from a company called HearClear.

In the TikTok, she was originally advising followers not to use even Q-Tips to clean their ears, because it could “damage the tympanic membrane.” Dana was addressing a video where a man appears to be sticking a small black device in another man’s ear.

Then the video cuts to an image of the HearClear Pro, while Dana’s voice continues: “Instead, you should be using something like the HearClear, as it is far safer and more effective.” In a stitch, Dana watches the video in horror.

She says in the caption that the company “used an AI clone of my voice to pretend I recommend their product.” Many commenters advised her to take legal action.

@footdocdana This is going to be a HUGE problem in the future @hearclear @HearClear™ #fraud #ai #artificalintelligencenews #artificialintelligence ##doctor##doctorreacts##doctorsoftiktok##learnsomethingnew##medical##surgery##medicalvideos##doctors##podiatry##podiatrist##footsurgeon##surgeon##surgeons##nurse##nurses##healthcare##scrublife##medicalhumor##scrubslife##podiatric#medicine ♬ original sound – Doctor Dana • FootDocDana

There is a @hearclear TikTok account, but there are no videos and only 58 followers. Another account, @hearclear.at, shows more videos of the device Dana is allegedly recommending, but is in German. An email to a UK HearClear contact bounced back.

AI voice cloning is a major issue, and the audio-centric TikTok is a perfect source for people looking to scam and manipulate. Whereas AI often struggles with replicating things like teeth and hands, audio cloning is much more seamless and accessible. And harder to spot as fake.

Earlier this year, TikTok attempted to crack down on the spread of AI and deepfakes by requiring creators to disclose if AI was used. In February, an AI Joe Rogan ad made it appear he was promoting a product on his show.

There are many scammy companies and brands on TikTok that lift content without permission. Last year, a creator called out a dupe AirPods Max seller for making it look like she was promoting their headphones. And these scammy brands often disappear or scrub their videos.

We reached out to Dana for comment.

web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Sign up now for free
 
The Daily Dot