Advertisement
Tech

Elon Musk’s lawyers try to claim his praise of Tesla are deepfakes to avoid liability in lawsuit

‘Their position is that because Mr. Musk is famous and might be more of a target for deep fakes, his public statements are immune.’

Photo of Mikael Thalen

Mikael Thalen

Elon Musk speaking in red chair in front of black and red background

Tesla CEO Elon Musk’s past remarks on the safety of self-driving technology shouldn’t be used in court given that they could be deepfakes, lawyers for the electric automaker have argued.

Featured Video

The claim was made in an attempt to keep Musk from being interviewed under oath as part of a lawsuit against his company from the family of Apple engineer Walter Huang, who died while driving a Tesla Model X in 2018.

Attorneys for Huang’s family say Tesla’s Autopilot software is to blame, while Tesla’s legal team alleges that Huang was playing games on his phone and ignoring warnings from the vehicle at the time of the fatal accident.

Despite Musk repeatedly touting the safety of Tesla’s self-driving capabilities, which, as noted by the Verge, includes a 2016 remark in which he said his vehicles could be driven “autonomously with greater safety than a person,” lawyers for the automaker suggested that his purported remarks should be disregarded given the existence of deepfake technology.

Advertisement

Specifically, the lawyers cited Musk’s celebrity status while noting that he has been “the subject of many ‘deepfake’ videos and audio recordings that purport to show him saying and doing things he never actually said or did.”

The judge in the case, however, has pushed back on Tesla’s remarks.

“Their position is that because Mr. Musk is famous and might be more of a target for deep fakes, his public statements are immune,” said Santa Clara County Superior Court Judge Evette D. Pennypacker. “In other words, Mr. Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do.”

Such a scenario has become increasingly common in recent years and is known as the “liar’s dividend” or the idea that because something can be faked, everything could be faked.

Advertisement

Synthetic media expert Henry Ajder told the Daily Dot that the emergence of advanced deepfakes “has made it possible for people to plausibly deny that authentic audio-visual is real.”

“Legal systems globally aren’t prepared for a world where hyperrealistic generative content is everywhere, making media provenance technologies increasingly important to the future of trust in the age of generative AI,” he said.

Judge Pennypacker has since tentatively ordered Musk to give a three-hour deposition regarding his past statements on Tesla’s Autopilot software. The deposition is set to be finalized today, with the lawsuit set to go to trial on July 31.

Advertisement
web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Sign up now for free

 
The Daily Dot