Justice Secretary in Deepfake General Election Warning

The UK’s secretary of state for justice has warned of a “clear and present danger” to British democracy from deepfakes ahead of the upcoming general election.

Robert Buckland made the remarks on BBC Radio 4’s Today program yesterday, claiming the technology delivers a “liar’s dividend” in that, by undermining trust in the veracity of information, voters will simply stop trusting anything.

“It’s all about scale and accessibility. We’re in a world where even in your own bedroom you can allow generative AI to produce content that can easily be shared in a matter of moments, and on a scale that we’ve never seen before,” he said.

“That will have potentially a hugely corrosive effect on trust in information.”

The National Cyber Security Centre (NCSC) has previously warned that deepfake campaigns will ramp up ahead of the next UK general election, which must take place before January 2025.

Read more on deepfakes: How the Rise of Deepfakes Will Impact the 2024 Presidential Elections

Fake clips have already sprung up in recent months impersonating Labour Party leader, Keir Starmer, and London mayor, Sadiq Khan. In the Balkans, the liberal Progressive Slovakia party recently lost an election to a pro-Russia populist outfit after a fake audio clip emerged of party leader Michal Šimečka apparently discussing how to rig the vote.

Microsoft has also issued a warning about the technology’s ability to spread disinformation ahead of the US presidential elections next year. The tech giant is launching several initiatives designed to mitigate the threat, including the creation of an “Election Communications Hub” to help global democracies build secure and resilient election processes.

Buckland is apparently one of a group of Tory MPs that has written to science secretary, Michelle Donelan, requesting the government provide social media firms with clearer guidelines on how to comply with new national security laws designed to tackle foreign interference in elections.


Credit: Source link

Comments are closed.