Washington state bill would provide safeguards against ‘deepfake’ political ads

SHARE NOW

(OLYMPIA, Wash.) — Washington state legislators are looking to get ahead of some technological tools used for political misinformation before they say they further disrupt the public trust in government.

The state Senate passed a bill last month that would provide political candidates with legal safeguards in civil court against “deepfake” videos, audio and images that are used in political ads.

“Deepfakes” are typically any form of media that has been altered and manipulated to misrepresent someone, typically in a way that shows the person saying something that was never said.

Washington Sec. of State Steve Hobbs, who requested the bill, told ABC News that such altered videos haven’t been officially used in campaigns, but several groups around the world are using them with malicious intent.

“We’re trying to get ahead of it,” Hobbs told ABC News.

Hobbs cited a “deepfake” video released last year at the start of the Russian-Ukraine conflict that falsely showed Ukrainian President Volodymyr Zelenskyy call on soldiers to surrender to the Russians. Although viewers and world leaders were able to see through the deception and the video was taken down from social media sites, Hobbs said that the public is vulnerable to similar types of manipulation.

“The last thing you want is political campaigns or political action committees to put ads out of the ‘deepfakes’ of the person they are trying to vote out,” he said. “Even if that ad was up a day or two before people realize it’s a ‘deepfake’ it can cause damage.”

The bill defines synthetic media in campaigns for elective office as ” an image, an audio recording, or a video recording of an individual’s appearance, speech, or conduct that has been intentionally manipulated with the use of generative adversarial network techniques or other digital technology in a manner to create a realistic but false image, audio, or video that produces a depiction that to a reasonable individual is of a real individual in appearance, action, or speech that did not actually occur in reality.”

Under the bill, candidates who are the victim of a “deepfake” video could “seek injunctive or other equitable relief prohibiting the publication of such synthetic media.”

The bill passed the state Senate on Feb. 15 with a 35-15 vote.

The state House of Representatives held a public hearing in the State Government & Tribal Relations committee on March 10 and some constituents expressed concern about the language of the bill.

Joshua Hardwick, who works in video, told the committee he opposed the bill because the current language didn’t clarify what constitutes a “deepfake” or an image or video edited for clarity or artistic purposes.

“If I apply a filter, make a color image black and white or sepia, or I want to shorten a part of the content and some things that may not be considered synthetic would be included,” he testified.

Hobbs contended that the bill’s language clearly does not include photo or video edits in its definition. He added that it does not intend to infringe on people’s First Amendment rights.

“If you want through the First Amendment, go after someone with ads and you have a political stance, that’s your right. But to take someone’s image or video and change it to make it look like they’re making a speech that they didn’t say, that’s just wrong,” Hobbs said.

The timetable for the bill to be voted in the House committee and the full chamber haven’t been set. Hobbs said there will likely be tweaks to the language and amendments following the discourse but stressed that “deepfake” campaigns are topics that need legislative action fast.

“We have to do something, we just can’t do nothing. We need other states to put up [safeguards] and the federal government to take action,” he said.

Copyright © 2023, ABC Audio. All rights reserved.