Campaigns that use artificial intelligence in an attempt to sway voters would soon be required to reveal that fact, under a bipartisan bill Oregon lawmakers are considering this year.
Senate Bill 1571 would require campaign materials 鈥 from physical fliers to online videos 鈥 to disclose the use of any artificial intelligence used to depict a person鈥檚 voice or image. It鈥檚 one of dozens of similar proposals that have emerged around the country, as state and federal officials work to tamp down use of emerging 鈥渄eep fake鈥 technology ahead of this year鈥檚 elections.
鈥淚t鈥檚 important that the state of Oregon keeps up with the times鈥, said state Sen. Aaron Woods, D-Wilsonville, a retired Xerox executive who introduced the bill. 鈥淭he bill will build awareness.鈥
Oregon hasn鈥檛 seen high-profile instances of artificial intelligence in political communications yet. But as technology quickly improves, AI has become a factor on the national stage 鈥 including in using the faked voice of President Joe Biden in New Hampshire and a to mimic the voice of former President Donald Trump. The Federal Communications Commission last week.
Woods鈥 bill, which was scheduled to receive a public hearing Tuesday afternoon, has support from Democrat and Republican lawmakers. Under Woods expects to move forward, it defines 鈥渟ynthetic media鈥 as an image, audio recording or video of a person that 鈥渉as been intentionally manipulated with the use of artificial intelligence techniques or similar digital technology鈥 and which gives voters a false impression of events.
Campaigns using that material would be required to disclose it. Violations could result in a lawsuit from the Oregon Secretary of State and a maximum $10,000 fine. The bill includes exemptions for media organizations that report on campaign ads that feature AI.
The bill would take effect immediately, putting its provisions into place well ahead of Oregon鈥檚 May primary election.
The regulations in the bill are not as detailed or strict as those in some other states. A , for instance, requires that a message about use of AI be written in the same sized font as the majority of texts when used in print ads, and be put on the screen for at least four seconds in the case of video ads.
Public Citizen, a national advocacy group pushing AI disclosure rules around the country, , including that AI disclosures be written in the same-sized font as the largest writing in print communications, and be displayed the duration of a video.
More than have introduced or passed bills to ban so-called 鈥渄eep fake鈥 videos in politics or require disclosure when they are used, according to the organization. Those efforts have been broadly bipartisan.
It鈥檚 not clear from HB 1571 exactly how Oregon campaigns will be required to reveal use of AI if the bill passes, other than that they must state that the material has 鈥渂een manipulated.鈥 Woods said he had not considered specific regulations around how disclosures must be presented, saying those details could be hammered out in a future legislative session. But he characterized his proposal as popular among colleagues, no matter their party affiliation.
鈥淚t鈥檚 bipartisan and bicameral, so if you are against this鈥 I certainly want to hear why,鈥 he said in an interview. 鈥淏ut no, we鈥檝e not had any opposition or negativity.鈥
Among supporters is Secretary of State LaVonne Griffin-Valade, the state鈥檚 top election official, a spokesperson said Tuesday.
As of early afternoon, written testimony filed on the proposal was supportive or neutral, with 鈥済ood government鈥 groups like Common Cause Oregon and the League of Women Voters of Oregon voicing their support.