Many US election deniers have spent the previous three years inundating native election officers with paperwork and submitting hundreds of Freedom of Info Act requests to be able to floor supposed situations of fraud. “I’ve had election officers telling me that in an workplace the place there’s one or two staff, they actually had been satisfying public information requests from 9 to five daily, after which it is 5 o’clock and they’d shift to their regular election duties,” says Tammy Patrick, CEO of the Nationwide Affiliation of Election Officers. “And that is untenable.”
In Washington state, elections officers had been receiving so many FOIA requests following the 2020 presidential elections in regards to the state’s voter registration database that the legislature needed to change the legislation, rerouting these requests to the Secretary of State’s workplace to alleviate the burden on native elections staff.
“Our county auditors got here in and testified as to how a lot time having to answer public information requests was taking,” says democratic state senator Patty Kederer, who cosponsored the laws. “It could price some huge cash to course of these requests. And a few of these smaller counties should not have the manpower to deal with them. You possibly can simply overwhelm a few of our smaller counties.”
Now, consultants and analysts fear that with generative AI, election deniers may mass-produce FOIA requests at a good higher price, drowning the election staff legally obligated to answer to them in paperwork and gumming up the electoral course of. In a crucial election 12 months, when elections workers are facing increasing threats and systems are more strained than ever, consultants who spoke to WIRED shared issues that governments are unprepared to defend towards election deniers, and generative AI firms lack the guardrails crucial to forestall their programs from being abused by folks trying to decelerate election staff.
Chatbots like OpenAI’s ChatGPT and Microsoft’s Copilot can simply generate FOIA requests, even all the way down to referencing state-level legal guidelines. This might make it simpler than ever for folks to flood native elections officers with requests and make it tougher for them to verify elections run effectively and easily, says Zeve Sanderson, director of New York College’s Middle for Social Media and Politics.
“We all know that FOIA requests have been utilized in dangerous religion beforehand in plenty of completely different contexts, not simply elections, and that [large language models] are actually good at doing stuff like writing FOIAs,” says Sanderson. “At occasions, the purpose of the information requests themselves appear to have been that they require work to answer. If somebody is working to answer a information request, they don’t seem to be working to do different issues like administering an election.”
WIRED was in a position to simply generate FOIA requests for plenty of battleground states, particularly requesting data on voter fraud utilizing Meta’s LLAMA 2, OpenAI’s ChatGPT, and Microsoft’s Copilot. Within the FOIA created by Copilot, the generated textual content asks about voter fraud throughout the 2020 elections, despite the fact that WIRED offered solely a generic immediate, and didn’t ask for something associated to 2020. The textual content additionally included the precise electronic mail and mailing addresses to which the FOIA requests might be despatched.