FCC Imposes $6 Million Fine on Consultant for AI-Driven Biden Robocalls
Steven Kramer, a political adviser, has been fined $6 million by the Federal Communications Commission (FCC) for using AI to make fake robocalls that sounded like President Joe Biden and told people in New Hampshire not to vote in the Democratic primary.
In order to stop people from abusing AI in political campaigns, the fine was given. Kramer, a Democratic consultant from Louisiana, was charged by officials in New Hampshire in May with making the fake calls that seemed to be from Vice President Biden.
People were told not to vote in the state’s Democratic primary until November because of the calls, even though the primary was supposed to happen earlier in the year. Kramer used to work for Representative Dean Phillips, who was running against Biden in the primary and spoke out against the robocalls.
In a statement released earlier this year, Kramer admitted that he paid $500 to have the calls sent out. He said that his goal was to bring attention to the fact that AI could be used inappropriately in political campaigns.
Deepfake technology made by AI was used to make the calls sound like they were from Biden. This has raised worries about how similar technology could be used to mess with elections.
Jessica Rosenworcel, Chair of the FCC, talked about the dangers of AI in political communication. She said, “It is now cheap and easy to use AI to clone voices and flood us with fake sounds and images.” This technology can be used to illegally mess with voting. We need to stop this scam with everything we have and call it out when we see it.
Kramer will have to pay a $6 million fine because he broke FCC rules that say sending wrong caller ID information is illegal. He has 30 days to pay the fine, or the case will be sent to the Justice Department to be collected.
Kramer hasn’t said anything about the decision yet, and attempts to get in touch with him or a spokesman failed.
This case is part of the FCC’s larger plan to control the use of AI in politics and communications. Back in August, Lingo Telecom decided to pay the FCC $1 million to settle their case about the New Hampshire robocalls.
As part of the deal, Lingo agreed to follow the FCC’s rules on caller ID authentication. The FCC is also thinking about how to better control material made by AI in political ads. In July, the commission suggested a rule that would require political ads that air on TV or radio to say if any of the content was made using AI.
The plan is still being looked over. More and more people are worried about how AI might affect the fairness of elections, especially as deepfake technology gets easier to get and use.