The FCC wants to make robocalls that use AI-generated voices illegal

The rise of AI-generated voices mimicking celebrities and politicians could make it even harder for the Federal Communications Commission (FCC) to fight robocalls and prevent people from getting spammed and scammed. That’s why FCC Chairwoman Jessica Rosenworcel wants the commission to officially recognize calls that use AI-generated voices as “artificial,” which would make the use of voice cloning technologies in robocalls illegal. Under the FCC’s Telephone Consumer Protection Act (TCPA), solicitations to residences that use an artificial voice or a recording are against the law. As TechCrunch notes, the FCC’s proposal will make it easier to go after and charge bad actors.

“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” FCC Chairwoman Jessica Rosenworcel said in a statement. “No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls.” If the FCC recognizes AI-generated voice calls as illegal under existing law, the agency can give State Attorneys General offices across the country “new tools they can use to crack down on… scams and protect consumers.”

The FCC’s proposal comes shortly after some New Hampshire residents received a call impersonating President Joe Biden, telling them not to vote in their state’s primary. A security firm performed a thorough analysis of the call and determined that it was created using AI tools by a startup called ElevenLabs. The company had reportedly banned the account responsible for the message mimicking the president, but the incident could end up being just one of the many attempts to disrupt the upcoming US elections using AI-generated content.

Credit: Source link

Comments are closed.