
FBI Warns People Are Being REPLACED By This Mysterious Technology
(NewsReady.com) – Over the last several years, the technology hackers have become more advanced. Deepfake videos, or fake videos of real people that appear genuine, have become an especially big concern. Most recently, Amazon announced voice replication technology that is also worrisome. The FBI is warning people that criminals could take advantage of the latest craze.
On June 28, the FBI Internet Crime Complaint Center issued a Public Service Announcement that warned of an increase in stolen personal information and the use of deepfakes to apply for “a variety of remote work and work-at-home positions.” Some of the positions allow the employees access to databases full of personal information that criminals could steal and use nefariously.
The #FBI is advising the public to beware of deepfakes and stolen publicly identifiable information utilized to apply for remote work positions. For more on this scam and how to report it, see here: https://t.co/K48TvR1H1G pic.twitter.com/ZaEnhSQRuV
— FBI Los Angeles (@FBILosAngeles) June 28, 2022
The notice from the FBI revealed it had received complaints about voice spoofing during applicant interviews. Sometimes, the lip movements of the person speaking on screen and the sound didn’t match up, causing the interviewer to become suspicious. Victims reported people using their identities to land jobs.
The federal law enforcement agency’s warning demonstrated a worry that critics raised in June when Amazon announced Alexa could mimic the voices of people after just listening to a one-minute recording. Fortune reported it raised concerns criminals would use the technology for scams and deepfakes. The FBI’s warning indicates that it’s already happening, just like the unreleased Alexa update.
Copyright 2022, NewsReady.com