Since April of 2023, the FBI has noticed an uptick in using photos and videos that are being manipulated through “deepfakes” and turned into explicit content that is then published online to harass and bully the victims whose photographs are being used.
https://www.pcworld.com/article/1945525/blackmailers-are-using-ai-to-generate-nudes-from-social-media-photos.html
Posting photographs and videos on social media these days is as normal as going to the grocery store. Almost anyone and everyone who has Facebook, Instagram, Snap Chat, or any other form of social media has posted a photograph or video at some point and hasn’t thought twice about it. It’s how we interact with one another, by sharing a portion of our daily lives and what we are doing, who we are doing it with, where we are going, etc. It’s the norm. Most of the time, nothing happens. We post a photo or video, and we get comments and likes and then it’s on to the next photo and video.
Recently, however, the FBI has noticed an uptick in the use of photos and videos being manipulated into explicit content that is then used to harass and bully the victims whose photographs are being used. This is called sextortion and since April of 2023, the FBI has had an increase in sextortion related reports.
Sextortion is identified as someone coercing another individual into providing a sexually explicit photograph or video of themselves. Once the victim provides the content, the requestor threatens to share the content publicly. Most of the time, this is done to bully and harass the victim, or to try and obtain money from the victim with the promise that once payment is made, the explicit content will be removed.
Sextortion is nothing new. It just took the bad actors a little longer to take the photos or videos and photoshop them into something that looked more realistic than what it originally was. Since the creation of AI (artificial intelligence), these bad actors don’t need years of photoshop practice. They just need the right software and tools to manipulate the content they receive. Creating this new content is being referred to as “deepfakes” and it is where the images are manipulated through AI content so that the original photo or video of an individual is turned into the person in that photo or video doing or saying things they never did. In other words, an alteration of the original. A bad, bad alteration.
In most cases, the individual whose images end up in the wrong hands doesn’t even know it until it’s too late and they illicit content is distributed all over social media, the internet, public forums and even porn sites. Some of the content even contains images of minors, so it’s not just adults that are victims.
To curb the rise in sextortion, the FBI is asking everyone to use caution when posting a photograph or video on a social media platform and for everyone to apply privacy settings to accounts and to secure passwords and use multi-factor authorizations when able. Don’t accept random friend requests or communicate with individuals you don’t know. For parents, they are asking for you to monitor your child’s posts and other online activity and run searches of your child’s information online to see if any potentially exposed material pops up.
All the extra caution should be occurring whether you feel you’ve been a victim of sextortion or not. These tips should be common practice for anyone who uses a social media platform. If you find out you’ve been a victim of sextortion, keep all communications and other information you receive or have and turn it over to law enforcement agencies.
You can find out more information through the FBI’s Public Service Announcement issued on June 5, 2023, by clicking this link: https://www.ic3.gov/Media/Y2023/PSA230605