The Patriot Post® · Deepfake Porn Exploitation
Artificial Intelligence (AI) is a powerful tool, and, like all tools when put in the wrong hands, it can used as a weapon of mass destruction. In the case of deepfake pornography, the victims are many and range from the girl next door to big-name celebrities like Taylor Swift.
Swift’s image was stolen by evil-minded people who used her picture to create AI-generated “nudes.” These circulated on the Internet very quickly and were viewed by millions of users before social media platforms shut them down. Swift is not the first victim of deepfake porn. This problem has been hurting ordinary people all over the world.
In New Jersey, 30 female students had their images snatched and perverted into pornographic material by some of their male classmates. In London, a young girl took her own life after facing a similar situation.
All it takes is one picture and AI does the rest to ruin someone’s image and reputation and titillate perverts.
Swift’s experience seems to have catalyzed Congress, which now may be pushing forward legislation to deal with this menace. The Senate has put forward a bipartisan bill called the Defiance Act to criminalize the use and distribution of deepfake porn.
One of the architects of this bill, Senator Josh Hawley (R-MO), said: “Nobody — neither celebrities nor ordinary Americans — should ever have to find themselves featured in AI pornography. Innocent people have a right to defend their reputations and hold perpetrators accountable in court.”
Other avenues that Congress is looking at to address this issue involve rewriting Section 230, which protects Internet platforms like social media sites from being held liable for content uploaded onto them. Protecting people’s images and reputations should be legislated, and those who exploit them should be prosecuted with severe criminal consequences. (Social media outlets seem to have no trouble censoring certain political opinions; porn shouldn’t elude them.)
Ten states already have laws on the books protecting innocents from the use of their personal image in such a way. Congress is — shocker — a little late to the game.
However, this will never be enough until we address the root of the problem: the porn industry.
How about we outlaw porn?
Our culture is obsessed with obscenity and pushing the boundaries of acceptable attire. If anyone watched the Grammy’s, they would have noticed that many of the female celebrities wore clothing that exposed their nipples and otherwise left little to the imagination. What is the point? Female empowerment?
Well, it’s not.
Their poor example of not caring about curating a respectable image is part of the more significant problem: the Left’s destigmatization of porn as an industry.
Perhaps the root of the problem isn’t that people are now taking advantage of a technology that makes such images cheaply and easily; perhaps the problem is the deeper rot of allowing and accepting the role of pornography in our economy and culture.
There are obscenity laws on the books, and apparently, they are no longer enforced (much like our laws are not enforced at the southern border).
Thankfully, some states have taken a stand against these mega porn companies by requiring age verification via ID, therefore ensuring the users aren’t minors. However, porn really is at the root of a lot of societal evil. It is exploitative and addictive. Moreover, continual consumers spiral into more and more illicit material involving ever-increasing violence, minor abuse, and sexual perversion (LGBTQ+ stuff). Even people who work for these porn companies are admitting that they are starting to push transgenderism porn in order to influence their consumers in that direction.
Let’s criminalize porn as well as create and enforce legislation that protects private images.
How do we teach AI to monitor itself?
Let’s assume that laws are put in place and that platforms are more motivated to keep deepfake porn off their sites. Furthermore, let’s assume that law enforcement is able to track down the perpetrators who exploit someone else’s image and damage his or her reputation.
As this tech becomes more sophisticated, it will be harder and harder to prove that the AI-generated images are fake and perhaps even that the victim’s images were used non-consensually.
Is there a way to teach AI to detect its own handiwork? Are there ways that images can be protected through an AI database? Is there a way to thwart someone’s use of a non-consensually taken image? These are questions far above this author’s pay grade. However, since AI is a tool, and a highly sophisticated one at that, could it not be used to outsmart these perverts?
How do we protect our children’s image?
We live in a world where a picture and a video can be taken on a phone with the tap of a touchscreen. We also live in the age of social media where bad actors can have easy access to someone’s image. This has been a problem since the advent of social media. Child predators have used parents’ social media to track potential victims. Now, we have perverts who can exploit our children with a picture and access to WiFi.
As parents, how do we protect our child’s image? For starters, perhaps refraining from posting our child’s pictures online would help.
Conclusion
Anyone can become a victim of this depraved new use of AI. As we navigate this new tech, laws and safeguards need to be put in place. However, as parents of children who are going to be living in the world of AI, it’s also our responsibility to protect them as much as we are able from being exploited by this industry. It is also perhaps time to address the core issue — the porn industry itself.