Texas Supreme Court Ends Abbott's Push to Expel Lawmakers Who Fled the State...
We All Know Why This House Dem Isn't Running for Re-Election
Dexter Taylor Shows Why New York's Anti-Gunners Can't Be Taken Seriously
The AP Wants to Ban Guns Not Being Used; NBC News Frets a...
In the UK, Offensive Words Are Now an Offense Punishable by Death
Wait Until California Taxpayers Hear About yet Another Newsom Spending Debacle
Tim Walz Called Steve Scalise a 'Bootlicker' and Scalise's Response Was Perfect
The Justice Department Found Yale Discriminated Against White, Asian Med School Applicants
Senator Bernie Moreno Sounds the Alarm on Chinese Vehicles Entering the US
Venezuela Opposition Leader Refuses to Take the Bait As CNN Presses Her on...
The UAE Has a Plan to Circumvent Iran and the Strait of Hormuz...
The CIA Lands in Havana: Trump Sends a Direct Message to the Cuban...
Abortion by Mail Must Stop
Former Labor Dept. Employee Pleads Guilty to Stealing $46K in Pandemic Unemployment Funds
Michigan Nurse Convicted in $1.6M Medicare Fraud Scheme Using Stolen Patient Records
Tipsheet

Artificial Intelligence 'True Crime' TikTok Trend Raises Horrifying Ethical Concerns

Artificial Intelligence 'True Crime' TikTok Trend Raises Horrifying Ethical Concerns
AP Photo/Ted S. Warren

As Artificial Intelligence continues to dominate social media, no corner of the art and information world is seemingly untouched by the new phenomenon: including true crime.  

Advertisement

Although AI's controversial nature has garnered backlash regarding the legal use of "real voices" and art, its entrance into true-crime sphere on TikTok raises these ethical concerns to another disturbing level. 

This video is an Artificial Intelligence depiction of Brianna Lopez, known as "Baby Brianna," who died at the hands of her mother, father, and uncle in 2002 in New Mexico. As if these details of her death weren't gruesome enough, in the video, it's told through a hyper-realistic depiction of what Lopez would've looked and sounded like like as a toddler, while covered with blood and other signs of abuse.

"From the day I was born, until five months later, when I died, I had received no love from anyone," the voice says in the video. "I would go through abuse every single day from the people that should have loved me."

And this isn't the only example. The TikTok account @mycriminalstory is just one of the many that solely posts these types of videos, where victims, and, in some cases, perpetrators, of unspeakable crimes tell their side of the story.

Many of the users in the comments section support the videos, saying it gives the crime victims (many of whom are no longer alive) a voice they never had. Some defenders of this true crime trend even say it brings awareness to horrific crimes that otherwise may be forgotten. 

Advertisement

On the contrary, these videos have also received intense backlash from many users and creators alike, pointing out that the victims' families did not give permission. Not to mention, some even argue that the videos are being used for clout rather than to bring awareness. Although the voices can't be traced back to the victims themselves, they are designed to look and sound nearly identical to them. 

Here's what Paul Bleakley, assistant professor in criminal justice at the University of New Haven, told Rolling Stone about the issue: 

“They’re quite strange and creepy,” says Paul Bleakley, assistant professor in criminal justice at the University of New Haven. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.” 

The music publication described the phenomenon as a "walking nightmare."

Artificial Intelligence has already created problems. In the music realm, creators have used the technology to use artists' voices to create music they never recorded themselves, like with the case of TikTok user Ghostwriter creating an AI-generated Drake song to promote his own name. Another complicated side of AI is that its code essentially steals artistic style from the internet which can't be proven legally because of its nature, but may have unknown ramifications on the price of art created by actual artists. 

Advertisement

AI-created crime victims, however, generate a different level of concern. It could not only be opening old wounds for the survivors or surviving family members of a tragic crime, but it also often involves the depiction of young children. Supporters could say that the U.S. has always been known for its sensationalized violence, so how is this different? Opposers could say, will we stop this before there is literal live holographic depictions of the murders and their young victims who can't give permission for use of their face and voice? 

Ultimately, Artificial Intelligence is too young to have any legal ramifications for unethical behavior, so the users and viewers of AI generators must ask themselves: when has it gone too far? 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement