This University Just Paid a Hefty Price After Firing a Professor for Criticizing...
Of Course Progressives Are Mad That Trump Wants Americans to Have More Babies
President Trump Just Exploded on Zelensky
Trump Signals Big Change on China Tariffs
Some Can Now Get Non-Resident Concealed Carry Permits in This Restrictive State
It's Not Just a Population Crisis
GOP Congressman Launches Bid for McConnell's Senate Seat
GOP Congresswoman Announces Trip to El Salvador
Dems: You Know, This 'Maryland Father' Story Is Political Gold for Us
Watch As David Hogg and Reince Priebus Go At It During ABC News...
A Horrific New Form of Bullying Using AI Has Emerged
Anti-Israel Sentiment Is Rampant Among Professors at This University, Report Shows
Longtime Democrat Senator Will Not Seek Reelection
Brutal: Elizabeth Warren Has No Idea What to Say When Confronted With Her...
Here Are the Democrats Who Traveled to El Salvador Advocating for the So-Called...
Tipsheet

Artificial Intelligence 'True Crime' TikTok Trend Raises Horrifying Ethical Concerns

AP Photo/Ted S. Warren

As Artificial Intelligence continues to dominate social media, no corner of the art and information world is seemingly untouched by the new phenomenon: including true crime.  

Advertisement

Although AI's controversial nature has garnered backlash regarding the legal use of "real voices" and art, its entrance into true-crime sphere on TikTok raises these ethical concerns to another disturbing level. 

This video is an Artificial Intelligence depiction of Brianna Lopez, known as "Baby Brianna," who died at the hands of her mother, father, and uncle in 2002 in New Mexico. As if these details of her death weren't gruesome enough, in the video, it's told through a hyper-realistic depiction of what Lopez would've looked and sounded like like as a toddler, while covered with blood and other signs of abuse.

"From the day I was born, until five months later, when I died, I had received no love from anyone," the voice says in the video. "I would go through abuse every single day from the people that should have loved me."

And this isn't the only example. The TikTok account @mycriminalstory is just one of the many that solely posts these types of videos, where victims, and, in some cases, perpetrators, of unspeakable crimes tell their side of the story.

Many of the users in the comments section support the videos, saying it gives the crime victims (many of whom are no longer alive) a voice they never had. Some defenders of this true crime trend even say it brings awareness to horrific crimes that otherwise may be forgotten. 

Advertisement

On the contrary, these videos have also received intense backlash from many users and creators alike, pointing out that the victims' families did not give permission. Not to mention, some even argue that the videos are being used for clout rather than to bring awareness. Although the voices can't be traced back to the victims themselves, they are designed to look and sound nearly identical to them. 

Here's what Paul Bleakley, assistant professor in criminal justice at the University of New Haven, told Rolling Stone about the issue: 

“They’re quite strange and creepy,” says Paul Bleakley, assistant professor in criminal justice at the University of New Haven. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.” 

The music publication described the phenomenon as a "walking nightmare."

Artificial Intelligence has already created problems. In the music realm, creators have used the technology to use artists' voices to create music they never recorded themselves, like with the case of TikTok user Ghostwriter creating an AI-generated Drake song to promote his own name. Another complicated side of AI is that its code essentially steals artistic style from the internet which can't be proven legally because of its nature, but may have unknown ramifications on the price of art created by actual artists. 

Advertisement

AI-created crime victims, however, generate a different level of concern. It could not only be opening old wounds for the survivors or surviving family members of a tragic crime, but it also often involves the depiction of young children. Supporters could say that the U.S. has always been known for its sensationalized violence, so how is this different? Opposers could say, will we stop this before there is literal live holographic depictions of the murders and their young victims who can't give permission for use of their face and voice? 

Ultimately, Artificial Intelligence is too young to have any legal ramifications for unethical behavior, so the users and viewers of AI generators must ask themselves: when has it gone too far? 

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement