Pentagon Confirms Second Chinese Spy Balloon, Here's Where It Is
Wait...That's When Biden's People Will Brief the 'Gang of Eight' About the Chinese...
Florida Takes Action Against Venue That Hosted Provocative 'All Ages' Drag Show
Dying to Escape Socialism
MTG and AOC Fued It Out Over Twitter
GOP Lawmakers Call Out Biden For Putting His Relationship With China Over the...
Phoenix Is Policing Super Bowl Speech, but Not Violent Crime
DirectTV’s Attack on Free Speech
The Left’s Corruption of Rodgers & Hammerstein
Wildly Irresponsible Congressional Dems Still on TikTok as of 2023
Trump, DeSantis or Haley? Recent Polls Show Where Americans Stand on the Issue
Elon Musk Wins Big Victory In Lawsuit Regarding 2018 Tweets
Sen. Hawley Demands Investigation Into Biden's Troubling Response to Chinese Spy Balloon
Joe Manchin Looking to Work With Republican Senate Colleagues, Will It Save Him...
Hakeem Jeffries Claims Socialism Would 'Undermine' the Democrat Agenda

The Blurred Lines of Modern Media

The opinions expressed by columnists are their own and do not necessarily represent the views of

Facebook is under heavy fire for its seemingly insatiable appetite for slurping up and analyzing endless streams of user data. The social media giant is again playing defense over its powerful role in controlling the news that each of us sees on our Facebook feeds.

Like so many other modern sites, Facebook takes an algorithmic approach to news. It uses code to draw its own conclusions about what stories we want to see, thereby impacting how we look at certain issues.

The chorus of criticism is deafening. However, the mainstream media is being incredibly hypocritical by teaming up against Mark Zuckerberg and his company. After all, how is the media process of determining what people to feature and what topics to cover any different than what Facebook is doing?

If anything, the old way of choosing the news is likely more prone to political or situational bias and more susceptible to human error. After all, people are fallible, and we all make mistakes.

I know about this first hand. As a syndicated columnist, my articles run in print in various publications across the world and appear on various websites. My reach is further expanded through my own social media channels, as well as through dedicated readers consistently sharing my content with an ever-widening audience by posting links to the stories in their social media feeds, or by emailing them to their family members and friends.

I will be the first to admit that the topics I explore are often determined by situations arising in my own life. It's not an inherently good or bad thing, it just is.

If a friend were to be diagnosed with an illness, then without any conscious effort I would probably be more inclined to write pieces related to wellness or nutrition, or I might talk philosophically upon the most important things in our lives.

While there are many similarities between the "curation" of news in newsrooms, and around the table during editorial meetings, and the curation of news algorithmically by Facebook and other sites, there are also some notable differences.

Social media often forces us to experience a stronger media bias. This is likely due in large part to the fact that the goal is to drive clicks, to extend the time we spend within the confines of any one platform and to increase the number of stories with which we interact.

There are clear financial incentives for this approach: Facebook benefits by collecting user data about our preferences. It also increases the likelihood of converting visitors into customers, because Facebook's advertisers are able to bombard us with sales pitches. In addition, when we increase the amount of time we spend on any one platform, we boost their statistics. This provides the foundation for ever-increasing prices in advertising payments by corporations and brands eager to connect with captive audiences.

Also, let's not forget that the "social" aspect of social media all but guarantees that we become more firmly ensconced in silos and echo chambers. If one person's friends are talking about a specific topic, then Facebook is more likely to display content related to it. While for me, this ironically often means that I am presented with a diverse array of viewpoints and topics, for most people, the opposite seems to be true. For the average user, Facebook's machines lead them to like-minded people talking in similar ways about the same topics as dictated by the platform.

Another clear differentiator is that social media is more likely than traditional media to comingle stories that are related thematically, but may not be from the same news cycle or timeframe. How many times have we all experienced reading a story about a topic in the news, only to click on a related story that is linked below or even hyperlinked from within the text itself?

A negative byproduct of this machine approach to sharing news is that it is sometimes very hard to quickly grasp time differences between pieces that otherwise seemed to be correlated. This can create confusion, as well as compounding issues that should actually be kept separate.

While reading recent coverage of the mass shooting at Marjory Stoneman Douglass High School, I suddenly found myself skimming a story about a mass stabbing in another school. When it showed up in my news feed, I mistakenly thought that it was a breaking story; however, it was actually an older incident that took place in 2014.

With access to more information than ever before, we must be ever vigilant of the drawbacks of modern media, and we must always be ready to read with a critical eye.

Join the conversation as a VIP Member


Trending on Townhall Video