Facebook Claims Ability to Discern “Fake News”
Article audio sponsored by The John Birch Society

With roughly two billion users worldwide, Facebook, once merely a social media and networking platform, has evolved into a platform through which news is spread throughout the United States by citizens enjoying their rights to freedom of speech and expression. However, CNBC reported on October 5 that Facebook would immediately start allowing a user to tap a button to be told whether the news is “fake” or not. Confirmation of this came from Facebook Newsroom simultaneously. 

Facebook stated, “This new feature is designed to provide people some of the tools they need to make an informed decision about which stories to read, share, and trust.” Essentially, as part of the Facebook Journalism Project, an endeavor to tie Facebook with the news industry, Facebook will allow users to access additional contextual information without virtually leaving the site itself. This information is pulled from Facebook, the publisher’s Wikipedia entry, and trending or related articles. With Facebook’s history of censorship, especially against conservative news sources, users should be wary of Facebook’s ability to discern the validity of a story.

{modulepos inner_text_ad}

It is no secret that Facebook’s CEO and founder, Mark Zuckerberg, has a liberal ideology, especially on social issues. He has been a longtime supporter of LGBT issues, aligning his company on the politically correct side. Additionally, he has been a firm backer of “Black Lives Matter,” a movement created on the belief that society dictates that black lives do not, in fact, matter. Facebook and its employees have taken heat recently for censoring conservative news throughout the last election cycle, by blocking users and support pages for various conservative spokespersons and movements.

Granted, Facebook has removed those blocks, but only because of backlash from users. Michael Nunez, a writer for Gizmodo, reported that Facebook employees routinely suppress stories of special interest to conservative readers from the “trending” news section. Upon speaking with individuals informed on this, Nunez wrote, “Workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.” Facebook’s news curators, as they are named, “were instructed to artificially ‘inject’ selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion — or in some cases weren’t trending at all.” The most suppressed stories on Facebook include the circumstances surrounding Navy SEAL Chris Kyle’s murder and the well-founded accusation that former IRS official Lois Lerner targeted conservative political groups by not allowing them to get non-profit tax status to reduce their taxes. Facebook’s news section reflects the biases of the employees and agenda of the company itself.

This censoring of news comes at a sensitive time, after America has withstood a devastating loss of lives in Las Vegas, which Josh Constine, writer for Tech Crunch, argued gives more reason for the addition of a “fake news” button. He stated, “Getting this right [the “fake news” button] is especially important after the fiasco this week … for the tragic Las Vegas mass-shooting pointed people to fake news.” However, with nearly all early reports about what happened in Las Vegas being reliant on speculation and eyewitness testimony (which has a history of often being inaccurate), any attempt by Facebook to limit the dissemination of articles with which it disagrees would likely mean that anyone researching what happened would have less availabilty to news stories, including some that contained truth. There is absolutely no doubt that there are unanswered questions regarding the shooting, and that there are still changes being made to stories circulating on the Web. The addition of the “fake news” button merely reduces the opportunity of citizens to discern truth from lies themselves. 

Theoretically, knowing which news is “fake” could be beneficial to society in the sense that there is often a fine line between truth and falsehood, but in this case, as with so many, truth is best determined by an objective bystander, which is definitely not the case with Facebook or its employees.