Until corporate media and the neoliberal establishment refused to acknowledge their direct role in the election of Donald Trump and threw a temper-tantrum about misinformation on social media to scapegoat blame.
Facebook CEO Mark Zuckerberg balked at the notion faulty reports circulating on social media had anything at all to do with the November 8th shocker. “Of all the content on Facebook, more than 99 percent of what people see is authentic. Only a very small amount is fake news and hoaxes,” Zuckerberg wrote in a post to his platform last Saturday. “The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”
Now, rather than stand by that original assertion, Zuckerberg instead cast all logic aside and unleashed a Machiavellian seven-point plan to eradicate the “very small amount” of false information — read: all opinion not in lock step with the establishment narrative — from the newsfeeds of Facebook’s billion-plus users. Because, apparently, we can’t be trusted to think for ourselves. “The bottom line is: we take misinformation seriously,” Zuckerberg wrote late Friday evening, apparently forgetting what he posted exactly one week ago. “Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information.
We’ve been working on this problem for a long time and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done.” Curiously, the head of the Facebook Ministry of Truth neglected to explain how the 65 corporate presstitutes and myriad mendacious mainstream outlets exposed in Wikileaks’ Podesta Files for colluding with the Clintonite establishment were awarded a free pass to spread propagandic disinformation — and, frequently, flagrant lies. Worse, what Zuckerberg wrote next should send chills down the spines of anyone who has ever been forced to deal with fallout from the social media platform’s already-rampant and oft-inexplicable censorship via erroneous and revenge reporting on posts, arbitrary unpublishing of pages, ghosting, and newsfeed suppression — as well as those who look to Facebook for alternatives to vapid mainstream media:
“Historically, we have relied on our community to help us understand what is fake and what is not. Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it’s much less likely to spread.”
Snopes?
Really?
The same Snopes that took it upon itself to “debunk” an inside joke in meme form that happened to go viral? In just those three sentences, Zuckerberg does more to expose the innate perils of censorship than any scholarly tome on the subject ever could — personal opinion always operates the censor’s heavy hand.
It’s inescapable fact that what one individual deems devoid of value, another may find sacrilegiously offensive — while another may laugh off as innocuous. Dismissing that scripture — or, perhaps, forgetting it formed the foundation for First Amendment protections of free speech, press, and expression — Zuckerberg laid out his plan to combat the ‘relatively small percentage of misinformation,’ encompassing the following points:
- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.
- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.
- Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.
- Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.
- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.
- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We’re looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.
- Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.
In other words, apart from spam detection, which other websites and platforms have effectively combatted for years, Facebook’s plan to ‘detect’ misinformation will be based on what any idiot says. Although the people of this planet generally operate from a place of honesty and integrity, let’s face it, humans have nasty penchants for retribution, revenge, sanctimonious arrogance, self-righteousness, misjudgment, mischaracterization, hyperbole, and — most imperatively — making mistakes.
Relying on people’s personal assessments of possibly-false news items as the primary driver of what deserves to be branded with a Scarlet Letter “F” is a system destined to fail everyone before it even begins. Facebook still does not provide the means to rebut post and link removals or the sudden unpublishing of pages — the platform has, in essence, a shoot first, ask questions later attitude when it receives a report something violated its Community Standards. This has already imperiled owners of perfectly legitimate pages with millions of fans to the arduous process of challenging unjustified reports and coping in the meantime with devastating loss of revenue.
Nowhere in Friday’s announcement does Zuckerberg address those concerns — which will exponentially increase if and when the plan begins. With little to no recourse to defend against what will undoubtedly be an explosion of posts erroneously flagged as ‘false information,’ Facebook is brazenly handing over the censor’s black marker to a populace already too lackadaisical to bother investigating questionable news items.
Therein lies the greatest threat to a free press and free speech this country has seen since Red Scare McCarthyism — Facebook, backed by a polarized public, will be the arbiter of acceptable thought — and those who dare question or criticize that thought will pay with their livelihoods. Far worse, everyone will pay the price of lost access to information.
We’re already starting to.
Hi! I am a content-detection robot. I found similar content that readers might be interested in:
http://www.activistpost.com/2016/11/zuckerberg-just-revealed-facebooks-7-point-plan-censor-fake-news-chilling.html
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
He can do all that.
and watch his user base dwindle.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Not indicating that the content you copy/paste is not your original work could be seen as plagiarism.
Some tips to share content and add value:
Repeated plagiarized posts are considered spam. Spam is discouraged by the community, and may result in action from the cheetah bot.
Creative Commons: If you are posting content under a Creative Commons license, please attribute and link according to the specific license. If you are posting content under CC0 or Public Domain please consider noting that at the end of your post.
If you are actually the original author, please do reply to let us know!
Thank You!
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Understood and thanks...my first post and from now on will credit any source researched. Good to learn some of your ground rules.
Cheers.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit
Thanks to many sites like Facebook, we have a society filled with even more ignorant and misinformed people.....What happened to independent thinking, educating ourselves, research and informed debate? I found this short video, highlighting my point.
Downvoting a post can decrease pending rewards and make it less visible. Common reasons:
Submit