See listing of Recent and Most Popular articles on the Home Page

My World

Category: Communication / Topics: Civility Crime, Justice, Punishment Freedom Internet Media Politics Racism and Inequality Social Issues Social Media Social Unrest, Division Technology Trends Voting & Elections

The Facebook Threat

by Stu Johnson

Posted: October 16, 2020

Increasing concerns for the role of Facebook in fueling social strife…

Last week, in Driven Apart," I talked about how our sources of information and our perceptions of the world can increase the polarization in society. The title of that article came from the idea explored in numerous books, media reports and Congressional hearings that social media giants—following the lead of Facebook and driven by their business models—are driving people apart by focusing on radical groups and content, rather than bringing people together into a more civil society able to embrace diversity and respect differing opinions.

This week I want to explore two more items, the first a follow-up on the Mozilla petition targeting Facebook recommendations, and then present an excerpt from an extensive New Yorker article by Andrew Marants, "Why Facebook Can't Fix Itself." (A link to the full article appears a the end).

An Open Letter to Mark Zuckerberg and Jack Dorsey

Next week, Mozilla (the non-profit group behind the Firefox browser and champion of privacy protection and good internet practices) will be publishing an open letter in the Washington Post calling on Mark Zuckerberg [Facebook] and Jack Dorsey [Twitter] "to stop key recommendation engines that are threatening U.S. election integrity." Here is more from an email received this week:

Despite tens of thousands of Mozilla supporters calling for urgent action, Twitter and Facebook haven't paused their key recommendation engines that are amplifying misinformation.

Experts in misinformation and election integrity are increasingly concerned that the post-election time period will see waves of misinformation about the election results and calls for violence. The news about a [domestic] terrorist group's plan to kidnap the governor of Michigan, developed using social media platforms, only further demonstrates the need for bold action from Twitter and Facebook.

That's why we're elevating our demands and will publish an open letter to Jack Dorsey, CEO of Twitter, and Mark Zuckerberg, CEO of Facebook in the Washington Post next week.

Here is the text of the letter:

Dear Mr. Zuckerberg and Mr. Dorsey:

We appreciate the recent action you have taken to curb the spread of disinformation related to the U.S. election. Labeling false information, rejecting misleading ads, limiting retweets, and removing accounts and groups that promote lies, hate, conspiracies, and violence are important steps.

We remain deeply concerned, however, that neither platform has gone far enough. Facebook and Twitter are still actively recommending disinformation that can threaten the integrity of the U.S. election. These recommendations could help disinformation about voting and election results go viral.

That is why we are calling on both Facebook and Twitter to immediately turn off two features that amplify disinformation: Twitter’s Trending Topics and Facebook’s Group Recommendations.

A false or misleading trending topic or violent group can reach millions of people before other safeguards to remove or label content take effect. Countless experts — and even some of your own employees — have revealed how these features can amplify disinformation

“64% of all extremist group joins are due to [Facebook] recommendation tools… [Facebook] recommendation systems grow the problem.”

Eff Horwitz and Deepa Seetharaman, “Facebook Executives Shut Down Efforts to Make the Site Less Divisive,” May 2020, The Wall Street Journal

“[Twitter’s Trending] system has often been gamed by bots and internet trolls to spread false, hateful or misleading information.”

Kate Conger and Nicole Perlroth, “Twitter to Add Context to Trending Topics,” September 2020, The New York Times

These features should remain disabled, neutrally and across the board, until January 2021. This will help prevent the viral spread of disinformation—such as inaccurate voting procedures or election results—throughout the election process and its aftermath.

We urge you to do so immediately, joining the chorus of concerned individuals and the many experts that have spoken out in the lead up to November 3rd.

Sincerely,

Mozilla

What's the Big Deal about Facebook?

You may be among the devout users of Facebook and other social media platforms who see it as a fun and useful place to be. The original intent of providing a way for people to easily connect and share their lives with friends (and the world, if desired!) has given way to a much more insidious movement that represents what a growing number of critics see as a threat to democracy. This has come about through topic recommendations, trending reports, groups, and other features that can be manipulated by the powerful algorithms used by these technology giants—feeding escalating division as more radical groups join the network and take advantage of the free exposure their positions get them. One of those critics is Andrew Marants, whose article "Why Facebook Can't be Fixed" will appear in the print edition of The New Yorker on Monday (October 19). Here is an excerpt. I hope you will take time to retrieve and read the whole article, even if you disagree with some of his evidence and conclusions.

When Facebook was founded, in 2004, the company had few codified rules about what was allowed on the platform and what was not. Charlotte Willner joined three years later, as one of the company’s first employees to moderate content on the site. At the time, she said, the written guidelines were about a page long; around the office, they were often summarized as, “If something makes you feel bad in your gut, take it down.” Her husband, Dave, was hired the following year, becoming one of twelve full-time content moderators. He later became the company’s head of content policy. The guidelines, he told me, “were just a bunch of examples, with no one articulating the reasoning behind them. ‘We delete nudity.’ ‘People aren’t allowed to say nice things about Hitler.’ It was a list, not a framework.” So he wrote a framework. He called the document the Abuse Standards. A few years later, it was given a more innocuous-sounding title: the Implementation Standards.

These days, the Implementation Standards comprise an ever-changing wiki, roughly twelve thousand words long, with twenty-four headings—“Hate Speech,” “Bullying,” “Harassment,” and so on—each of which contains dozens of subcategories, technical definitions, and links to supplementary materials. These are located on an internal software system that only content moderators and select employees can access. The document available to Facebook’s users, the Community Standards, is a condensed, sanitized version of the guidelines. The rule about graphic content, for example, begins, “We remove content that glorifies violence.” The internal version, by contrast, enumerates several dozen types of graphic images—“charred or burning human beings”; “the detachment of non-generating body parts”; “toddlers smoking”—that content moderators are instructed to mark as “disturbing,” but not to remove.

Facebook’s stated mission is to “bring the world closer together.” It considers itself a neutral platform, not a publisher, and so has resisted censoring its users’ speech, even when that speech is ugly or unpopular. In its early years, Facebook weathered periodic waves of bad press, usually occasioned by incidents of bullying or violence on the platform. Yet none of this seemed to cause lasting damage to the company’s reputation, or to its valuation. Facebook’s representatives repeatedly claimed that they took the spread of harmful content seriously, indicating that they could manage the problem if they were only given more time. Rashad Robinson, the president of the racial-justice group Color of Change, told me, “I don’t want to sound naïve, but until recently I was willing to believe that they were committed to making real progress. But then the hate speech and the toxicity keeps multiplying, and at a certain point you go, Oh, maybe, despite what they say, getting rid of this stuff just isn’t a priority for them.”

. . . In public, Mark Zuckerberg, Facebook’s founder, chairman, and C.E.O., often invokes the lofty ideals of free speech and pluralistic debate. During a lecture at Georgetown University last October, he said, “Frederick Douglass once called free expression ‘the great moral renovator of society.’ ” But Zuckerberg’s actions make more sense when viewed as an outgrowth of his business model. The company’s incentive is to keep people on the platform—including strongmen and their most avid followers, whose incendiary rhetoric tends to generate a disproportionate amount of engagement. A former Facebook employee told me, “Nobody wants to look in the mirror and go, I make a lot of money by giving objectively dangerous people a huge megaphone.” This is precisely what Facebook’s executives are doing, the former employee continued, “but they try to tell themselves a convoluted story about how it’s not actually what they’re doing.”

In retrospect, it seems that the company’s strategy has never been to manage the problem of dangerous content, but rather to manage the public’s perception of the problem. In Clegg’s recent blog post, he wrote that Facebook takes a “zero tolerance approach” to hate speech, but that, “with so much content posted every day, rooting out the hate is like looking for a needle in a haystack.” This metaphor casts Zuckerberg as a hapless victim of fate: day after day, through no fault of his own, his haystack ends up mysteriously full of needles. A more honest metaphor would posit a powerful set of magnets at the center of the haystack—Facebook’s algorithms, which attract and elevate whatever content is most highly charged. If there are needles anywhere nearby—and, on the Internet, there always are—the magnets will pull them in. Remove as many as you want today; more will reappear tomorrow. This is how the system is designed to work.

. . . On August 19th, Facebook announced changes to its guidelines. Chief among them was a new policy restricting the activities of “organizations and movements that have demonstrated significant risks to public safety,” including “US-based militia organizations.” Some reporters and activists asked why it had taken so long for Facebook to come up with rules regarding such groups; others pointed out that, although hundreds of pages had been removed under the new policy, many such pages remained. Four days later, in Kenosha, Wisconsin, a police officer shot a Black man named Jacob Blake seven times in the back, in front of his children. Nightly protests erupted. The Kenosha Guard, a self-described militia, put up a “call to arms” on its Facebook page, where people explicitly expressed their intention to commit vigilante violence (“I fully plan to kill looters and rioters tonight”). Within a day, according to BuzzFeed, more than four hundred people had reported the page to Facebook’s content moderators, but the moderators decided that it did not violate any of Facebook’s standards, and they left it up. (Mark Zuckerberg later called this “an operational mistake.”) On August 25th, a white seventeen-year-old travelled to Kenosha from out of state, carrying a semi-automatic rifle, and shot three protesters, killing two of them. It’s not clear whether he’d learned about the Kenosha Guard on Facebook, but the militia’s page was public. Anyone could have seen it.

Pusateri, the Facebook spokesperson, said, “So far we’ve identified over 300 militarized social movements who we’ve banned from maintaining Facebook Pages, groups, and Instagram accounts.” In addition, last week, Facebook banned all content relating to QAnon, the far-right conspiracy theory. It also took down a post by Trump that contained misinformation about the coronavirus, and announced plans to ban all political ads from the platform for an indefinite period starting on Election Night. Its critics once again considered these measures too little, too late. Senator Elizabeth Warren described them as “performative changes,” arguing that the company was still failing to “change its broken algorithm, or take responsibility for the power it’s amassed.”

The restrictions are also likely to feed into the notion that social media discriminates against conservatives. (As Trump tweeted in May, “The Radical Left is in total command & control of Facebook, Instagram, Twitter and Google.”) This has become a right-wing talking point, even though the bulk of the evidence suggests the opposite. Every weekday, the Times reporter Kevin Roose shares the Top Ten “link posts”—posts containing links—from American Facebook pages, according to data provided by a tool owned by Facebook. Almost always, the list is dominated by far-right celebrities or news outlets. (On a representative day last week, the Top Ten included a post by Donald Trump for President, four posts from Fox News, two from CNN, and one from TMZ.) Facebook has disputed Roose’s methodology, arguing that there are ways to parse the data that would make it look less damning. Roose ranks posts by “interactions,” but John Hegeman, who runs Facebook’s News Feed, has argued that it would be better to rank posts by “reach” instead. This, Hegeman tweeted in July, would be “a more accurate way to see what’s popular.” However, he continued, “This data is only available internally.”


Read the original Mozzila article, sign the petition and find other related resources. Add your name to the letter.

Read Andrew Marants' full article, "Why Facebook Can't be Fixed" at The New Yorker.

Also search all of our articles on Voting & Elections for more analysis

This article was also posted on Stu's InfoMatters blog.



Search all articles by Stu Johnson

Stu Johnson is principal of Stuart Johnson & Associates, a communications consultancy in Wheaton, Illinois. He is publisher and editor of SeniorLifestyle, writes the InfoMatters blog on his own website and contributes articles for SeniorLifestyle.

Author bio (website*) E-mail the author (moc.setaicossajs@uts*) Author's website (personal or primary**)

* For web-based email, you may need to copy and paste the address yourself.

** opens in a new tab or window. Close it to return here.


Posted: October 16, 2020   Accessed 469 times

Go to the list of most recent My World Articles
Search My World (You can expand the search to the entire site)
Go to the list of Most Recent and Most Popular Articles across the site (Home Page)

Advertisements
Hilton Hotels

Rockler

Double VIP Points Every Wednesday