Please ensure Javascript is enabled for purposes ofwebsite accessibilityUnder fire from within Facebook, Zuckerberg defends inaction on Trump posts | WSYX
Close Alert

Under fire from within Facebook, Zuckerberg defends inaction on Trump posts


This Oct. 25, 2019 file photo shows Facebook CEO Mark Zuckerberg speaking at the Paley Center in New York. If you want a gauge for what the future of office work will look like, watch how the biggest tech companies are preparing for a post-pandemic world. During an employee town hall Thursday Facebook CEO Mark Zuckerberg said “We want to make sure we move forward in a measured way”. Facebook, which has nearly 45,000 employees, is looking five to 10 years down the line as it plans for more remote work, even when COVID-19 is no longer a threat that keeps its employees working from home. (AP Photo/Mark Lennihan)
This Oct. 25, 2019 file photo shows Facebook CEO Mark Zuckerberg speaking at the Paley Center in New York. If you want a gauge for what the future of office work will look like, watch how the biggest tech companies are preparing for a post-pandemic world. During an employee town hall Thursday Facebook CEO Mark Zuckerberg said “We want to make sure we move forward in a measured way”. Facebook, which has nearly 45,000 employees, is looking five to 10 years down the line as it plans for more remote work, even when COVID-19 is no longer a threat that keeps its employees working from home. (AP Photo/Mark Lennihan)
Facebook Share IconTwitter Share IconEmail Share Icon
Comment bubble
0

Facebook founder Mark Zuckerberg defended his decision not to moderate controversial posts by President Donald Trump Tuesday amid growing internal dissent over the company’s inaction and a simmering political controversy over Trump’s efforts to target social media platforms for perceived bias against conservatives.

After hundreds of Facebook workers staged a virtual walkout Monday and several publicly resigned or threatened to do so, Zuckerberg held an online video call with 22,000 employees Tuesday, facing repeated questions about his response to Trump’s posts and his commitment to past promises to police violent speech. A recording of the town hall was leaked to the press.

Last week, Twitter took unprecedented action to flag two Trump tweets alleging widespread voter fraud as “potentially misleading” and to add a warning to a tweet about protests over George Floyd’s death that the president of the United States was “glorifying violence.” Facebook took no action over either post.

The posts about the protests appeared to be a breaking point for many within Facebook. As unrest spread across the country, at times resulting in rioting and looting, Trump broadcast a demand for a more aggressive response by law enforcement, adding, “when the looting starts, the shooting starts.”

The historically-charged phrase was widely read as encouraging authorities to shoot looters, though Trump later insisted that was not his intent. Zuckerberg said Tuesday he found the post “troubling,” but he believes Facebook was right to leave it up because he did not consider it an incitement of violence.

“We basically concluded after the research and after everything I’ve read and all the different folks that I’ve talked to that the reference is clearly to aggressive policing — maybe excessive policing — but it has no history of being read as a dog whistle for vigilante supporters to take justice into their own hands,” he said.

According to Zuckerberg, the decision not to moderate the post was made by a small group of senior executives, only one of whom was black. However, he insisted there are lines Trump and other leaders are not allowed to cross, and he suggested the site might add labels to posts perceived as inciting violence in the future.

“This isn’t a case where he is allowed to say anything he wants, or that we let government officials or policymakers say anything they want,” he said.

During the call, Zuckerberg indicated Facebook would reexamine policies on threats of force by governments, increase the company’s focus on diversity and racial justice, consider new labels for content that is objectionable but not in violation of community standards, and communicate better about how policy decisions are made. He acknowledged speaking with President Trump personally about his posts but maintained the decision to leave them up was made before they talked.

“If Silicon Valley continues to be led by a majority of white men, we will continue to see technology that does not take into account the privacy and security of women, people of color, and those in the LGBT community,” said Lindsay Hoffman, an associate professor of communication at the University of Delaware who studies media technology and politics.

Facebook executives also spoke with civil rights groups this week about their handling of Trump’s post. In a statement, the heads of Color of Change, The Leadership Conference on Civil and Human Rights, and the NAACP Legal Defense and Educational Fund declared they were “disappointed and stunned” by Zuckerberg’s explanation.

“He did not demonstrate understanding of historic or modern-day voter suppression and he refuses to acknowledge how Facebook is facilitating Trump’s call for violence against protesters. Mark is setting a very dangerous precedent for other voices who would say similar harmful things on Facebook,” the civil rights leaders said.

President Trump was enraged by Twitter’s actions last week, threatening to impose new regulations or even shut the company down for supposedly censoring conservative views. He also signed an executive order aimed at reassessing liability protections afforded to social media platforms under Section 230 of the Communications Decency Act and penalizing companies that impose content policies unfairly.

The order had few immediate legal ramifications and any changes to Section 230 would need to be made by Congress, but critics have alleged it was intended to intimidate companies like Facebook into leaving inflammatory posts alone. The Center for Democracy and Technology filed a lawsuit Tuesday warning the directive could “chill future online speech.”

"The Executive Order is designed to deter social media services from fighting misinformation, voter suppression, and the stoking of violence on their platforms," CDT director Alexandra Givens said in a statement.

Despite the threat posed by the executive order, Snapchat announced Wednesday it would no longer promote President Trump’s content in its Discover section, stating it will not “amplify voices who incite racial violence.” Trump’s campaign issued a furious response, claiming Snapchat is trying to rig the presidential election and does not want conservatives using its platform.

Under fire from the left and right, Twitter has struggled to explain its decisions to moderate these specific Trump tweets, as well. While the president’s supporters point to posts by other world leaders encouraging violence that had not been flagged, Democrats argue many of Trump’s tweets contain misinformation or glorify violence and deserve the same treatment.

To justify the fact-checking of Trump’s tweets about mail ballot fraud, Twitter CEO Jack Dorsey suggested the concern was not that Trump was undermining the integrity of elections but that his false claim that California officials were sending ballots to all residents could discourage people from registering to vote. A disclaimer was added to the tweets pointing users to information on the security of voting by mail.

On Tuesday, Twitter attempted to clarify its content policies in response to criticism, asserting its focus is on “providing context, not fact-checking.” Dorsey said the goal is to promote healthy discourse, decrease the potential for harm, and increase public accountability.

“We are NOT attempting to address all misinformation. Instead, we prioritize based on the highest potential for harm, focusing on manipulated media, civic integrity, and COVID-19. Likelihood, severity and type of potential harm — along with reach and scale — factor into this,” a thread posted by Twitter Safety stated.

Experts say social media platforms undoubtedly have the right to establish their own content policies, but with the ethical and legal landscape surrounding the relatively new technology still taking shape, these sorts of conflicts are inevitable.

“Are social media platforms like the town square, where everyone gets a turn to speak? Like a privately owned newspaper, subject to the editor or publisher’s judgment? Like a broadcast network, which the federal government regulates to some extent?” said David Greenberg, a professor at Rutgers University and author of “Republic of Spin: An Inside History of the American Presidency.” “Because we lack clear legal and social consensus on these questions, we need to accept that different social media platforms will follow different policies.”

Given the political sensitivity of content decisions and the potential dangers of allowing violent speech, hate speech, and disinformation to run rampant, Hoffman said tech companies would be better served by developing clearer, more consistent policies upfront with input from marginalized groups than dealing with these crises as they arise.

Comment bubble
JOIN THE CONVERSATION (
0
)

“These platforms are acting retroactively rather than being proactive and evaluating scenarios before they become real problems,” Hoffman said. “It's a cat-and-mouse game, leaving them ever unprepared for the next chaos-inducing post.”

Loading ...