Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ US https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Is Facebook really ready for the 2020 election?

Is Facebook really ready for the 2020 election?



Ever since Russian agents and other opportunists abused its platform in an attempt to manipulate the 2016 U.S. presidential election, Facebook has repeatedly insisted that it has learned its lesson and is no longer a channel for misinformation, voter oppression and electoral disruption.

But it has been a long and stopping journey for the social network. Critical outsiders as well as some of Facebook’s own employees, says the company’s efforts to revise its rules and tighten its security measures remain completely inadequate for the task, despite having spent billions on the project. As for why, they point to the company’s persistent unwillingness to act decisively over much of that time.

“Am I worried about the election? I̵

7;m scared, ”said Roger McNamee, a Silicon Valley venture capitalist and early Facebook investor who became a vocal critic. “On the company’s current scale, it is a clear and present danger to democracy and national security.”

The company’s rhetoric has certainly received an update. CEO Mark Zuckerberg now casually refers to possible outcomes that were unthinkable in 2016 – among them possible civil unrest and potentially a controversial choice that Facebook could easily make even worse – as challenges the platform now faces.

“This election is not going to work out as usual,” Zuckerberg wrote in a Facebook post from September in which he outlined Facebook’s efforts to encourage voting and remove misinformation from its service. “We all have a responsibility to protect our democracy.”

Yet, for years, Facebook leaders appeared to be trapped when their platform – created to connect the world – was used for malicious purposes. Zuckerberg has offered multipleapologies over the years, as if no one could have predicted that people would use Facebook for live-stream murder and suicide, encourage ethnic cleansing, promote fake cancer treatments or try to steal choices.

While other platforms like Twitter and YouTube have also struggled to address misinformation and hateful content, Facebook stands out for its reach and scale, and compared to many other platforms, its slower response to the challenges identified in 2016.

In the immediate aftermath of President Donald Trump’s election, Zuckerberg offered a remarkably tone-deaf quip regarding the notion that “fake news” spread on Facebook could have influenced the 2016 election and called it “a pretty crazy idea.” A week later, he left the comment.

Since then, Facebook has posted a stream of mea culpas for its slowness to act against threats of the 2016 election and promised to do better. “I don’t think they’ve gotten better at listening,” said David Kirkpatrick, author of a book about the rise of Facebook. “What has changed is that more people have told them they need to do something.”

The company has hired external fact-checkers, added restrictions – then more restrictions – on political advertisements and taken thousands of accounts, sites and groups that they found to be involved in “coordinated unauthentic behavior.” It’s Facebook’s term for fake accounts and groups maliciously targeting political discourse in countries ranging from Albania to Zimbabwe.

It has also begun adding warning stickers to posts that contain misinformation about voting, and has at times taken steps to limit the circulation of misleading posts. In recent weeks, the platform was also banned posts denying the Holocaust and joining Twitter to limit the spread of an unconfirmed political story about Hunter Biden, son of Democratic presidential candidate Joe Biden, published by the conservative New York Post.

All of this undoubtedly puts Facebook in a better position than it was four years ago. But that does not mean it is fully prepared. Despite stricter rules banning them, violent militias still use the platform to organize. Recently, this included a foiled plot to kidnap the governor of Michigan.

In the four years since the last election, Facebook’s earnings and user growth have increased. This year, analysts expect the company to make a profit of $ 23.2 billion. At a turnover of $ 80 billion. According to FactSet. It currently boasts 2.7 billion users worldwide, up from 1.8 billion at this point in 2016.

Facebook is facing a series of government investigations into its size and market power, including an antitrust probe from the US Federal Trade Commission. A previous FTC investigation sued Facebook with a large $ 5 billion fine, but demanded no further changes.

“Their No. 1 priority is growth, not harm reduction,” Kirkpatrick said. “And that probably won’t change.”

Part of the problem: Zuckerberg maintains an iron grip on the company, but does not take criticism of him or his creation seriously, accuses social media expert Jennifer Grygiel, a communications professor at Syracuse University. But the public knows what’s going on, she said. “They see COVID misinformation. They see how Donald Trump is exploiting it. They can not see it. ”

Facebook insists it takes the challenge of misinformation seriously – especially when it comes to elections.

“The choice has changed since 2016, and so has Facebook,” the company said in a statement sets out its electoral and voting policy. “We have more people and better technology to protect our platforms, and we’ve improved our content policies and enforcement.”

Grygiel says such comments are straightforward for the course. “This company uses PR instead of an ethical business model,” she said.

Kirkpatrick notes that board members and executives who have pushed back against the CEO – a group that includes the founders of Instagram and WhatsApp – have left the company.

“He’s so sure that Facebook’s overall impact on the world is positive,” and that critics are not giving him enough credit for it, “Kirkpatrick said of Zuckerberg. As a result, Facebook CEOs are not inclined to take constructive feedback. “He does not have to do anything he does not want to do. He has no supervision, ”Kirkpatrick said.

The federal government has so far left Facebook to its own devices, a lack of accountability that has only authorized the company, according to the U.S. rep. Pramila Jayapal, a Washington Democrat who grilled Zuckerberg during a hearing in Capitol Hill in July.

Warning stickers are of limited value if the algorithms behind the platform are designed to push polarizing material to users, she said. “I think Facebook has done some things to indicate that it understands its role. But it has, in my opinion, been far too little, too late. ”


Source link