[ad_1]
It’s the evening of November 3rd. Election Day 2020. The polls have closed, and in-person vote totals are being reported, but millions of mail-in ballots, which skew heavily Democratic, won’t be counted for days or weeks. Donald Trump, unsurprisingly, doesn’t care to wait for that to happen. He’s leading the in-person vote in the decisive swing states. He takes to Facebook to declare premature victory and insist that ballots stop being counted.
This hypothetical chain of events has come up a lot recently, as an unprecedented number of Americans prepare to vote by mail. The Democratic data firm Hawkfish calls it the “red mirage:” an apparent Trump landslide on election night, leading to a fight over the millions of outstanding ballots that makes Bush v. Gore look like a tea party. Which raises an important question: How will the social media platforms where so many Americans get their news respond?
On Wednesday morning, we got some answers to that question. In a blog post, Mark Zuckerberg laid out Facebook’s latest election-related policies, including its plan to deal with the possibility that a winner won’t be officially declared on Election Day. The company plans to use its new Voting Information Center “to prepare people for the possibility that it may take a while to get official results.” On Election Day, the information center will include authoritative information from Reuters and the National Election Pool. And if a candidate claims victory prematurely, Zuckerberg says Facebook will “add a label to their post educating that official results are not yet in and directing people to the official results.” (Posts that could trick people out of having their vote counted—or use Covid-19 scaremongering to deter them from voting—will be subject to removal.)
These are good ideas, in theory. The question, as with every Facebook policy announcement, is how well they will be executed. “We’ve already strengthened our enforcement against militias,” Zuckerberg’s blog post notes, less than a week after the Verge reported that Facebook failed to act on multiple user warnings about militia-related events prior to the shooting in Kenosha, Wisconsin, that left two people dead. The new policies leave similar room for uncertainty. Will a false claim of victory by a politician be clearly and decisively debunked? Or will misinformation simply be presented next to a vague link to “Get Voting Information”? The latter is what initially happened with Trump’s strange Wednesday post attempting to retroactively clean up his suggestion that North Carolina Republicans illegally vote twice. Facebook later updated the post with a different label that says, “Voting by mail has a long history of trustworthiness in the US and the same is predicted this year. (Source: Bipartisan Policy Center.)” That’s a shade more helpful—but the change underscores how unpredictable this policy implementation can be. The generic label remains on other posts in Trump’s feed, as well as on posts by Joe Biden that discuss election issues.
That disclaimer, meanwhile, links to Facebook’s Voting Information Center, which is at the heart of the company’s ambitious plan to register 4 million new voters and which provides lots of helpful links to things like voter registration, mail ballot applications, and—in a particularly inspired move, given the barriers to in-person voting—ways to volunteer to be a poll worker. But will all that authoritative information actually make its way to people’s eyeballs? Facebook has emphasized that the Voting Information Center will appear at the top of people’s News Feeds, but three weeks after its roll-out, I still don’t see it in my feed on Facebook’s desktop site. To be fair, it does appear on mobile, which more people use, but in my experience it takes a few seconds to pop up—by which point I don’t see it, because I’ve already scrolled down far enough to where Facebook’s recommendation algorithm is suggesting new QAnon groups for me to join. (I recently joined a few for research purposes.)
[ad_2]
Source link