While we won’t know who will form Canada’s next government for nine more days, I’m ready to declare a winner: Facebook.
Entering the fifth week of the federal election campaign, there appears to be no substantive evidence of foreign influence through the platform. Facebook has been responsive in removing or down-ranking questionable posts, and its data-sharing practices have allowed researchers and reporters to see what’s going on inside the platform for the first time.
It’s an impressive turn of events for a company that has faced a lot of warranted criticism, including in this publication.
You’ll recall that in January, the federal government established a task force comprised of some of its most senior civil servants to monitor the election for substantial threats and alert party leaders, Elections Canada and the public if it detected any skulduggery.
I asked several officials in Ottawa if they’d detected any suspicious foreign influence in the campaign to date. Those who responded all answered with a resounding “no.”
“Generally speaking, we have seen a largely clean election, with much less of a kind of misrepresented advertiser [or] misinformation kinds of content that we have seen in the U.S. and the U.K.,” Laura Edelson told The Canadian Press this week. Edelson is a researcher with New York University’s Online Political Transparency Project, which has partnered with the Digital Democracy Project to monitor Facebook ads during the campaign. “If disinformation ads come out, my suspicion is they will come out [in] the last week.”
It’s not that there hasn’t been any fake news during the campaign. But when it has appeared—for example, in a Buffalo Chronicle item earlier this week—Facebook has taken measures to reduce its spread.
McGill University analyzed the fake Chronicle item and found that while it reached 24 million people across all social platforms, it was dramatically lower on Facebook. This suggests that the company’s suppression protocols worked.
In The Logic’s own tests during the campaign and in the weeks leading up to it, Facebook flagged and blocked all ads that violated its political advertising policies, often within minutes of the content being submitted for approval.
Facebook’s transparency and advertising disclosures have also helped educate voters about what’s taking place on its platform. Countless column inches have been filled with reporting on disinformation, ad spending and awareness and researchers have been given unprecedented access to Facebook’s “firehose”—the hundreds of terabytes tracking all the likes and shares on the platform.
Much remains out of sight for researchers—all advertisements not labelled political, all private posts, all comments, messages and all private groups—despite Facebook initially promising to provide much of that. But with what Facebook has made available, researchers have been able to provide detailed updates to voters.
All of this transparency has revealed that most of the misinformation on Facebook in this election has come not from foreign actors abusing the platform but from the campaigns and their surrogates. That’s the sad truth underlying election manipulation in this campaign.
To quote the great philosopher Pogo, we have met the enemy and it is not Facebook; it is us.
As The Verge reporter Casey Newton wrote last week, “People seem to be holding Facebook responsible for politicians’ lies when we could be holding the politicians responsible instead.”
I’m not letting Facebook off the hook. It took market and regulatory pressure to force the company to work on a problem that it had long ignored. But its sincerity isn’t really relevant if what it’s doing appears to be working.
When a politician starts a campaign with low expectations and exceeds them, they’re declared a winner. On that premise, Facebook can take a victory lap.