By Tony Romm
The changes include new requirements for owners of Facebook pages, which must disclose more clearly the organizations that run them or whether they’re tied to a country’s state-owned media, along with more prominent labels around debunked news and tougher rules to prevent voter suppression. The announcements seek to remedy some vulnerabilities that malicious actors have tapped in recent months to spread false or misleading posts, photos and videos.
But the suite of policy and product fixes is unlikely to end the growing dispute over Facebook’s handling of political ads, particularly from President Trump, that contain falsehoods. Facebook’s decision against fact-checking or blocking those ads — which Mark Zuckerberg, the company’s chief executive, defended in an interview with The Washington Post last week — has drawn sharp rebukes from Democratic presidential candidates, who have accused Facebook of profiting from lies.
Facebook’s announcements arrive two days before Zuckerberg is set to appear on Capitol Hill, where lawmakers are likely to press him on the company’s work to safeguard U.S. elections from foreign manipulation. During the 2016 presidential race, Russian agents weaponized the site to spread falsehoods and stoke social and political unrest, aiming to boost Trump and damage Democratic presidential candidate Hillary Clinton.
Zuckerberg last week told The Post the company is in a “much better place now” to stop such disinformation campaigns more than a year before voters head to the polls, citing the company’s investments in staff and artificial intelligence and its successes in other elections, including in Europe. Still, Zuckerberg cautioned the threat is never “going to go away,” pointing to recent disinformation campaigns from countries including Iran and China.
On Monday, Facebook said it had removed four separate networks of accounts — three originating in Iran, and one in Russia — that violated the company’s rules around “inauthentic behavior.” The networks targeted the U.S. and other countries, but Facebook did not provide additional details about its enforcement actions.
Going forward, certain Facebook pages with large U.S. followings must disclose organizations behind them, part of an effort to crack down on instances in which owners obscure their origin “as a way to make people think that a page is run independently.”
Facebook’s pages feature has long been the source of trouble. An investigation by The Post found that a page known as “Vets for Trump,” which shared conservative memes with more than 100,000 followers, had been hijacked by a North Macedonian businessman. This September, Facebook under pressure removed a page called “I Love America” that featured patriotic themes and pro-Trump memes — but was actually run by Ukrainians. Facebook said the page, which boasted more than 1 million followers, violated its policies on spam and fake accounts.
Facebook also tightened its policy around voter suppression, prohibiting ads that suggest voting is useless or meaningless. The company long has banned posts, photos and videos that deceive users about how and when to vote, and it sends some voting-related content to fact-checkers, who can determine, for example, if claims about long lines at polling stations are false. Still, Facebook long has fielded criticism for adopting a narrow view of what constitutes voter suppression.
And Facebook said it would more prominently label content that its fact-checkers have deemed false on its main social-networking app as well as Instagram. It plans to append new labels to pages run by state-owned media outlets, a decision that comes in the wake of a number of enforcement actions by the tech giant against groups in Iran and elsewhere that violated its policies.
With those changes, Facebook also said Monday it would expand a program meant to protect candidates and their staff from security threats. While it isn’t altering its rules around ads, the company does plan to provide more information about a candidate’s spending. And Facebook announced it would set aside $2 million toward new media literacy programs.