Facebook Inc acknowledged on Thursday that it has become a battleground for governments seeking to manipulate public opinion in other countries and outlined new measures it is taking to combat what it calls “information operations” that go well beyond the phenomenon known as fake news.
In a report and summary of response plans on its website on Thursday, Facebook describes well-funded and subtle efforts by nations and other organisations to spread misleading information and falsehoods for geopolitical goals.
These initiatives go much further than posting fake news stories to include amplification – essentially widening the circulation of posts through a variety of means – carried out by government employees or paid professionals, often using fake accounts.
Reuters reviewed an advance copy of the 13-page report, which was written by two veteran security analysts who joined Facebook from cyber-security firms FireEye Inc and Dell SecureWorks, along with Facebook’s chief security officer.
Facebook said its security team would now fight information operations, which it regards as a more complex problem than traditional hackers and scammers, by suspending or deleting false accounts after identifying them with a combination of machine learning and intelligence agency-level analysis.
The new efforts build on the company’s recently expanded campaigns to identify fake news and crack down on automated profile pages that post commercial or political spam. Facebook suspended 30,000 accounts in France ahead of last Sunday’s first-round presidential election.
In addressing the US presidential election as a “case study,” the Facebook team said fake Facebook personas had spread stolen emails and other documents as part of a coordinated effort, which US intelligence agencies have attributed to Russia. Other false personas pushed stories that expanded on that material.
“From there, organic proliferation of the messaging and data through authentic peer groups and networks was inevitable,” Facebook said. It said its data “does not contradict” the US director of national intelligence’s conclusion that Russia was behind efforts to interfere with the US election. The report does not name any other countries.
‘False amplification’
Facebook has faced pressure to clamp down on fake news, and has begun warning about suspected hoax stories. In its latest report, Facebook focused on how it will fight “false amplification” and targeted data collection, carried out through methods such as imposter accounts and password-collection schemes.
Facebook employees said the information operations it had seen included techniques such as carefully crafted friend requests sent under the appropriated names of real people. If those requests are accepted, the false friends can glean more information about the target.
That information in turn can be used to send convincing web links leading to malicious software or to map the social networks of the targets for further spying.
Facebook said it would go after amplifier accounts based on behavioral analysis that shows signs of inauthenticity, such as sudden bursts of activity or repeated posting of the same material, without regard to the politics of the content.
Facebook said that other amplification techniques it had discovered include coordinated “likes” to boost the prominence of key postings, the creation of groups that camouflage propaganda by including legitimate items, and the spread of inflammatory and racist material.
Most of the false amplification is driven by people with local language skills and a basic knowledge of the relevant political situation, the study said.
Though the goals may often be to promote one cause or candidate or to denigrate another, another objective appears to be sowing distrust and confusion in general, the authors wrote.
In some cases, they said, the same fake accounts engaged with both sides of an issue “with the apparent intent of increasing tensions between supporters.”
Facebook’s new crackdown reflects a striking change in perspective from November, when Chief Executive Mark Zuckerberg dismissed the argument that fake stories on Facebook could have influenced the US presidential election “in any way” as “a pretty crazy idea.”
[“source-ndtv”]