Company launches notifications and identification format to halt the spread of rumors.
"We want the information shared on Facebook to be true, serious," says Adam Mosseri, vice president of the news feed , the service news feeder, which all users see as they enter their page on the world's largest social network . Facebook has announced that it will launch a filter to detect false rumors, false news and advertising from Friday to alert the user about the low veracity of the content seen and to curb its disclosure . It will initially be available in 14 countries.
He thinks the impact will be great: "It will be seen by millions of users." Facebook has more than 1.7 billion active profiles in the world. The Germany was the first country in which the new system was integrated experimentally.
In Menlo Park, the company's headquarters, there is an obsession to do exactly the opposite of what is expected on Facebook. "Our nature is to share. In these cases we ask that this should not be done, that the disclosure should be stopped, "says Mosseri, by telephone.
The executive admits that false news is a scourge to his company, but with mitigating factors. You do not think it's something new or that all responsibility is yours. Facebook focuses on three axes to deactivate content: eliminate the economic incentives for such publishing, create new products that fuse that content, and help society make information-based decisions.
Mosseri thinks that the first point is one of the keys so that it is no longer attractive to fill the net and the murals of false information. "We realize that almost always the incentive to stop this page is more economic than ideological. Almost always repeats a scheme: pages with three or four initial sentences, which is what is usually seen in the summary of the news feed, and the rest is advertising. It is advertising 80% or 90% of the article. When someone clicks on it, the owner of the page profits, "he explains.
In addition to having a human team dedicated to verifying the truth of the facts , special attention will be given to those who buy ads to give more visibility to false news and a layer of artificial intelligence will be added to better detect behavioral patterns. It adds that they will not hesitate in suspending and canceling accounts that have originated this type of content.
Mosseri insists that they will make an effort to identify and reduce this content. "We can not be judges of truth, but work with organizations that help us make better decisions and give us context when it comes to evaluating."
The involvement of users in reporting suspicious information will be one of the options that Facebook wants to stimulate, with a button to mark as dubious the posts. The social network will check, and even if it does not censor it, the profiles that come from the moment it is decided that it is not sure fact will see a mark and receive a warning before sharing it again.
"We will improve the way we decide what appears on the mural. For example, we note that often when someone reads an article does not share it. And vice versa, often the most shared is not read. We want the fake not to be released without control, even if it is not clicked, "he says, explaining the changes in his algorithm.
Finally, in an attempt to provide more quality information to users, they want to work more closely with the media. For this the Facebook Journalism Project was created. "We are committed to making better tools, services and products together. So that citizens have better information. We are approaching experts such as the Walter Cronkite School of Journalism in Arizona in collaboration with the News Literacy Project , a nonprofit initiative that helps shape the profiles to figure out better on Facebook, "he says. In addition, it joined the News Integrity Initiative, a 25-member initiative that combines both communications companies and institutions or foundations such as Mozilla.