After announcing last week that it will prioritize posts from friends and family members over brands and publishers, the social media site will also begin focusing on news from "trusted sources". So, we surveyed a vast, broadly representative range of people (which helps - among other measures - to prevent the gaming-of-the-system or abuse issue you noted) within our Facebook community to develop the roadmap to these changes - changes that are not meant to directly impact any specific groups of publishers based on their size or ideological leanings. "It will only shift the balance of news you see towards sources that are determined to be trusted by the community".
As part of our ongoing quality surveys, we will now ask people whether they're familiar with a news source and, if so, whether they trust that source.
"You could find music; you could find news; you could find information, but you couldn't find and connect with the people that you cared about, which as people is actually the most important thing".
As for how to achieve this goal of high quality news, Facebook will look for content to be trustworthy, informative, and local. As a piece published in Fast Company notes, limiting the exposure of users to information outside of their group makes them more prone to xenophobia and less open to the views of others. While the combination of these changes is apparently only going to change the mix of news in feeds from five percent to four percent, its stated claim is to avoid " sensationalism, misinformation and polarization".
Zinke Pushes Majority Of Nationwide Park Service Advisory Panel To Resign
Eight of the nine who were part of the letter had terms expiring in May, and suspected Interior was running out the clock. He wrote he was working on January 8 to renew the board's charter, fill vacancies and hold a meeting in a few weeks.
Facebook wants its users to decide which media outlets they trust most. If the answer is no, then it's evident people are trusting too many untrustworthy sources. Publications that do not score highly as trusted by the community may see a decrease.
The change will affect not only links posted by news outlets but also news stories that individuals share, Facebook said. "Teachers with DACA don't know if they'll be allowed to teach in a few months, but somehow we expect them to take care of our children", Zuckerberg said.
HOLY FUCK, MARK. If people can not tell truth from bullshit, why are those same people being used to rank publications on a scale of trustworthiness?
The social network has long struggled with fake or misleading news on its platform. Many experts said that this change might be about China. Is this latest change to let users rank news source trustworthiness a good approach?