Earlier this week, President Trump accused Google and Facebook of prioritizing unfavorable coverage about him in their respective news feeds. The accusation comes at a time when both platforms are under increasing pressure to be more transparent about how they decide what content to prioritize.
Last week, several Facebook employees called out a similar discrepancy on the company's internal message board. In a post titled, “We Have a Problem With Political Diversity,” Brian Amerige, a senior Facebook engineer, wrote, “We are a political monoculture that’s intolerant of different views. We claim to welcome all perspectives, but are quick to attack — often in mobs — anyone who presents a view that appears to be in opposition to left-leaning ideology.”
Facebook has been seemingly transparent in explaining its ranking process, particularly after its recent algorithm change when the platform produced this explainer video:
This explainer doesn't address how paid content gets prioritized above organic, though, which any social media marketer will tell you is a crucial part of their ad spend. Moreover, this explainer doesn't address the deluge of controversies that have flourished around the legitimacy of news on Facebook, limited to but not including: the aforementioned internal accusation that conservative views are underrepresented, Russian-influenced fake news stories and a clearly communicated policy around patently false purveyors of sensationalist, conspiracy-laden content like Alex Jones (will his banishment from the platform set a precedent)?
Expect to hear more on this next week, when Facebook's chief operating officer Sheryl Sandberg is set to testify in a senate hearing about social media's role in election manipulation. "A team helping Ms. Sandberg get ready for the hearing next Wednesday has warned her that some Republican lawmakers may raise questions about Facebook and biases, according to two people involved in the preparations," writes the New York Times.
Google's search rankings, meanwhile, work as a hybrid of AI and human efforts. Google's AI uses web crawlers that collect data based on relevant keywords (hello, SEO!) and generate page results, in part, based on that data. Google's search also chooses which stories have "authority" according to a set of judgments formed about the site publishing them. While Trump may mistake those judgments for bias, they are in fact culled together by a group of 10,000-plus employees identified as "search quality raters." This team's sole responsibility is to assign authority based on the recommendations of professional society, accolades like Pulitzer Prizes and the legitimacy of the reporting.
Ben Spangler, head of SEO on the Performics Practices team at Spark Foundry, says that Facebook going in and making manual updates to its algorithm is necessary, as the platform's tech isn't at the point where the AI intuitively understands how to root out bad actors. As many articles have mentioned this week, the bad actors on Facebook didn't hack a single thing—they simply took advantage of the algorithm's vulnerabilities.
"What they've done over the past few months is manually make updates to discourage the visibility of a lot of these different sites," Spangler says. "I still don't understand how they're going to incorporate that into the AI portion."
Spangler thinks that Facebook's next move will involve incorporating those manual, contextual updates into the next algorithm change. "Google became great at that," he says. "Every time Google's done an algorithm update it's usually revolved around trying to get nonsense like spam sites or link farms, sites that are not there to give great content. Facebook could probably take a page out of Google's book in updating the algorithm to look for contextual clues, which is what we do every day in SEO."
Follow Justin: @Joffaloff