Last week, Facebook lost about $120 billion in value. Though the plunge shocked many, the reasons why it happened did not—between the platform's Cambridge Analytica scandal and confirmed reports of Russian election interference, users continue jumping ship.
During next week's Facebook for Communicators Boot Camp at PR News' Social Media Summit on Aug. 9, James Nickerson, lead digital marketing instructor at continued education school General Assembly, will explain why communicators are in a unique position to heal the growing disconnect between brands' trust in the platform and the distrust felt by end users.
Nickerson will also explain why regulation might not be a bad thing for Facebook at this stage in the game—an increasingly popular sentiment, despite the challenges that such a task would present.
Sen. Mark Warner of Virginia, the top Democrat on the Senate Intelligence Committee, has released a policy paper proposing 20 different paths toward regulation. In the paper, published July 30 by Axios, Warner divides his proposals into three categories: combating disinformation, protecting user privacy and promoting competition in the tech space.
Here are three key points from the paper for communicators to be aware of.
Social platforms should be liable for failing to remove slanderous, illegal or otherwise harmful content. Warner writes that the onus is currently on victims to find and report harmful content to platforms "who frequently take months to respond and who are under no obligation thereafter to proactively prevent such content from being re-uploaded in the future."
Last weekend Facebook temporarily banned Alex Jones, founder and administrator of the controversial website Infowars, for uploading content that violated its terms. This raised the question, yet again, as to whether Facebook is a platform, a publisher or both. Facebook has publicly defined itself as a platform even while claiming in court that it's a publisher. The company must make that distinction before communicators can know for certain what role they play in vetting the perceived legitimacy of their content.
A public initiative driving media literacy is needed. Warner writes that recognizing disinformation will ultimately require "an informed and discerning population of citizens who are both alert to the threat but also armed with the critical thinking skills necessary to protect against malicious influence." He proposes a public initiative, bolstered by federal funding but run by state and local educational institutions, to build media literacy from an early age.
This could create a public that's more adept at sniffing out manipulative or hollow content. While it's no revelation that being honest, real and transparent is a best practice in PR, a public with media literacy training would have a much smaller threshold for clickbait. Should this idea gain traction, communicators will need to continue to raise the bar when it comes to the substance of their messaging.
Algorithmic audits would make it harder for social platforms to play favorites. "The federal government could set mandatory standards for algorithms to be auditable," writes Warner, "both so the outputs of algorithms are evaluated for efficacy/fairness as well as for potential hidden bias."
And if those audits look into the platform's financials, Facebook's advertising model, which prioritizes marketing partners with the deepest pockets, could be in danger. Brands with big social advertising budgets on Facebook are safe for now, but charting the progress of this proposal, and the others outlined in Warner's paper, should be a priority for any communicator currently investing in paid social.
Follow Justin: @Joffaloff