You know when a trend in digital media — or news for that matter — hits an inflection point. It’s when a publication runs a headline that says something like ” Why This XYZ is Different.”
This week’s example? The Drum’s 8 July 2020 opinion column by Samuel Scott, entitled “Why This Facebook Advertising Boycott is Different.” It’s an excellent column. (I do disagree with his point that the COVID pandemic will only temporarily change marketing tactics. That’s a topic for another post.)
Boycotts Are a Start . . .
What is on-point is that as much as many might want to see government regulation, it’s going to be advertisers that drive the change they want to see in Facebook’s current approach to policing intentionally polarizing, often fact-free political content, let alone what many consider many instances of hate speech.
While a collective approach to pushing Facebook — and other social media or content outlets like YouTube — to continually police their content seems fanciful, it’s an example of a market-based approach. Newspapers and television (pre-Web) have done this for decades. Their readership expected these outlets to adhere to a general sense of “community standards.” Did an article help a resident understand a particular issue or event so that the majority had an individual and shared understanding of a single event or a series of events that were connected? Are stories — a single event or the weekly or bi-weekly city council or state assembly sessions–providing enough objective information that residents can make informed decisions?
It’s preposterous to expect any single company, even one with the reach and frequency of engagement with humans Facebook has, to solve social ills or reign in politicians that trade in divisive, hateful, coercive (including use of pseudo-science) content.
Technology and People
Ad-supported social media should be able to provide, at a minimum, a guarantee to advertisers that their ads running will not run against content that doesn’t align minimally with basic community standards. (Consider Google’s revision of Google Preferred to YouTube Select as one approach to the problem.)
Can advertisers expect Facebook or any other social media or ad-supported content site that allows consumers to share content to be capable of stopping all polarizing, factually inaccurate or hateful content from being posted and shared? No. Facebook’s enforcement can’t focus exclusively on preventing such content from appearing, it must also refine processes that honor advertiser and user requirements that enable rapid responses to reports of content runs afoul of basic standards.
Amazing technological advances contributed to our current position. But human behavior combined with a global pandemic exacerbated an already unbalanced relationship between social media, social networks and advertisers. It’s going to take processes and people — not just code — to pull us back to normal.