Blog post

Meta is Why the Platform Accountability and Transparency Act Should Become Law

By Darin Stewart | January 06, 2022 | 1 Comment

CIO Leadership of Innovation, Disruptive Trends and Emerging Practices

Last month, a small bipartisan group of senators introduced the Platform Accountability and Transparency Act (PATA) to compel social media giants to share data with researchers. It should become law as soon as possible, though I have my doubts it will.

PATA would establish an office within the FTC to facilitate researchers access to platform data while ensuring privacy and security standards are met.  University affiliated researchers would submit study proposals to the National Science Foundation for review. If approved, relevant platforms would be required to make the requested data available to the researchers. In most cases, data would be anonymized and encrypted, but in some situations researchers would be required to physically work within secure areas on controlled systems.

PATA would only apply to major social media platforms, defined as those with at least 25 million unique monthly visitors. As you might expect, these companies including Meta, Google, Tik-Tok and others are not excited about the prospect of having their “black boxes” forced open for the world to see. While they have not yet objected to the legislation forcefully, at least not publicly, they argue that it is unnecessary. These platforms are quick to point out that they already provide vast amounts of data about their platforms inner workings.  These claims are disingenuous at best.

Social media platforms feign a commitment to transparency all the while fighting to maintain opacity. Mark Zuckerberg boasted that Facebook  “established an industry-leading standard for transparency and reporting.” At roughly the same time, his company dismantled their content tracking product CrowdTangle because, according to Facebooks Vice President of Global Affairs, it was exposing “the wrong narrative.” Instead, as technology journalist Kevin Roose reports,  Facebook prefers to “selectively disclose data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.”

These “carefully curated reports” are woefully inadequate to researchers wishing to understand how social media impacts our society.  That is by design. For example, Facebook maintains an archive of political ads that it makes available to researchers and journalists. Researchers at N.Y.U.’s Tandon School of Engineering discovered that the archive is missing over 100,000 ads.  The ads that are available in the archive paint a picture of Facebook as a much more fair and balanced environment than it actually is.  Once the N.Y.U. researchers made their findings public, Facebook disabled their accounts.

Meta is obsessed with managing public perception of its products rather than addressing problems caused by its platform. Brian Boland, a former vice president in charge of partnerships strategy and an advocate for more transparency, resigned from Facebook in November over this issue. “One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” told the New York Times. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”

At times Meta is forced to confront its faults, as when internal engineers raised concerns about Instagram’s toxicity for teens  and how tweaks to algorithms amplify incendiary and divisive posts.  These alarms are routinely discounted, spun and ultimately buried. As the Wall Street Journal’s Facebook Files and whistleblower Frances Haugen’s testimony demonstrate, transparency is unfailingly subordinated to image management.  Meta knows much more about its impact on society than it shares. In this way, the giant social media platforms have taken their place alongside Big Tobacco and Big Oil as organizations that hide the damage they cause by hiding the data they produce.  This is what makes passage of the Platform Accountability and Transparency Act essential.

Social media platforms have repeatedly demonstrated that they will only release information that supports their preferred narrative and promotes their desired image. PATA will not eliminate  this dynamic, but it will provide a means to rationally and empirically examine what is actually happening in the social media ecosystems that now host most human interaction.  It will enable a data-driven examination of the impact of social media on society.  Sadly, this is also the biggest obstacle to PATA becoming law.

Big tech in general and social media in particular are popular punching bags for politicians and pundits. Conservatives believe that their viewpoints are censored and suppressed. Progressives accuse social media of spreading misinformation that foments hatred and hyper-partisanship. With full access to platform data and rigorous, peer-reviewed analysis, these questions could be settled.

As Stanford’s Nathanial Persily, who helped draft the PATA legislation testified to congress “Transparency legislation that allows researchers “other than those tied to the profit-maximizing mission of the firms, to get access to the data that will shed light on the most pressing questions related to the effects of social media on society.”

Therein lies the rub.  Despite numerous studies of the social impact of social media platforms and exposes of their inner workings, their conclusions are largely circumstantial and inferential rather than empirical and definitive. That means any useful claim can be made without fear of conclusive rebuttal.  Politicians will be reluctant to let go of this fund-raising cash cow and pass PATA into law. Providing unbiased, data-derived definitive answers to thorny questions is not in the best  interests of the social media giants or the politicians charged with regulating them. It is in the best interests of society.

Comments are closed

1 Comment

  • Hey!
    Amazing post.
    Thanks for sharing with us.