The Hypocrisy of Facebook: Facebook’s Failure to Comply with Their Own Proposal

Photo by Dawid Sokolowski on Unsplash

Section 230 of the Communications Decency Act (“CDA”) immunizes social media platforms, like Facebook, from liability of third party content.1 In other words, Facebook can’t be held liable for defamatory or offensive content posted by its users. In many ways, this is dangerous to the public, as Facebook has been accused of directly contributing to the January 6, 2021 Capitol riot by relaxing its security safeguards too soon after the 2020 presidential election.2 Evidence indicates that “numerous Facebook groups and accounts, public and private, were used to help organize the protest.”3

Recently, as social media has continued to evolve, there has been debate surrounding Section 230 and the broad sweeping immunity it provides.4 This twenty-five-year-old law was enacted during a “long-gone age of naïve technological optimism and primitive technological capabilities,” which has many pushing for a revision of the protections the law provides to internet platforms.5

Interestingly enough, Facebook has proposed its own amendments to Section 230 in a March 25, 2021 hearing before the United States House of Representatives Energy and Commerce Committee.6 Given that Facebook seems to continue to benefit from the immunity the law provides by seemingly escaping liability despite allowing offensive content to remain on their platform, this may come as a surprise. The proposal has been described as “vague and ill-defined” by conditioning “platforms’ intermediary liability protection” on whether platforms can “demonstrate that they have systems in place for identifying unlawful content and removing it.”7 So, under Facebook’ “if a particular piece of content evades its detection,” platforms that have adequate systems in place would not be held liable.8 The rationale surrounding Facebook’s proposal is that it is “impractical for platforms with billions of posts per day” to detect all offending content.9

This proposal is one that few social media platforms can satisfy as smaller internet service providers lack the proper resources to create an adequate system for “identifying and removing user-generated content.”10 Thus, this would only insulate a small portion of social media giants from liability for any offending content that slipped through the cracks. 

In light of this, it is worth pointing out certain revelations that the recent Facebook whistleblower, Frances Haugen, revealed in her testimony before a Senate subcommittee.11 Haugen, a former Facebook employee who served as a data scientist during her time with the company,12 Haugen released internal research and documents from the company that “indicat[ed] the company was aware of various problems cause by its apps,” including harm to teens and children.13 Haugen explained that Mark Zuckerberg, CEO of Facebook, “has not shown any readiness to protect the public from the harm his company is causing.”14 She further explained that “[Zuckerberg]… has not demonstrated that he is willing to govern the company at the level that is necessary for public safety.”15 In support of these assertions, the documents she disclosed show that “Facebook AI systems only catch a very tiny minority of offending content.”16 Furthermore, she explains that the company is “understaffed,” contributing to platform-wide problems that Facebook has continued to struggle to tackle.17

Given the information that Haugen has provided regarding Facebook, it seems baffling that Zuckerberg would propose an amendment to Section 230 requiring internet social media platforms to have adequate systems in place to moderate offending content in order to receive immunity from Section 230. It is clear from Haugen’s testimony that Facebook itself has continuously failed to implement any form of adequate system to moderate offending content. The company’s unwillingness to ensure public safety within its platform is noteworthy.  Facebook is pushing for Section 230 immunity solely for platforms that have systems in place for identifying unlawful content. Yet, the platform likely wouldn’t even be covered under their own proposal since, as revealed by Haugen, they have failed to demonstrate that they have an effective system in place for moderating content. So, under their proposal, Facebook would essentially be responsible for all offending content posted on their site.

Looking at Facebook’s proposal alone may seemingly paint the company in a good light, actively participating in the legislation scheme that regulates the company. But, as the whistleblower testimony reveals, Facebook is far from complying with the proposals they have made themselves. If Facebook is to be taken seriously in its proposal, then the company needs to reexamine its policies and regulations and take action by implementing adequate systems for moderating content.

Caitlin Muraca is a Second Year Law Student at the Benjamin N. Cardozo School of Law and a Staff Editor for the Cardozo Arts & Entertainment Law Journal. Caitlin is interested in music law. Caitlin is also a member of the Entertainment Law Society and is currently an extern at the Acker Law Group, a music business law firm.

  1. 47 U.S.C. § 230 (1996).
  2. See Mike Isaac, Whistle-Blower to Accuse Facebook of Contributing to Jan. 6 Riot, Memo Says, N.Y. Times (Oct. 23, 2021), https://www.nytimes.com/2021/10/02/technology/whistle-blower-facebook-memo.html [https://perma.cc/V4E7-97HA].
  3. Thomas Brewster, Facebook Gives FBI Private Messages Of Users Discussing Capitol Hill Riot, Forbes (Jan. 21, 2021, 5:59 AM), https://www.forbes.com/sites/thomasbrewster/2021/01/21/facebook-gives-fbi-private-messages-of-users-discussing-capitol-hill-riot/?sh=2d6c037961ed.
  4. See Ashley Johnson & Daniel Castro, Fact-Checking the Critiques of Section 230: What Are the Real Problems?, ITIF (Feb. 22, 2021), https://itif.org/publications/2021/02/22/fact-checking-critiques-section-230-what-are-real-problems [https://perma.cc/7YAF-JJAR].
  5. Michael D. Smith & Marshall Van Alstyne, It’s Time to Update Section 230, Harvard Business Review (Aug. 12, 2021), https://hbr.org/2021/08/its-time-to-update-section-230 [https://perma.cc/XEC2-PBSH].
  6. See Aaron Mackey, Facebook’s Pitch to Congress: Section 230 for Me, But not for Thee, EFF (Mar. 24, 2021), https://www.eff.org/deeplinks/2021/03/facebooks-pitch-congress-section-230-me-not-thee [https://perma.cc/Z3ME-PL2N].
  7. Id.; Hearing Before the U.S. H.R. Comm. on Energy and Com. Subcomms. on Consumer Prot. & Com. and Commc’ns & Tech. 7 (2021) [hereinafter Hearing] (testimony of Mark Zuckerberg, CEO of Facebook).
  8. Hearing, supra note 7, at 7 (testimony of Mark Zuckerberg, CEO of Facebook).
  9. Id.
  10. Mackey, supra note 6.
  11. See Key Things the Facebook Whistleblower Told Congress, CNN Business (Oct. 5, 2021, 6:07 PM), https://www.cnn.com/business/live-news/facebook-senate-hearing-10-05-21/index.html [https://perma.cc/S5PW-T7BD].
  12. See Bobby Allyn, Here Are 4 Key Points From the Facebook Whistleblower’s Testimony on Capitol Hill, NPR (Oct. 5, 2021, 9:30 PM), https://www.npr.org/2021/10/05/1043377310/facebook-whistleblower-frances-haugen-congress [https://perma.cc/6BSF-7FWB].
  13. Samantha Murphy Kelly & Clare Duffy, Facebook Whistleblower Testifies Company ‘Is Operating in the Shadows, Hiding its Research From Public Scrutiny’, CNN Business (Oct. 6, 2021, 8:16 AM), https://www.cnn.com/2021/10/05/tech/facebook-whistleblower-testify/index.html [https://perma.cc/XS2D-DC77].
  14. Dan Milmo, Facebook Boss ‘Not Willing to Protect Public From Harm’, The Guardian (Oct. 23, 2021, 9:02 PM), https://www.theguardian.com/technology/2021/oct/24/facebook-boss-not-willing-to-protect-public-from-harm [https://perma.cc/3PG7-UFRY].
  15. Dan Milmo, Frances Haugen to Testify to MPs About Facebook and Online Harm, The Guardian (Oct. 25, 2021, 1:00 AM), https://www.theguardian.com/technology/2021/oct/25/frances-haugen-to-testify-to-mps-about-facebook-and-online-harm [https://perma.cc/9NBH-TXUK].
  16. Aditi Sangal, Whistleblower: Facebook’s Artificial Intelligence Systems Only Catch “Very Tiny Minority” of Offending Content, CNN Business (Oct. 5, 2021, 11:45 AM), https://www.cnn.com/business/live-news/facebook-senate-hearing-10-05-21/h_f9830a104aa3d2dc7033aea882358538 [https://perma.cc/D8N5-G53C].
  17. Brian Selter, Facebook Struggles to Tackle Problems Because It’s “Understaffed,” Whistleblower Says, CNN Business (Oct. 5, 2021, 11:57 AM), https://www.cnn.com/business/live-news/facebook-senate-hearing-10-05-21/h_f9830a104aa3d2dc7033aea882358538 [https://perma.cc/D8N5-G53C].