Platforms Have Failed Us on Abortion Content. Here's How They Can Fix It.

Platforms Have Failed Us on Abortion Content. Here's How They Can Fix It.

Curated from Deeplinks — Here’s what matters right now:

This is the eighth installment in a blog series documenting EFF's findings from the Stop Censoring Abortion campaign. You can read additional posts here. In our Stop Censoring Abortion series, we’ve documented the many ways that reproductive rights advocates have faced arbitrary censorship on Meta platforms. Since social media is the primary—and sometimes the only—way that providers, advocates, and communities can safely and effectively share timely and accurate information about abortion, it’s vitally important that platforms take steps to proactively protect this speech. Yet, even though Meta says its moderation policies allow abortion-related speech, its enforcement of those policies tells a different story. Posts are being wrongfully flagged, accounts are disappearing without warning, and important information is being removed without clear justification. So what explains the gap between Meta’s public commitments and its actions? And how can we push platforms to be better—to, dare we say, #StopCensoringAbortion? After reviewing nearly one-hundred submissions and speaking with Meta to clarify their moderation practices, here’s what we’ve learned. Platforms’ Editorial Freedom to Moderate User Content First, given the current landscape—with some states trying to criminalize speech about abortion—you may be wondering how much leeway platforms like Facebook and Instagram have to choose their own content moderation policies. In other words, can social media companies proactively commit to stop censoring abortion? The answer is yes. Social media companies, including Meta, TikTok, and X, have the constitutionally protected First Amendment right to moderate user content however they see fit. They can take down posts, suspend accounts, or suppress content for virtually any reason. The Supreme Court explicitly affirmed this right in 2023 in Moody v. Netchoice, holding that social media platforms, like newspapers, bookstores, and art galleries before them, have the First Amendment right to edit the user speech that they host and deliver to other users on their platforms. The Court also established that the government has a very limited role in dictating what social media platforms must (or must not) publish. This editorial discretion, whether granted to individuals, traditional press, or online platforms, is meant to protect these institutions from government interference and to safeguard the diversity of the public sphere—so that important conversations and movements like this one have the space to flourish. Meta’s Broken Promises Unfortunately, Meta is failing to meet even these basic standards. Again and again, its policies say one thing while its actual enforcement says another. Meta has stated its intent to allow conversations about abortion to take place on its platforms. In fact, as we’ve written previously in this series, Meta has publicly insisted that posts with educational content about abortion access should not be censored, even admitting in

Next step: Keep your day-to-day compliant and secure—find privacy-forward devices that help you stay protected.

Original reporting: Deeplinks

Back to blog

Leave a comment

Please note, comments need to be approved before they are published.