top of page
People Over Platforms Worldwide logo featuring a blue and white shield with a capital letter P in the center.

People Over Platforms Worldwide

60,000 Supporters Call for Greater Transparency in Digital Platforms

A Greater Call for Fair Access and Platform Transparency

Access to digital platforms increasingly shapes how people communicate, earn income, and participate in modern life.


For many individuals and small businesses, social platforms serve as essential tools for work, connection, education, and community building. When access to those platforms is suddenly removed or restricted, the consequences can be significant.

Man reviewing paperwork at home representing the real world impact when individuals lose access to digital platforms.
Automation and platform enforcement decisions increasingly shape how people work, communicate, and participate online. As 60,000 supporters join the call for change, concerns about transparency and accountability continue to grow.

Wrongfully disabled accounts are increasingly affecting how individuals and small businesses access essential digital tools

People Over Platforms Worldwide has been documenting the experiences of individuals who report losing access to accounts without clear explanations or effective appeal pathways.


The organization’s petition calling for stronger transparency and fairer platform processes has now surpassed 60,000 supporters worldwide.


This milestone reflects a greater call for transparency and accountability in how digital platforms manage user access, moderation systems, and appeals processes.


When access disappears, people’s rights can be harmed.



Digital Platforms and Modern Public Life


Large digital platforms now play a central role in everyday activities.

For many users, these platforms are used to:

  • communicate with family and communities

  • operate businesses and reach customers

  • share creative work

  • organize events and advocacy

  • participate in public discussions


Because of this influence, decisions about account access, moderation systems, and algorithmic visibility can have real-world consequences.


As digital platforms continue to evolve, policymakers and researchers are increasingly examining how these systems operate and how users can be better protected.



Growing International Attention to Platform Governance


Throughout 2026, governments and regulators around the world have continued discussing how major technology platforms should be governed.


Policy conversations have focused on issues such as:

  • algorithmic transparency

  • user appeal systems

  • competition in digital markets

  • online safety protections

  • artificial intelligence oversight


Several regions, including the European Union, the United Kingdom, and the United States, are currently examining how existing regulations apply to large digital platforms and whether additional safeguards may be necessary.


“Artificial intelligence rivals will be allowed on WhatsApp for a year, Meta Platforms said, aiming to head off possible action from EU antitrust regulators.” (Source: Reuters)

Artificial Intelligence and the Future of Platforms


Artificial intelligence has become one of the fastest-growing areas of investment across the technology industry.


Major companies are rapidly expanding research and infrastructure related to:

  • machine learning models

  • automated content systems

  • recommendation algorithms

  • AI-powered communication tools


Artificial intelligence is becoming a central focus for technology companies, prompting new discussions about oversight, transparency, and responsible development across digital platforms. (Source: Reuters)


While these technologies offer significant opportunities, researchers and policymakers have also raised questions about transparency, accountability, and user protections as AI becomes more integrated into online platforms.



Why Platform Transparency Matters


As platforms grow larger and more influential, the systems governing them also become more complex.


Many users report that they struggle to understand:

  • why an account was disabled

  • how automated moderation systems work

  • how appeals are reviewed

  • how algorithmic systems determine visibility


These questions have led researchers, policymakers, and advocacy groups to call for clearer communication and stronger procedural safeguards for users.


Transparency helps build trust between platforms and the communities that rely on them.



60,000 Supporters and a Growing Conversation


The petition reaching 60,000 supporters reflects a growing global conversation about digital rights and platform accountability.


For supporters, the issue is not simply about individual accounts.


It is about ensuring that the systems shaping modern communication and commerce operate with fairness, transparency, and responsible oversight.


“Digital platforms have become essential infrastructure for modern life, and the systems governing them must be transparent and accountable.”

Recent Media Coverage Highlights Growing Concerns


Recent investigative reporting has continued to highlight concerns about automated platform enforcement and the challenges users face when accounts are disabled without clear explanations or accessible appeals.


In a March 9, 2026 investigation by CBS Baltimore, several users reported that their Facebook and Instagram accounts were suddenly disabled after automated systems flagged their profiles for potential child exploitation violations.


“It sort of feels like David and Goliath because you just kind of feel like it's impossible,” said one affected user while describing the difficulty of trying to resolve the issue without being able to reach a human representative. (Source: CBS Baltimore)

The report described multiple families who said their accounts were disabled without warning and that they were unable to obtain meaningful responses while attempting to navigate automated appeals systems.


Another user told investigators that when accounts are permanently disabled without explanation, the lack of communication can leave people feeling powerless.

“Something that drastic, there needs to be some way to communicate with a human,” one affected user explained while describing the experience of trying to restore access to their account. (Source: CBS Baltimore)

Advocacy Efforts Draw Attention


The investigation also referenced advocacy efforts aimed at improving transparency and appeal processes when accounts are disabled.


The non-profit People Over Platforms Worldwide launched a petition calling for clearer enforcement policies and more accessible human review when users lose access to their accounts.


“We’re calling for transparency, accessible human support, and a fair appeals process when errors happen,” said People Over Platforms founder Brittany Watson-Smith during the investigation.

“When access disappears without recourse, the impact is serious, and people should be able to contact somebody.”(Source: CBS Baltimore)

As of March 2026, the petition has attracted nearly 60,000 supporters, reflecting growing concern among users about how digital platforms enforce policies and manage appeals.



Oversight and Legal Scrutiny Continue


The issue is emerging alongside broader conversations about digital platform accountability and safety enforcement.


According to reporting referenced in the investigation, the Maryland Attorney General’s Office has received hundreds of complaints related to Facebook and Instagram since 2025, although those complaints involve a range of platform-related issues.


At the same time, ongoing legal proceedings and policy discussions across the United States continue to examine how large technology companies manage automated moderation systems, safety enforcement, and user protections online.



Why These Conversations Matter


For many individuals, creators, and small businesses, access to digital platforms increasingly shapes how people communicate, work, and participate in modern life.


When accounts are disabled without clear explanations or accessible appeals, the consequences can extend far beyond social media access, affecting communication, professional opportunities, and personal connections.


As 60,000 supporters join the growing call for transparency, discussions around digital access, accountability, and fair review processes continue to gain attention among users, policymakers, and media outlets.



Looking Ahead


The conversation around digital rights, platform governance, and user protections is continuing to evolve.


As policymakers, researchers, journalists, and users examine these issues, discussions about accountability, transparency, and fair processes are likely to remain central.

For organizations like People Over Platforms Worldwide, the goal remains the same:

to help ensure that people who rely on digital platforms are heard, supported, and treated fairly.


When access disappears, people’s rights are harmed.


Share Your Story With Us Through Our Support Form


Sign Our Petition To Support Fair Digital Access


Help Us Continue The Work By Supporting Our Mission




Editorial Note

People Over Platforms Worldwide is an independent non-profit advocacy organization focused on digital rights, transparency, and user access to essential online services.

The information presented in this article reflects publicly reported developments and ongoing policy discussions about digital platforms and technology governance.


The purpose of this article is to inform readers and contribute to broader discussions about digital rights and platform accountability.


People Over Platforms Worldwide advocates for transparency, fair access, and responsible governance across the digital ecosystem.

1 Comment

Rated 0 out of 5 stars.
No ratings yet

Add a rating
Jason
Mar 17
Rated 5 out of 5 stars.

Something has got to be done about these false bans and accusations

Like
bottom of page