1. Introduction
In a landmark ruling, the Gauteng High Court in Johannesburg has ordered that Meta, the parent company of WhatsApp and Instagram, permanently deactivate multiple anonymous accounts and channels disseminating pornographic and explicit child sexual abuse material (CSAM) involving South African schoolchildren. The court’s decision, handed down on Monday, 14 July 2025, follows an urgent application by The Digital Law Company (DLC), which sought an interdict to halt the distribution of sexually explicit content on WhatsApp channels and Instagram profiles. As of 17 July 2025, the situation remains unresolved, with Meta facing potential contempt of court proceedings.
2. Background and Court Ruling
On 14 July 2025, the Gauteng High Court in Johannesburg rendered a significant ruling after an expedited application by The Digital Law Company, represented by social media law specialist Emma Sadleir. The court mandated Meta, the parent company of WhatsApp and Instagram, to immediately and permanently shut down more than 30 Instagram handles and WhatsApp channel accounts distributing graphic child pornography involving South African schoolchildren. The order, detailed in sources such as News24 and BusinessLive, required Meta to remove the accounts and provide all information regarding the identity of the creators by midday on 15 July 2025. This deadline was critical, reflecting the urgency of protecting children from online exploitation.
The judgement, overseen by Judge Mudunwazi Makamu, was founded on the assertions made by advocate Ben Winks, representing The Digital Law Company, that children were being exploited through the dissemination of explicit content. This legal action underscores the court’s role in enforcing platform accountability, as noted in reports from EWN and IOL, which emphasised the scale of the issue with over 1,000 photos and videos shared across platforms.
3. The Johannesburg High Court Ruling: Unpacking the Interdict
An urgent application for an interdict was lodged on Monday, July 14, 2025, specifically aimed at halting the distribution of sexually explicit material through a particular WhatsApp channel. The subsequent day, July 15, 2025, Judge Mudunwazi Makamu of the Gauteng High Court delivered a definitive order against Meta. Judge Makamu, who assumed his position in the Gauteng High Court in October 2024, possesses a distinguished judicial background, having presided over serious criminal cases, including those involving rape and murder. This background underscores the judiciary’s serious approach to offences of this nature.
The court’s order mandated Meta to undertake several critical actions:
- The immediate cessation and shutdown of anonymous Instagram accounts and WhatsApp channels identified as distributing explicit material involving schoolchildren.
- The permanent disabling of the perpetrator’s ability to establish any future accounts on Meta’s platforms.
- The provision of comprehensive identifying information regarding the perpetrator to The Digital Law Company by 15 July 2025.
The ruling unequivocally targeted content characterised as “sexual in nature that involves school children” and “explicit Child Sexual Abuse Material (CSAM)”.
Legal Foundations: The Films and Publications Act and its application to online child sexual abuse material (CSAM)
The legal underpinning of this judgment is firmly established within South African jurisprudence, primarily drawing from the Films and Publications Act. This Act provides a clear statutory definition of child pornography as “any image, however created, or any description or representation, that visually depicts a child engaged in sexual activity”. Furthermore, the Act explicitly criminalises the production, distribution, possession, or exposure of children to child pornography, prescribing severe penalties that include imprisonment and substantial fines. This robust legal framework is further reinforced by the provisions of the Cybercrimes Act and the Films and Publications Amendment Act, which collectively aim to comprehensively address digital offences and online content regulation.
Jurisdictional Dispute
There was also a legal dispute regarding jurisdiction, with Meta indicating that the court order needed to be formally served in California, its headquarters, despite dealing with the South African office during the application. This has resulted in the offending channels remaining active, highlighting a practical challenge in enforcing the ruling internationally.
4. The Role of AI in Combating Illicit Content
AI could play a pivotal role in preventing the distribution of CSAM on platforms like WhatsApp and Instagram. Advanced AI algorithms, capable of detecting and flagging explicit content in real-time, could proactively identify and block illegal material before it reaches large audiences. The urgency of leveraging such technology, noting that the perpetrator(s) were creating new accounts “every few minutes” to continue their “terror campaign.” stressed that Meta, as the platform operator, has the technological capacity to implement AI-driven solutions to disable offending accounts and prevent further uploads.
The use of AI in content moderation has been a growing topic in tech litigation globally. While Meta has faced scrutiny in other jurisdictions for its handling of copyrighted material in AI training models, the Johannesburg case underscores the need for AI to address immediate threats like CSAM. Meta’s failure to deploy robust AI tools to monitor and disable offending accounts exacerbates the harm caused to vulnerable children.
5. Government and Public Response
The South African Department of Communications and Digital Technologies, along with the Film and Publication Board, welcomed the court’s ruling on 16 July 2025. Deputy Minister Mondli Gungubele emphasised the psychological impact on victims and the need for swift justice, stating, “With the development and expansion of digital technologies, the crime of online child exploitation and abuse has grown exponentially and has become the most insidious form of global, modern and borderless cybercrime.”
Public reaction, as reflected in media reports, has been one of concern and support for stronger measures to protect children online. Emma Sadleir’s statement, “Meta ‘has blood on its hands’,” captured the severity of the issue and the perceived responsibility of tech giants in moderating content.
Sadleir urged parents to monitor their children’s device use closely, particularly at night, noting that much of the harmful activity occurs during these hours. She also called for Meta to take decisive action to identify and permanently disable the perpetrator(s), stating, “Meta is the only one who can stop him.” The case has ignited extensive discourse regarding the obligation of technology conglomerates to safeguard users, particularly minors, from damaging content.
6. Broader Implications and Global Context
The Johannesburg High Court ruling, while significant in its own right, is not an isolated incident but rather a component of an intensifying pattern of legal and regulatory scrutiny confronting Meta Platforms Inc. across the Global South. In Kenya, for instance, Meta has faced substantial legal challenges, including a landmark High Court ruling in April 2025 affirming its jurisdiction over a case alleging Meta’s role in promoting content that contributed to ethnic violence in Ethiopia. This Kenyan court’s assertion of jurisdiction, despite Meta’s attempts to restrict claims to U.S. courts, marks a pivotal moment in holding major tech companies accountable in developing nations. Furthermore, Meta has been embroiled in lawsuits in Kenya concerning the working conditions and alleged unfair dismissal of its content moderators, with Amnesty International noting that these cases represent the “first time that Meta Platforms Inc will be significantly subjected to a court of law in the global south”. These legal battles highlight critical issues surrounding labour rights, mental health support for moderators, and the outsourcing model of content moderation.
Adding to this trend, the Nigerian Competition Tribunal in April 2025 upheld a substantial $220 million fine against Meta and WhatsApp. A fine was levied after an investigation by the Federal Competition and Consumer Protection Commission (FCCPC) into claims of discriminatory and exploitative practices against Nigerian consumers, with the Tribunal confirming that Meta’s privacy policies contravened Nigerian law. These cases collectively demonstrate a growing assertiveness from national judiciaries and regulatory bodies in African nations. They are increasingly compelling technology giants to adhere to local legal frameworks and human rights standards, rather than allowing them to solely rely on their internal terms of service or arguments for jurisdiction in their home countries. This represents a notable shift in global digital governance, where countries in Africa are progressively asserting their sovereignty over online content and platform conduct, potentially establishing precedents that could influence other developing nations. This contrasts with Meta’s legal victory in the U.S. in May 2025, where it won $168 million in damages from Israeli cyberintelligence firm NSO Group over a WhatsApp spyware attack, a case focused on user privacy against external threats rather than platform liability for content.
7. The Evolving Landscape of Digital Content Regulation
The cumulative effect of these rulings, particularly the Johannesburg judgment, signals a significant evolution in the global landscape of digital content regulation. The era of tech companies operating with minimal oversight, largely relying on self-regulation and broad terms of service, appears to be drawing to a close. Governments, particularly in the Global South, are demonstrating a clear resolve to intervene actively to protect their citizens and enforce national laws in the digital sphere. This development suggests a future where platforms will face increasing pressure to adapt their global operations to diverse local legal and ethical standards, rather than dictating terms from a centralised authority.
The success of the Johannesburg High Court ruling also underscores the critical and increasingly effective role played by local civil society organisations, such as Women For Change , and specialised legal entities like The Digital Law Company. These local actors are instrumental in advocating for and achieving accountability from global technology platforms. They possess the unique ability to navigate complex national legal frameworks, mobilise public pressure, and directly engage with the judiciary to enforce local laws and safeguard vulnerable populations. Their active involvement demonstrates a powerful and effective model for grassroots digital rights advocacy and legal action, often filling gaps where international regulatory bodies might lack direct enforcement power.
Conclusion
The Johannesburg High Court’s ruling against Meta Platforms Inc. marks a pivotal moment in the global effort to combat online Child Sexual Abuse Material and redefine platform accountability. The immediate interdict, mandating content removal, perpetrator disablement, and information disclosure to The Digital Law Company, establishes a robust precedent for judicial intervention in digital content governance in South Africa. This action reflects a clear assertion of national digital sovereignty, moving beyond traditional self-regulatory models and demanding direct adherence to local laws, particularly the Films and Publications Act.
The case vividly illustrates the complex and often paradoxical role of Artificial Intelligence in content moderation. While Meta leverages AI for proactive detection and claims high success rates in identifying illicit material, the proliferation of sophisticated AI-generated CSAM presents an escalating challenge, creating a technological arms race between detection and generation. This is further complicated by the inherent tension between robust end-to-end encryption, vital for user privacy, and the desire for comprehensive content scanning to identify harmful material.
Furthermore, this ruling is indicative of a broader, intensifying trend of legal and regulatory scrutiny faced by Meta and other major tech platforms across the Global South. Concurrent legal battles in Kenya concerning human rights and labour conditions for content moderators, alongside significant fines in Nigeria for competition and privacy violations, collectively signal a global shift. National judiciaries and regulators are increasingly assertive in enforcing local laws and human rights standards, challenging platforms’ traditional arguments for extraterritorial jurisdiction. This evolving landscape necessitates a fundamental re-evaluation by global tech companies of their operational frameworks and content governance strategies to align with the diverse legal and ethical expectations of the jurisdictions in which they operate. The collective efforts of civil society and local legal entities are proving instrumental in driving this accountability.
This case is a critical moment in the fight against online child exploitation, with the Johannesburg High Court setting a precedent for platform accountability. As of 17 July 2025, the outcome remains pending, with the contempt hearing on 18 July 2025, likely to shape future legal actions.
By: Natascha Miller, LLB, BA [Forensics]










