A long-running dispute between Google and Ulrich Richter is nearing a decision at Mexico’s Supreme Court. On the surface, the case is about an offensive blog and a claim for moral damages. In practice, it asks a larger question: should a platform be punished for leaving up user content before a court decides whether that content is illegal? The answer could affect how search engines, hosting services, and social platforms respond to complaints, and how easily speech disappears from the internet in Mexico.
Mexico’s Supreme Court, known as the SCJN, is nearing a decision in a case that reaches far beyond one disputed blog. The conflict began after Ulrich Richter sought the removal of content published on Blogger, a Google-owned service. Lower courts sided with Google after it failed to take the material down before a judge ruled on its legality. That turned a personal dispute into a broader fight over who decides what stays online.
At the center is a simple but important question. Can a platform be punished for leaving up user content after a complaint, even without a prior court order? If the answer is yes, critics say companies will have a strong reason to remove contested speech first and review it later. That is why the case matters beyond a single plaintiff and a single blog. It could reshape how online complaints are handled across Mexico.
Why the case reached the Supreme Court
In Mexico, an amparo is a legal tool used to challenge alleged constitutional violations. The live file before the court is Direct Amparo 8/2023. The Supreme Court took the case because it saw broader constitutional issues, not just a private damages fight. The justices are weighing how the law should distinguish between the author of speech, the service that hosts it, and the tools that help users find it. They are also examining whether a platform can be compelled to act like a private judge when someone demands a takedown.
The lawsuit itself is a civil claim for moral damages, a Mexican legal concept tied to harm to honor, reputation, or dignity. The lower rulings did not stop with the alleged author. They also treated Google as partly responsible for keeping the blog available. Google argues that the approach conflates the roles of a platform and a publisher. That distinction matters because platforms host huge volumes of content they did not write.
How the dispute started
The conflict dates to 2014, when a blog appeared on Blogger with serious accusations against Richter and his family. The posts linked him to criminal conduct and other wrongdoing. Richter sought removal and later pursued damages in civil court. The case was admitted in 2015. A trial court ruled in 2021 against both the blog author and Google. An appeals court upheld that approach in 2022 and ordered the content removed.
That appeal ruling is one reason the case drew national attention. It signaled that a platform could face significant liability for failing to remove content after a private complaint. Google says the damages exposure exceeded 5 billion pesos. Whether or not that figure survives in the end, the message was clear. A company that leaves disputed speech online could face extraordinary risk. That is the pressure critics say can distort online moderation.
Why critics call it private censorship
The phrase “private censorship” does not mean that every takedown is unlawful. Courts can order content removed when it violates the law. The deeper concern is who makes the first call. If a platform must remove speech to avoid heavy liability, the practical decision shifts away from a judge. It moves to a company employee, a complaint form, or an internal policy. That can happen without a full hearing, clear standards, or an effective chance to contest the claim.
Another concern is the legal basis used in the section below. Critics say the ruling treated Google’s internal policies as if they created a binding legal duty to remove the blog. If that logic stands, a platform’s moderation rules could become the basis for civil liability. That would give private corporate policies a much larger role in deciding what can remain online. For free speech advocates, that is a central problem in the case.
The dispute also raises hard questions about public figures and speech on matters of public interest. Google argues Richter is a public figure and that the Mexican free expression doctrine requires a higher threshold before civil punishment. It also says the disputed blog involved satire and commentary, not only factual claims. The Supreme Court may reject some of those arguments. But it still must answer two questions. Where does criticism end and unlawful harm begin? And who should draw that line first?
The shadow of the right to be forgotten
The case is not formally about a right to be forgotten. Still, that debate hangs over it. Mexico’s Supreme Court has already rejected a broad version of that doctrine. The court has said Mexico does not recognize an automatic right to erase lawful information from public view simply because it is old, harmful, or uncomfortable. Critics now warn that a ruling against Google could create a similar result through the back door. Instead of calling it erasure, the system would pressure platforms to remove content before a judge decides it is illegal.
That is why this dispute matters beyond defamation law. It could shape the rules for search engines, blog hosts, social platforms, and other intermediaries. A broad liability rule would not only affect clearly unlawful content. It would also affect borderline content, satire, political criticism, and contested reporting. When the penalties are high enough, platforms often choose safety over speech. Lawful material can disappear for one simple reason: keeping it online is too risky.
What the ruling could change
If the Supreme Court backs the lower courts, people with money, influence, or strong legal teams may gain a more powerful tool. They could pressure platforms to delete material quickly, even before the facts are tested in court. That would not guarantee censorship in every case. But it would change incentives across the internet. Companies usually respond to legal risk en masse, with broad rules and rapid removals. Smaller publishers and independent speakers often feel that pressure first.
If the court draws a firmer line in Google’s favor, the ruling could still leave room for removals. It would simply say that serious restrictions need stronger legal safeguards. A judge, not a platform, would remain the main decision-maker in close cases. That would not prevent victims of defamation from suing. It would clarify that platform liability cannot be used as a shortcut around due process. For readers in Mexico, including foreigners who rely on digital platforms for news, business, and public debate, that is the real stake in the case.
The coming ruling will not settle every dispute over online harm. But it should answer a question that keeps getting harder in the digital age. When speech is challenged, who decides what disappears first? In Mexico, the Supreme Court now has a chance to set that boundary in a way that will matter far beyond this one case.




