When Connection Becomes Control: Facebook's Dark Side in Việt Nam
In an increasingly digitized world, social media platforms like Facebook often present themselves as champions of open communication and connectivity. However, a recent in-depth report, "Community-led Assessment of Rights in the Tech Industry: Facebook’s Operations in Vietnam," published by Legal Initiatives for Vietnam (LIV) on May 31, 2025, paints a far less optimistic picture of Facebook's presence in the Southeast Asian nation.
The report reveals a deeply concerning pattern of complicity, suggesting that Facebook is not just navigating Việt Nam’s authoritarian landscape but actively aiding the government in stifling dissent and controlling information.
For many in Việt Nam, the internet and social media have become an essential part of their daily lives. With a staggering 73.3% of the population engaged in various social media platforms as of early 2024, Facebook has become a critical space for civic engagement, independent media, and human rights advocacy in a country where physical spaces for such activities are highly suppressed.
Yet, it is precisely this pervasive influence that makes Facebook’s acquiescence to governmental demands so potent and, frankly, disturbing.
LIV, a Taipei-based nonprofit dedicated to promoting human rights, democracy, and rule of law in Việt Nam, undertook this comprehensive Human Rights Impact Assessment (HRIA) to provide critical feedback to Meta/Facebook on its real-world impact on vulnerable user groups. Their findings should serve as a wake-up call for anyone who believes in the principles of free expression and human rights.
Censorship, Moderation, and Algorithmic Bias
The report meticulously details three primary areas where Facebook's operations in Việt Nam undermine human rights: government censorship, Facebook’s own content moderation policies, and its content promotion algorithms.
Regarding government censorship, Việt Nam operates under strict laws, such as the Cybersecurity Law, that give the government the power to demand the removal of "toxic" content—a broad term that often encompasses criticism of the state—within 24 hours.
The report gives Facebook a Salience Score of 70/100 and a Company Risk Score of 50/100 for this aspect. While any company operating in an authoritarian state faces immense pressure, the report questions the extent of Facebook's resistance, or rather, lack thereof.
The recommendations for Facebook to challenge these unlawful state requests, by demanding a three-part test (detailed information, legal basis, and court approval) and advocating against user identity verification, highlight a clear path that Facebook, according to the report, has chosen not to pursue. This suggests a willingness to comply rather than challenge the state's overreach.
The report also scrutinizes Facebook's content moderation policies. Here, the findings are even more damning, earning a Salience Score of 100/100 and a Company Risk Score of 67/100.
The report highlights a problematic dual standard: both under-enforcement and over-enforcement. On the one hand, dangerous and fraudulent content, including instances related to human trafficking, persists due to insufficient moderation. This directly harms users, leading to "health and safety violations, including human trafficking, torture, and psychological distress."
The report chillingly notes that victims have died due to abuse facilitated by content that Facebook failed to remove, highlighting the severe, irreversible consequences of their inaction.
On the other hand, the report accuses Facebook of over-enforcement meaning that legitimate content—particularly from human rights defenders, independent journalists, and civil society groups—is wrongly restricted or removed.
This effectively silences criticism and shrinks the country's already limited civic space. The issue is amplified by a lack of transparency and ineffective appeal mechanisms that leave affected users with little recourse. This suggests a systemic failure within Facebook's moderation framework that disproportionately impacts those speaking truth to power.
Finally, and perhaps most disturbingly, is the impact of Facebook's algorithm in promoting content and advertising. This area received a Salience Score of 70/100 and a remarkably high Company Risk Score of 83/100.
The report asserts that Facebook actively prioritizes "government propaganda and sensational content while limiting political and civic content.” This algorithmic bias affects over 70% of the Vietnamese population and leads to "relationship damage for regular users" and "severe discrimination, mistreatment, and persecution for vulnerable groups, with potential for torture or imprisonment.”
It also notes that Facebook itself admits to restricting political content while promoting government propaganda, essentially acknowledging its role in shaping public discourse in favor of the authoritarian regime. The remediability for such severe cases of discrimination and persecution is described as "hardly" possible, emphasizing the long-term harm caused by this algorithmic manipulation.
A Convenient Partnership
In an authoritarian state like Việt Nam, where the physical space for dissent is suppressed, social media platforms become crucial battlegrounds for information and human rights.
Facebook, with its immense reach and influence, has the potential to stand as a bulwark against state censorship. Instead, the report strongly suggests that it has become a willing, or at least highly compliant, partner in the Vietnamese government's efforts to control its citizens and public discourse.
The report details how Việt Nam’s legal framework, including laws requiring tech companies to store user data domestically and remove content within 24 hours, creates an environment ripe for complicity.
While Facebook claims to have human rights policies, the reality experienced by users tells a different story. The consistent issues of wrongful moderation, over/under-enforcement, promotion of harmful content, and limited visibility for nonprofit organizations' pages on the platform highlight a significant gap between policy and practice.
Recommendations and the Path Forward
Accompanying these findings, LIV offers a comprehensive set of recommendations for Meta/Facebook to improve its human rights practices and resist unlawful government demands. These recommendations fall into two main categories: demanding government accountability and improving internal company practices.
For demanding government accountability, LIV suggests that Meta/Facebook should:
- Require government requests for content removal to be specific, legally justified, and have court approval.
- Advocate for the abolition of user identity-verification requirements, which can be used to unmask dissidents.
For improving its own operations, Meta/Facebook is advised to:
- Inform users about their rights and appeal processes, and publicly disclose government content removal requests.
- Collaborate more effectively with local civil society organizations.
- Label state-controlled media pages, improve detection of inauthentic behavior, and expand its network of "trusted flaggers.”
- Increase transparency on visibility limits for content and cease reducing the visibility of legitimate nonprofit content.
- Strengthen appeal processes, ensure human review of moderation decisions, and avoid over-censorship.
- Train its AI systems in Vietnamese to better understand local nuances and context.
- Provide users with an opt-out from algorithmic curation to empower them with more control over their feed.
- Establish an independent board to review content-moderation decisions.
These recommendations are not radical; they are standard practices for upholding human rights online. The fact that they need to be explicitly articulated for a company of Facebook's stature in a "community-led" assessment reveals the painful reality regarding the current state of affairs.
The Uncomfortable Truth
The LIV report serves as a stark reminder that in the complex geopolitical landscape of authoritarian states, technology companies face a crucial choice: to uphold the values of open communication and user freedom, or to become enablers of state repression. Facebook's significant presence and influence in Việt Nam indicate that it is not merely a passive bystander.
The findings of this report present a moral dilemma:
If Facebook truly wishes to live up to its claims of connecting the world and fostering a global community, it must urgently address the glaring human rights violations identified in the report. Otherwise, its operations in Việt Nam will continue to be viewed not as a bridge to connectivity, but as a tool for suppression in the silencing of dissent and in the erosion of fundamental freedoms.
This report does not just urge Facebook to do better; it challenges the very premise of its role in an unfree and authoritarian society. And for those who believe in a truly open internet, this is an uncomfortable truth that cannot be ignored.
—
LIV’s full report can be accessed here.