Nepal, Disinformation, and AI: Why This Moment Demands Governance, Not Just Awareness
The rapid evolution of artificial intelligence and digital technology has brought Nepal to a critical juncture—one where we can no longer remain passive observers of a global revolution. As we navigate this complex digital landscape, the challenge is not merely to adopt new tools, but to architect a future that remains unmistakably and beautifully human. This transition requires more than technical skill; it demands a moral compass and a governance architecture that ensures innovation serves as a bridge to progress rather than a catalyst for inequity. To secure a prosperous 'New Nepal,' we must now take up the responsibility to lead with vision, ethics, and a commitment to inclusive growth. As outlined in the Digital Nepal Framework 2.0 (2025) and the newly approved National AI Policy 2082 (2025), the state is beginning to recognize that "awareness" alone cannot buffer against the scale of AI-driven disinformation.
The approval of the National AI Policy 2082 (2025) marks a historic milestone. However, the successful implementation of the Digital Nepal Framework (DNF) 2.0 depends on moving beyond regulatory "documentation" of harm to a proactive architecture. To achieve the goal of ranking in the top 50 of the Global AI Readiness Index.
For over 40 years, I’ve advocated for granting Institutes of National Importance (INIs) status to Nepal’s top premier institutions—IOE, IOM, NAST etc This is not symbolic reform. Globally, INI status has proven that institutional autonomy, stable funding, and insulation from political interference transform academic institutions into engines of national development. Nepal must act now by granting INI status to premier institutions via Special Act encouraging building private-public partnerships for innovation.
Nepal is facing a critical moment. Artificial intelligence and advanced technologies are evolving faster than our laws, institutions, and public understanding. Waiting until harm occurs is not protection—it is merely documentation. Decision makers at policy level are not aware of the magnitude of damage due to their ignorance. Nepal stands at a crossroads where the digital and the cultural converge. The Digital Nepal Framework provides the map, but our values provide the compass. Waiting until harm occurs is not protection—it is merely documentation. Our future depends not on the technology we adopt, but on the values, we embed within it. By transitioning from passive witnesses to proactive architects of our own digital landscape, we can ensure that innovation serves as a bridge to progress rather than a barrier to equity. Let us lead with a moral compass, ensuring that the 'New Nepal' is as technologically advanced as it is ethically grounded. Ultimately, our future depends not on the technology we adopt, but on the values we embed within it. By leveraging the DNF to transition from passive witnesses to proactive architects, we can ensure the "New Nepal" is as technologically advanced as it is ethically grounded. Innovation must proceed at the speed of wisdom, guided by a national roadmap that prioritizes people over platforms.
I’ve written before that waiting for harm before regulating powerful technologies is not protection—it is documentation. Nowhere is this clearer than in Nepal’s rapidly deteriorating information ecosystem. For many Nepalis today, “truth” arrives not through deliberation or verification, but through speed—via Facebook posts, TikTok videos, forwarded voice notes, and increasingly, content generated or amplified by artificial intelligence. This is not a neutral technological shift. It is a governance failure unfolding in real time. Across Asia, disinformation has reached crisis levels. In Nepal, we don’t need studies to confirm this. We have already experienced it during elections, public health emergencies, migration crises, and natural disasters.
Private Sector Partnership
Public institutions cannot innovate in isolation due to limitations in resources in the context of Nepal. Industry collaboration is essential for promoting joint research labs and innovation hubs creating structured internship and faculty exchange programs for technology transfer and commercialization pipelines. This ensures that research is relevant, employable, and globally competitive while anchoring innovation within Nepal’s economy.
Elections: When AI Exploits Institutional Gaps
Nepal’s elections have become increasingly vulnerable to misinformation. False claims about candidates, recycled images presented as breaking news, and rumors undermining trust in the Election Commission spread rapidly online.
What concerns me now is not just misinformation—but AI-enabled disinformation. Deepfake audio or video in fluent Nepali or local languages could fabricate speeches, incite communal tensions, or falsely signal electoral manipulation. In a system where institutional trust is already fragile, such tools could cause damage long before fact-checkers or regulators respond. This is precisely why I have argued for anticipatory governance. Once harm becomes visible, it is already too late.
Public Health: The Cost of Reactive Policy
During COVID-19, Nepal witnessed how health misinformation erodes trust and costs lives. Vaccine myths, conspiracy theories, and false cures circulated widely, particularly where institutional communication was weakest. Generative AI now makes health misinformation more convincing and scalable. Fake experts, fabricated studies, and emotionally tailored narratives can spread faster than public health responses. In earlier writing, I’ve emphasized that safety emerges when institutions are empowered before crises. Public broadcasters, trusted radio, and ethical journalism are not optional services—they are public safety infrastructure.
Migration: When AI Amplifies Exploitation
Labor migration sustains Nepal’s economy, yet migrants remain among the most exploited groups in the digital ecosystem. Fake recruitment ads, AI-generated testimonials, and misleading visa guarantees prey on hope and desperation. This is not just a technology problem; it is an institutional one. Weak oversight, fragmented regulation, and under-resourced journalism create a vacuum that AI-enabled scams easily fill. Governance is about closing these gaps before technology exploits them.
Disasters: Speed Without Authority Is Dangerous
Nepal’s exposure to earthquakes, floods, and landslides makes reliable information essential. Yet disasters consistently trigger rumors, panic, and misinformation—now increasingly enhanced by AI-generated images and videos. I’ve previously argued that societies normalize risk before they understand it. In disasters, this normalization becomes lethal. Legacy media, when trusted and resourced, slows information flows just enough to replace panic with coordination. Speed alone does not save lives. Authority and credibility do.
AI in Newsrooms: Efficiency Is Not Neutral
AI tools are already entering Nepali newsrooms—translating, summarizing, generating content. This is understandable in a resource-constrained environment. But efficiency without ethics reproduces bias at scale. AI systems reflect the inequalities embedded in their training data—gender, caste, ethnicity, language, and geography. Without strong editorial oversight, AI risks amplifying exactly the voices and narratives that governance should protect against. As I have written elsewhere, human control over AI is not a technical detail—it is a democratic requirement.
If Nepal continues to respond only after a crisis—be it an election deepfake or a migration scam—we are merely documenting our failure. By granting autonomy to our top institutions and embedding ethical guardrails today, we ensure that the "New Nepal" is governed by its values, not just its algorithms.
Institutions, Education, and the Governance Gap
Regulation alone will not solve this. Nepal needs institutional capacity—independent media, universities, public broadcasters, and civil society—that can engage AI critically and ethically. Media and AI literacy must become core civic skills, not optional add-ons. Legacy media can and should be used in classrooms and public forums to demonstrate verification, accountability, and ethical judgment in practice. This aligns with my long-standing argument for strengthening institutions before crises force reform upon us.
Organizations like Rotary, Lion, NGOs, government local bodies and other concerned stakeholders must act as the bridge between experts and the community, as outlined in the DNF’s vision for an inclusive digital society. These institutions must raise awareness, promote ethical standards, and bridge experts, policymakers, and communities. There is an urgent of upskilling and reskilling the human resources of these institutions. Launching AI safety and awareness programs nationwide has become important aligning with institutional growth with Sustainable Development Goals of United Nations.
The target of training 5,000 AI professionals must be met by partnering with civil society (Rotary, Lions, NGOs) to "upskill" the frontline—teachers, health workers, and local government officials and developing a "Code of Ethics for AI in Journalism" to ensure newsrooms using AI for efficiency do not inadvertently reproduce caste or gender biases.
A Closing Reflection
The choice Nepal faces is not whether to adopt AI. That decision has already been made by global markets and technological momentum. The real question is whether we govern AI—or allow it to govern us. Disinformation thrives where institutions are weak, regulation is slow, and public understanding is limited. Legacy media, supported by anticipatory AI governance, remains one of the few tools we have to restore trust, context, and accountability. If Nepal waits for the first major AI-driven information crisis to act, it will already be too late. Governance is not about resisting the future. It is about shaping it—before harm becomes irreversible.
Note:
This article is part of my broader work on AI governance, institutional autonomy, and anticipatory regulation, and aligns closely with the DNF Agenda Nepal—which emphasizes democratic resilience, national interest, and future readiness.
Across my writing, I have argued that waiting for harm before acting is not protection; it is documentation. The risks posed by AI-driven disinformation—to elections, public health, migration, and disaster response—demonstrate why Nepal must move from reactive responses to forward-looking governance. Strengthening legacy media, embedding ethical AI use, and building media and AI literacy are not peripheral concerns; they are central to safeguarding democracy and social cohesion.
Through the lens of the DNF Agenda Nepal, this piece situates disinformation as not merely a technological issue, but a governance challenge—one that requires strong institutions, informed citizens, and policy frameworks that anticipate risk rather than respond to crisis. My hope is that this article contributes to a wider public conversation on how Nepal can shape technology in service of democratic values and the national interest.