In response to the escalating risks posed by social networks, particularly to susceptible demographics like children and vulnerable adults, the Government of Nepal (the “GoN”) has introduced the “Directive for Regulating the Use of Social Netwrok2023 (2080)” (hereinafter referred to as the “Directive”). The Directive not only aims to regulate the social network usage but also establish a robust mechanism which empowers affected individuals to report distressing online activities, ensuring the prompt removal of such harmful content from such platforms. The Directive has been issued by the GoN by virtue of the power provided under section 79 of the Electronic Transaction Act 2008 (2063).
The key provisions of the Directive have been set out below:
Definition of the Social Networks and Social Network Platforms
As per the Directive, Social Networks refer to facilities provided by Social Network Platforms through electronic communication mediums like computers or the internet. This includes services that facilitate interactive communication among individuals, groups, or organizations. This also encompasses the facility to disseminate content created by users, including groups, blogs, apps and other networks.
Similarly, Social Network Platforms refers to a platform operated publicly based on the internet or information technology that enable individuals or organizations to exchange ideas or information and offer the capacity to disseminate content created by the users. This includes platforms such as Facebook, TikTok, Twitter, Viber, Pinterest, WhatsApp, Messenger, Instagram, YouTube, LinkedIn, WeChat, etc.
Licensing requirements
The Directive provides that whosoever wishes to operate a social network platform, must be listed in the Ministry of Communications and Information Technology (Ministry). Any social network entity operating prior to the commencement of this Directive must get listed in the Ministry within 3 months from the date of commencement of this directive. The listed social networks must update its information every three years.
Platforms dedicated exclusively to civic education and social empowerment are not subjected to such these listing requirements.
Prohibited Activities for users.
Under the Directive, users are forbidden from creating anonymized or disguised identities (Fake Ids) to produce, comment, or share content. Targeting individuals or groups based on gender, religion, age, social class, and more, through hate speech (defined as posts, shares, or comments whose content can lead to violence among individuals, groups, or communities, disrupt social harmony and result in other negative consequences. This includes any form of expression such as voices, words, pictures, and videos that can cause such outcome) or unauthorized publication of photographs or trolling/ publication of memes targeting such individual or groups are not allowed. The Directive also prohibits encouraging activities that are illegal, including but not limited to child labor and human trafficking.
Abusive language and hate speech whether in words, audiovisuals, or images, are strictly banned. Users cannot distort images of individuals using technologies like animation or montage, nor publish private photos or videos without requisite permission. The promotion of obscene content, whether in words, pictures, or videos, is also forbidden.
Additionally, the Directive safeguards children against material harm, including sexual exploitation and abuse. Spreading false or misleading information is not allowed, and neither is any form of cyberbullying. Users are discouraged from engaging in activities related to narcotics, terrorism, or any actions that violate individual privacy. Hacking, phishing, and impersonation using social networks are also prohibited, as is the publication of obscene content and the advertising or trading of items that are illegal. Lastly, replicating and sharing any activities that are prohibited by law are equally forbidden.
The Directive imposes further responsibilities for social network users providing that users shall not inspire or conspire to spread hate and malice on the basis of class, caste, religion and function in a way that adversely affects Nepal’s sovereignty, geographical integrity, national security, independence, self-respect, national interest and act in a way to affect good relations between different levels of Government. It further provides that a user cannot intentionally like, repost, broadcast, tag, mention, comment or subscribe to any of the aforementioned acts.
Classification of Social Network Platforms
The Directive makes a distinction on social network platforms based on the number of users, classifying those with fewer than 100,000 users as “small” platforms and those with more than 100,000 users as “large\” platforms. This classification has significant implications, particularly for the larger platforms. Specifically, large-scale social network platforms are mandated to establish a residential “point of contact” official and officials inspection compliance of self-regulation.
Point of Contact and its responsibilities:
Pursuant to this Act, all social network operators are required to establish a dedicated ‘point of contact’ within Nepal. This mandate is specifically aimed at facilitating the resolution of complaints and issues relating to the use and management of social network platforms.
The point of contact would, identify material communicated on social network that is contrary to the Directive, temporarily or permanently remove such content, and inform such activities to the ‘social network management unit’ and other concerned authorities and periodically publish information regarding responsible use of social network platforms.
Responsibilities of social network operators:
Under the Directives, social network operators have various responsibilities. These include developing algorithms and other measures to prevent the publication and transmission of information, advertisements, and content that contravene prevailing laws. Operators are required to promptly, within a 24-hour window, identify and assess the legality of content that is subject to complaints, removing any material that is found to be in violation of the prescribed user conduct guidelines (see the above section on ‘Prohibited Activities’). Additionally, if found that content is being or is about to be posted that contradicts provisions of the Directive, under the instructions of the social network management committee or related body, operators must ensure its removal within a 24-hour timeframe.
Furthermore, social network operators are obliged to establish robust measures for the protection of individual privacy, explicitly prohibiting the unauthorized publication or sharing of personal data. They are also tasked with the development and dissemination of educational and informative content that caters to the safety and interests of social network users, as necessary. Maintaining a comprehensive record of all complaints received, along with the subsequent actions taken, forms a crucial part of these responsibilities.
Operators are also required to actively prevent the circulation of content on social network platforms that could potentially undermine national integrity and independence, among other sensitive matters. The management of social network usage should align with international principles and standards.
Lastly, it provides that any transactions through social network platforms must occur through a banking system.
The Social Network Management Unit
The Social Network Management Unit is tasked under the Ministry of Communications and Information Technology to hear complaints and issues not addressed at the point of contact or by the social network operator itself. It is tasked with register complaints arising from social network use (for example by verifying screenshots of purported victim), to enhance capacity of the human resources employed in the social network management unit, organizing coordination meetings regarding the systematic use and regulation of social networks with the participation of related authorities. They are also tasked to send a written notice to the contact point of the social network platform to immediately remove any material published or disseminated contrary to these Directives.
Consequence of non-compliance
The Directive does not explicitly state the consequences for users of social networks who do not comply with its regulations. It can be inferred that non-compliance could result consequences as per the Electronic Transaction Act 2063, as the Directive has been issued by virtue of the power provided under section 79.