This module is a resource for lecturers


Content-related offences


As the title implies, the cybercrimes included in this section involve illegal content. A prime example of illegal content is child sexual abuse material. The term child sexual abuse material should be used over child pornography because the term child pornography minimizes the seriousness of the offence. What the person is viewing, is not sexual activities between a child and an adult, but the sexual abuse of a child. Nevertheless, international, regional, and national laws use the term child pornography instead of child sexual abuse material. Article 9 of the Council of Europe Cybercrime Convention criminalizes offences related to child pornography, which is conceptualized as including visual depictions of 'a minor engaged in sexually explicit conduct … [,] a person appearing to be a minor engaged in sexually explicit conduct … [, and/or] realistic images representing a minor engaged in sexually explicit conduct." This conception of child pornography is not universally agreed upon; some states criminalize non-realistic images of child pornography, such as cartoons and drawings (e.g., Brazil, Costa Rica, Dominican Republic, Guatemala, Mexico, Nicaragua, Panama, and Uruguay), while others only criminalize images involving real children (e.g., Argentina, Bolivia, Chile, Colombia, Ecuador, El Salvador, Honduras, Paraguay, Peru, and Venezuela) (ICMEC and UNICEF, 2016).

A person commits an offence under Article 9 of the Council of Europe Cybercrime Convention if the person "intentionally and without right … produc[es] child pornography for the purpose of its distribution through a computer system [,] … offer[s] or mak[es] available child pornography through a computer system [,] … distribut[es] or transmit[s] child pornography through a computer system [,] … procur[es] child pornography through a computer system for oneself or for another person [, and/or] … possesses child pornography in a computer system or on a computer-data storage medium." Article 29(3)(a-d) of the African Union Convention on Cyber Security and Personal Data Protection also proscribes the production, procurement, possession, and facilitation of child pornography.

A notorious paedophile (i.e., a person sexually aroused by a child), Matthew Falder, who was a UK academic with a PhD from Cambridge University, was charged in 2017 with over 137 offences committed against 46 individuals, which included intentionally encouraging rape and sexual activity with a child family member, inciting sexual exploitation of a child, and the possession and distribution of child pornography, among other offences (Dennison, 2018; Vernalls and McMenemy, 2018). He blackmailed his victims to commit humiliating, despicable, degrading and abusive acts against themselves (e.g., self-harming and licking a soiled toilet brush) and against others, and recorded these acts on images and videos (Davies, 2018; Dennison, 2018). He then shared the images and videos on hurt core sites (like Hurt 2 the Core, now defunct), which specialize in rape, murder, sadism, torture, and paedophilic content (McMenemy, 2018).

Did you know?

Anatomically correct child sex dolls are being sold online. These dolls can be pre-made or made to order, and are primarily shipped from China and Japan.

Want to learn more?

Read: Maras, Marie-Helen and Lauren Renee Shapiro. (2017). Child Sex Dolls and Robots: More Than Just an Uncanny Valley. Journal of Internet Law 21(6), 3-21.

Commercial sexual exploitation of children is a term used to describe a range of activities and crimes that involve the sexual abuse of children for some kind of remuneration of any monetary or non-monetary value (e.g., shelter, food). An example of commercial sexual exploitation of children is live streaming child sexual abuse, which involves the real-time transmission and broadcasting of child sexual abuse whereby viewers can be passive or active (i.e., they can watch and/or interact with the victim or ask for certain acts to be performed by the child, alone, or for adults to perform certain acts against a child) (UNODC, 2015). These and others forms of commercial sexual exploitation of children, such as child sex trafficking, which involves "inducing, recruiting, harbouring, transporting, providing, or obtaining a child under the age of eighteen for the purposes of commercial sex" (Maras, 2016, p. 310), are explored in greater detail in Cybercrime Module 12 on Interpersonal Cybercrime and the UNODC Teaching Module Series on Trafficking in Persons.

Apart from child sexual abuse material and live streaming of child sexual abuse, other content included in this category is not considered universally illegal, such as "racist and xenophobic material," which refers to "any written material, any image or any other representation of ideas or theories, which advocates, promotes or incites hatred, discrimination or violence, against any individual or group of individuals, based on race, colour, descent or national or ethnic origin, as well as religion if used as a pretext for any of these factors," proscribed in Article 2(1) of the Additional Protocol to the Council of Europe Convention on Cybercrime, concerning the criminalization of acts of a racist and xenophobic nature committed through computer systems of 2003. Nevertheless, this content is prohibited in regional and international law under, for example, Article 29(3)(e-f) of the African Union Convention on Cyber Security and Personal Data Protection, and Article 20(2) of International Covenant on Civil and Political Rights of 1966, which prohibits "any advocacy of national, racial or religious hatred that constitute incitement to discrimination, hostility or violence" (see Cybercrime Module 3 on Legal Frameworks and Human Rights, for further information about the criminalization of this and other forms of expression pursuant to national laws and the lack of protection of these forms of expression under international human rights law).

The publication of false information is also considered a crime in various countries. In Tanzania, Section 16 of the Cybercrimes Act of 2015 prohibits the publication of "information or data presented in a picture, text, symbol or any other form in a computer system knowing that such information or data is false, deceptive, misleading or inaccurate, and with intent to defame, threaten, abuse, insult, or otherwise deceive or mislead the public or counselling commission of an offence." Kenya's Computer Misuse and Cybercrimes Act of 2018 also criminalizes the "knowing … publi[cation of] informationthat is false in print, broadcast, data or over a computer system , that is calculated or results in panic, chaos, orviolence among citizens of the Republic, or which is likely to discredit the reputation of a person" (Section 23). Nevertheless, according to the UNODC 2013 Draft Comprehensive Study on Cybercrime, "countries report varying boundaries to expression, including with respect to defamation, contempt, threats, incitement to hatred, insult to religious feelings, obscene material, and undermining the state" (UNODC, 2013, p. xxi). In some cases, governments' removal of Internet content relating to those forms of expression raised human rights concerns (UNODC, 2013, p. 25; for more information about these and other concerns regarding the restriction of the freedom of expression, see Cybercrime Module 3 on Legal Frameworks and Human Rights).

In 2005, the United Nations Security Council adopted resolution 1624, which (inter alia) calls upon Member States to "adopt such measures as may be necessary and appropriate and in accordance with their obligations under international law to … prohibit by law incitement to commit a terrorist act or acts … and to … prevent such conduct" ( UNSCR 1624 (2005)). The measures that Member States might take in achieving this objective include the criminalization of incitement to terrorism.

Other bodies have also called upon States to take steps to address the incitement to terrorism within their national legal systems. For example, Article 3 of the Council of the European Union Framework Decision 2008/919/JHA of 28 November 2008 amending framework decision 2002/475/JHA on combating terrorism, and Article 5 of the Council of Europe Convention on the Prevention of Terrorism of 2005 oblige the respective Member States of each instrument to criminalize acts or statements constituting incitement to commit acts of terrorism. Moreover, the Council of Europe Convention on the Prevention of Terrorism imposes an obligation on member States to criminalize "public provocation to commit a terrorist offence," as well as both recruitment and training for terrorism (UNODC, 2012, pp. 39-40).

While there is currently no binding universal obligation on States under international law to criminalize the incitement of terrorism, many States do have legal and criminal justice approaches to address such conducts and acts. Examples of approaches used in some countries include the use of 18 U.S.C. § 373(a), which proscribes solicitation and conspiracy, by the United States to successfully prosecute acts of incitement to terrorism (e.g., United States of America v. Emerson Winfield Begolly, UNODC, 2012, pp. 39-41), and the use of Section 1 of the Terrorism Act 2006, by the United Kingdom, which criminalizes the "encouragement of terrorism" as follows:

A person commits an offence if -

  • (a) he publishes a statement to which this section applies or causes another to publish such a statement; and
  • (b) at the time he publishes it or causes it to be published, he -
    • (i) intends members of the public to be directly or indirectly encouraged or otherwise induced by the statement to commit, prepare or instigate acts of terrorism or Convention offences; or
    • (ii) is reckless as to whether members of the public will be directly or indirectly encouraged or otherwise induced by the statement to commit, prepare or instigate such acts or offences.

Authorities in the United Kingdom have also previously successfully prosecuted the incitement to terrorism under the Terrorism Act of 2000. See the case of Younes Tsouli and others who were convicted under this Act for inciting terrorism overseas through material posted on online sites and chat rooms they created, administered, and controlled ( R v. Tsouli, 2007; UNODC, 2012, para. 114).

Notwithstanding the absence of any universal, legally binding obligation on States under international law to implement measures addressing the incitement to terrorism, many States have adopted such measures at a national level. However, several factors continue to present challenges in reaching an internationally agreed approach to the issue, including the absence of a universally agreed definition of terrorism, and different national constitutional and legal approaches to fundamental rights such as the right to freedom of expression and association, privacy etc. Therefore, the adoption and implementation of national approaches, which target online content inciting violent acts of terrorism and do not have a "chilling effect" on the legitimate and lawful expression of particular political or ideological views, continues to be a challenge for legislators, law enforcement and criminal justice institutions of all States (see UNODC, 2012, pp. 39-41).

The conflict between the criminalization of online content and the exercise of certain human rights is further explored in Cybercrime Module 3 on Legal Frameworks and Human Rights as well as Cybercrime Module 10 on Privacy and Data Protection. For more information on the incitement of terrorism see UNODC Counter-Terrorism Legal Training Curriculum Module 2 and Module 4, as well as Modules 2 and 4 of the Teaching Module Series on Counter-Terrorism.

Next: Conclusion
Back to top