This module is a resource for lecturers
Online child sexual exploitation and abuse
Online child sexual abuse and online child sexual exploitation involve the use of information and communication technology as a means to sexually abuse and/or sexually exploit children (Interagency Working Group, 2016, p. 23 and 28). The United Nations Economic and Social Commission for Asia and the Pacific (UN ESCAP) (1999) defines child sexual abuse "as contacts or interactions between a child and an older or more knowledgeable child or adult (stranger, sibling or person in a position of authority such as a parent or caretaker) when the child is being used as an object for the older child's or adult's sexual needs. These contacts or interactions are carried out against the child using force, trickery, bribes, threats or pressure." Child sexual exploitation involves child sexual abuse and/or other sexualized acts using children that involves an exchange of some kind (e.g., affection, food, drugs, and shelter) (UNODC, 2015) (see also Cybercrime Module 2 and TIP Modules 12 and 14). Perpetrators of this crime commit abuse or attempt to abuse "a position of vulnerability, differential power, or trust for sexual purposes" for monetary or other benefit (e.g., sexual gratification) (Interagency Working Group, 2016, p. 25). Actually, it is often hard to distinguish between child sexual abuse and child sexual exploitation because "there is considerable overlap between them" (Interagency Working Group, 2016, p. 25).
International conventions, such as the Convention on the Rights of the Child of 1989, and the Optional Protocol to the Convention on the Rights of the Child on the sale of children, child prostitution and child pornography of 2000, enumerate children's rights and clarify the obligation of states to protect children from sexual exploitation and sexual abuse. Furthermore, regional conventions, such as the Council of Europe's Convention on the Protection of Children a gainst Sexual Exploitation and Abuse of 2007 (a.k.a., the Lanzarote Convention), which entered into force on July 1, 2010, seek to prevent child sexual exploitation and abuse, protect victims, prosecute offenders, and promote national and international cooperation in the identification, investigation, prosecution, and prevention of these crimes (Secretariat of the Lanzarote Committee, 2018).
Age of a child
Article 1 of the Convention on the Rights of the Child of 1989 defines a child as "every human being below the age of eighteen years unless under the law applicable to the child, majority is attained earlier." Age thresholds vary by state. These differences may hamper cross-border cooperation in investigations of child sexual exploitation and child sexual abuse (ICMEC, 2018, p. 7).
Types of online child sexual exploitation and abuse
While online child sexual exploitation and child sexual abuse are prohibited by national, regional, and international laws (discussed in Cybercrime Module 2), and represent a serious form of violence against children, the types of crimes considered to be child sexual exploitation and child sexual abuse varies within these legal instruments. Examples of the crimes proscribed by law (to varying degrees) are online grooming, child sexual abuse material/child sexual exploitation material, and live streaming of child sexual abuse.
Child grooming (a.k.a. enticement of children or solicitation of children for sexual purposes) "can be described as a practice by means of which an adult 'befriends' a child (often online, but offline grooming also exists and should not be neglected) with the intention of sexually abusing her/him" (Interagency Working Group, 2016, p. 49). Research and available data show that grooming is predominately perpetrated by males; to a lesser extent, women solicit children for sexual purposes and/or to groom them (Altamura, 2017).
In typical cases, the grooming process proceeds in stages, beginning with victim selection (Winters and Jeglic, 2017). Online, children participate in a variety of social media platforms and communication apps that perpetrators can utilize to gain access to children's accounts. Perpetrators choose a victim based on the victim's "appeal/attractiveness" (determined by the perpetrators' desires), "ease of access" (e.g., based on whether the privacy settings on the websites, platforms, and apps children use are disabled or inadequately set), and/or "vulnerabilities" (e.g., children post about isolation or feeling misunderstood) (Lanning, 2010; Mooney and Ost, 2013; Winters and Jeglic, 2017). After victim selection, the perpetrator contacts the victim to gain access to him or her (Winters and Jeglic, 2017). The perpetrator then seeks to form a friendship with the victim (O'Connell, 2003). The perpetrator can glean information about the victim from online sources and use this information to deceive the victim by, for example, feigning common interests and hobbies and similar family and social situations, in order to relate to the victim, build rapport, and establish trust. The perpetrator's objective is to further develop the friendship into a relationship (O'Connell, 2003; Aitken, Gaskell, and Hodkinson, 2018). Before the sexual exploitation or abuse, the offender assesses the risk of being detected (e.g., asks the victim if parents or others monitor the child's accounts and/or digital devices), communicates the exclusivity of the relationship and the need for secrecy, and isolates the child (O'Connell, 2003; Aitken, Gaskell, and Hodkinson, 2018). However, there may be exceptions to such approaches.
Research has shown that online grooming does not happen through a linear process (Black et al., 2015; Elliot, 2017); it happens through a dynamic process driven by the motivation and capabilities of the offender and the offender's ability to manipulate and control the victim (Aitken, Gaskell, and Hodkinson, 2018). The end goal of online grooming is to sexually exploit or abuse the victim (e.g., by manipulating or coercing the victim to take a sexually explicit image or video and send it to the perpetrator) or offline (e.g., by meeting with the victim in person to sexually abuse him or her).
In contrast to other international and regional instruments (e.g., the Optional Protocol to the Convention on the Rights of the Child on the Sale of Children, Child Prostitution, and Child Pornography of 2000), the Lanzarote Convention and Directive 2011/92/EU of the European Parliament and of the Council of 13 December 2011 on Combating the Sexual Abuse and Sexual Exploitation of Children and Child Pornography, which replaced Council Framework Decision 2004/68/JHAof 22 December 2003 onCombating the Sexual Exploitation of Children, including Child Pornography (hereafter Directive 2011/92/EU) explicitly criminalize child grooming.
Article 23 of the Lanzarote Convention prohibits the "solicitation of children for sexual purposes," by criminalizing "the intentional proposal, through information and communication technologies, of an adult to meet a child...for the purpose of committing …[sexual abuse or producing child pornography]…, where this proposal has been followed by material acts leading to such a meeting." Like the Lanzarote Convention, Article 6 of Directive 2011/92/EU prohibits the "solicitation of children for sexual purposes." Article 6(1) of the Directive criminalizes "the proposal, by means of information and communication technology, by an adult to meet a child who has not reached the age of sexual consent, for the purpose of committing… [sexual abuse and producing child pornography], where that proposal was followed by material acts leading to such a meeting" (e.g., traveling to an agreed upon meeting place). Article 6(2) of the Directive proscribes "an attempt, by means of information and communication technology, to commit …[child pornography offences]…by an adult soliciting a child who has not reached the age of sexual consent to provide child pornography depicting that child."
The "material acts leading to such a meeting" requirement in both the Lanzarote Convention and Directive 2011/92/EU is problematic as a physical meeting with a child does not need to be arranged or take place for child sexual exploitation and abuse to occur (Interagency Working Group, 2016, p. 50). Realizing this, the Lanzarote Committee (2015) published an opinion and explanatory note concerning Article 23 of the Convention, which stated that "[t]he solicitation of children through information and communication technologies does not necessarily result in a meeting in person. It may remain online and nonetheless cause serious harm to the child" (Interagency Working Group, 2016, p. 50).
Many countries do not have legislation specifically criminalizing online grooming (International Centre for Missing and Exploited Children, 2017, p. 7; this ICMEC publication contains information about the countries that have these laws). For those countries that have national laws criminalizing online grooming, the provisions of these laws vary (International Centre for Missing and Exploited Children 2017, p. 7). For instance, similarly to the Lanzarote Convention and the Directive 2011/92/EU, some countries criminalize online grooming if there is intent to meet with the child in person (International Centre for Missing and Exploited Children 2017, p. 14). In the United Kingdom, a perpetrator was charged and convicted pursuant to the Sexual Offences Act of 2003 for meeting a minor after grooming her online via Internet relay chat (IRC) and performing sexual acts on her ( R v. Costi, 2006). Other countries "criminalize online grooming regardless of the intent to meet the child" in person (International Centre for Missing and Exploited Children 2017, p. 14). Other countries, like Argentina, Brazil, Canada, Italy, and Portugal (to name a few), criminalize online grooming regardless of the intent of the perpetrator to meet the child in person (for more information about these countries and other countries that criminalize online grooming, see International Centre for Missing and Exploited Children, 2017, pp. 39-56).
Child sexual abuse material/exploitation material
The "representation, by whatever means, of a child engaged in real or simulated explicit sexual activities or representation of the sexual parts of a child for primarily sexual purposes" (Article 2, UN Optional Protocol to the Convention on the Rights of the Child on the Sale of Children, Child Prostitution, and Child Pornography of 2000), "as well as the use of a child to create such a representation" is known as child pornography (International Centre for Missing and Exploited Children, 2016, p. vii). Given that what is being depicted in the material is the sexual abuse of a childand not sexual activities, the terms child sexual abuse material or child sexual exploitation material are preferred to remove any connotations that can surround the use of the term pornography (Frangež et al., 2015; Interagency Working Group, 2016, p. 39). For material that depicts child sexual abuse, the term "child sexual abuse material" is used, which is a form of "child sexual exploitation material," and for "all other sexualised material depicting children" the term "child sexual exploitation material" is used (Interagency Working Group, 2016, p. 39-40). Regional laws relating to the sexual exploitation of children have separated child sexual abuse material from child sexual exploitation (e.g., Directive 2011/92/EU and Article 27 of the African Charter on the Rights and Welfare of the Child of 1990).
The use of the term "child pornography" has been rejected by international organizations, law enforcement agencies, academics, and child protection professionals because it minimizes the serious form of violence against children it represents, it can place blame on the child rather than on the perpetrator of the offense, and it risks conveying that what is occurring is consensual (see Cybercrime Module 2). Despite this rejection, the term "child pornography" features prominently in less recent legal instruments from the past decade (e.g., the Convention on the Rights of the Child of 1989; the Optional Protocol to the Convention on the Rights of the Child on the Sale of Children, Child Prostitution and Child Pornography of 2000; the Council of Europe's Convention on Cybercrime of 2001; and the Lanzarote Convention of 2007, to name a few; see Cybercrime Module 2 for further information).
National, regional and international laws vary with respect to their definitions of child sexual abuse material. In certain countries, only the depiction of real children in the material is considered as a form of child sexual abuse material (ICMEC and UNICEF, 2016; see Cybercrime Module 2). Specifically, countries vary with respect to whether they proscribe "computer-generated child sexual abuse material," which refers to "the production, through digital media, of child sexual abuse material and other wholly or partly artificially or digitally created sexualised images of children." While this material is prohibited under international, regional, and certain national laws (e.g., Council of Europe's Convention on Cybercrime; Directive 2011/92/EU; Togo Law No. 2007-017 of 6 July 2007; Timor Leste Penal Code; UK Protection of Children Act of 1978; and Brazil Child Protection Statute), this prohibition is by no means universal (International Centre for Missing and Exploited Children, 2016, p. 40; Interagency Working Group, 2016, p. 40). What is more, national laws differ in regard to their criminalization of the possession, production, provision, procurement, and/or distribution or otherwise making available of child sexual abuse material. Many countries criminalize possession of child sexual abuse material only if there is intent to distribute the material (International Centre for Missing and Exploited Children, 2016, p. vi; see this ICMEC publication for more information about these countries and their laws). Many countries also do not have provisions in law that specifically criminalize online child sexual abuse material (International Centre for Missing and Exploited Children, 2016, p. vi; see this ICMEC publication for more information about these countries).
Child sexual exploitation and abuse material is distributed via email, text message, instant messaging, chat rooms, peer-to-peer file sharing networks (e.g., eDonkey, BitTorrent, and Gigatribe), social media platforms, and unencrypted and encrypted communication apps (e.g., Skype, Telegram and WhatsApp) (Maras, 2016; Europol, 2018, p. 32). Child sexual exploitation and abuse material is also traded on password-protected sites, bulletin boards, and forums. A case in point was "Dreamboard". Individuals seeking to join Dreamboard were required to upload an image of child sexual abuse material depicting a minor younger than twelve years old with their application (US Department of Justice, 2012). If the image was accepted as valid, the individual was granted limited access to content on the site and membership could only be maintained if the user continued to upload child sexual abuse material to the site. If the user wanted to obtain greater access to the content, he or she would have to produce child sexual abuse material and upload it to the site, submit child sexual abuse material "that had never been seen before," and/or upload a significant amount of child sexual abuse material (US Department of Justice, 2012). Members were encouraged to use encryption to prevent access to content and to evade detection by law enforcement agencies (US Department of Justice, 2012). While child sexual abuse material can be found on the visible web, Europol (2018) reports that the Darknet (an area of the Deep Web known for the illicit activities that occur within that space; for more information about Darknet and the Deep Web, see Cybercrime Module 5 on Cybercrime Investigation) is increasingly being used to distribute child sexual exploitation and abuse material and more extreme forms of this material (p. 32).
Live streaming of child sexual abuse
Live streaming of child sexual abuse involves the real-time broadcasting of child sexual abuse to viewers in remote locations (see Cybercrime Module 2). While live streaming of child sexual abuse frequently involves transmission across national borders over the internet, it is important to note that some countries have reported instances of domestic live streaming of child sexual abuse (Europol, 2018, p. 35; Promchertchoo, 2018a).
Live streaming of child sexual abuse occurs on online chat rooms, social media platforms, and communication apps (with video chat features) (Europol, 2018, p. 35). Viewers of live streaming child sexual abuse can be passive (i.e., pay to watch) or active by communicating with the child, the sexual abuser, and/or facilitator of the child sexual abuse and requesting specific physical acts (e.g., choking) and/or sexual acts to be performed on and/or performed by the child. Active participation on the part of the viewer is known as child sexual abuse to order, and can occur before or during the live streaming of child sexual abuse (UNODC, 2015; Interagency Working Group, 2016, p. 47). A disgraced Lostprophets singer, Ian Watkins, was convicted for child sexual abuse, among other charges, for encouraging a mother to sexually abuse her child via Skype sessions ( The Queen vs. Ian Watkins and others, 2013). This case illustrates that live streaming child sexual abuse does not only occur for payment, but it can occur to gratify love interests or sexual partners, and/or to satisfy sexual abusers' and viewers' desires, and/or in the context of other abusive relationships (e.g., where the abuser may be responding to the direction of someone they are also abused by).
Live streaming child sexual abuse is not explicitly mentioned in international, regional and national legal instruments. This type of act, however, may be criminalized under the sections of these instruments that prohibit "participation of a child in pornographic performances." Article 2(e) of Directive 2011/92/EU defines "pornographic performance" as "a live exhibition aimed at an audience, including by means of information and communication technology, of…a child engaged in real or simulated sexually explicit conduct…or the sexual organs of a child for primarily sexual purposes." Particularly, Article 21(1) of the Lanzarote Convention criminalizes "recruiting a child into participating in pornographic performances or causing a child to participate in such performances; …[the] coercing [of] a child into participating in pornographic performances or profiting from or otherwise exploiting a child for such purposes; …[and] knowingly attending pornographic performances involving the participation of children." In the Philippines, the Anti-Child Pornography Act of 2009 not only criminalizes child sexual abuse material, but also can be used to prosecute those involved in live streaming child sexual abuse by making it unlawful for anyone "[t]o hire, employ, use, persuade, induce or coerce a child to perform in the creation or production of any form of child pornography…[,]… [t]o produce, direct, manufacture or create any form of child pornography…[, and]…[to] publish offer, transmit, sell, distribute, broadcast, advertise, promote, export or import any form of child pornography" (Article 4).
Those engaging or participating in live streaming child sexual abuse could also be charged with the production of child sexual abuse material or possession of child sexual abuse material if the act is recorded (Interagency Working Group, 2016, p. 46). Software can be used to record the live streaming of child sexual abuse and/or still images can be captured of the child sexual abuse during its broadcast (Internet Watch Foundation, 2018). This recording and/or these still images could be kept by the perpetrators for their collection and/or shared with others. While copies can be made by the facilitators or viewers of live streaming child sexual abuse, often such copies are not available, making the identification of victims and offenders and the prosecution of viewers, sexual abusers and facilitators difficult (Interagency Working Group, 2016, p. 47).
Digital currencies, cryptocurrencies, money transfers, online payment services, deposits to bank accounts, debit cards, and credit cards have been used to pay for live streaming child sexual abuse (European Bank Authority, 2014; ECPAT International, 2016, p. 3; Nouwen, 2017; see Cybercrime Module 13 for further information on digital currencies and cryptocurrencies). Money transfers and other financial transactions in live streaming child sexual abuse can be used as evidence of live streaming child sexual abuse, if obfuscation tactics (e.g., pre-paid Internet access and the use of anti-forensics techniques, discussed in Cybercrime Module 4 on Introduction to Digital Forensics) have not been used to make the identification of the perpetrators difficult (Varrella, 2017, p. 49). Europol (2018) reported that "online payment services, money transfer services and local payment centres" are the preferred payment methods and the use of debit and credit cards for the purchase of live streaming child sexual abuse "has considerably decreased" "[f]ollowing successful interventions by financial coalitions" (p. 35). Europol also identified the use "of Informal Value Transfer System (IVTS) - where money can be collected with only a mobile phone number and a reference number, registration or identification -" as "a popular emerging payment method" (Europol, 2018, p. 35).
In countries such as the Philippines, Romania, the United Kingdom, and the United States, cases of live streaming child sexual abuse have involved women forcing children to perform sex acts or performing sex acts on children (Altamura, 2017, pp. 34 and 43-45; Europol, 2018, p. 35). For example, a US investigation revealed that a Romanian woman was sexually abusing her one-year-old daughter and three-year-old son via Skype for payment (Europol, 2018, p. 35). While evidence primarily points to the male involvement in live streaming child sexual abuse, women's involvement in this cybercrime should not be discounted.
Economic imbalances in countries, such as high levels of poverty, unemployment, and job instability, have been identified as drivers of live streaming child sexual abuse (Varrella, 2017; Internet Watch Foundation, 2018; Terre des Hommes, 2018). Live streaming child sexual abuse has occurred in regions such as South East Asia, where families' forced use of children to perform sex acts in order to financially support families is not considered a taboo or contrary to cultural and social norms (Varrella, 2017; Europol, 2018; Internet Watch Foundation, 2018; Terre des Hommes, 2018). In these cases, children are often "forced by facilitators (commonly a family or community member) to appear in front of a webcam to engage in sexual behaviour or be sexually abused" (Internet Watch Foundation, 2018, p. 1). In the Philippines, the facilitator justifies the sexual abuse of the child as a contribution to the family, whereby the money received can be used to feed the family, including younger children (e.g., buying milk for a baby). Children rescued from these situations usually carry with them the guilt that they did not act as told (Promchertchoo, 2018b).
These cases, however, are not the most common cases of live streaming child sexual abuse encountered by the Internet Watch Foundation. The Internet Watch Foundation (2018) has more commonly encountered live streaming child sexual abuse "involving white girls...from relatively affluent Western backgrounds… who are physically alone in a home setting, often their own bedrooms" (p. 1).
Countering online child sexual exploitation and abuse
Law enforcement investigations are among the most prominent means to combat online child sexual exploitation and abuse. National, regional, and international law enforcement agencies investigate online child sexual exploitation and abuse and cooperate in the investigation of these cybercrimes. For example, in Operation Tantalio , INTERPOL, Europol, and law enforcement agencies from 15 countries in Europe, Central America and South America cooperated in the investigation of child sexual abuse material distributed via WhatsApp (INTERPOL, 2017a). The existence of harmonized national laws, international cooperation in criminal matters, such as mutual legal assistance and extradition, bilateral, regional and multilateral conventions and agreements on child sexual exploitation and abuse, and the effective enforcement of these laws, treaties, conventions, and agreements, enables coordination and cooperation between agencies in international investigations of child sexual exploitation (for general information on harmonization of legal instruments and international cooperation on cybercrime matters, see Cybercrime Modules 3 and 7).
Undercover law enforcement investigations have also been conducted to identify, investigate, and prosecute perpetrators of online child sexual exploitation and abuse. A case in point is the undercover operation of the Kids the Light of Our Lives Internet chatroom, which served as a platform for live streaming child sexual abuse and to upload and share child sexual exploitation and abuse material (Laville, 2007). Undercover law enforcement officers, who were members of the Virtual Global Taskforce (a taskforce comprised of various law enforcement agencies around the world, whose overall objective is to develop partnerships with other non-member law enforcement agencies, non-government organizations, and the private sector to counter online child sexual exploitation and abuse), were able to infiltrate the chatroom and collect vital evidence which was used to successfully prosecute the host of the chatroom and individuals who used the site (Baines, 2008).
Cooperation between the private sector and government agencies is also essential in countering online child sexual exploitation and abuse. This cooperation has involved the "blocking" of registered child sex offenders' access to platforms frequented and used by children. A case in point is the 2012 Operation Game Over , where "Microsoft, Apple, Blizzard Entertainment, Electronic Arts, Disney Interactive Media Group, Warner Brothers and Sony" took down "more than 3,500 accounts of New York registered sex offenders" from online video game platforms (e.g., Xbox Live and PlayStation) (New York State Office of the Attorney General, 2012). Private companies (e.g., Thorn, Facebook, Google and others) have also worked together to create an Industry Hash Sharing Platform ("a cloud-based hash sharing tool") in order to harmonize takedown practices of child sexual exploitation and abuse material from online platforms (Thorn, n.d.).
Databases have also been created where child sexual abuse material can be uploaded for investigative purposes, such as INTERPOL's International Child Sexual Exploitation (ICSE) database, to counter online child sexual exploitation and abuse. These databases not only aid in the identification of child victims of sexual exploitation and abuse, but also aid in the identification and investigation of perpetrators. For instance, the leader of a network of sexual abusers in Japan was identified when authorities in Denmark and Australia uploaded videos of an unknown child sexual abuse victim to ICSE (INTERPOL, 2017b). In the United States, the National Centre for Missing and Exploited Children's (NCMEC) Child Victim Identification Program serves as a central repository for child sexual abuse material. Like the ICSE database, the material in this database is used to identify victims and perpetrators of child sexual exploitation and abuse, and investigate child sexual abusers, consumers of child sexual abuse material, and facilitators of child sexual exploitation and abuse.
The sheer size of the Internet, and the number of online platforms and apps, and digital technologies on the market, makes it easy for perpetrators to hide in plain sight. Given the volume of data and number of online sites, traditional investigative techniques of child sexual exploitation and abuse do not suffice. New technological solutions can reduce the amount of time it takes to identify perpetrators and victims, and they can proactively remove child sexual exploitation and abuse material. For example, Terre des Hommes, an international children's rights organization, in collaboration with partner institutions, created "Sweetie," a virtual ten-year-old Filipina girl designed to find and communicate with sexual predators online, for the purpose of publicly exposing them and informing appropriate law enforcement agencies (Terre de Hommes, n.d.).
Web crawlers (i.e., "[a]n application that systematically and continuously traverses the World Wide Web for some purpose;" Butterfield and Ngondi, 2016) and data mining (i.e., "[e]xtraction of useful information from large data sets;" Black, Hashimzade, and Myles, 2017) tools have also been used to proactively identify online child sexual exploitation and abuse. Cases in point are tools that are part of the Defence Advanced Research Projects Agency's (DARPA's) Memex project, such as DIG and TellFinder, which comb online advertisements, download content, identify links in downloaded content, and add the information gleaned to a database that is query-enabled (DARPA, n.d.). The purpose of this tool is to identify victims of sexual exploitation and perpetrators of these crimes. Another tool, Traffic Jam, which was created by Marinus Analytics, identifies patterns in online content and identifies victims through the use of facial recognition technology (DARPA, n.d.). Other tools focus on the identification of victims depicted in online child sexual exploitation and abuse material by focusing on the background and surroundings of the victim in the material in an effort to identify an item in the material that could provide information about the victim's location (e.g., the "Stop Child Abuse - Trace an Object" campaign of Europol).
Some technological measures taken by law enforcement agencies to conduct investigations of child sexual exploitation and abuse are considered controversial. For example, in the United States, law enforcement agencies can "rely upon known [software] vulnerabilities…, or …[can] develop tools to detect and use previously unknown and undisclosed vulnerabilities (or otherwise acquire exploits for these zero-day vulnerabilities) that it can then leverage" to gain access to perpetrators' digital devices and the data housed within them (Finklea, 2017, p. 1). These techniques, known as networking investigation techniques (NITs), are "specially designed exploits or malware" and have been utilized in several visible web and Darknet investigations of child sexual predators and child sexual abuse material (Finklea, 2017, pp. 1-2). For example, in Operation Playpen, which targeted (at the time) one of the largest Darknet sites containing child sexual abuse material, "the NIT used by the government…[was] malware that was surreptitiously disseminated through a Tor hidden service. The malware was designed to pierce the anonymity provided by the Tor network by (apparently) exploiting a vulnerability in the Firefox web browser (running as part of the Tor Browser) to place computer code on users' computers that would transmit private information back to a law enforcement server outside of the Tor network" (Electronic Frontier Foundation, n.d.; see Cybercrime Module 5 for more information about ToR).
The reality is that a multifaceted approach is required to effectively counter online child sexual exploitation and abuse, which includes not only law enforcement tactics, but also laws, regulations, and policies, the coordination of services provided to victims of child sexual exploitation and abuse, cooperation among all institutions involved in child sexual exploitation and abuse cases, and education programmes and awareness campaigns addressing these crimes and Internet safety (for more information about this multifaceted approach, see Module 13 on Violence against Children and Module 12 on Justice for Children of the E4J University Modules Series on Crime Prevention and Criminal Justice).
Did you know?
The Council of Europe created an information and education campaign for adults and children in sports to draw attention to the risks to children of sexual abuse.
Want to learn more?
For more information, see here.