bao 5
bao 5
Research article
A R T I C L E I N F O A B S T R A C T
Keywords: Background: A critical step in improving the response to and monitoring of online child sexual
Online child sexual exploitation and abuse exploitation and abuse (OCSEA) is the need to standardize the data that are collected, stored, and
Online content-sharing services analyzed that effectively measure change in the frequency, nature and risk of OCSEA over time.
Child sexual abuse material
Objective: The objective of the content analysis was to investigate the metrics used by online
Metrics
content-sharing platforms in their efforts to combat OCSEA.
Methods: A content analysis was undertaken on 19 online content-sharing services' transparency
reports on their metrics related to OCSEA.
Results: From the 19 transparency reports, 132 data points in relation to OCSEA were identified
with 22 distinct metrics on OCSEA. Findings revealed a disparity of appropriate metrics and
reporting mechanisms employed, particularly, there is a lack of standardized approaches to
metrics reporting and an absence of time related measures. Furthermore, very few online content-
sharing services disclosed metadata on the data reported and its capture methodology.
Conclusion: This study highlights the critical need for standardized metrics reporting to enable
comparability across services. Without such an evidence base, there are no objective measures to
assess the progress and effectiveness in addressing OCSEA.
1. Introduction
1.1. Background
Online Child Sexual Exploitation and Abuse (OCSEA) has been seen both as a public health problem and criminal justice problem
(Ali et al., 2021). OCSEA is defined as crimes that include production, dissemination and possession of child sexual abuse material
(CSAM), online grooming of children for sexual purposes, sexting, sexual extortion of children, revenge pornography, commercial
sexual exploitation of children, exploitation of children through online prostitution, and live streaming of sexual abuse (Quayle, 2016).
Furthermore, different aspects and definitions that may fall under OCSEA including sexual harassment and online solicitation,
exposure to sexual content, sexual bullying, pressure to share sexual images of themselves, wider sharing of sexual images, grooming,
sexual abuse and exploitation (Livingstone et al., 2017). Child sexual exploitation is distinguishable from child sexual abuse with the
underlying notion involving some form of exchange and often encompasses a broader range of activities. This can include activities
* Corresponding author.
E-mail address: [email protected] (M. Lu).
https://doi.org/10.1016/j.chiabu.2024.107046
Received 6 July 2024; Received in revised form 2 September 2024; Accepted 11 September 2024
Available online 23 September 2024
0145-2134/© 2024 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY license
(http://creativecommons.org/licenses/by/4.0/).
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
such as enticing, manipulating, or threatening a child into performing sexual acts, grooming potential victims, and the production,
distribution, buying, selling, possession, or transmission of CSAM (ECPAT International, 2016; Wager et al., 2018).
The prevalence of OCSEA is deeply concerning. A comprehensive meta-analysis of 88 publications, including findings from 125
studies across 57 countries and published in English, Chinese, and Spanish between January 1st, 2010, and September 30th, 2023,
revealed that approximately 1 in 8 children globally (12.6 %) experienced non-consensual taking, sharing, or exposure to sexual
images and videos within the past year. Similarly, nearly the same proportion (12.5 %) reported experiencing online solicitation during
the same period. Moreover, an overall prevalence rate of OCSEA at 8.1 % was observed across 15 included studies that measured
exposure to at least one type of OCSEA when multiple types of OCSEA were assessed within the same sample (Krzeczkowska et al.,
2024). Regarding the extent and characteristics of CSAM, over 36 million online cases of CSAM were identified and reported from 2018
to 2022 by five organizations that publish public reports on the accessibility and dissemination of CSAM. The presence of CSAM
perpetuates ongoing violations of the rights of each victim-survivor, with its distribution, production, and commercialization further
exacerbating harm to these individuals (Stevenson et al., 2024).
Legislation addressing OCSEA is growing globally. For example, the recent Online Safety Bill in the UK requires online content-
sharing services to report OCSEA to the National Crime Agency within specified timeframes based on when OCSEA content is
detected (Online Safety Bill, 2021). Services are required to publish annual transparency reports to inform users about the measures
providers are implementing to enhance safety and empower the office of communications (OFCOM) to hold them accountable (Online
Safety Act, 2023). In the European Union, the proposed Digital Services Act (DSA) aims to create a safer digital space by requiring
online platforms to remove illegal content, including OCSEA, and to cooperate with law enforcement authorities. The DSA also
mandates transparency reports and sets out clear obligations for online platforms to protect children from illegal and harmful content
(European Commission, 2022).
The eSafety commissioner in Australia provides guidance to online content-sharing services on expectations related to OCSEA and
reporting. The guidance provided pertains to the Basic Online Safety Expectations established under the Online Safety Act 2021 in
Australia. These Expectations outline the Australian Government's requirements for online service providers. Online content-sharing
services are expected to prioritize responding to the most harmful risks on their service, particularly where these involve unlawful
material or activity including OCSEA (eSafety Commissioner, 2023).
The African Union (AU) (African Union, (n.d.)) detailed a five-year strategic plan to tackle OCSEA following a poor response from
member states in ratifying the AU cyber convention 2014. Currently there is no legislation that mandates online content-sharing
services to remove, block, and report instances of OCSEA (African Union, n.d.). The plan details that industry should collabora
tively develop and share innovative tools to identify and remove OCSEA, address OCSEA in polices and process to identify, measure,
prevent, and mitigate against OCSEA. Several indicators have been established including legal obligations with internet industries to
report, remove and block CSAM.
In North America, the US Kids Online Safety Act (2023) requires large platforms to provide transparency reports including fore
seeable risks to children. Platforms are required to implement reasonable measures in the design and operation of their products or
services used by minors to prevent and address certain potential harms, such as sexual exploitation. Canada's most recent proposed
amendments to their Online Harms Bill mandates that social media platforms are required to make non-consensually distributed
intimate images and CSAM inaccessible within 24 h. Additionally, compliance with the Act mandates services to develop a plan that
includes statistics related to moderating harmful content and managing electronic data (Parliament of Canada, 2023).
Complementing these legislative efforts, the National Center for Missing & Exploited Children (NCMEC; National Center for
Missing & Exploited Children, n.d) is a private, non-profit organization established in 1984 by the United States Congress to serve as
the nation's resource on issues related to the abduction, sexual exploitation, and victimization of children. One of its key roles is
managing the CyberTipline, a central mechanism for electronic service providers to report instances of online child sexual exploitation
(Grossman et al., 2024). The organization has played a pivotal role in increasing public awareness, advocating for legislative changes,
and enhancing technological tools to combat OCSEA.
2
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
2. Method
Content analysis is a “research technique for making replicable and valid inferences from texts (or other meaningful matter) to the
contexts of their use” (Krippendorff, 2018, p.18). The goal is to achieve a concise yet comprehensive description of the phenomenon.
The analysis results in concepts or categories that describe the phenomenon. Typically, these concepts or categories are used to develop
a model, conceptual system, conceptual map, or categories (Elo & Kyngäs, 2008).
Given the rapid advancement of technology in the field of CSEA metrics, relying solely on academic literature may not capture the
most relevant or current information, as these metrics are typically reported in services' transparency reports. A transparency report is
“a public communication document released by an internet company that discloses key metrics and information about digital
governance and enforcement measures on its platform(s)” (Trust and Safety Professional Association, 2024). Our approach was to
focus specifically on mapping the key metrics outlined in the online content-sharing services included in a recent report by the Or
ganization for Economic Co-operation Development (OECD) on transparency reporting on OCSEA (OECD, 2023). This benchmarking
study provided a robust evidence base on how leading content-sharing service providers publicly report on their efforts to combat
OCSEA. Online content-sharing services are defined as “any online service that enables the transfer, transmission and dissemination of
content, in whatever form, whether one-to-one, one-to-few or one-to-many and irrespective of whether the content is public-facing,
semi-private or private. All of the services profiled in this report are Online Content-Sharing Services” (OECD, 2023, p.156). The
report analyzed how the top 50 global content-sharing services address CSAM and offers a detailed analysis of the policies and pro
cedures for addressing CSAM on their platforms and services (OECD, 2023). To identify the most popular social media platforms,
video-sharing sites, and communication services, the metric of monthly active users (MAU) was selected as the most appropriate
measure. MAU is commonly used in the industry to gauge online engagement and platform reach, making it a suitable basis for ranking
the most frequently used services. However, MAU is less relevant for other types of services. Therefore, market share was used to
identify the leading cloud-based file-sharing services. Additionally, two services - the WordPress content management system and the
Wikipedia reference site - were included in the original top-50 ranking, even though their relative popularity compared to other
services could not be directly determined (OECD, 2023).
By leveraging the OECD study and transparency reports from various online content-sharing services, we aimed to provide a
comprehensive overview of the current landscape of OCSEA metrics. Two authors conducted comprehensive searches on the websites
of all 50 online content-sharing services identified by the OECD report, as well as Verizon, a member of Tech Coalition,1 and
downloaded the latest transparency reports where available.
Following the retrieval of transparency reports, two authors reviewed each report to extract information including: 1) the name of
the online content-sharing service, 2) the specific metrics pertaining to OCSEA, 3) publication period of the transparency report. To
ensure accuracy and reliability, each author was assigned half of the total number of reports for initial extraction, while cross-checking
the other half extracted by the other reviewer. Each metric was recorded as a single data point; subsequently, the authors collabo
ratively grouped similar metrics into broader categories and synthesized them based on thematic areas. This process facilitated a
comprehensive analysis of the identified metrics and allowed for the identification of overarching trends and patterns in the strategies
employed by online content-sharing services to address OCSEA.
The study was registered with Open Science Foundation: https://osf.io/fsp5n
3. Results
The majority of online content-sharing services do not publish transparency metrics on OCSEA data. Even among the companies
that provide transparency metrics, differences in the metrics used often make it challenging to compare data across all services. Among
the 50 online content-sharing services examined, transparency reports were successfully located and downloaded from 19 services.
Table 1 presents the name, country, and publication period of these services and Table A1 details the services that do and do not
publish transparency reports.
As outlined in the table, the publication periods of transparency reports vary among services. Specifically, four services publish
their transparency reports annually, nine services release transparency reports biannually, six services publish transparency reports
quarterly.
1
The Tech Coalition (https://www.technologycoalition.org/) is an alliance of technology companies that collaborate to combat OCSEA.
3
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
Table 1
Characteristics of online content-sharing services.
Name of Service Country Publication period
Amazon
Reddit
Microsoft United States Annually
Microsoft (Bing)
Dropbox
LINE Japan
LinkedIn United States
Snap
Twitch
X Half year
Verizon
Yubo France
Zoom
Discord
Google United States
Meta
Pinterest Quarterly
Tiktok (ByteDance Co.) China
YouTube United States
traditional focus on the CSAM, namely depictions of nudity, abuse, and exploitation such as images and videos. As well as lack of
definitions of OCSEA, online content-sharing services do not define “child” or “age” in their transparency reports. Furthermore, there is
wide variation and categorisation between services definitions related to their OSCEA metrics. See Table 2 for a summary of definitions
that online content-sharing services use to define their OCSEA related metrics.
TikTok provides additional information about the definitions of CSEA underpinning its metrics with reported violations indexed to
the respective policies and sub-policies. The metrics reported by TikTok under the minor safety policy can be traced to the different
sub-policies. The most relevant for CSEA are sexual exploitation of minors, grooming behavior and nudity and sexual activity involving
minors. TikTok's report states:
The “sexual activity involving minors” sub-policy prohibits a broad range of content, including “minors in minimal clothing”, and
“sexually explicit dancing”. These two categories represent the majority of content removed under that sub-policy. We report CSAM
and supporting evidence to the NCMEC and to any additional relevant legal authorities.
Such detailed descriptions allow a much more fine-grained understanding of CSEA dynamics in the platform.
The initial mapping exercise of OCSEA metrics resulted in 132 data points, indicating the heterogeneity across different online
content-sharing services regarding the reporting of OCSEA (data points are available upon request). After grouping them into broader
themes, a total number of 22 metrics on OCSEA were identified. Based on the thematic metrics, we further synthesized them based on
the target objectives of CSEA; as a result, six categories were identified, including 1) content-based metrics, 2) account-based metrics,
3) detection-based CSEA, 4) report, 5) time related metrics, 6) escalations or legal requests. Table 3 presents the categories of the
metrics identified from the transparency reports. Additionally, a table of which services report each of these categories can be found in
Table 2
Summary of services definitions of OCSEA metrics in their transparency reports.
Definition of OCSEA Services
Child sexual Abuse Material (CSAM) Amazon, Discord, Google, and YouTube,
Verizon, Yubo, Pinterest
CSAM including content in images, videos, or text Amazon, Pinterest
CSAM as “visual depictions, including but not limited to photos, videos, and computer-generated imagery” of Google and YouTube
sexually explicit conduct involving minors.
Child sexual exploitation and abuse imagery (CSEAI). Snap, Microsoft, Bing
Content depicting sexual exploitation of a minor or grooming behavior. Verizon
Child exploitation (content depicting or promoting sexual activity or abuse of minors, solicitation of such LinkedIn
materials, or the solicitation of minor)
Content depicting sexual exploitation of a minor or grooming behavior Twitch
Child nudity, physical abuse and child sexual exploitation Meta
Child sexual exploitation, which includes media, text, illustrated, or computer-generated images, and URLs X
Grooming Yubo
Minor sexualization Reddit
Abuse of Children Line
Sexual exploitation of minors, grooming behavior and nudity and sexual activity involving minors TikTok
4
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
Table 3
Overview of metrics on CSEA reported by online content-sharing services.
Category Thematic metrics Number of services
Proactive rate 5
Detection-related metrics (n = 9) Content detected (manually, with Photo DNA/hash matching or with hybrid tools 4
Appealed content 1
Restored content 1
Content removed/actions 14
Content detected 6
Impressions 1
URLs deindexed 2
Content-related (n = 27) Prevalence of child endangerment 1
Reports to NCMEC 13
Reports per month 1
Disclosed data on child abuse 1
Report (n = 19) User reports 4
Removal before any views 1
Removal within 24 h 1
Reach of content deactivated for child sexual exploitation
Time-related metrics (n = 3) 1
Grooming or endangerment escalations 1
Escalations or legal requests (n = 13) Legal requests, including info disclosed 12
Account reinstated 2
Account appealed 4
Account-related metrics (n = 18) Accounts Actioned 12
Table A3.
3.2.1.1. Appealed content. Content appealed refers to the number of times users have challenged removed content. YouTube provides
metrics related to videos that have been appealed and videos that have been reinstated, however this metric is not disaggregated by
content type. Instead, they represent all removals for any content violations. Meta provides metrics regarding actioned content ap
peals, for both Facebook and Instagram. These metrics may provide some insights regarding fairness and accuracy of content
moderation practices. However, Meta warns of limitations:
This metric should not be interpreted as an indicator of the accuracy of our decisions on content, as people may choose to appeal for
many different reasons. We report the total number of pieces of content that had an appeal submitted in each quarter – for example, 1
January to 31 March. Bear in mind that this means that the numbers can't be compared directly to content actioned or to content
restored for the same quarter. Some restored content may have been appealed in the previous quarter, and some appealed content may
be restored in the next quarter.
3.2.1.2. Other content related metrics. Several additional metrics have been reported by various online content-sharing services,
providing further insights into their approaches to reporting OCSEA. For example, Meta and TikTok have metrics related to “restored
content”, representing the number of pieces of content reinstated after initial removal or warning. Meta has also incorporated a metric
concerning the prevalence of child endangerment, although no data has been collected thus far. Meta states that “we are working on
estimating prevalence for child endangerment violations. We will continue to expand prevalence measurement to more areas as we
confirm accuracy and meaningful data” (Meta Platforms, Inc., 2023).
Pinterest used a multifaceted approach involving automated tools, manual review, and hybrid methods, combining elements of
both, to deactivate policy-violating content including OCSEA. Pinterest also reports the percentage of content deactivated manually
versus hybrid tools. Meanwhile, X introduces the concept of “impressions,” defined as any time at least half of the area of a given Tweet
5
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
is visible to a user for at least half a second (including while scrolling) Additionally, X provides metrics on content deleted by country
and removed globally for OCSEA. Lastly, Google provides data on URLs deindexed for CSAM from its search results, reflecting its
efforts to remove harmful content reported on third-party web pages, although it lacks control over the content itself. These diverse
metrics contribute to a more comprehensive understanding of content moderation strategies employed by online content-sharing
services and their efforts to combat OCSEA.
3.2.3.1. Reports to NCMEC (n = 13). NCMEC reporting is mandatory for online content-sharing services operating in the U.S. This is
the case of Google (for both Google and YouTube), Snap, Yubo, Microsoft, Verizon and Amazon (for both their consumer services and
Twitch). Some of the services provide the number of NCMEC reports in their transparency reports. However, simply reporting on the
total number of reports made to NCMEC does not provide sufficient insight into the nature and extent of CSEA in these online en
vironments. In addition, the number of reports by the services are already made public by NCMEC's annual reports. In addition to the
absolute number of NCMEC reports, some services provide further information regarding this reporting. Twitch, for instance, publishes
the metric relative to the hours watched. The NCMEC Cyber Tipline signals how many reports were made for every 1000 h of live
streamed content watched. This metric is particularly useful because it measures effectiveness in real-time. The ratio of reports to
viewership seeks to reveal how the magnitude of CSEA relates to the total amount of streamed content in the platform. In addition,
Google reports not only the number of reports made but also the total pieces of content reported to NCMEC, as well as the CSAM hashes
contributed by the service to the NCMEC database. This type of contribution strengthens the ability of quickly locating remerging
material using automated hash matching techniques. Google's strategy provides a more complete picture of the scope of the efforts to
tackle OCSEA and efforts to fight OCSEA.
3.2.3.2. User reports (n = 4). Overall, there appears to be a disparity on user report on OCSEA metrics with only a few online-content
sharing services reporting on this. Challenges to self-reporting OSCEA have been identified on services with lack of intuitive methods of
reporting, or specificity in reporting OCSEA.
6
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
that 85 % of the reports processed by the company during the second half of 2023 resulted from proactive detection strategies (Yubo,
2023). TikTok also provides information on the proactive removal rate of harmful content, with 96.70 % of harmful content being
removed via proactive detection (TikTok, 2023). However, Zoom reports that PhotoDNA technology has been inefficient with 96.85 %
of all reports found were false positives after being reviewed by the Trust and Safety team and are therefore currently seeking better
CSAM hash-matching and other meta driven technologies (Zoom, 2023).
In addition, Meta (for both Facebook and Instagram), Snap, and Microsoft (for both its consumer services and Bing). Amazon also
reports on the number of images detected using Safer found on Amazon Photos. Safer is a CSAM classifier, a machine learning tool
developed by Thorn which can detect known and unknown CSAM in images and videos. Reporting on proactively detected content
offers of insight into a platform's commitment to keeping its community safe and secure. In addition, Amazon provides two metrics
which can be seen as complementary to the proactive detection. These are the number of reports of other content such as chat in
teractions and URLs from third parties, and the reports by trusted reporters for content quickly removed. These metrics illustrate how
Amazon has a system in place whereby external agents (e.g., NCMEC, Internet Watch Foundation (n.d.), Canadian CyberTipline, and
INHOPE hotlines) can flag content directly to the company, resulting in swift removal.
4. Discussion
This content analysis offers a comprehensive evaluation of the metrics related to OCSEA as reported in transparency reports by
online content-sharing services. Building on insights from the OECD report, this study further refined these metrics to present a
nuanced analysis of the data provided by these platforms. Of the 19 included transparency reports, 132 data points and 22 distinct
metrics were identified. However, despite the number of metrics, it became clear by looking at the data that any given metric may only
have a couple of services reporting it. Similar metrics are also not measured in the same way or even with enough clarity on how they
are measured to determine if they are comparable. In essence, there is a data landscape, but it is not comparable and not usable for
enacting change to keep children safe across the sector.
7
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
discrete content related measures such as content removed only. Findings from the content analysis highlight the importance of
effective and proactive detection metrics in the fight against CSAM. It emphasizes the proactive nature of detection measures, which
are essential for intercepting and combating the dissemination of CSAM before it can inflict further harm. Through a comprehensive
review of detection-related metrics extracted from transparency reports, findings highlight the important role of proactive detection in
safeguarding children. Notably, online content-sharing services such as Meta, Yubo, TikTok, and Microsoft Bing have made progress in
identifying and addressing CSAM through proactive detection measures. A proactive approach not only enables online content-sharing
services to detect and remove CSAM promptly but also plays a crucial role in disrupting the cycle of abuse associated with CSAM. In
essence, the detection-related metrics highlight the proactive detection strategies employed by leading online content-sharing services,
recognizing their instrumental role in combating CSAM and fostering a safer online environment for children.
Similarly, metrics related to accounts took various forms; including overall actioned account metrics to individual metrics related
to accounts being suspended or warned. Few transparency reports gave specific measures related to detection, with the mechanism of
detection incorporating multiple strategies including technology approaches, content moderators and user reports. There is a paucity
of comparable data in the field to allow for accurate estimation of the prevalence of OCSEA, with variation in how online content-
sharing services produce metrics with limited information on how the metrics are calculated. In addition, some of the metrics are
not mutually exclusive, which might lead to an over-estimation of the prevalence rate.
4.1.6. Limitations
In addition to the limitations identified from the metrics, the content analysis has several other limitations. Firstly, it only included
available transparency reports from the top 50 online content-sharing services. Future research should aim to incorporate transparency
reports from a broader range of services. Secondly, similar metrics are often not measured in a consistent manner. This inconsistency
hampers the ability to determine if these metrics are truly comparable. Therefore, any comparability of these metrics should be
interpreted with caution.
8
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
5. Conclusion
Online content-sharing services seek to demonstrate a commitment to transparency and accountability by publishing transparency
reports. It is essential that they continue to refine their methods of reporting exploitative content, so that this reporting can effectively
reflect emerging trends in OCSEA. We should note that simply reporting numbers and ratios over time do not necessarily indicate how
much offending behavior takes place, nor how the platform's algorithms and moderation efforts are successfully identifying and acting
on offending material. Insights into the patterns and trends associated with OCSEA as reported by online content-sharing services lacks
the contextual information that would allow to infer how much of observed change over time is due to changes in policies, procedures,
and technological innovation, and how much relates to offending behavior and victimization. Reporting could provide more specific
details, namely the types of content being actioned, the number of unique images or videos, the age of the victims depicted, and the
frequency with which certain types of content appear. By providing this kind of granular data, online content-sharing services could
better equip researchers, policymakers, law enforcement agencies, and civil society organizations to understand and address the
problem. Furthermore, this approach would allow consumers to make informed decisions about which products and services to use
based on the services record of accomplishment when it comes to protecting vulnerable populations such as children.
Funding source
Mengyao Lu: Writing – original draft, Validation, Supervision, Resources, Project administration, Methodology, Investigation,
Formal analysis, Data curation, Conceptualization. Maria Lamond: Writing – review & editing, Writing – original draft, Methodology,
Investigation, Formal analysis, Data curation. Deborah Fry: Writing – review & editing, Supervision, Methodology, Investigation,
Funding acquisition, Conceptualization.
Data availability
Acknowledgement
The authors wish to express their gratitude to Dr. Pedro Jacobetty for his valuable feedback and contributions to the original draft.
They also extend their thanks to the authors of the original OECD study that served as the foundation for this research.
Author contributions
Mengyao Lu: conceptualization, methodology, data extraction and analysis, original draft preparation. Maria Lamond: data
extraction and analysis, editing of the manuscript. Deborah Fry: conceptualization, reviewing, and editing of the manuscript.
Appendix A
Table A1
. Online content-sharing services that do and do not have transparency reports.
Online content sharing service with transparency reports Online content sharing service without transparency reports
9
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
Table A1 (continued )
Online content sharing service with transparency reports Online content sharing service without transparency reports
Appendix B
Table A2
Links to included transparency reports.
Discord https://discord.com/safety-transparency-reports/2023-
Microsoft Bing 2023 October Microsoft Bing EU DSA Report
Pinterest https://policy.pinterest.com/en/transparency-report
Reddit 2022 Transparency Report - Reddit (redditinc.com)
Dropbox https://help.dropbox.com/transparency/reports
LinkedIn https://about.linkedin.com/transparency/community-report
Microsoft https://www.microsoft.com/en-us/corporate-responsibility/digital-safety-content-report?activetab=pivot_1:primaryr3
Google https://transparencyreport.google.com/youtube-policy/featured-policies/child-safety?
Meta https://transparency.fb.com/reports/community-standards-enforcement/
Youtube https://transparencyreport.google.com/youtube-policy/removals?hl=en_GB
Twitch https://safety.twitch.tv/s/article/H1-2023-NetzDG-Transparency-Report?language=en_US
Snap https://values.snap.com/privacy/transparency
Yubo https://yubo.cdn.prismic.io/yubo/4bb10550-0506-4f82-a455-20ba06fd9ecd_Yubo_Transparency+Report_Second_Half_2023.pdf
X (formally Twitter) https://transparency.twitter.com/en/resources.html
Tiktok https://www.tiktok.com/transparency/en/community-guidelines-enforcement-2023-2/
Verizon https://www.verizon.com/about/sites/default/files/International-Transparency-Report-1H-2023.pdf
LINE https://linecorp.com/en/security/transparency/2022h2
Zoom https://explore.zoom.us/en/trust/transparency/
Amazon https://brandservices.amazon.com/transparency
Appendix C
Table A3
Online content-sharing services metrics reported by category.
Metrics Category
10
M. Lu et al. Child Abuse & Neglect 157 (2024) 107046
Table A3 (continued )
Metrics Category
Pinterest x x x x x x
Tiktok x x x x
YouTube x x x
References
African Union, (n.d.) African Union Initiative on: Strengthening Regional and National Capacity and Action against Online Child Sexual Exploitation and Abuse in
Africa Strategy and Plan of Action 2020–2025 Online-Child-Sexual-Exploitation-and-Abuse-OCSEA-2020-2025-Strategy-1.pdf (aucecma.org).
Ali, S., Haykal, H. A., & Youssef, E. Y. M. (2021). Child sexual abuse and the internet—A systematic review. Human Arenas, 1–18. Child Sexual Abuse and the
Internet—A Systematic Review | Human Arenas springer.com.
Álvarez-Guerrero, G., Fry, D., Lu, M., & Gaitis, K. K. (2024). Online child sexual exploitation and abuse of children and adolescents with disabilities: A systematic
review. Disabilities, 4(2), 264–276.
Canadian Centre for Child Protection (2017). Survivors' Survey Full Report. C3P_SurvivorsSurveyFullReport2017.pdf.
Canadian Centre for Child Protection (2020). Reviewing Child Sexual Abuse Material Reporting Functions on Popular Platforms. C3P_
ReviewingCSAMMaterialReporting_en.pdf.
ECPAT International. (2016). Terminology Guidelines for the Protection of Children from Sexual Exploitation and Sexual Abuse. UK: Terminology guidelines | ECPAT.
ECPAT. (2021). Terminology guidelines: Terminology related to sexual exploitation of children. ECPAT International. Retrieved from https://ecpat.org/wp-content/
uploads/2021/05/Terminology-guidelines-396922-EN-1.pdf.
Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107–115.
European Commission. (2022). Digital Services Act. Retrieved from: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065.
Grossman, S., Pfefferkorn, R., Thiel, D., Shah, S., Stamos, A., DiResta, R., Perrino, J., Cryst, E., & Hancock, J. (2024). The strengths and weaknesses of the online child
safety ecosystem: Perspectives from platforms, NCMEC, and law enforcement on the CyberTipline and how to improve it. Stanford Internet Observatory. https://
purl.stanford.edu/pr592kc5483.
Internet Watch Foundation. (n.d.). IWF. https://www.iwf.org.uk/).
Kids Online Safety Act (2023) S.1409 - 118th Congress (2023–2024): Kids Online Safety Act | Congress.gov | Library of Congress.
Krippendorff, K. (2018). Content analysis: An introduction to its methodology (4th ed.). Sage Publications.
Krzeczkowska, A., Fry, D., Anderson, N., Ren, J., Lu, M., Lu, Y., … Fang, X. (2024). Indicator 1: The prevalence of online victimisation, technical note in into the light:
Childlight’s global child sexual exploitation and abuse index (p. 2024). Edinburgh: Childlight. https://childlight.org/sites/default/files/2024-05/technical-note-1.
pdf.
Livingstone, S., Davidson, J., Bryce, J., & with Batool, S., Haughton, C., & Nandi, A.. (2017). Children’s online activities, risks and safety: A literature review by the UKCCIS
evidence group. London: London School of Economics (LES) consulting.
Meta Platforms, Inc.. (2023). Integrity and transparency reports: First quarter 2023. In Meta. https://about.meta.com/newsroom/.
National Center for Missing & Exploited Children. (n.d.). NCMEC. https://www.missingkids.org/.
OECD. (2023). Transparency reporting on child sexual exploitation and abuse online", OECD Digital Economy Papers, 357. Paris: OECD Publishing. https://doi.org/
10.1787/554ad91f-en
Online Safety Act 2023- Parliamentary Bills - UK Parliament.
Online Safety Bill. (2021). Parliament of Australia. aph.gov.au.
Parliament of Canada. (2023). Bill C-63: An act to enact the Online Harms Act, to amend the Broadcasting Act and to make related amendments to other acts. Parliament of
Canada. https://www.parl.ca/DocumentViewer/en/44-1/bill/C-63/first-reading.
Quayle, E. (2016). Researching online child sexual exploitation and abuse: Are there links between online and offline vulnerabilities?. http://globalkidsonline.net/
wp-content/uploads/2016/05/Guide-7-Child-sexual-exploitation-and-abuse-Quayle.pdf.
Reddit. (2023). 2023 H1 transparency report. Reddit Inc.. Retrieved from https://www.redditinc.com/policies/2023-h1-transparency-report-.
Stevenson, J., Vermeulen, I., & Fry, D. (2024). Indicator 3: The global scale and nature of child sexual abuse material (CSAM) online, technical note for into the light 2024:
Childlight’s global child sexual exploitation and abuse index. Edinburgh: Childlight. https://intothelight.childlight.org/indicator-3.html.
Tech Coalition (2022) Tech Coalition | Annual Report (technologycoalition.org).
TikTok. (2023). Transparency reports. TikTok Transparency Center. https://www.tiktok.com/transparency/en/reports.
Trust and Safety Professional Association (2024). Transparency report categories. Retrieved August 5, 2024, from https://www.tspa.org/curriculum/ts-
fundamentals/transparency-report/transparency-report-categories/.
Wager, N., Armitage, R., Christmann, K., Gallagher, B., Ioannou, M., Parkinson, S., et al. (2018). Rapid evidence assessment: Quantifying the extent of online-
facilitated child sexual abuse: Report for the independent inquiry into child sexual abuse. Available at http://cdn.basw.co.uk/upload/basw_103534-9.pdf.
Yubo. (2023). Transparency report. Yubo. https://www.yubo.live/safety/transparency-report.
Zoom. (2023). Transparency report. Zoom Trust Center. https://explore.zoom.us/en/trust/transparency/.
11