'Google's hash matching solution has revolutionised our workflows and led to better, faster results. Because of Google’s significant contribution to the National Center for Missing & Exploited Children, automated processes reduce the need for human review of previously reported CSAM, which minimises our staff’s exposure. It's critical that these images are taken down as quickly as possible because every time a child's photo is re-shared, they are re-victimised all over again. This also allows us to focus our work on new and unknown child victims and survivors.' – National Center for Missing & Exploited Children
'Our platform is making visual content accessible, and it is our responsibility to make sure that all the user-contributed content is moderated effectively. Google's Content Safety API and the AI behind it enables us to streamline a tedious process with an efficient way to quickly review, take down and report any offending content – helping us in doing our part to make the world a safer place.' – Plugon
'Easy to integrate, fast to respond, stable and accurate, the Content Safety API has significantly reduced the amount of time spent by the content analysts reviewing CSAM images, optimised the analysis process, and had a positive impact on their wellbeing.' – Safernet Brasil
'The Content Safety API is an easy-to-use and scalable solution that we integrated a few months ago. We are pretty sure that it can become an important technology to protect our community and prevent the sharing of inappropriate content.' – Yubo
'The Content Safety API helps analysts working on identifying images depicting sexual abuse on minors with prioritisation. Given the ever-increasing volume of reports to process, successfully prioritising those reports for review is a real challenge. This technology enables analysts to see reports that include content showing sexual abuse on minors faster, and therefore lets them act faster to protect victims and remove the content.' – Point de Contact
'CSAI Match was straightforward to deploy and continues to be an effective tool to help us further detect and disrupt the distribution of child sexual abuse material.' – Adobe
'Combatting the rise of video-based child sexual abuse material presents new challenges and requires new technology. CSAI Match has proven to be part of the solution, enabling us to better detect and remove video CSAM from our platforms.' – Yahoo!
Content Safety API
Does the Content Safety API work for video?
The Content Safety API is designed for images, but through YouTube’s CSAI Match, organisations can access fingerprinting software and an API to identify matches against our database of known abusive video content. Learn more.
Who can sign up to access the technology and Content Safety API?
Industry and civil society third parties seeking to protect their platform against abuse can sign up to access the Content Safety API. Applications are subject to approval.
Why do you make these tools so widely available?
We believe that the best approach to tackling online child exploitation is to collaborate with other companies and NGOs. We have long worked across industry and with NGOs to support the development of new data-driven tools, boost technical capacity and raise awareness. We believe that making these tools widely available, so that our partners can use AI to better review content at scale, is an important part of this fight.
Does CSAI Match work for images?
CSAI Match is designed for video, but through Google’s Content Safety API, a collection of tools are available to industry and NGO partners, offering machine learning-powered classification for images. Learn more.
What information is returned with an identified match?
The match will identify which portion of the video matches known CSAI, as well as a standardised categorisation of the type of content that was matched.
What makes the CSAI Match technology so efficient?
CSAI Match detects near-duplicate segments of known CSAI content. This includes full duplicates that MD5 hash matching would get, as well as near-duplicates which might be re-encodings, obfuscation, truncations or scaling of CSAI videos – even if a video contains only a small portion of CSAI possibly mixed with non-CSAI content. Partners run a fingerprinting binary to produce a 'fingerprint' of the video, a byte-sequence similar to a MD5 hash. This is then sent to Google’s CSAI Match service, which is specifically designed for efficiency when scanning a video against YouTube’s corpus of known CSAI references.