Google offers AI toolkit to report child sex abuse images

Numerous organizations have taken on the noble task of reporting pedophilic images, but it's both technically difficult and emotionally challenging to review vast amounts of the horrific content. Google is promising to make this process easier. It's launching an AI toolkit that helps organizations review vas t amounts of child sex abuse material both quickly and while minimizing the need for human inspections. Deep neural networks scan images for abusive content and prioritize the most likely candidates for review. This promises to both dramatically increase the number of responses (700 percent more than before) and reduce the number of people who have to look at the imagery.

Unlike the conventional approach, which simply compares image hashes against known offending images, the AI method can also flag previously undiscovered material. That, in turn, could help authorities catch active offenders and prevent further abuse.

The tool is free to both corporate partners and non-governmental organizations through Google's Content Safety programming kit. While there's no certainty that it'll have a dramatically reduce the volume of horrible images online, it could help outlets detect and report child sex abuse even if they have only limited resources.

Via: CNET

Source: Google



via Engadget RSS Feed https://ift.tt/2LQkXJK
RSS Feed

If New feed item from http://www.engadget.com/rss-full.xml, then send me


Unsubscribe from these notifications or sign in to manage your Email Applets.

IFTTT

Comments

Popular posts from this blog

Evernote cuts staff as user growth stalls

The best air conditioner

We won't see a 'universal' vape oil cartridge anytime soon