Microsoft and Google are cooperating in an effort to make it much more difficult for child predators to find illegal images online by blocking search results for about 100,000 search terms. The companies also are collaborating on methods to better identify illegal abuse images and remove them more quickly.
Both Microsoft’s Digital Crimes Unit and people within Google have been working on this issue separately for several years now, and each company has made quite a lot of progress on the problem. They work closely with law enforcement agencies and child exploitation prevention groups to not just remove illegal abuse content but also to help victims recover. Now, the two companies are pooling their resources to prevent users from finding the content their looking for when they use any of about 100,000 search terms related to child abuse.
“We’ve fine tuned Google Search to prevent links to child sexual abuse material from appearing in our results,” Google Chairman Eric Schmidt wrote in the Daily Mail Monday.
“While no algorithm is perfect – and Google cannot prevent paedophiles adding new images to the web – these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids.”
Google and Microsoft both have teams that work on identifying and removing illegal abuse images from the Web, and because this process can’t be done reliably by computers, humans have to be involved in distinguishing abuse images from legitimate ones. Microsoft has been developing technology to help fingerprint illegal images, which it has shared with Google. A similar technology also is being applied to videos on YouTube.
Google also will be loaning engineers to the Internet Watch Foundation in the U.K. and the National Center for Missing and Exploited Children in the United States.