Google has announced a plan to do more to tackle online images of child sexual abuse.
Using both technology and funding, it hopes to find and eradicate images and track down abusers.
Google said it was helping create a database of images to improve collaboration between law enforcement, companies and anti-abuse charities.
It has also set up a $2m (£1.3m) fund to bankroll developers creating better tools to tackle images.
Spot and stop
Web firms in the UK have been at the centre of the debate about online images showing the sexual abuse of children following two high profile court cases in which offenders were known to have sought child pornography online.
Google said that since 2008 it had used technology that classified images giving them a unique identifier or “hash” to make it easier to spot abuse pictures.
The blogpost said it was going further by helping to create unique fingerprints of images it saw and then contributing them to a larger industry-wide database. This, it said, was helping police forces, companies and charities working together to detect and remove images. This co-operation would also help track down abusers, it said.
Google has also put $2m into what it called a Child Protection Technology Fund that would reward software developers who were working on programs to help eradicate abuse images.
“We’re in the business of making information widely available, but there’s certain ‘information’ that should never be created or found,” wrote Jacquelline Fuller, director of Google Giving, in the blogpost.
“We can do a lot to ensure it’s not available online – and that when people try to share this disgusting content they are caught and prosecuted,” she added.
Christian Berg, co-founder of NetClean which helped to pioneer the classification of images shared online by abusers and paedophiles, said there were many other initiatives already underway that helped to spot the pictures Google was targeting.
As well as hashing systems, police forces around the world and cross-border agencies such as Interpol were using a tool known as PhotoDNA to identify images. This, he said, was a more reliable way of producing a signature of an image as it could survive changes made to images as they were cropped, re-sized or manipulated by paedophiles in a bid to hide them.
Microsoft, Facebook and others had already adopted PhotoDNA and were using it to stop images of child sexual abuse being shared by their users, said Mr Berg.
Despite this, he said, Google’s initiative was a good move.
“We welcome them to the field and it’s great that they have put attention on the problem,” he said.
Google’s announcement comes as BT and TalkTalk refine they way they block access to sites known to harbour images of child sexual abuse. Instead of a generic “page not found” error, people will instead get a detailed warning which says access was denied because the page may contain illegal images.