Pages

Monday, October 28, 2013

Google is still displaying vile child porn

Report

Google is still displaying vile child porn

It’s child’s play for paedophiles to locate shockingly explicit images using Google’s search function, which even auto-suggests even more disgusting child pornography for sickos to download on results pages.
Notorious internet paedophile Tristan Jackson.
Notorious internet paedophile Tristan Jackson.
  • Google’s search serving up vile and indecent images of children
  • The company is failing to keep its indexes free from child porn
  • So called ‘SafeSearch’ mode does not prevent images appearing
  • Auto-complete function leads to even more shocking content
  • Company failed to respond or act when contacted by The Kernel
A Kernel investigation has revealed that Google Images is both displaying child pornography and pointing users in the direction of even more hard core content via its “auto complete” function.
Google is openly displaying sexually provocative images of children for a variety of keyword searches.
While shocking images of child pornography are mainly kept hidden on the so-called “Deep Web” and private messaging services, paedophiles continue to post on easy-to-access image hosting websites, which are indexed by the search giant.
Typing the first three letters of one the biggest image hosting services, Imgur, into Google, Google suggests that you might mean another board with a similar name. This latter board is hosted in Russia and is notorious for hosting indecent images of children.
Readers should not attempt to reproduce these steps as depending on their browser settings they may be committing a criminal offence.
Google’s suggestions for search terms to append to the name of the board reveal a deeply worrying lack of accountability over the “auto complete” suggestions offered up by the search giant’s algorithms.
Searching for the name of this hosting service in Google Images and on the main Google homepage displays pages of naked and nearly naked young children, many of which are sexually provocative.
Staggeringly, Google admits that “suspected child abuse content has been removed” from the page – implying they are fully aware of the type of results searching for this word yields yet are failing to police some of the most vile content on the internet.
Such negligence would be bad enough, but Google’s suggestions for search terms to append to the name of the board reveal a deeply worrying lack of accountability over the “auto complete” suggestions offered up by the search giant’s algorithms.
One of the suggestions given by Google’s algorithm is to add the word “growing” to the name of the image board above. Clicking through will bring up nude images of young teenagers.
This is how much of a Google search results page we can legally publish.
This is how much of a Google search results page we can legally publish.
It also suggests appending the words “lolmaster” or “lilliloli”, which are the names of prolific posters of indecent images of children on the board.
Once these suggestions have been clicked, Google Images then suggests new terms, such as “Kids”, “Kids Album Passwords”, “Toddler” and names of notorious sets of child pornography photos.
The prompt to look for “Kid Album Passwords” is particularly troubling and points to a dramatic lack of oversight from Google.
While the readily available images Google displays from both the image board in question and other sites tagging photos with the name of that site to draw attention are public, much of the board’s content is password protected.
Users exchange passwords, encouraging each other to create their own sick content so they can view other people’s. The Kernel wasn’t prepared to view the content hidden behind these passwords, but we can assume the content is even worse than that available via a Google search.
Turning Google’s SafeSearch feature on or off made little difference to the images that could be found, according to our testing.
Google knows this is happening: it has removed images from the results before. But despite our communication with the company, as we went to press it had neither commented on nor removed the images in question from its search results.

No comments:

Post a Comment