Twitter
Advertisement

How Google, Facebook and Microsoft can scan for child porn in your account without actually violating your privacy

Latest News
article-main
FacebookTwitterWhatsappLinkedin

For decades now, there has been an ongoing struggle between the masses and their governments over regulating the seemingly vast world wide web. While freedom activist, Julian Assange and Edward Snowden ergo, have demanded the internet remain free, 'the powers that be' have dedicated their efforts to controlling and monitoring it.

And despite the many ethical conflicts and some illegal spying, the web has largely remained unrestrained. Or so we believe.

Earlier this month, Google reported one of its users who possessed child abuse imagery on his Gmail account. And even as most cheered Google on, for their "heroic" act, what everyone was really thinking about was, "How often does Google go through our emails searching for objectionable content?"

Their answer—we don't! Or at least, "Not really"

So then how do they catch the bad guys?
Well, turns out, the tech giants—Google, Facebook, Microsoft and others, have a nexus. But its not your regular nexus of evil, the NSA scandal notwithstanding. This is one of the good kind of collaborations, called the Technology Coalition.

In 2009, Microsoft, along with Dartmouth College, had developed a new technology that would help identify and remove some the "worst of the worst" images of child sexual exploitations from the Internet. Being the good guys that they are, they shared this technology with their fellow tech giants and even the National Center for Missing & Exploited Children (NCMEC) in the US. 

Dubbed as PhotoDNA, it solely works to find and disrupt the spread of child pornography. Following this the PhotoDNA was installed on Bing, OneDrive and Outlook.com services, and later on Facebook and Gmail.

How does PhotoDNA work?
PhotoDNA is an image matching technology that helps identify images that could contain actions of child abuse and pornographic in nature. "It creates a unique signature for a digital image, something like a fingerprint, which can be compared with the signatures of other images to find copies of that image," Microsoft explains on their blog

"When child pornography images are shared and viewed amongst predators online, it is not simply the distribution of objectionable content – it is community rape of a child. These crimes turn a single horrific moment of sexual abuse of a child into an unending series of violations of that child. We simply cannot allow people to continue trading these horrifying images online when we have the technology to help do something about it," they share on their blog.

 


How effective is it?
In 2011 alone, PhotoDNA has successfully evaluated more than two billion images leading to identification of more than 1,000 matches on SkyDrive and 1,500 matches through Bing’s image search indexing.

However, Google's Chief Legal Officer David Drummond, pointed out in the Telegraph, "While computers can detect the colour of naked flesh, only humans can effectively differentiate between innocent pictures of children and images of abuse. And even humans don’t get it 100% right," he wrote.

But it works, Google confirms. "Since 2008, we have used "hashing" technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique fingerprint that our computers can recognise without humans having to view them again," Drummond writes. (By hashing, he means PhotoDNA. They don't use the actual word. It must be competitor thing)

Available freely
Microsoft has donate the technology since then, making PhotoDNA available to law enforcement globally at no charge via NetClean.

So, while it is a little relieving to know that it is not big corporations scanning our emails but their futuristic technologies doing that instead, we can take comfort in the fact that it is at least for a good cause. "We’re in the business of making information widely available, but there’s certain 'information' that should never be created or found," explains Google. "We can do a lot to ensure it’s not available online—and that when people try to share this disgusting content they are caught and prosecuted."

You are right, dear Google. But question remains, can you trust this technology to do the right thing?

Find your daily dose of news & explainers in your WhatsApp. Stay updated, Stay informed-  Follow DNA on WhatsApp.
Advertisement

Live tv

Advertisement
Advertisement