Twitter
Advertisement

Apple to scan iCloud photos to crack down child pornography - Details inside

Apple is yet to officially announce the launch of this technology to identify Child Sexual Abuse Material (CSAM) on your phones

Latest News
article-main
FacebookTwitterWhatsappLinkedin

Apple is taking its software a notch further for the purpose of better protection of rights and privacy. The Tim Cook led company is gearing to introduce a tool that will scan through your Apple products to identify Child Sexual Abuse Material (CSAM). 

How will this work?

Apple will encode strategic identifiers the will further indicate CSAM from Apple’s end. Once indicated, these identifiers will run a search on the users’ iPhone, iPad, Mac photos on iCloud. After the search, the results based on the number of matches will be sent back to Apple. 

Though Apple is yet to make it official, the announcement was made by Matthew Green, a cryptography and cybersecurity expert, and associate professor of computer science at Johns Hopkins University.

He announced this information via a tweet, he said, "Initially, this will be used to perform client-side scanning for cloud-stored photos. Eventually, it could be a key ingredient in adding surveillance to encrypted messaging systems. The ability to add a scanning system like this to E2E (end-to-end encrypted) messaging systems has been a major “ask” by law enforcement the world over." 

This does seem a bit odd since at the end of Apple who has generally been a strong advocate of privacy because this technology that the company is set to release will go through all pictures, even the non-encrypted ones in order to crack down on child pornography. But it is still too early to state if this technology will do more harm than good. 

Find your daily dose of news & explainers in your WhatsApp. Stay updated, Stay informed-  Follow DNA on WhatsApp.
Advertisement

Live tv

Advertisement
Advertisement