Apple is facing a lot of heat over its plan for new software that would scan iPhones to detect images of child sexual abuse material. The software will scan all the photos for matches of known CSAM before they are stored onto iCloud Photos. The tech giant is getting applause from child protection groups. But there are concerns about misusing the technology and many have started to call it a ‘surveillance system.’ Some even went on to say that the government can use the technology to keep an eye on its citizens. The smartphone manufacturer is also planning to scan encrypted messages of users for sexually explicit content. This has also not gone down well for privacy advocates.
The images will be scanned by the tool and if it finds a match, the image will be reviewed by a human. According to reports, in case of any content of child pornography is confirmed, the account of the user will be disabled. The company will also inform the National Center for Missing and Exploited Children of further action. According to the company, the software will flag only those images that are already in the database of the center and marked as child pornography. So parents, who very often click photos of their child in the bath, need not worry about the system.
Commenting on this new system of Apple, WhatsApp head Will Cathcart said that the move by the tech giant is “very concerning”. In a Twitter thread, Cathcart said that said, “the surveillance system operated by Apple could be very easily used to scan private content of users. He also expressed his apprehensions about how such a system can be exploited by the government of China or other countries. The remarks are not surprising because Apple has been criticizing Facebook, the parent company of WhatsApp, for its record on privacy. However, the social media giant has successfully embraced end-to-end encryption.