The WhatsApp head posted in a Twitter thread, I read the information provided by Apple yesterday and I am concerned. I think this is a wrong approach and a blow to the privacy of people around the world.
Whatsapp (symbolic picture)
WhatsApp head Will Cathcart has criticized Apple for its plans to introduce photo identity tools that recognize images of child sexual abusers in the iOS Photo Library. WhatsApp said, such Apple tools should not be allowed to run on its platform. Cathcart said Apple has long needed to do more to fight child sexual abuse material (CSAM). But the approach they are adopting is very important for the world.
Will Cathcart posted in a Twitter thread, I read the information provided by Apple yesterday and I am concerned. I think this is a wrong approach and a blow to the privacy of people around the world. People have asked whether we will adopt this system for WhatsApp, then they have not answered. On Thursday, Apple confirmed plans to introduce new technology within iOS, macOS, watchOS and iMessage that will detect potential child sexual abuse imagery, but clarified the details needed from the ongoing project.
Private content can be easily scanned
According to a report in The Verge, new applications for cryptography rolling out this fall in new versions of iOS and iPadOS for Tolls in the US help limit the spread of CSAM Online, while designed for user privacy. are done. However, Cathcart said that it is an Apple manufactured and operated surveillance system. Which can be used very easily to scan someone’s private content. Which she or the government wants to control.
Cathcart said, in countries where iPhones are sold, there will be different definitions of acceptable things. Apple has said that other child protection groups may be added as hash sources as the program expands. Asked Cathcart, will this system be used in China? They will consider these tools out there as illegal and how will we be able to accept them? How will they handle requests from governments around the world to add other types of content to the list for scanning?
According to 9to5Mac, an internal memo submitted by Sebastian Marineau-Maes, a vice president of software at Apple, acknowledged that some people are concerned about the implications of the new child protection, but Apple will take care of user privacy.
read this also-
Avoid child sexual abuse content on phone, now Apple will scan your photo gallery
Now download COVID-19 vaccination certificate on WhatsApp, here’s how