Using photo hashing, child sexual abuse material (CSAM) can be identified on iPhone devices. Apple says these measures have been developed in collaboration with child safety experts
Or the CSAM detection tool will allow the detection of child abusive photos stored in iCloud Photos.
Apple is reportedly planning to announce the Photo Identity tool, which will identify children’s abusive photos in the iOS Photo Library. Apple previously removed individual apps from the App Store over concerns about child pornography, but is now said to be rolling out such detection systems more widely.
Apple has announced new measures to limit the spread of child sexual abuse material (CSAM). The Cupertino-based tech giant introduces a tool to scan for CSAM or child sexual abuse content stored on your iPhone. These new CSAM detection features will roll out with future iOS 15, iPadOS 15, watchOS 8 and macOS Monterey versions. The CSAM detection feature will work in three areas including Photos, Siri and Search and Messages.
Apple says that these measures have been developed in collaboration with child safety experts and ensure the privacy of users.
CSAM will be identified by photo hashing
Using photo hashing, child sexual abuse material (CSAM) can be identified on iPhone devices. AppleInsider reported on Thursday. The source of this is security expert Matthew Green, a cryptographer and associate professor at the Johns Hopkins Information Security Institute. According to Green, the plan will initially be client-side – that is, monitoring the user’s iPhone.
Child pornography can be found in the phone
However, Matthew Green argues that it is possible that this is the beginning of a process that is followed by monitoring the data traffic sent and received from the phone. Ultimately, it could be an essential component in adding surveillance to encrypted messaging systems, Green said. The ability to integrate such scanning systems into end-to-end encryption messaging systems has been a major factor worldwide, he said. According to Green, such a tool can prove to be a boon in finding child pornography in people’s phones. Greene and Johns Hopkins University have previously worked with Apple to fix security bugs in Messages.
The new CSAM detection tool will allow the detection of child abusive photos stored in iCloud Photos. Apple claims that instead of scanning images in the cloud, the new tool does “on-device matching” using a database of CSAM image hashes from NCMEC (National Center for Missing and Exploited Children) and other child protection organizations.
With privacy in mind, Apple ensures that this database is “turned into an unreadable set of hashes that are securely stored on users’ tools.”
(with IANS input)
read this also-
MicroMax IN 2B: Discount of Rs 7,450 on this made in India phone of Rs 7,999, only today is your chance
Airtel launched Office Internet service, partnered with Google Cloud and Cisco for better convenience