Apple Agrees Its CSAM Scanning Initiative For Checking Child Abuse Materials Could Be Misused In A Shock Turn Of Events
iPhone maker Apple has just made a shocking turn of events by confirming how its approach to tackling explicit and child abuse materials, better known as CSAM scanning, could be abused. The company highlights how repressive governments might use it for scanning ordeals such as plans related to political protests. The Cupertino firm rejected any reasoning provided during that period. But one ironic twist did take center stage after a reply was put forward to the government of Australia recently. Apple mentioned how it had planned to roll out on-device scan plans with the help of techniques used for digital fingerprinting. Those fingerprints are a means to match certain pictures without any individual getting the chance to view them. They’re designed to be used in a very fuzzy manner to match certain pictures that were cropped or edited while giving rise to a small number of false positives. To be a little more clear in this regard, Apple confirmed how its recent proposal was an ap...