Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.
Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.
But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.
— Read on apnews.com/article/technology-business-child-abuse-apple-inc-7fe2a09427d663cda8addfeeffc40196
- White House delays JFK document dump, and blames COVID - October 23, 2021
- WHAT WENT WRONG!? SOMEONE YELLED ‘COLD GUN’ AS ALEC BALDWIN WAS HANDLED WEAPON THAT KILLED HALYNA HUTCHINS; Reports indicate 3 misfires on set of RUST previously - October 23, 2021
- Death on Rust set! Production halted.. Alec Baldwin fired the shot! - October 22, 2021