Apple Starts to Scan User's Photos

Enjoyed this video? Join my Locals community for exclusive content at thedickshow.locals.com!
2 years ago
17

Apple announced they would start searching your photos under the guise of fighting child harm. "The new feature scans iCloud Photos images to find child sexual abuse material, or CSAM, and reports it to Apple moderators — who can pass it on to the National Center for Missing and Exploited Children, or NCMEC."

From Episode 270
For the full episode (video) and bonus episodes, go to http://patreon.com/thedickshow and https://thedickshow.com/
http://patreon.com/thedickshow

#Iphone #Privacy #TheDickShow

Loading comments...