At the beginning of this month, Reuters reported that Apple officially is going to start scanning photos and messages stored in their cloud service called iCloud with the excuse that their system will scan Child Sexual Abuse Material (CSAM). As it was already posted in the following thread...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.