Apple's PLAN To SCAN Your Photos [Their REAL Motive]
Payette Forward Payette Forward
1.5M subscribers
85,317 views
0

 Published On Sep 27, 2021

David & David discuss Apple's plan to scan everyone's iCloud Photos for CSAM, or child sexual abuse material. It's a complicated issue.

Of course we believe in protecting children from predators — but at what cost? And is Apple really doing this for the children, or for themselves?

We dive into Apple's plan to scan everyone's photos for matches for child pornography, and talk about the potential ramifications of rolling out this image scanning algorithm to the masses.

Let us know your thoughts in the comments section below!

🎁 Join this channel to get access to perks:
   / @payetteforward  

📱 Visit https://www.payetteforward.com for more iPhone help, tips, and tricks!

Compare every cell phone, every plan, and coverage maps on UpPhone: https://www.upphone.com

Compare wireless coverage: https://www.upphone.com/coverage-map/

👋 Get Social With Payette Forward:
On Facebook:   / payetteforwardinc  
On Twitter:   / payetteforward  
On Instagram:   / payetteforward  

show more

Share/Embed