Apple Scanning Kids' Photos For Nudity: Have They Gone Too Far?!?
Payette Forward Payette Forward
1.5M subscribers
23,851 views
0

 Published On Oct 10, 2021

David & David talk about Apple's plan to scan your child's photos looking for sexually explicit content. This feature is available on iPhones enrolled in a Family Sharing plan.

If a child sends or tries to view a sexually explicit image in the Messages app, they'll receive a series of warnings and dialogues cautioning them about what the image may contain.

And, if the child is 12 or younger, their parents can opt in to receive a notification if a child chooses to view the potentially sexually explicit image.

❓What do you think? Is this an invasion of privacy, or a useful tool for parents? Let us know in the comments section below!

🎁 Join this channel to get access to perks:
   / @payetteforward  

📱 Learn more about Apple's plan to scan iCloud photos:    • Apple's PLAN To SCAN Your Photos [The...  

 Learn more about Apple Child Safety features: https://www.apple.com/child-safety/

 Security Threat Model Review of Apple’s Child Safety Features: https://www.apple.com/child-safety/pd...

Compare every cell phone, every plan, and coverage maps on UpPhone: https://www.upphone.com

Compare wireless coverage: https://www.upphone.com/coverage-map/

Visit https://www.payetteforward.com for more iPhone help, tips, and tricks!

👋 Get Social With Payette Forward:
On Facebook:   / payetteforwardinc  
On Twitter:   / payetteforward  
On Instagram:   / payetteforward  

#Apple #iMessage #Privacy

show more

Share/Embed