Apple Faces $1.2B Lawsuit Over Abandoned Child Safety Scanning Feature
• 1 min read
A child abuse survivor is suing Apple for withdrawing its planned CSAM detection system for iCloud, seeking damages exceeding $1.2 billion. The lawsuit represents thousands of victims and challenges Apple's balance between user privacy and child protection measures.
Apple Empowers Kids to Report Nudity in iMessages with iOS 18.2 Update
• 1 min read
Apple's iOS 18.2 introduces a child safety feature in Australia, allowing kids to report nude content in iMessages directly to Apple. This update builds on existing safety measures and aligns with new regulations to combat child sexual abuse material online.