NIXsolutions: Apple’s New Feature Protects Minors’ Safety

Apple has added a new feature to iOS 18.2 to protect the morality and mental health of minors. When a nude image is detected, the system blurs it, displays a warning, and requires password confirmation to proceed. This feature maintains end-to-end encryption without introducing backdoors for authorities, preserving user privacy.

NIXsolutions

Machine Learning and Protective Options

The feature relies on machine learning algorithms to identify nudity in images. If detected, the image is blurred, and a message informs users of the sensitive content. Options include exiting the conversation, blocking the sender, accessing online safety resources, or sending an alert to a parent or guardian. These actions aim to empower minors to make safe decisions when encountering inappropriate material.

Supported Platforms and Regional Rollout

On iPhone and iPad, the feature works across Messages, AirDrop, FaceTime video messages, contact posters, and some third-party apps with photo-sharing capabilities. It will also be available on PCs, Apple Watches, and the Vision Pro headset. Devices must run iOS 18, iPadOS 18, macOS 15 Sequoia, or visionOS 2 to support the feature.

The system initially launched in Australia, where regulators are exploring rules requiring tech companies to monitor terrorist content and child abuse “where technically feasible.” Apple had previously explored a different plan in 2021 that involved scanning local content on devices and notifying authorities, but it faced backlash over privacy concerns. Critics warned that such a system could be misused by authoritarian governments. Following the controversy, Apple abandoned the initiative in 2022, reminds NIXsolutions.

We’ll keep you updated as Apple refines this feature and rolls it out across more regions.