Apple testing iMessage feature that protects children from nude images

Apple is testing iMessage feature that protects children from sending or receiving nude images by analyzing all attachments in chats of users marked as minors

  • Apple is beta testing a new feature in the second beta of Apple’s iOS 15.2
  • It is designed to protect children from sending or receiving nude images
  • The system analyzes attachments in iMessages of accounts marked as children
  • If it suspects nudity in a photo, the image will be blurred and the child will be warned about the content – but they can still send or receive the image 

Apple announced a new iMessage feature on Tuesday that protects children from sending or receiving nude images.

The feature, offered in beta testing, analyzes attachments in messaged of users who are marked as children in accounts.

If a child receives or attempts to send an image with nudity, the photo will be blurred and the child will be warned about the content – but they still have the option to send or view the sensitive content.

However, children have the ability to message someone they trust for help if they choose.

Apple said it will ensure message encryption is used in the process and promises none of the photos or indication of detection leaves the device, Apple shared in a blog post.

The safety feature is enabled in the second beta of Apple’s iOS 15.2 that was released Tuesday, but it is not yet clear when it will be released as an official feature. 

The feature, offered in beta testing, analyzes attachments in messaged of users who are marked as children in accounts. If a child receives or attempts to send an image with nudity, the photo will be blurred

With this protective feature, if a sensitive photo is discovered in a message thread, the image will be blocked and a label will appear below the photo that states, ‘this may be sensitive’ with a link to click to view the photo.

If the child chooses to view the photo, another screen appears with more information.

Here, a message informs the child that sensitive photos and videos ‘show the private body parts that you cover with bathing suits’ and ‘it’s not your fault, but sensitive photos and videos can be used to harm you.’

The latest update is part of the tech giant’s new Communication Safety in Messages, which is a Family Sharing feature enabled by parents or guardians and is not a default function, MacRumors reports.

The child will be warned about the content – but they still have the option to send or view the sensitive content. Apple said it will ensure message encryption is used in the process and promises none of the photos or indication of detection leaves the device

Communication Safety was first announced in August and at the time, was designed for parents with children under the age of 13 who would like to opt in to receive notifications if their child viewed a nude image in iMessages.

However, Apple is no longer sending parents or guardians a notification.

Apple told DailyMail.com in an email that the change is due to criticism it received from individuals and organizations that fear notifying parents or guardians could lead to parental violence or other abuse.

This feature is separate from the controversial feature Apple announced in August that plans to scan iPhones for child abuse images. 

The tech giant said it would use an algorithm to scan photos for explicit images, which has sparked criticism among privacy advocates who fear the technology could be exploited by the government. 

However, the feature has been delayed due to criticism from data privacy advocates who accessed Apple of using this to open a new back door to accessing personal data and ‘appeasing’ governments who could harness it to snoop on their citizens.

 Apple has yet to announce when the feature will roll out.

How Apple will scan your phone for ‘child abuse images’ – and send suspicious photos to a company employee who will check it before sending them to police 

The new image-monitoring feature is part of a series of tools heading to Apple mobile devices, according to the company. 

Here is how it works:

1.) User’s photos are compared with ‘fingerprints’ from America’s National Center for Missing and Exploited Children (NCMEC) from its database of child abuse videos and images that allow technology to detect them, stop them and report them to the authorities. 

Those images are translated into “hashes”, a type of code that can be “matched” to an image on an Apple device to see if it could be illegal.

2.) Before an iPhone or other Apple device uploads an image to iCloud, the ‘device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.’   

3.) Apple’s ‘system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content,’ Apple has said.  

At the same time Apple’s texting app, Messages, will use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.

‘When receiving this type of content, the photo will be blurred and the child will be warned,’ Apple said.

‘As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it.’

Similar precautions are triggered if a child tries to send a sexually explicit photo, according to Apple.  Personal assistant Siri, meanwhile, will be taught to ‘intervene’ when users try to search topics related to child sexual abuse, according to Apple.

4.) Apple says that if their ‘voucher’ threshhold is crossed and the image is deemed suspicious, its staff ‘manually reviews all reports made to NCMEC to ensure reporting accuracy’

Users can ‘file an appeal to have their account reinstated’ if they believe it has been wrongly flagged. 

5.) If the image is a child sexual abuse image the NCMEC can report it to the authorities with a view to a prosecution. 

 

  

Source: Read Full Article

Previous post Woman, 28, arrested after schoolboy, 10, mauled to death by dog while visiting pal's house
Next post Megan Barton-Hanson looks incredible with new highlighter-yellow hair