iPhones will send warnings to parents if their children are sexting

iPhones will send sexting warnings to parents if their children send or receive explicit images – and will automatically report child abuse pictures on devices to authorities

  • New safety tools unveiled to protect young people and limit spread of material
  • The measures are initially only being rolled out in the US, the tech giant said
  • But there are plans for it to soon be available in the UK and across the globe

iPhones will send sexting warnings to parents if their children send or receive explicit images – and will automatically report child abuse pictures on devices to the authorities, Apple has announced.

A trio of new safety tools have been unveiled in a bid to protect young people and limit the spread of child sexual abuse material (CSAM), the tech giant said.

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide.

iPhones will send sexting warnings to parents if their children send or receive explicit images – and will automatically report child abuse images on devices to the authorities, Apple has announced

The new Messages system will show a warning to a child when they are sent sexually explicit photos, blurring the image and reassuring them that it is OK if they do not want to view the image as well as presenting them with helpful resources.

Parents using linked family accounts will also be warned under the new plans. 

Furthermore, it will inform children that as an extra precaution if they do choose to view the image, their parents will be sent a notification.

Similar protections will be in place if a child attempts to send a sexually explicit image, Apple said. 

Among the other features, is new technology that will allow the company to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

It will be joined by new guidance in Siri and Search which will point users to helpful resources when they perform searches related to CSAM.

The iPhone maker said the new detection tools have been designed to protect user privacy and do not allow the tech giant to see or scan a user’s photo album.

Instead, the system will look for matches, securely on the device, based on a database of ‘hashes’ – a type of digital fingerprint – of known CSAM images provided by child safety organisations.

While the measures are initially only being rolled out in the US, Apple plans for the technology to soon be available in the UK and other countries worldwide

This matching will only take place when a user attempts to upload an image to their iCloud Photo Library.

Apple said that only if a threshold for matches for harmful content is exceeded would it then be able to manually review the content to confirm the match and then send a report to safety organisations.

The new tools are set to be introduced later this year as part of the iOS and iPadOS 15 software update due in the autumn, and will initially be introduced in the US only, but with plans to expand further over time.

The company reiterated that the new CSAM detection tools would only apply to those using iCloud Photos and would not allow the firm or anyone else to scan the images on a user’s camera roll.

The announcement is the latest in a series of major updates from the iPhone maker geared at improving user safety, following a number of security updates early this year designed to cut down on third-party data collection and improve user privacy when they use an iPhone.

Source: Read Full Article