Apple delays plans to scan iPhones for child abuse images over privacy


A leading child safety charity has urged Apple to press ahead with the move (Picture: SOPA)

Plans for software capable of detecting child abuse images to be built into iPhones have been temporarily shelved over privacy concerns.

Apple has been developing a system which would automatically recongise illegal images when they are uploaded to iCloud and alert the authorities.

But now the technology giant has announced the launch has been put back to ‘make improvements’ after campaigners said the programme breached privacy standards.

Some have suggested the tool could be hijacked by authoritarian governments to look for other types of images.

Apple has previously said it would not allow that to happen and promised secure software which would not regularly scan a user’s camera roll.

In a statement, the company said: ‘Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” the company said in a statement.

‘Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.’

The software would automatically recognise abuse images already identified by police (Picture: ZUMA Press Wire/REX/Shutterstock)

Andy Burrows, head of child safety online policy at children’s charity the NSPCC, said the delay was ‘incredibly disappointing’.

He said the move would ‘undeniably make a big difference in keeping children safe from abuse online and could have set an industry standard’.

Mr Burrows added: ‘[Apple] sought to adopt a proportionate approach that scanned for child abuse images in a privacy-preserving way, and that balanced user safety and privacy.

‘We hope Apple will consider standing their ground instead of delaying important child protection measures in the face of criticism.’

Some campaigners are worried the programme could be hijacked by authoritarian governments (Picture: Artur Widak/NurPhoto)

The system works by looking for digital markers of known child sex abuse material provided by child safety organisations.

Apple is also planning a new feature in the Messages app, which warns children and their parents using linked family accounts when sexually explicit photos are sent or received.

The company will issue new guidance in Siri and Search which will point users to helpful resources when they perform searches related to child abuse images.

Apple said the two features are not the same and do not use the same technology, adding that it will ‘never’ gain access to communications as a result of the improvements to Messages.

Get in touch with our news team by emailing us at webnews@metro.co.uk.

For more stories like this, check our news page.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *