cloudThing logo in white
Menu open icon
email: info@cloudthing.com
tel: +44 (0) 121 393 4700
Menu closed icon

General

Generally useful pages

Sectors

We know loads about this stuff

What we do

The Building Blocks for cloudThing Magic

5,000 Industry Experts Urge Apple To Halt Attempts To Identify Child Abuse Material Over Privacy Concerns

Sun Aug 08 2021

Apple have been urged to halt their plans for automated photo scanning in an open letter over fears it could be exploited by threat actors

More than 5,000 tech and privacy-first dignitaries and organisations have signed and published an open letter addressed to Apple, urging them to re-think their plans to roll out their new photo-scanning feature that they’ve designed to automatically detect images of child sexual abuse material (CSAM) on stored on apple devices or servers.

 

The feature was announced last week by Apple, for its new, upcoming versions of iOS and iPadOS. The statement revealed these updates would feature ‘new applications of cryptography’ which would allow the tech to giant to identify any CSAM images being stored on Apple devices or uploaded to iCloud Photo’s, Apple’s online storage.

 

Industry experts and privacy advocates have been pushing back however, penning an open letter to Apple to warn them of their concerns the update may have the potential to bypass any end-to-end encryption that would normally safeguard a user's privacy.

open quote mark

While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.

Open letter to Apple

close quote mark

The letter has now been signed by well over 5,000 individuals and organisations including tech execs, privacy supporters, legal experts, researchers, professors and many more.

It cautions Apple, warning them that this update has the potential to create a backdoor into their software which could be exploited by hackers in the longer term.

The letter goes on to request Apple immediately stop the roll-out of the new auto photo-scanning software and issue a public statement that will "reaffirm their commitment to end-to-end encryption and to user privacy".

open quote mark

Apple's compromise on end-to-end encryption may appease government agencies in the US and abroad, but it is a shocking about-face for users who have relied on the firm's leadership in privacy and security. It is impossible to create a client-side scanning system that can only be used for sexually explicit images sent or received by children. That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change.

Electronic Frontier Foundation (EEF)- Blog Post

close quote mark

Apple did confirm when announcing the new software that the system would ensure nobody could learn about images stored on a device or server if they were not sexually explicit.

Before an image ever gets stored in iCloud Photos, an on-device matching process will have to be performed for that image against the database of known CSAM images, compiled by the US National Center for Missing and Exploited Children (NCMEC).

The image being checked will then get converted to a hash key or unique set of numbers for the system to try and match against NCMEC’s database using cryptography.

 

If the system flags an image, it will be reviewed by a human to confirm a match, the users account will be instantly disabled and the findings will be reported to NCMEC.

 

When announcing the technology Apple stated it had an error rate of less than one in a trillion and that it doesn’t breach their user’s privacy agreements.

Not Quite Ready To Get Back To Work Just Yet?

UK GOV MUST IMPROVE THE ELECTRIC CAR CHARGING NETWORK NOW – CMA REPORT FINDS

RNLI CHIEF EXEC HITS BACK AT FARAGE’S “MIGRANT TAXI SERVICE” COMMENT

AUSTRALIAN COURTS RULES A.I CAN OWN INTELLECTUAL PROPERTY

© cloudThing 2021

Sun Aug 08 2021

email iconinfo@cloudthing.com
© 2020 Copyright cloudThing ltd. All rights reserved. Company registered in England & Wales no. 7510381, VAT no. 152340739