Apple to Scan images for child abuse
Apple to Scan images for child abuse
Author
Discussion

boxst

Original Poster:

3,806 posts

169 months

Friday 6th August 2021
quotequote all
Although full of good intentions, I find this terrifying: https://www.bbc.co.uk/news/technology-58109748

It is the first step to allowing everything to be scanned to check if you are doing anything wrong. And what is allowed today, might not be tomorrow and you have all those documents or images scanned and hashed to be theoretically searched through at a later date.

s1962a

7,428 posts

186 months

Friday 6th August 2021
quotequote all
This wouldn't bother me as I don't really do anything the law would be interested in. If it helps reduce abuse/terrorism etc then it can only be a good thing. They have probably been doing all this for a while anyway - just look at how much data apple/google store already about you, including recording your GPS for exactly where you've been.

Apple photos - how do you think they recognise faces and try to 'suggest' pictures of the same people to you?

I went to a mates wedding a few weeks ago. Myself and other groomsmen were staying in a random hotel the night before, and about an hour after I got there I got an alert from Apple about a suggested made up photo album with party pictures from the same guys from a few years ago. How did Apple know that I was there in the hotel with the same fellas and recommended these random photos? Some of them have android phones and some apple, and we communicate using the signal app as they are all geeky like that.

I suspect big brother has already been watching us for a while.

Mr Pointy

12,854 posts

183 months

Friday 6th August 2021
quotequote all
It's nothing new. Look here:

https://www.pistonheads.com/gassing/topic.asp?h=0&...

Guy stores his family photos in Onedrive & is permanently locked out of all of his MS accounts with no practical means of appeal or redress.

wiggy001

7,051 posts

295 months

Friday 6th August 2021
quotequote all
I'm in 2 minds about this.

On the one hand I agree that the privacy aspect is very concerning. However, this will be scanning images you choose to upload to Apple's iCloud service, so as a private company surely Apply have the right to do this, just as you have the right not to use that service?

Images uploaded on PH are checked and I would assume if there is anything seriously dodgy the mods might see fit to report this to the relevant authorities, so is what Apple want to do much different?

As I say, I am conflicted. And it's a moot point anyway because this is being done in the name of stopping child abuse so anyone arguing against (for any reason) will be shouted down.

Starfighter

5,307 posts

202 months

Friday 6th August 2021
quotequote all
I wonder if it will false positive on pictures taken in galleries with the cherubs etc on she masterpieces.

Ari

19,765 posts

239 months

Friday 6th August 2021
quotequote all
I might be missing the point here, but won't anyone with dodgy images simply not upload them to the iCloud?

Jamescrs

5,921 posts

89 months

Friday 6th August 2021
quotequote all
Starfighter said:
I wonder if it will false positive on pictures taken in galleries with the cherubs etc on she masterpieces.
No- It will compare photos uploaded to images already stored on a central database used by law enforcement based on specific #values given to each image, so it's not going to be flagging up peoples holiday pics or pictures painted by Da Vinci for example.

Regarding another posters comments about people not being stupid enough to upload bad images to i Cloud - Never underestimate the power of stupidity.

I personally wouldn't have an issue with it, i'm not an Apple fan personally, I find their devices limiting but If I was an apple user this wouldn't put me off.

Mr Pointy

12,854 posts

183 months

Friday 6th August 2021
quotequote all
Ari said:
I might be missing the point here, but won't anyone with dodgy images simply not upload them to the iCloud?
But Apple's (or Microsoft's) view of dodgy might be very different from yours. You might want to store pictures of your kids playing in the back garden but Apple's automatic process might decide they are obscene & start the process of having you prosecuted.

Jakg

3,958 posts

192 months

Friday 6th August 2021
quotequote all
Mr Pointy said:
But Apple's (or Microsoft's) view of dodgy might be very different from yours. You might want to store pictures of your kids playing in the back garden but Apple's automatic process might decide they are obscene & start the process of having you prosecuted.
I think you've misunderstood the technology.

The iCloud-bit isn't scanning pictures looking for child abuse.

It's comparing the pictures that are uploaded to see if they are the *same image* as an existing known abuse image.

It's not going to recognise a picture taken of a naked child, or of a political slogan, unless that picture is already in the database (well, the hash is).

The idea is to stop the sharing of the images, not their creation.


There is a second part which analyses images for explicit content, but this is done on-device to flag warnings that they are being sent/received for child-limited users only, and is not part of iCloud.

EDIT:
https://www.apple.com/child-safety/

Apple said:
Another important concern is the spread of Child Sexual Abuse Material (CSAM) online. CSAM refers to content that depicts sexually explicit activities involving a child.

To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Pixelpeep 135

8,600 posts

166 months

Friday 6th August 2021
quotequote all
This has vibes of 'does your daddy bath you' questions in schools.

Yes, my dad baths me, because he's my dad. it doesn't mean he's abusing me.

(waiting for the.. yeh, but he shouldn't be bathing his 14 year old daughter biggrin)

snapping your kids first bath where they're obviously undressed is such an innocent, private thing. You don't wanna be having to beg apple to 'release' the lock out on all your apple devices whilst you explain that it was little jasmins first splash around.

The article does say the images will be compared to existing child abuse ones, and if they're a match it will flag, but i'm sure it will move on to actually analysing the content of non matched pics.

i think i would think hard about staying with apple (been an iphone only user since 2008, have ipads and an imac at home) if this become a reality. I understand the need, and it will no doubt help, but i'm sure all the wronguns will just store pics in zip files, or buy an android.

markcoznottz

7,155 posts

248 months

Friday 6th August 2021
quotequote all
wiggy001 said:
I'm in 2 minds about this.

On the one hand I agree that the privacy aspect is very concerning. However, this will be scanning images you choose to upload to Apple's iCloud service, so as a private company surely Apply have the right to do this, just as you have the right not to use that service?

Images uploaded on PH are checked and I would assume if there is anything seriously dodgy the mods might see fit to report this to the relevant authorities, so is what Apple want to do much different?

As I say, I am conflicted. And it's a moot point anyway because this is being done in the name of stopping child abuse so anyone arguing against (for any reason) will be shouted down.
It’s weird that ph hardly ever has people posting images. When forums first started it was the purpose of forums.

Getragdogleg

9,880 posts

207 months

Friday 6th August 2021
quotequote all
The internet has been like the wild west for a long time and i can see the merits of this sort of tech to catch dodgy bds doing horrible things but the trouble is with all this is the scope creep of the tech and how it will be used in future.

its child abuse today, adverse political views tomorrow.

Besides, the Pedos will simply not use cloud storage or cameras in phones, a non "smart" camera recording to a small flash card is not going to ping anything apart from a mk1 eyeball if its physically discovered.


eliot

11,988 posts

278 months

Friday 6th August 2021
quotequote all
so presumably changing 1 pixel in a photo will change the hash and it will go undetected?

JulianHJ

8,860 posts

286 months

Friday 6th August 2021
quotequote all
If they are going to use a hash database supplied by law enforcement then I think it's not unreasonable. That said, offenders will just switch to Android I guess...

boxst

Original Poster:

3,806 posts

169 months

Friday 6th August 2021
quotequote all
eliot said:
so presumably changing 1 pixel in a photo will change the hash and it will go undetected?
No one really knows yet, it seems a bit more intelligent than that: https://www.theregister.com/2021/08/05/apple_csam_...

Quite a few applications (like WhatsApp) will automatically save images to your phone, so if something is sent to you randomly then theoretically it could trigger this. All explainable but the Police turning up to ask questions and confiscate your phone ...

JulianHJ

8,860 posts

286 months

Friday 6th August 2021
quotequote all
Starfighter said:
I wonder if it will false positive on pictures taken in galleries with the cherubs etc on she masterpieces.
That's not how hashing technology works. They will be comparing algorithms known as hashes derived from known and confirmed child abuse images with hashes of their users' files. The chances of false positives are much lower than the chance of you being a DNA match with a random stranger.

ChocolateFrog

34,954 posts

197 months

Friday 6th August 2021
quotequote all
It will create a tsunami of work for law enforcement.

JulianHJ

8,860 posts

286 months

Friday 6th August 2021
quotequote all
ChocolateFrog said:
It will create a tsunami of work for law enforcement.
Surely this can only be viewed as a positive? Considering that the material in question, I can't see how anyone would have an issue with it. Anyway, there's already a tsunami of work for the teams (often known as Paedophile Online Investigation or POLIT) that deal with this. I'll happily pay an extra couple of quid a year on my council tax police precept if it means POLIT teams get a big boost in funding.

pquinn

7,167 posts

70 months

Friday 6th August 2021
quotequote all
JulianHJ said:
The chances of false positives are much lower than the chance of you being a DNA match with a random stranger.
Total bullst. The tech is not even within 3 orders of magnitude of being that good.

Teddy Lop

8,301 posts

91 months

Friday 6th August 2021
quotequote all
Getragdogleg said:
The internet has been like the wild west for a long time and i can see the merits of this sort of tech to catch dodgy bds doing horrible things but the trouble is with all this is the scope creep of the tech and how it will be used in future.

its child abuse today, adverse political views tomorrow.

Besides, the Pedos will simply not use cloud storage or cameras in phones, a non "smart" camera recording to a small flash card is not going to ping anything apart from a mk1 eyeball if its physically discovered.
I dunno, remember how they caught glitter.

I'm very mixed, I'm a huge advocate of your middle paragraph concerning the mission creep and for those of us able to stand back and take the broad view it's a very current and very portentous problem.

OTOH enforcement has to change to some degree to keep abreast of fast moving technology, theres only so much liberty a policed society can have before it breaks down into anarchy. What's the balence?