image text translation
Just because they are giving up on censoring child sexual abuse material (CSAM).
I was sued:
This feature is a feature that was not first announced in August 2027.
Hash values of photos uploaded to iCloud Drive by users
Child sexual abuse material compared to registered data from existing investigative agencies
The point is that you can block all:
However, after Apple announced that it would introduce the feature,
There hasn’t been much criticism from around the world: user data.
Censorship is a violation of the privacy and security that has always been advocated.
Not only is this trend inconsistent, but there is also political opposition from government authorities.
The hash value used for comparison can be exploited to capture power.
When I use the same algorithm again, it’s a completely different picture.
A statement in which the hash value used for comparison has the same creation month.
My comments continue: These criticisms and comments are strong.
Eventually, Apple announced that it would stop developing the feature in 2022.
I marked it
And last Saturday, 2,680 victims of child sex crimes and their families
A class action has been filed in Northern California Court:
Players have given up on CSAM censorship as promised and continue to buy it.
It is said that victims are suffering due to Jin being distributed in the market.
This is from Yu: They demand a total of $1.2 billion in compensation for damages.
I did:
Apple or Apple will respond to this lawsuit (even though CSAM censorship was originally planned).
(although not in a certain way) without compromising privacy and security.
Various protective measures have been implemented to prevent illegal content sharing.
I said I was on the move.
I’ll censor it. – You don’t protect personal information!!!
ㅇㅇ Then I won’t do it – are you ignoring child victims!!
= sue