ljtryout (ljtryout) wrote in ohnotheydidnt,
ljtryout
ljtryout
ohnotheydidnt

ONTD Original: Apple introduces new Tools against Child Abuse... and Hollywood is scared.



First of all. A BIG Trigger warning for sexual and child abuse. If you're not comfortable with it, i recommend even skipping the comments. Second, i know ONTD doesn't read™, so i'll try to make this as simple as i can...in ONTD terms.


1. OK, so WTF is happening? Gimme the deets. But be brief, ONTD doesn't read.


Basically, Apple is introducing 3 new functionalities to protect children from abuse.
1. Siri will now give you more accurate information about what you can do towards CSAM (Child Sexual Abuse Material)
2. If your kid receives/sends a nude through iMessage it will be blurred and notified to their parents (if they established it that way)
3. (And the most controversial one) Apple will detect if you have CSAM content in your phone/ipad and report you to the authorities.


2. Why tho? Why now? Is Tim Cook bored or somethin?



Basically, Apple's doing this cause they've been a very privacy focus company. They rarely go through people stuff and that has led to it being a great way for pedos to cover their shit. Unlike other services that do track content, their CSAM reports in the last few years has been shitty to say the least, as Youtuber Rene Ritchie mentions.

Last year Apple only reported 265 cases of CSAM out of a total 21.4 million (Which is a huge horrifying number by itself)

(Screencap from Rene's video)


3. So Apple is looking at my photos? AWWWW HAAAALLEEE NAWWWW. I thought they were privacy focused and some shit.



No. they're not looking at your pictures... not exactly. They've developed code that doesn't identify pixels, but something called hash which is the data behind the photos. They worked with agencies that have a database of hashes of CSAM pictures and developed an algorythm that reads that hash if you have iCloud photos library enabled (that thing to sync your pics with the cloud). Also, this only applies to the US. You also have to have over 30 pictures identified through the hash and then a human filter reviews the pics before reporting you to the authorities.


4. Ok, too much wibbly wobbly techy techy timey wimey, bitch. I have Rupaul's episodes to catch up with. Why is this an issue?



Privacy. People are saying this is a backdoor for governments to force Apple to scout phones for other stuff as well... drugs, arms, terrorism, etc. (which sounds ok) but also homosexuality in forbidden countries, women fighting for their rights, etc.

Some people are also worried that they'll being reported for algorythm mistakes like having nude baby pics of their grandchildren or consensual sexual pictures between small people, etc. (The human filter should avoid these, but people do not want to get their private photos seen by another human as well.)

This has caused A LOT of media (even liberal and Hollywood related media) to surprisingly come out against these new measures.











5. Are you telling me men driven companies are scared of Apple finding child pornography in their phones? PretendsToBeShocked.gif



Basically, yes. A lot of media has turn this into a 'they're violating our rights' issue and leaving the child protection aside. Which made Apple's VP, Craig Federighi to come out and explain.



Most people are threatening to leave Apple services and move to other platforms (even though all of them already do some sort of CSAM scanning, especially Google) as a way to force Apple to stop doing this. (Which they're not) and since a lot of people know Apple devices are like the Caiman Islands for Child porn.

6. Ok, what about Hollywood?



There's no secret that Hollywood is full of Pedos. All the recent news and the last few years of people speaking up thanks to #MeToo have put this issue (a bit more) on the spotlight. And Cory Feldman has been quite spoken about it.


(LOL at Matt Lauer leading this interview, so defensive... now we know why.)

Also, it's not secret that Hollywood loves iPhones. Android who? California has one of the highest rates of iphone vs Android use with 66.23% using iPhones (and that was in 2018). Leading people to believe that Hollywood's pedos are using iPhones to manage CSAM content.

7. What's gonna happen now? What do i have to do?



Apple is quite determined to release this with iOS15, so there's no take backsies, but there will be a lot of push from the media against Apple...so, what can you do? Follow this simple 2 step guide:

1. Update to iOS15.
2. Don't be a fucking Pedo.


Source 1,2,3,4,5,6,7

Where's my pulitzer?!
Tags: #metoo, apple / iphone, ontd original, sexual misconduct
Subscribe

  • Post a new comment

    Error

    Comments allowed for members only

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 153 comments
Previous
← Ctrl ← Alt
Next
Ctrl → Alt →
Previous
← Ctrl ← Alt
Next
Ctrl → Alt →