Apple officially drops plans to scan iCloud for child abuse images

Image of an article titled Apple officially canceling its scan of iCloud photos for child abuse material

photo: Anton Ivanov (Shutterstock)

Apple has officially pulled the plug on one of its most controversial proposals ever: a plan Scan iCloud images for signs of child sexual abuse material (or CSAM).

Yes, last summer, Apple Announce It will introduce on-device scanning — a new iOS feature that uses advanced technology to quietly sift through individual users’ photos for signs of unwanted material.new feature is Designed So if the scanner finds evidence of CSAM, it alerts the human technicians, who may then call the police.

The plan immediately sparked a torrent rebound From privacy and security experts, critics argue that the scanning feature could eventually be repurposed to find other types of content.SecondEven having such scanning capabilities in iOS would lead to wider monitoring abuses, Critics allege, and TonHis general consensus is that the tool CSoon to be a back door for the police.

At the time, Apple fought fiercely with these companies criticizebut the company eventually backed down, saying shortly after initially announcing the new feature that it would “put off” Execute until later.

It now appears that date will never come.On Wednesday, while announcing a series of new iCloud security measures feature, the company also revealed that it will not move forward with its device scanning plans.in a statement shared Via Wired magazine, Apple made it clear that it has decided to take a different route:

Following extensive consultations with experts to gather feedback on the child protection initiatives we introduced last year, we are deepening our investment in communications safety features, which we first rolled out in December 2021. We further decided not to advance our previously proposed CSAM detection iCloud Photos tool. Protecting children doesn’t require companies to comb through personal data, and we will continue to work with governments, child advocates, and other companies to help protect young people, uphold their right to privacy, and make the internet a safer place for children and us all.

Apple’s plans seem well-intentioned. The digital diffusion of CSAM is main problem– Experts say the situation has only gotten worse in recent years. obviously, Efforts to solve this problem is a good thing.In other words, Apple’s underlying technology The recommended use – and the surveillance dangers it poses – does not appear to be the right tool for the job.

.

Leave a Reply

Your email address will not be published. Required fields are marked *