Apple sued for neglecting to implement tools to identify CSAM in iCloud.
The company announced plans for CSAM detection tools in 2021, but later abandoned them amid privacy concerns.
Apple is presently undergoing a major legal tussle after a group of victims of paedophilia dragged it into a lawsuit. The case has to do with the failure of the technology company to take any proactive measures against scanning iCloud services for materials related to child sexual abuse, or CSAM.
In 2021, Apple announced that it was working on a tool designed to discover and signal such damaging images on its platform, which would then notify the National Center for Missing and Exploited Children if instances of CSAM were discovered there. However, major public backlash and privacy advocates saw this and pushed Apple to abandon the plan.
Now, it was filed as a lawsuit in Northern California, with demand for outrageous damages up to $1.2 Billion. The claim involves about 2,680 complainants who state that Apple has allowed the proliferation of CSAM on its devices to the still further harm and trauma to the affected people or at least could have prevented the further increase in abuse. According to the legal documents, Apple had no work on its previously planned measures nor the existence of a competent means to track or curtail the distribution of CSAM through its services.
In answer to the lawsuit, an Apple spokesperson named Fred Sainz stated that the company is committed to the fight against child sexual abuse material. He stressed that Apple is actively seeking new ways to approach the challenges of protecting user safety and privacy. This issue actually indicates the fine line that technology companies have to walk between user privacy concerns and how best to tackle important social problems such as child exploitation.
Thank you for reading this post, don't forget to follow my whatsapp channel
Discover more from TechKelly
Subscribe to get the latest posts sent to your email.
Comments are closed.