She had been abused since she was a baby. She was abused by a relative who took pictures and shared them online. He multiplied the abuse by letting another man spend time with her.
The 27-year-old lady, who now resides in the Northeast, is reminded of that abuse almost daily when she receives a law enforcement letter stating that someone has been accused of possessing the photos. She was notified in late 2021 that the pictures had been discovered on a man’s MacBook in Vermont. Her attorney then verified with law authorities that Apple’s iCloud had also been used to store the photos.
Months after Apple released a feature that enabled it to search for illicit photos of sexual abuse, the alert was sent. However, it swiftly dropped the option after being criticized by cybersecurity experts who claimed it may open the door for more demands for government surveillance.
The woman is now suing Apple under a fictitious name, claiming that the company failed to fulfill its pledge to safeguard victims like her. The lawsuit claims that Apple allowed the information to spread instead of using the tools it had developed to find, delete, and report photographs of her assault, causing victims of child sexual abuse to relive the trauma that has molded their lives.
The case was submitted to the Northern California U.S. District Court late Saturday. It claims that because of Apple’s shortcomings, the company has been selling subpar goods that have hurt a certain group of consumers, specifically victims of child sexual assault because it momentarily introducedÂ
“a widely touted improved design aimed at protecting children” but “then failed to implement those designs or take any measures to detect and limit”Â
child sexual abuse material.