In an update on its Child Safety page, Apple announced last week that they are pausing plans to implement a feature that would scan iPads, iPhones, and iCloud photos for child sexual abuse materials (CSAM). “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the update reads.