The model is a massive part of the AI-ecosystem, used by Google and Stable Diffusion. The removal follows discoveries made by Stanford researchers, who found thousands instances of suspected child sexual abuse material in the dataset.
IIRC from a previous thread, different law enforcement agencies will release hashes or similar so the image can be detected without distributing the original
IIRC from a previous thread, different law enforcement agencies will release hashes or similar so the image can be detected without distributing the original