@[email protected] to [email protected]English • 11 months agoAI image training dataset found to include child sexual abuse imagerywww.theverge.commessage-square15fedilinkarrow-up1125
arrow-up1112external-linkAI image training dataset found to include child sexual abuse imagerywww.theverge.com@[email protected] to [email protected]English • 11 months agomessage-square15fedilink
minus-square@[email protected]linkfedilinkEnglish7•edit-211 months agoremoving these images from the open web has been a headache of webmasters and admins for years in sites which host user uploaded images. if the millions of images in the training data were automatically scraped from the internet, I don’t find it surprising that there was CSAM there.
removing these images from the open web has been a headache of webmasters and admins for years in sites which host user uploaded images.
if the millions of images in the training data were automatically scraped from the internet, I don’t find it surprising that there was CSAM there.
Don’t they need to label the data?
Not manually