themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 3 months agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square90linkfedilinkarrow-up1605arrow-down123cross-posted to: fuck_ai@lemmy.world
arrow-up1582arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 3 months agomessage-square90linkfedilinkcross-posted to: fuck_ai@lemmy.world
minus-squareMiðvikudagur@lemmy.worldlinkfedilinkEnglisharrow-up11·3 months ago“Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.
minus-squareTipsyMcGee@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up1·edit-22 months agodeleted by creator
“Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.
deleted by creator