cross-posted from: https://programming.dev/post/37122335

  • Sexual extortion, or ‘sextortion’ scams against children and young people on the rise, with ‘hideous and callous cruelty’ used to blackmail victims.
  • Boys still at particular risk as numbers surge – making up 97% of confirmed sextortion cases seen by the Internet Watch Foundation (IWF).
  • UK’s Report Remove service, run jointly by Childline and the IWF, sees significant rise in children self-reporting nude or sexual imagery which may have got out of control online.
  • Passerby6497@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    Think of it as nonconsensual AI generated CSAM. Like nonconsensual AI porn, you take public SFW photos and use the AI to generate explicit images using the provided photo as reference for the abused victim.

    As for the over share library, think of all the pictures you see of peoples’ kids in your social media feed, then consider how few people take proper privacy controls on what they post (or just intentionally post them publicly for laughs/attention). All of those images can be used as the basis for generation nonconsensual AI porn/CSAM.