cross-posted from: https://programming.dev/post/37122335

  • Sexual extortion, or ‘sextortion’ scams against children and young people on the rise, with ‘hideous and callous cruelty’ used to blackmail victims.
  • Boys still at particular risk as numbers surge – making up 97% of confirmed sextortion cases seen by the Internet Watch Foundation (IWF).
  • UK’s Report Remove service, run jointly by Childline and the IWF, sees significant rise in children self-reporting nude or sexual imagery which may have got out of control online.
  • 3dcadminA
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    I have started hearing of this happened because of sharenting - or parents sharing everything about their kids

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      This doesn’t hold much, because how do these two relate? Of course that could be deemed privacy invasive by anyone - and even considered to be trust invasive due to consent being taken loosely. But I wouldn’t consider oversharing to be remotely close to such crime.

      • 3dcadminA
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Also the rise of AI with easily found child images is what is starting to happen. Massive library of those overshared on, for example, Facebook over the years.

        • sleen@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          I know AI usage is increasing, but I don’t understand how it allows ‘child images’ (which I assume is sexual abuse material) to be easily found.

          Additionally, I’d want clarification on how those massive libraries which are over shared are related to oversharing.

          • 3dcadminA
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            They do not need to be sexual abusive material, they find easily scraped images and then use AI to make the sexually abusive or pornographic to then blackmail the child/children and their parents. For instance, a case 2 weeks back, young mother shared images of her kids, not sexual but a few in swimsuits when they were young. These were then doctored and sent to this girl who is now older to blackmail her and then on to her mother in an attempt to blackmail her as well. The pictures were shared with a lack of understanding of privacy at the time so anyone could see them. Police struggling to find out who is blackmailing the person, and struggling to find a reason to actually investigate as they say it is a likeness of the person not the actual person and it was shared to the world years ago meaning permission was given (ie the picture was allowed to be shared at the time). Now of course I am not revealing any info as it is a current police investigation and that would be illegal but it appears to be going nowhere yet is disturbing to the kid and the mother

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            Think of it as nonconsensual AI generated CSAM. Like nonconsensual AI porn, you take public SFW photos and use the AI to generate explicit images using the provided photo as reference for the abused victim.

            As for the over share library, think of all the pictures you see of peoples’ kids in your social media feed, then consider how few people take proper privacy controls on what they post (or just intentionally post them publicly for laughs/attention). All of those images can be used as the basis for generation nonconsensual AI porn/CSAM.

      • 3dcadminA
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        I am seeing it already here in the UK. Remember our privacy laws are totally different than plenty of other places, and most overshared without understanding the hows or whys. Oversharing might not be an issue now, but it will be in the future - that is why it is important to understand why it can be bad