• OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Fourthly, scanning for known, thus old material does not help identify and rescue victims, or prevent child sexual abuse. It will actually make safeguarding victims more difficult by pushing criminals to secure, decentralised communication channels which are impossible to intercept even with a warrant.

    This point is huge, and on its own explains why half baked compromises are worthless.

    The criminals will use banned chat apps, while innocent people get their messages read.

    • taladar@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      The reason they don’t care about that is that the whole thing isn’t about protecting children at all but about surveillance of the vast majority of people.

    • BestBouclettes@jlai.lu
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      Well that’s the point, catching CSAM is just a very convenient excuse. Once that’s through, it will just be a matter of extending it to terrorism. Then you can declare anyone a terrorist and bam you have free reign to monitor anyone you want.