xXthrowawayXx [none/use name]

  • 1 Post
  • 83 Comments
Joined 2 years ago
cake
Cake day: August 23rd, 2022

help-circle



  • I uh actually agree with you almost entirely. Except at the end I’m like “and that’s why it won’t work as protection”.

    Software hasn’t been treated like other fields of engineering and all operators have needed for protection from liability was the twin shields of “nothing I could do” and “I was doing nothing” to come out of any courthouse relatively unscathed.

    That type of “aww shucks technocracy” is only possible if you do the bare minimum or nothing at all. Once an operator implements some kind of protection (yes, even one with warning labels all over it), both defenses are rendered unusable.

    Now that you’ve done something you’re able to be held liable for the effects of what you’ve done and for knowing there was a problem.

    The picture gets even murkier when we look at how things are going! Lawsuits against Tesla for their self driving deaths are making waves not because they impugn the dignity of Americas biggest car manufacturer by market cap but because every judge who sees one raises the biggest eyebrow possible at software engineering not being held to the same standard as any other type, both in a court of law and within its own process.

    There’s a good chance that software PEs will become a thing (again?) as a result.

    The long and short of it is that because the only reason monsters like moot are able to exist is their sly lethargy and looking at the legal storm rolling into software engineering, having something bolted onto the backend like this would be a bad idea.

    I think automated tools like this can be put to use though if they were hosted separately and provided with an api that linked up nicely with some moderation queue standard and returned something like “entries 1,5 and 9 are likely csam” back to the moderator. It would at least save the mod from dealing with the material directly.

    So I guess I agree but come to the opposite conclusion.


  • I think you’re off base here. The utopian socialists were arguing against the methods and outcomes of revolutionary socialism, this person is trying their best to explain that this particular tool has serious legal repercussions within the framework we all live under. Those are pretty different.

    The reason I see the logic in their arguments is because there’s longstanding legal precedent for misuse of a tool or material because it’s better than nothing to not be a defense even if there are no other options available.

    So if you built a car so big no type of shock absorber could handle it cornering at speed and you knew it, using some amazing whiz bang material for shocks isn’t a defense because even though it’s the best thing you knew it wouldn’t work.

    Legally speaking, the right choice there is not to make an excessively dangerous vehicle if you don’t want to be held liable for negligence.

    It’s also the argument throughout unsafe at any speed although the courts always seem to side with the automakers 🤔

    Or if one were to get sued for hosting csam, using the latest whiz bang ai system for detection wouldn’t be a defense or even a point in your favor because you knew it wasn’t a reasonable use of the underlying technologies. You can’t say “judge, I was relying on the ai csam detector!” When the component parts of the ai csam detector have big “prototype, do not use in production” stickers all over them.

    Ultimately while these tools might protect mods and users from having to view csam in the moderation process, that’s just one side of the struggle and on its other side they’re a paper shield at best and proof of negligence at worst.


  • Lay off. This person is right.

    We here at hexbear are concerned about protecting people from seeing csam. That’s good. The rest of lemmy is concerned about that and the very real consequences of csam uploads for the sites, which is getting dropped by hosting and registrar and prosecuted for distributing.

    There are already cases where that kind of legal dos attack has worked, there’s even cases of anti csam organizations uploading it to reverse image search sites and then serving them papers when the reverse image search site displays the uploaded image to compare with nonexistent results.

    The person you’re replying to isn’t trying to fuck shit up, they’re telling you that this tool won’t actually solve the problem it’s marketed for.

    E: edited for clarity and kindness