1. Great way for people in poverty to contribute to the server (BONUS: at some point you could assign badges on the person’s profile for for how many things they’ve moderated).

  2. Have a captcha at the start of each “session” of moderating 10-20 posts to help prevent people spamming support for abusive content. After 10-20 posts you do another captcha. This number could be modifiable based on community size.

  3. Require some kind of consensus of volunteer moderators regarding each piece of content being moderated. Best two of three or three of five or whatever. This number could also possibly be modifiable based on community size.

  4. Make it simple by showing the rule the content was reported for violating. Each moderation report would be per community rule and the prompt would say “This was reported for X does this content contain X?” to keep things simple and (with a blurb about the specific rule) to keep things simple for the volunteer.

  5. Editing to add: would also help to differentiate between server-wide rule violations and community-specific violations. People who are active members of a community should be favored for judging community-specific rule violations. Overall server rule violations could be judged by anyone, so nazi shit or gore or whatever can get removed promptly. Community-specific violations would wait a couple hours (or more or less! This could be a setting you can change according to your server or community size!) to let all online community members get their vote in and possibly hit the requisite yea vs nay threshold before server-wide volunteers get to have a say.

  6. Like somebody else said, maybe have a system for flagging somebody who’s ALWAYS in the 1/3, 2/5 for review to let people double check if they’re a troll vs just trying to stick up for a genuine minority viewpoint?

  • Waves@sh.itjust.works
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    When Reddit started posturing about “democracy”, I started thinking about a direct democracy would work for moderation.

    In my mind, it’s a lot like this - you ask members of the community to moderate a few comments. But you then take their decisions and put through another round with another temp mod.

    Only if they corroborate each other do you perform a mod action. Not sure how mod tools handle it (long term obligations stress me out), but I envision you’d have to mark a comment for removal and check off the rule(s) it violates

    Similar to your idea, it would turn a responsibility into volunteer work you could do in the moment, and it could ensure that the rules are based on the understanding the community has of them

    There’s a lot of ways to slice it, and it would need some light statistics, but it’s something possible if people are into the idea

  • jlj@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Reminds me a bit of Slashdot’s meta moderation, back in the day. I used to pick that up maybe once or twice a week, if I had a moment. The user experience was excellent; its benefits to the community were less clear to me, but I put that down to a transparency issue.

  • GhostedIC@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Is this supposed to be for admin reports only, or mod reports? I’d be nervous about community outsiders doing moderation. If its admin only then theres less wiggle room with rules and it is potentially more appeal-able. I still think the potential for crappy calls is pretty high.

    • Apytele@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Updated idea:

      Differentiate between server-wide rule violations and community-specific violations. People who are active members of a community should be favored for judging community-specific rule violations. Overall server rule violations could be judged by anyone, so nazi shit or gore or whatever can get removed promptly. Community-specific violations would wait a couple hours (or more or less! This could be a setting you can change according to your server or community size!) to let all online community members get their vote in and possibly hit the requisite yea/nay threshold before server-wide volunteers get to have a say.

  • atkion@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I think this could be a pretty good system, but it would require a far more robust moderation framework than we have right now, as well as a lot of thought. I’m just brainstorming here, but I think at a minimum we would need:

    • An elegant appeal system for when bad calls are made, maybe even run the appeals back through the same moderation loop as regular reports (with a rate limit to prevent abuse)
    • A way to (democratically?) remove people with a history of opinionated or incorrect moderation decisions from the system
    • A better modlog, allowing complete transparency rather than “mod banned this user”
    • A distribution engine for moderation that does not allow users to pick and choose what they moderate, to avoid the issue pointed out by @ShadowAether@sh.itjust.works
    • A way to prevent the same few terminally-online user(s) from moderating constantly, to the point that the whole instance slowly warps to their preferences
    • Perhaps a method of giving accounts with strong reputations more weight in mod decisions? This one’s definitely arguable, but I can imagine a future where people create bots or alts to auto-approve their rule-breaking posts
    • tcely@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      A moderation bot that collated reports against rules and selected active users to vote on if the rule was broken or not seems like a useful thing to create to me.

      Batching these into small groups and assembling a jury to decide if a particular rule was broken by each of the 1-7 reported posts should be a short task for the people involved. As long as the same people don’t always end up being selected for the jury the results should be mostly good too.

    • Derproid@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Ooo these sound really good. Also if there are enough people moderating I think it could be better to allow multiple users to moderate the same content to avoid bias. Like 2/3 or 3/5 users must agree it’s a rule violation before an action is taken.