There’s a fairly interesting discussion on Quorum.org about moderation systems in an online forum. You can go read it if you like, but I’ll try to summarize my own views here:

  • An effective moderation system depends on strong identity/authentication to prevent astroturfing.
  • Moderation activities should be visible, because invisible activities (that affect others’ reputation) are the moderation-abuser’s best friend.
  • There should be a limit to how much one person can affect another’s reputation, to prevent both “karma wars” (two people modding each other down) and “mutual admiration societies” (two people modding each other up) from getting out of control.

My suggestion, therefore, is that there be a cap – two caps, actually – on how many positive and negative moderations a person A can make of another person B’s posts, per time period, and “excess moderations” cause all moderations by A of B in the same direction to be “scaled down” so that the total influence equals the limit. For example, let’s say the limit is 5. If person A moderates 3 of person B’s posts down, nothing in particular happens. However, if s/he does so 7 times, each down-moderation is scaled down to a value of 5/7 so that the total value is 5. A corresponding – but perhaps different – limit would also be enforced for up-moderations. This would effectively limit the influence that A could have on the total rating of B’s posts, which constitutes B’s reputation in such a system.

I’d be interested in hearing more about this idea from people who know more about this stuff. Feel free to send email or leave a note on PlatSpot.