I think a fully p2p system with a community, a user, and a post being identified by a key and connected via asymmetric cryptography, and then a reputation system yielding a number between, say, -100 and +100, would work better.
That reputation system wouldn’t be like karma, it would possibly also affect whether we store something below -50 score, to then share.
It should be relative - we may attribute an evaluation to a thing, which would affect its children. Or we may attribute an evaluation to a user, and then derive score for a thing from that user’s evaluation of it. Or maybe all of the described.
Maybe something like that is going to be easier to build on Locutus when it becomes operational.
Eh, that was the whole point. Do not leave moderation to other people or at least make that easy.
It should be relative
Which means that the score of anything would be derived from 1) what you directly set, 2) what another user sets, modified by what you set for that user, 3) what a user sets, modified by what is set for him by another user, which has a value set by you attributed …
One can even make a logic where you see high score for things disliked by people you dislike.
There is some computative difficulty, but nothing big for our times.
Yes.
I think a fully p2p system with a community, a user, and a post being identified by a key and connected via asymmetric cryptography, and then a reputation system yielding a number between, say, -100 and +100, would work better.
That reputation system wouldn’t be like karma, it would possibly also affect whether we store something below -50 score, to then share.
It should be relative - we may attribute an evaluation to a thing, which would affect its children. Or we may attribute an evaluation to a user, and then derive score for a thing from that user’s evaluation of it. Or maybe all of the described.
Maybe something like that is going to be easier to build on Locutus when it becomes operational.
Removed by mod
Eh, that was the whole point. Do not leave moderation to other people or at least make that easy.
Which means that the score of anything would be derived from 1) what you directly set, 2) what another user sets, modified by what you set for that user, 3) what a user sets, modified by what is set for him by another user, which has a value set by you attributed …
One can even make a logic where you see high score for things disliked by people you dislike.
There is some computative difficulty, but nothing big for our times.