The problem is that the stance is incredibly shortsighted and in a way bigoted itself. Take a word filter that contains some regex for n**a. They are saying you should never use slurs and this word in particular in public discourse.
So the word above word is used in lyrics of a music genre with predominantly black musicians. In addition to saying we don't want our software to be used by racists, they also say "we don't want our Software to be used to discuss certain kinds of black music" (arguably a racist stance just by itself). Talk about unintended side effects.
yes, this is one of the trade offs of any system built where one must decide between human moderation/curation vs automating moderation/curation.
if automation is chosen there will absolutely be situations where perfection is impossible. if human’s unparalleled ability to see nuance is chosen then the cost scales along with the amount of information.
the fact is, if we want a community and we want to keep signal above noise, we will need some form of removal of spam, child porn, racism, etc…
automatic tools can’t nuance as well as humans.
then human mods start nuancing and someone will point at stuff and call it biased.
So the word above word is used in lyrics of a music genre with predominantly black musicians. In addition to saying we don't want our software to be used by racists, they also say "we don't want our Software to be used to discuss certain kinds of black music" (arguably a racist stance just by itself). Talk about unintended side effects.