This is a long post so be ready to read it,

Warning: the word り

Summary

ノc

🅺 is in this post
~
~
~
~
~
~
~
~
~

After a lot of patrolling I have seen a lot of slurs that’s have been added to pages by anonymous users. I really think that slurs should be hidden because whenever I am patrolling anonymous users will tend to add words like the f and n word into wikiHow. That’s not it though, I feel like this feature is needed most because some people can get easily offended by such words and that will make RCP loose editors. I even have personal experience because one time someone added the n-word into an article and I was discouraged from using RCP for a few days.
Before:
Screenshot 2023-05-17 6.56.15 PM
After:
wow
How wikihow should fix this: The Best way to fix this is to add some javascript to the backend of wikiHow that will blur out that word if the word is on a list of slurs.
How this will be implemented into patrol coach: I don’t know but what I do know is that patrol coach is pretty customizable so I think that it will be pretty easy to implement that into patrol coach.

2 Likes

Its an interesting idea, but it may also cause unnecessary complications in my opnion.

For one, I don’t think this will help the process of patrolling. When operating RCP, seeing what change was made to the article is important in the process of deciding whether to patrol or to revert. I’m not much of a tech person, but such systems might not be the most reliable. I’m worried that it may detect non-offensive material and blur it out, leading to reverts on potentially clean edits. (e.g. A detection system may blur out a section if there is a word that is spelled with a slur inside of it) Or perhaps it wouldn’t be able to detect slurs when they aren’t written “correctly”, like when a typo is made or it is added without spacing.

Secondly, I don’t really see the necessity of implementing a system like this. As far as I’m aware, the showing of slurs in RCP has not been reported as a problem by an editor; RCP’s popularity with editors does not seem to be threatened by whether or not an editor can see an offensive edit made by a bad faith user.

RCP exists as to make sure such edits get reverted. Being able to see whether an edit is of such nature without having to doubt the accuracy of code is important. As of right now, being able to see such content makes my patrolling a lot quicker, especially as personally seeing content like this doesn’t effect my willingness to use RCP.

That being said, if seeing slurs in RCP is a problem that I do not know of, I don’t see the harm of implementing something like this so long as it works in the intended manner.

3 Likes

Usually where is see slurs is when an edit gets reverted and then it says “see what was undone” and when you click see what was undone you will mostly be cuss words.


do you see the “see what was undone button”

Being able to see the exact change made in Recent Changes Patrol is critical in determining whether the change should be marked as patrolled, rolled back, skipped, or edited.

For those especially sensitive to potentially offensive content, it may be best to avoid participating in RC Patrol. As long as wikiHow remains a wiki, we’re subject to trolls adding offensive things, and I’m afraid censoring those edits to patrollers doesn’t serve us well.

9 Likes

Plus what you know, this will essentially interfere patrolling.

I’m late to the discussion but just wanted to add that slurs/offensive words are subjective too. For example “monkey” is usually used in an innocent way (talking about these guys::monkey:), but the other day I reverted an edit where someone used it as a racial slur.

It might be better to implement a warning that says “It looks like this edit might be offensive. Are you sure you want to publish it?”, although I don’t know how helpful that would be in actually stopping offensive content from making its way to RC patrol.

2 Likes

While this is possible to do, the reality is that people adding slurs to articles like this are doing it intentionally, so adding a warning won’t help.

3 Likes

Yeah but what if you have to create an account to publish the edit if it was a racial slur/slur?

It not really in the wiki spirit to make someone create an account just to edit. You can see some of the history behind this choice here: wikiHow:Anonymous - wikiHow

1 Like

Would it be possible to just prevent edits that contain banned words? Similar to how it won’t let you publish an edit if you’re trying to add an article to a nonexistent category. We could have a small list of offensive words with no alternative meanings, and have those stop an edit from publishing.

I don’t know how practical/helpful it would be, but just an idea.

2 Likes

While this is indeed possible, vandals may be able to work around it. For example, they may use le3tspe@k, where they replace an a with an @, or use a 1 for an i, among other things. They could also use áccent chäractérs, or e:m:oj:information_source:s:tired_face::tired_face::tired_face:. The possibilities are endless. And some people see circumventing filters as a game, which can make the problem worse.

If there is a particular problem with a word or phrase though that keeps happening repeatedly, you could always report it to the Administrator Notice Board . Some repeat vandals like to use specific words as a “mark” that they were the ones who made the edit, so the filters can work well in these situations. Also, if you ever see a spam word or phrase that is used repeatedly, you can also report those as well.

4 Likes