
WorthlessSlavicShit
There are no happy endings in Eastern Europe.
★★★★★
- Joined
- Oct 30, 2022
- Posts
- 15,951
Interesting

.
View: https://www.reddit.com/r/MensRights/comments/1k7u4ag/women_who_hate_men_a_comparative_analysis_across/
www.nature.com
So basically, we finally got an, as much as could be hoped for, unbiased study comparing the text in feminist and "misogynistic" communities, and because it was done by an AI and not by humans who have the "women-are-wonderful" effect hardbaked into their brains, you just know the results will be very nice, and they don't disappoint
. Specifically, what was done in this study, is that they looked at archives of posts from two misogynistic (r/incels and r/mensrights) and two misandric (r/feminism and r/gendercrititcal) subreddits, one extant and one banned in each pair.
First, we get some unsurprising charts showing just how much more "online misogyny" is researched and focused on compared to "online misandry".
Then, there are some interesting results. First, they looked at "content toxicity".
Basically, they used a tool trained to detect "toxic speech" and "hate speech" to see how common it is in posts in the studied subreddits.
So far, the study isn't something soys and feminists would be surprised by, or something those of us wanting to prove them wrong would celebrate. After doing this, the researchers got the following graph, which, although it shows that posts in all four communities follows the same distribution, with most posts being either non-toxic at all (0) or extremely toxic (1), the misogynistic communities lead with extremely toxic content.
But then, things get more interesting. After that, they looked at the prevalent emotions in the texts. For this, they went two ways about it, looking both at text-level and user-level. The former is just about looking at raw text and extracting emotions from it. However, that means that one person writing walls of texts can drown out multiple others who write simple posts and thus skew the distribution, that's why they also did the latter analysis, which corrects for that by considering all the posts written by a single author as a single unit and gives equal weight to all those text units.
Once that was corrected for and the various users' posts were normalized like that, the distribution completely changed.
In the former, the various emotions were relatively evenly distributed. Clearly, the walls of texts posters very much experience a large spectrum of emotions. But on the user-level, hate absolutely dominated. Pound-for-pound, the users on those subreddits overwhelmingly tended to post hateful content. However, one side clearly predominates, and as you can see, it's the misandric subreddits where users specifically tended to post hateful content, while misogynistic ones posted it notably less. In fact, you can clearly see that r/incels posters were also much less likely than the users of the other three subreddits to post content expressing anger, and much more likely to post one showing sadness, with the researchers acknowledging all of that.
So much for "hateful incels" who are all "angry white men" according to feminists, redditors, and other people obsessed with us
.
View: https://www.reddit.com/r/MensRights/comments/1k7u4ag/women_who_hate_men_a_comparative_analysis_across/

Women who hate men: a comparative analysis across extremist Reddit communities - Scientific Reports
In the present online social landscape, while misogyny is a well-established issue, misandry remains significantly underexplored. In an effort to rectify this discrepancy and better understand the phenomenon of gendered hate speech, we analyze four openly declared misogynistic and misandric...

So basically, we finally got an, as much as could be hoped for, unbiased study comparing the text in feminist and "misogynistic" communities, and because it was done by an AI and not by humans who have the "women-are-wonderful" effect hardbaked into their brains, you just know the results will be very nice, and they don't disappoint
First, we get some unsurprising charts showing just how much more "online misogyny" is researched and focused on compared to "online misandry".


Then, there are some interesting results. First, they looked at "content toxicity".
For the task, we adopted a version of RoBERTa, a transformer-based text classifier, which has been fine-tuned for hate speech detection49. As reported by the authors, the training dataset comprised 41,255 entries, of which 18,993 have been manually annotated as “not hate” and 22,262 as “hate”. The posts tagged as “hate” have been in turn divided into sub-categories, such as “Animosity”, “Dehumanization”, “Derogation”, “Support”, “Threatening”, and “Other”.
Basically, they used a tool trained to detect "toxic speech" and "hate speech" to see how common it is in posts in the studied subreddits.
So far, the study isn't something soys and feminists would be surprised by, or something those of us wanting to prove them wrong would celebrate. After doing this, the researchers got the following graph, which, although it shows that posts in all four communities follows the same distribution, with most posts being either non-toxic at all (0) or extremely toxic (1), the misogynistic communities lead with extremely toxic content.

But then, things get more interesting. After that, they looked at the prevalent emotions in the texts. For this, they went two ways about it, looking both at text-level and user-level. The former is just about looking at raw text and extracting emotions from it. However, that means that one person writing walls of texts can drown out multiple others who write simple posts and thus skew the distribution, that's why they also did the latter analysis, which corrects for that by considering all the posts written by a single author as a single unit and gives equal weight to all those text units.
Once that was corrected for and the various users' posts were normalized like that, the distribution completely changed.

In the former, the various emotions were relatively evenly distributed. Clearly, the walls of texts posters very much experience a large spectrum of emotions. But on the user-level, hate absolutely dominated. Pound-for-pound, the users on those subreddits overwhelmingly tended to post hateful content. However, one side clearly predominates, and as you can see, it's the misandric subreddits where users specifically tended to post hateful content, while misogynistic ones posted it notably less. In fact, you can clearly see that r/incels posters were also much less likely than the users of the other three subreddits to post content expressing anger, and much more likely to post one showing sadness, with the researchers acknowledging all of that.
Despite no significant differences can be observed regarding GenderCritical and Incels (which remain mainly skewed toward fear and sadness, respectively), the user-level perspective shifts the distributions peaks of Feminism and Mensrights: while at a text-level, the two communities do not show a visible gap, here we can see that Feminism consistently overcomes the others in terms of hate, followed by GenderCritical and Mensrights.
Conversely, when a user-level perspective is taken into account, distributions drastically change, magnifying the hate peak of Feminism, which significantly overcomes the other communities. Also GenderCritical, despite maintaining an inclination toward fear, skews on anger and hate as well. Under this optic, indeed, misandric communities express more negative sentiments than misogynistic ones.
So much for "hateful incels" who are all "angry white men" according to feminists, redditors, and other people obsessed with us
@based_meme @DarkStar @Regenerator @Mecoja @Stupid Clown @Sewer Sloth @Sergeant Kelly @To koniec @reveries @NIGGER BOJANGLES @veryrare @LeFrenchCel @PersonalityChad @GeckoBus @Lazyandtalentless @OutcompetedByRoomba @weaselbomber @ItsovERfucks @Grodd @Epedaphic @KING VON @Wumbus