May 7 (UPI) -- Hate communities often flourish online for years, raising the question of how they persist. My research team has found that powerful stories keep members of a hate group galvanized, either by repeating the story over and over or by constantly adding fresh accusations and interpretations to it.
I'm a computational social scientist who studies social and political networks. My colleagues and I uncovered these trends by examining 10 years of posts, reactions and participation patterns in Facebook groups that shared antisemitic and Islamophobic content. Our findings have been accepted at the 2026 International Conference on Web and Social Media.
First, we measured who was posting and how that related to engagement on a site. Groups in which a small number of people produced most of the content tended to attract more reactions and responses. Then we looked at subjects the group members discussed -- religion, immigration, geopolitics -- and the kinds of stories members told about those topics, such as describing an entire group of people as criminals or warning that certain types of people are secretly taking over a country's way of life.
When we put these pieces together, we discovered some clear patterns. Messages posted by a few very active people were strongly associated with higher site engagement in the form of likes and shares in the near term. And repetition -- espousing the same ideas again and again -- was an effective tactic. We also found that when many users kept adding fresh accusations, conspiracy theories and explanations, a group tended to persist. Very uniform content that used the same framing led to less engagement over time.







