Examining Cyberbullying and Responding to Systemic Racism

It is estimated that more than four billion people worldwide will be using social media by 2025. Although a majority of people use social media to communicate with family and friends, people also use platforms and apps. to gain information and engage with communities on a range of issues. The polarization and sharing of news content in an era of “twist” and misinformation exacerbates potential conflicts online and can reinforce false rhetoric about specific social issues and racial groups. As a result, social media provides a forum for hate speech and cyberbullying to thrive with a limited understanding of the tools or tactics to counter these attacks. As a result, around 70% of people report having done something aggressive towards someone online, the majority of which report being cyberbullied themselves. Even more disturbing, nearly 90% of teens say they have witnessed bullying online.

While false rhetoric, hate speech and cyberbullying have many deleterious effects, there is a silver lining: over 80% of young people say they have seen other people stand up during incidents of cyberbullying and intervene online. This high percentage shows the power of bystander intervention – a strategy that has been shown to be effective in ensuring the dissemination of more evidence-based public health information – and holds great promise in addressing and curbing online interactions that reinforce systemic racism. Even more promising, a majority of young people say they want to identify effective strategies to intervene in situations of cyberbullying.

While existing studies focus primarily on gender-related or LGBTQIA-related cyberbullying, research focusing on hate speech and cyberbullying related to race and racism has received less scrutiny. Racism continues to be one of the most polarizing topics in America. The polarization of social media has helped reopen Pandora’s box that allows white supremacy and racism to wreak havoc in people’s lives. As previous research has shown, reactions to the #BlackLivesMatter movement have created echo chambers on social media that reinforce hate speech and cyberbullying linked to race and racism. Social media gives people the opportunity to hide their identities, much like the KKK balaclavas of the past.

This report aims to identify effective strategies to combat hate speech and disinformation online. By examining how people react to cyberbullying, our goal is to highlight spectator intervention strategies that are effective in building healthy communication, calming anger and frustration, and changing attitudes. This research has broader implications for leveraging strategies, tools, and tactics, many of which have already helped address the spread of public health disinformation, and for the development and implementation of public health strategies. ‘positive adaptation for better mental and emotional health outcomes among marginalized communities.

Accordingly, we conceptualize effective witness strategies as those that:

  • are viewed by other social media users as supportive;
  • change the discussion in a more positive, objective and less antagonistic way; and
  • change the online behavior of the bully or agitator.

Through this effort, the team aims to answer the following questions: How do people combat disinformation online, especially related to systemic racism, and, more specifically, how do people engage in the response? viewers on social networks? What strategies do they use and how effective are people in changing attitudes? How do people encourage healthy coping strategies for better mental and emotional health outcomes?

By analyzing over two million tweets and messages deleted from Twitter and Reddit as of 2020, we looked at the effectiveness of viewer strategies used online to combat racism. These social media platforms were specifically chosen because they have inherent ranking systems that allow us to examine which strategies are considered to be the most effective. On Twitter, people like and retweet the posts. On Reddit, people rank comments that move them up or down in the importance hierarchy queue to be seen better by others. Both platforms are also open, allowing most people to comment on most tweets or posts.

Methodologically, we conducted a quantitative analysis of tweets and posts and content analysis of comments. The analysis focused on four areas related to anti-racism (systemic racism, police brutality, educational inequality, employment and wealth) using synonyms for each term to search for hashtags on Twitter and posts on Reddit that use these terms.

We found four main types of racist speech: stereotypes, scapegoats, accusations of reverse racism, and echo chambers. We also found four types of spectator intervention strategies, which include: calls, insults or teasing, attempts to educate or provide evidence, and moderation of content. However, only one in six Twitter threads and just under 40% of Reddit threads involved spectator actions. Our findings contribute to research identifying and disseminating findings on online communication models and effective strategies to address hate speech and disinformation about systemic racism.

In this report, we provide an overview of academic research on cyberbullying and social support, a section on detailed methods regarding our analytical approach, and the quantitative and qualitative results of our investigation of how discussions of systemic racism unfold. demonstrate on social networks.

>> Download the full report


Source link

Comments are closed.