Using moderation to build communities

The classic Silicon Valley moderation model is all about the content – identifying material that is abusive or incites violence. For RNW Media, moderation goes deeper than removing the bad stuff, it’s about looking at the conversations taking place and generating dialogue. Our moderation strategies are based on choices about how we want our online communities to interact and encouraging positive behaviours that result in safe, inclusive spaces where young people can speak freely.

Citizens’ Voice operates in restrictive settings where freedom of expression, assembly and association is limited and young people lack access to reliable information and alternative points of view. Local teams create and distribute alternative (and independent) sources of digital media, which offer young people pluralistic information and perspectives on sociocultural norms and values. Our platforms and multi-media content are the foundation for digital communities – safe spaces where young people from across political, ethnic, racial, regional or religious divides can engage in discussion and ask questions on sensitive topics in a way which is often impossible in the offline space.

Local expertise
Careful moderation is an essential tool to build vibrant, respectful and safe digital communities. When applied as part of an holistic engagement strategy, moderation allows us to convert passive lurkers into active participants – bringing marginalised groups into the conversations and creating a safe space where young people feel not only able to take part but benefit from it. RNW Media’s country teams all have local moderators who follow the online discussions seven days a week. By engaging with users and asking specific questions on posts and comments, they encourage young people to think critically and engage in constructive discussion.

Ignore, challenge, warn
Moderators adopt different strategies, depending on their goals. In order to create a safe and secure environment where people feel able to participate in the discussion and feel heard, our moderation strategy focuses on reinforcing constructive conversation and behaviours and limiting the impact of antagonistic or unconstructive user comments. The tactics employed may be as simple as ignoring unwanted comments so they don’t get extra visibility, but moderators may also choose to challenge antagonistic users with an alternative viewpoint. Moderators can also warn users who do not abide by community guidelines with a yellow card system, ultimately leading to users being blocked. When the goal is to inform users, and validate their opinions, moderators will answer users’ questions, provide additional information and settle disputes with facts or by referring to research.

Many of the topics Citizens’ Voice covers are sensitive, touching on social, cultural, gender and religious norms. This kind of content can generate strong reactions, and the moderators aim to stimulate constructive and respectful dialogue and to create space for different opinions to be expressed. Tactics can include prompting people of opposing viewpoints to engage with each other as well as encouraging users to read the content or other, related content.

Taking the middle ground
Polarisation is always a risk when discussing sensitive issues and creates an atmosphere where users feel unable or unwilling to participate. When working to diffuse polarisation, moderators look to manage the conversation and move it from monologues to dialogue. The most effective tactic for achieving this is to target the middle ground. This can take the form of ignoring extreme positions and steering the discussion away from them. It also requires changing the tone by using mediating language to try and connect with a diverse range of users rather than focusing on ‘right’ or ‘wrong’.

Does it work?
We are working to measure the effectiveness of our moderation using A/B testing. This involves comparing the results when discussions around the same content/post are moderated or not moderated. Early findings from several small tests carried out on the Facebook pages for our Yaga Burundi and Benbere (Mali) platforms show that moderated posts had more comments, and more of these comments were considered ‘real’ comments, compared to phatic (simple comments such as hallo) or one-word comments on unmoderated posts.

Making space for women
Analysis of our channels compared to social media platforms in general in the countries where we work also reveals that in some countries more women engage (like, comment, share) with our content compared to their engagement on Facebook in general. Fourteen percent of all Yemenis on Facebook are women and we have the same percentage of women followers on our Yemen Youth Panel platform. However, 26% of all engagements on our platform are from women while the average figure for social media in Yemen is just 10% of engagements from women. We see similar results in Libya. We have very close to the same percentage of women’s participation on our Facebook pages as is the case for Facebook in general. However, women’s engagement on our page is around 65% while the Facebook average for Libya is around 35%.

These early findings back up our assumptions around community moderation. More extensive A/B testing is planned to further confirm that moderation can influence the quality of conversations and help create an inclusive environment that encourages more users to participate.