Facebook and SRHR censorship

Facebook is the world’s biggest social network and its policies have a huge influence on the ideas and information its users are exposed to. Our Love Matters Global Network platforms rely on digital channels to spread information about safe, healthy, pleasurable sex, love, and relationships for young people. And the content moderation policies of online gatekeepers such as Facebook make their work more and more complicated.

Between January 2015 and August 2020, 1,792 ads from six Love Matters platforms were rejected by Facebook. Facebook often categorises educational content on sexual and reproductive health and rights (SRHR) simply as ‘sexual content’ such as pornography or explicit nudity. Despite differences in cultural settings, our Love Matters teams in India, Kenya and Egypt face the same challenges when publishing and promoting their content. Whether they talk about sexual health, relationships, bodies, or LGBTQ-related topics, there is always a risk that the content will be disapproved or taken down.

Male bodies are “ok”
What is accepted in one country may not be accepted in another. Vithika Yada, Head of Love Matters India, believes the algorithms Facebook uses to monitor content can be a problem:

“They are based on Facebook’s understanding of the region. An example would be content on LGBTQ rights. In some countries it is legal, in some – not. But Facebook’s understanding of regions is not based on what is important, but on how to attract more and more people to their platform. If most Facebook users in India are men, Facebook possibly uses algorithms to get into the needs of men. That is why it is so skewed towards men. Facebook does not have a problem with a half-nude male body.”

Users’ attitudes
Yet not only algorithms, but Facebook users themselves often come between the young audience and information about their SRHR. Restrictive cultural attitudes are an issue according to Love Matters Arabic’s Social Media Editor May Elhosseiny:

“If you post a picture of a woman who is wearing a swimming suit, sometimes, the ad gets approved by Facebook, but after one day they reject it again. That happens usually because other users who are not the followers of our page, may think that this is ‘nudity’. In the Middle East, the topics we talk about on our platform may be considered sensitive. People are very resistant in terms of talking about sex and women’s rights. Facebook is a business, and it follows the trends and shows what people want to see on their platform.”

Creativity in action
In response to the challenges posed by Facebook’s content moderation policies, Love Matters teams find creative ways to keep spreading the word about love, sex and relationships. Love Matters Kenya, for instance, uses Swahili or local slang to avoid English words such as “sex” or “vagina”. As Social Media Editor Fiona Nzingo explains,

“A Kenyan would understand, but someone from Facebook’s HQ would be wondering what it means, and it is hard to notice and find translations of these words”.

Elhosseiny employs a similar tactic for Love Matters Arabic:

“Anything that has a sexual context will most likely be disapproved. […] Sometimes, we are playing with words. Instead of saying sexual relationship we would, for example, say marital or intimate relationship,”

Constantly adapting and learning what can and cannot be promoted or published has become a huge part of the daily work for the Love Matters teams.

“That kind of strategising happens week after week, because with algorithms you cannot follow the same pattern the whole year. It is really a core part of our everyday work because we have to constantly check on what is ‘safe’ as per Facebook,” says Yadav.

Why does it matter?
These constant restrictions do have wider implications. For women in Kenya, for instance, according to Nzingo.

“Overall, in Kenya the majority of social media users are men because they are privileged to have phones and time to access social media. When we do have a chance to share information that would help woman, we want to amplify it. For some women social media is the only channel that they can access. They rely on this platform, and they should be able to benefit from it, not only in terms of entertainment but also in terms of learning about themselves.”

The same is true for Love Matters Arabic says Elhosseiny:

“If I did not have these restrictions, I would have been able to reach more people and make more impact; I could have opened discussions with lots of people about different topics. In order to create change, I need to talk to people who are not aware of these topics and who are not on the same page as myself.”

Yadav observes that what works for one region may not work for the other, and there is a need for advocacy on the issue.

“It can’t be one size fits all: guidelines and algorithms. Facebook needs to understand this and have much more localised understanding of issues and problems. They need to set up a policy which specifically looks at SRHR and young people, and this should be without any bias and judgment.”

Not just Facebook
The issue of content disapproval and takedowns goes beyond Facebook. In China, for example, Facebook is banned but our Love Matters team face the same challenges with local social media channels such as WeChat and must also find ways to circumvent the country’s strict censorship. A frequent approach is using alternative words, such as ‘little brother’ to refer to penis and ‘little sister’ to refer to vagina – terms that are understood by users. The team is constantly on the alert for work-arounds as new words and phrases are added to the country’s automated content filters.

The way forward
As an organisation working to create social change with young people through our digital communities, RNW Media recognises the fundamental importance of an enabling online environment. We work to advocate for the digital rights of young people by fighting for freedom of expression online. As part of these efforts, we are a member of the Dynamic Coalition on the Sustainability of Journalism and News Media of the UN’s Internet Governance Forum (IGF). During this year’s online IGF conference we are drawing attention to the issue of Facebook censorship on SRHR advertisements through a case study in the Dynamic Coalition’s annual report. During the IGF session #Netgov and news media sustainability in the times of crisis on Thursday the 5th of November (14:40-16:10 CET) Fiona Nzingo, Social Media Editor for Love Matters Kenya, will present this case study and call upon online gatekeepers to ensure more transparency and accountability in their content moderation policies.