“You are incomplete.”
“You are deficient in intellect.”
“You have half a brain.”
These are not just phrases we come across while reading a book from the medieval era. Women in many regions and countries across the world are still bombarded with such degrading and inferior messages offline and online. I myself have had to listen to such language in real life.
In this blog post, I will study the nature of explicit misogynistic narratives in Arabic and explore the real harmful impact on women’s lives and wellbeing in the region. Inferiority in this region doesn’t only mean less or lower, it could lead to death. I will compare the Arabic narratives to English. I also explore how implicit hate speech against women manifests online and why it is harder to detect in English.
The statistics about violence against women are staggering. The UN reported that gender-based violence, including sexual violence, against women and girls is one of the most systematic and widespread human rights violations in the world.
The persistent use of degrading and offensive language online discourages women from speaking up and encourages abuse and violence offline. On a regular basis, men and women are equally exposed to narratives questioning female autonomy, intellect and capability.
Simultaneously, narratives that glorify motherhood, family “honour” and the qualities of “a good wife” are continuously spread on mainstream media and circulate on social media. This reinforces archaic representations of women that don't account for our full individuality and potential. These narratives might not break platform policies, but it is important for us to think about misogyny on a spectrum.
We have seen the real-world consequences of online misogyny in everything from sexual harassment to violent attacks on women, many of which start as online conversations.
Combating online sexism and misogyny starts with understanding the complexity of the explicit and implicit hate speech against women, decoding hidden messages and reading between the lines. It is a challenging mission. The challenge gets harder when detecting online attacks targeting women in long conversations and speech in audio and video.
Killing, domestic violence, female genital mutilation, abuse, harrasment, and early marriages are the daily reality for many women and girls around the world. The statistics are deeply shocking, but not truly a real reflection of the phenomena. In the Arab region for example, figures are highly underestimated, as violence against women and girls goes greatly underreported. Most women are afraid to speak up so their stories of abuse and violence are often untold.
Recently, people in the Arab world have been shocked by a wave of killings against women in broad daylight. The recent killing of two university students, Naiyera Ashraf from Egypt and Iman Rashid from Jordan, are not isolated cases. It is a wave of violence and a problem that crosses socio-political and cultural lines.
Language is equally shocking, blatant and undisguised. Women in the Arab world are bombarded with humiliating and demeaning phrases and words in school textbooks, mainstream media and social media. Some online users have no shame using such hateful language and receive no consequences. When the state, police, society, and language are all in cahoots, women are left with no protection in the real world. It is critical that anyone hosting conversations online attempts to create safe and free spaces for women in the virtual world.
In Kinzen, we study misogynistic narratives in multiple languages, including Arabic, decode the language around it and assess the risk and real world impact. On a daily basis, we detect new and emerging hashtags, keywords and phrases that are loaded with hate speech in a region that is already struggling with abuse and violence against women.
Some of the common undisguised misogynistic narratives blatantly describe women as “incomplete” or “half brain”. A simple and quick search on social media platforms using such phrases leads to many pages, groups and videos with obvious hateful names and titles. For example, the phrase ناقصات عقل و دين (Deficient in intellect and religion) is used to claim that women lack reason and intellect. This misogynistic phrase is the headline of hundreds of videos, some of which have been watched millions of times on social media.
Videos and podcasts with headlines loaded with misogyny are not difficult to spot online. Yet, they still exist. The deeper challenge comes when hate speech against women occurs in casual conversation with neutral headlines.
Blatant and undisguised phrases occur in Arabic audio and video, which I analyse on a daily basis in my work. The screenshot below is taken from Kinzen’s audio dashboard. It shows the keyword “adulteress” highlighted in yellow as it occurs in the transcript. Although the speaker in the audio called to punish both adulteress and adulterer, however, he targeted the “woman adulterer” in particular and blamed her for being the “first instigator of the crime of adultery.” Most researchers would miss this example of misogyny just by checking the episode title - they need to dive into the content within the show itself and understand the context.
I lived in a region where I hadn't ever seen a man “punished” for “adultery,” but women had been, and still are, beaten, abused or even killed by their partners or families if such similar accusations were made against them.
Although we don’t see much blatant degrading and inferior language used by the state or the media in the Western world, implicit biases, incidental language and casual misogyny still exists.
Casual misogyny is more disguised in the West. It hides behind sophisticated conversation, messages and comments. Although it does not rise to the level of blatant and explicit misogyny all the time, it creates an environment that allows hate speech against women to spread widely.
Indirect misogynistic messages often target women’s personal appearance, weight, clothes, etc. Women who get promotions have to face questions that no man would ever be asked. These problems are reflected in the audio format too.
For example, this screenshot is taken from the Kinzen audio dashboard in English. It shows how speakers are targeting female public figures and politicians by accusing them of using their body to gain power. Although the transcripts don’t include explicit abusive language, we are able to detect implicit narratives by using Kinzen’s tool. Our subject matter experts are able to contextualise possible sexist messages given the nature of the topic as well as female politicians mentioned in the conversation.
Because of how complex explicit and implicit misogyny is, Trust and Safety teams struggle to detect and act on such messages.
In Kinzen, our team of experts examine harmful online content including audio and video and provide insights to inform difficult decision-making. Crucially, our work is not limited to English. Our editorial network of experts gathers insights from across countries and languages including Arabic, Hindi, German, French, Turkish, Spanish, and many more.
Human expertise is critical in understanding the complex evolution of these narratives, and when married with technology we can scale detection broadly and identify harm with speed. At Kinzen we continue to build the technology needed to detect explicit and implicit hate speech against women in the online spaces they should be safely represented in.