By: Hammad Ahmad

Stereotypes have long influenced societal perceptions, shaping biased and preconceived opinions about individuals and communities. According to Merriam-Webster, a stereotype is a standardized mental picture held by members of a group that represents an oversimplified, prejudiced, or uncritical judgment. In simpler terms, it is an assumption made without any solid basis.

The Indo-Pak subcontinent is rife with such stereotypes—ranging from the belief that women are inferior to men, to the notion that men must not express emotions, to the perception that minority groups are only fit for meager jobs. Among these, one particular stereotype stands out for its inherent hypocrisy—the belief that married men are afraid of their wives. This stereotype, widely observed across the region, implicitly portrays wives as antagonists in their spouses’ lives.

However, data tells a different story. According to a report by the United Nations Population Fund (UNFPA), 34% of married women have experienced physical, sexual, or emotional violence at the hands of their spouses. A study by the Asian Development Bank further revealed that during COVID-19, spousal violence increased to 46% in Punjab and Sindh in the form of physical abuse. Clearly, the stereotype of “husbands fearing their wives” does not align with reality.

This raises an important question: Should we trust stereotypes or data? The answer is obvious—data must prevail. However, there is a growing concern over how artificial intelligence (AI) models, such as ChatGPT and other generative AI systems, contribute to reinforcing societal stereotypes.

AI, Stereotypes, and the Power of Data:

Exactly a month ago, on February 20, 2025, a user engaged in a conversation with ChatGPT, requesting jokes in Roman Urdu. However, the AI model’s responses carried a noticeable Hindi linguistic touch. Interestingly, nearly 70% of the jokes it generated revolved around married men being afraid of their wives, reinforcing the same stereotypical narrative prevalent in society.

One such joke read:

A man said to his wife: “Jaan, do you know marriage is like the Titanic?”
The wife replied: “Oh, you mean romantic and luxurious?”
The husband responded: “No, everything seems romantic at first, but then the ship (marriage) starts to sink.”
The wife took out her sandal, and the man is still revolving in an orbit.

Notably, before delivering this joke, the AI model asked, “Do you want me to go ‘big bang’?”, to which the user responded affirmatively.

The phrase “the wife took out her sandal and the man is still revolving in an orbit” reinforces the stereotype that husbands live in fear of their wives, painting women as domineering and tyrannical figures.

Is AI to Blame?

AI models like ChatGPT are data-driven—they learn from the vast pool of information they are trained on. This means that if biased narratives exist within the data, the AI will reflect and reproduce them. Therefore, the problem does not lie with AI itself but with the underlying data that shapes its responses.

The solution? A deliberate effort to create, refine, and revamp datasets to ensure gender-neutral, stereotype-free content. This requires AI experts who not only understand technology but also grasp the intricate relationship between society, stereotypes, and data.

AI: A Double-Edged Sword

AI holds immense potential in combating gender biases across all sectors of life. It can help dismantle prejudices against both men and women, challenge stereotypes about minority communities, and foster a more inclusive digital ecosystem. However, if left unchecked, it can also reinforce and amplify existing biases.

While AI engineers and professionals have a crucial role to play in fine-tuning algorithms and improving data, the responsibility does not rest solely on them. As consumers of AI-generated content, we must be critical of the information we consume and engage in thoughtful analysis of the narratives AI models present.

The future of AI is in our hands. It can either perpetuate societal biases or become a tool for progress—the choice is ours.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *