Are Indian women stereotyped by AI? Gender bias by AI has been out there pretty much
Social media frequently enthralls us with unusual items and ideas that make the most use of technology. As various artistic ideas and graphics take over the online world today, gender bias by artificial intelligence (AI) has emerged as a trend. Users undoubtedly adore this trend to the hilt, which was taken somewhat seriously by AI-created stereotypes by a Delhi-based artist.
The artist has now created part two of the stereotypes by AI series after his Twitter thread featuring artificial intelligence-generated images of stereotypical Indian guys went viral on social media. He disseminated stereotypes- and representations of Indian women stereotyped by AI that had previously circulated on the internet.
Madhav Kohli published pictures of what women from various Indian states would be like if they had the traditional characteristics. The thread featured artificial intelligence (AI)-generated ladies from several states. Some internet users thought the portraits appeared “genuine” because they were created with such a delicate touch.
Following user requests to change a few, such as the UP woman who was depicted with an angry look, Kohli also uploaded more versions. He published a picture of what a younger woman in the state of Uttar Pradesh would look like and dubbed it “UP 2.0.”
While netizens began clamoring for the renderings of numerous other local women, others even applauded Kohli for his superb paintings and noted how creative the images appeared. “I adore every single one of these. So creative!” one person said, and another added, “This thread… Worth a look… It doesn’t matter if you concur or not.
“Matlab kisi bhi jagah ki ladki chubby nahi hoti kya?”, one of the other groups of users appeared to object to the representational images. Does this imply that there aren’t any fat girls anywhere? Unreal aesthetic standards and no inclusiveness at all.” All appear the same for some reason, the other online user noted.” Some others even made fun of the stereotypes by asking, “Can you share the stereotype you used? Simply curious.”
Many organizations base their decisions on artificial intelligence (AI) systems that use machine learning (ML), in which a number of algorithms analyze vast amounts of data and learn from it in order to detect patterns and anticipate future events. These systems help determine how much credit financial organizations extend to certain clients, whom the healthcare system gives COVID-19 vaccines to first, and which job applicants are called in for interviews.
However, gender prejudice is ubiquitous in these systems and has a significant negative influence on women’s psychological, economic, and physical security in the short- and long-term. Additionally, it might accentuate and perpetuate undesirable gender stereotypes and prejudices that already exist.
Because they were developed by humans, AI systems are prejudiced. Who takes decisions that affect AI systems and who works on the team creating AI systems has an impact on how such systems grow. Undeniably, there is a large gender gap: Women make up only 22% of professionals in the AI and data science disciplines, and they are more likely to work in positions with lower prestige.
Humans produce, gather, and label the data that makes up datasets on a more granular level. The datasets, variables, and rules that the algorithms used to make predictions are chosen by humans. Both of these phases have the potential to produce biases that permeate AI systems.
For ML systems, making gender equity and justice a top priority might have an effect on design and management choices. We must admit that ML systems lack objectivity. Even ML systems built for good (for instance, a system aimed to improve hiring equity or creditworthiness assessments) might be vulnerable to bias-related problems, just like their human designers.
For instance, the Commission of Women Victims for Victims (KOFAVIV), a local community organization that works with marginalized communities to protect their rights using technology, collaborated with Digital Democracy to develop a safe method for gathering data on gender-based violence in Haiti. Local women were able to track, analyze, map, and share data, thanks to the system.
Add comment