In the world where we talk about being neutral, we have one more thing that appears neutral but isn’t viz. AI, which is proven to be a mirror of ourselves. As it is made by humans which means it has got all the same bias as we have including gender bias. As humans are the creators of the AI, our first main target is to put light on data of gender gap viz. Professionals who work in AI and data science fields consist of around 22-23 percent of women and the rest are male and this percent of female professionals are found to be working on jobs associated with less status.
Now let us see how the gender gap affects the AI and make it gender biased, talk about prejudice and stereotyping, as we say and mention these two words there might be something extreme going on. When we talk about medical testing, Only a few percent of AI and data science professionals are women, which is a huge gap when compared to male bodies. When it comes to medical trials, there is no data on female bodies. Also, the access to mobile phones is so low among women that it is difficult to create comprehensive data about them.
Prior to credit cards, processes used to determine creditworthiness were based on the status of marriage and gender. On which the Gender Shades project aims to work and improve the representation of women in AI by highlighting the lack of representation of women in facial-recognition datasets.Women are more prone to being misclassified as minorities than men, and these systems seem to care less about men as they do not care about women.
And now a big blame on AI for being sexist as women have to set up fake profiles on their educational systems to get access to the information and knowledge by protecting themselves from getting trolled/harassed/both and other forms of bias. It must be a fact to not believe for some but it is the reality with which we have to deal if these are the signals and commands that we provide to the AI. Also, reinforcement of existing harmful stereotypes in terms of translation software has taken on gender-neutral terms that reinforce stereotypes. (such as “the army” in English) which returns a gendered translation (such as “El Ejercito” Spanish) saying that the army represents a group of men, saying to the world that the army is a proven field for men only. There are many such gender neutral terms which have loosen their gender neutrality in this way.
“AI is MALE”
“The workforce in AI is male dominant, the data drawn is sexist which makes the output sexist too by analysing the patterns in it”. Unconscious of particulars’ gender, humans create certain rules which benefit them and these rules in AI make the algorithms.
Word Embeddings are the core of Natural Language Processing. But have we kept them free from inherent bias? The answer to this question is, NO. Imagine googling “cool sports t-shirts” and Google responding to almost all data with tees for men. This thing which has shown signs of gender bias, being incorporated in Google Search.
Through the immense ability of AI it identifies and eliminates distorted or say damaged inputs to maximize the efficiency and accuracy of machines AI holds, which is a great promise that is unachievable by humans and could never even think of it.
And for that to go we need to remove this prejudice and to train our data in such a way that identifies the flaws in existing predictive models we have. The training samples can be molded in such a way that we bring diversity in the training samples (e.g. by using as many female oriented data samples as males in our training data which will further get into the system). Then we can focus on sensitive groups and solve the unfairness going on by collecting more and more training data associated with it. From there, apply different machine learning algorithms, use of modern learning algorithms and debiasing techniques combined with other modern features of Machine Learning that offer ways to chastise the data not just for errors in recognizing the foremost or say primary variable, but that also have additional penalties for producing such sort of prejudices.
One more action to make our AI better at getting gender neutral is to inspire the work of the systems which helps to imagine how AI can be used to achieve an outwardly impossible or say unjust feat – abolishing the unfairness.
If we want to make AI more unjust, not like present us it is necessary to believe in the potential of it and to make that happen we can submerge and bring diversity in advance gender, equality, and incorporation among people or say teams working, managing and developing AI systems.
This is the opportunity to revolutioniz, to lock the new and better methods and unlock the fear of how we think, plan, design, and manage AI systems, this all does not guarantee an unjust AI but with growing inventions in the field of technology we can surely consider these things and work upon them. And once we grab these opportunities, we can happily remove all the tags from AI being Racist or Sexist.
Author Nimisha Agrawal - 20mca001@nirmauni.ac.in