Download PDFOpen PDF in browser

Does AI Reflect Human Behaviour? Exploring the Presence of Gender Bias in AI Translation Tools

EasyChair Preprint no. 10954

18 pagesDate: September 23, 2023


Natural language processing tools are becoming more and more important in our daily life, enabling us to perform many tasks in a timely and efficient manner. However, as the utilisation of these tools growth, so does the risk of unexpected consequences due to the presence of bias. This study investi-gates the presence of gender bias within the most popular neural machine translation and large language model tools. We defined a set of Italian sen-tences concerning ten specific jobs, where the gender of the subjects is not explicitly mentioned. Employing those AI tools, we translated the sentenc-es from Italian to English, requiring the gender to be explicitly mentioned. Afterwards, we developed a survey to obtain human translations for the same sentences, allowing us to compare the differences between the re-sponses generated by the tools and those from individuals. Results show a high presence of gender bias especially for the jobs associated with a male gender and demonstrate a consistency between the outcome obtained by the tools and the results of the survey. These findings serve as a starting point for exploring the origins of gender bias within natural language processing tools and how they reflect gender distributions in our society and human behaviour regarding job occupations.

Keyphrases: Artificial Intelligence Bias, gender bias, large language models, Natural Language Processing, Neural Machine Translation

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Marco Smacchia and Stefano Za and Álvaro Arenas},
  title = {Does AI Reflect Human Behaviour? Exploring the Presence of Gender Bias in AI Translation Tools},
  howpublished = {EasyChair Preprint no. 10954},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser