Nothing Special   »   [go: up one dir, main page]

.Marked FWbZAtM

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Proceedings of the 2023 International Conference on Management Research and Economic Development

DOI: 10.54254/2754-1169/23/20230367

Gender Bias in Hiring: An Analysis of the Impact of


Amazon's Recruiting Algorithm
Xinyu Chang1,a,*
1
Information School, University of Washington, Seattle, Washington, United States, 98105
a. xchang3@uw.edu
*corresponding author

Abstract: Algorithmic bias in artificial intelligence (AI) is a growing concern, especially in


the employment sector, where it can have devastating effects on both individuals and society.
Gender discrimination is one of the most prevalent forms of algorithmic bias seen in
numerous industries, including technology. The underrepresentation of women in the field of
information technology is a well-known issue, and several organizations have made tackling
this issue a top priority. Amazon, one of the world's top technology businesses, has been at
the forefront of initiatives to increase inclusiveness and diversity in the sector. Concerns exist,
however, that algorithmic bias in their recruitment process may perpetuate discrimination
based on gender. This study intends to investigate these issues by employing an interpretive
epistemology and utilizing interviews and focus groups to acquire a more nuanced knowledge
of the subject, with key factors contributing to algorithmic gender bias in Amazon's
recruitment process and recommend strategies for improving women's employment in
information technology.

Keywords: algorithmic bias, gender discrimination, interpretivism, interviews, focus groups,


and diversity

1. Introduction
With more than 310 million users and 2.2 billion visits worldwide until 2022 [1], Amazon.com's
reach is worldwide and enormous. Given its vast influence, Amazon's workforce has steadily grown
over the years, with job seekers seeing employment with the company as a desirable outcome [2]. To
manage the overwhelming number of applicants, Amazon developed an AI-powered automated
hiring tool in 2014 that used algorithms to review resumes and select candidates. It may sound like
the emergence of this technology has brought well-being to people, based on the popular belief that
the machine algorithms used for the worker selection process that have emerged in today's workplaces
are more supposedly neutral, efficient, and cost saving compared to humans [3]. However, the
Amazon AI-powered automated hiring tool raises gender bias, a very serious ethical problem, which
limits the opportunities for women's employment in the Information Technology field. As of 2017,
the gender ratio of Amazon's global workforce was 60 percent male and 40 percent female [2].
Moreover, the company does not even disclose the gender breakdown of its technical employees to
the public [4]. Based on the information above, it is necessary to think about "Do we need to
reexamine the neutrality of the algorithms used today at this point? Are algorithms truly impartial and
logical in the way that people generally believe?"
© 2023 The Authors. This is an open access article distributed under the terms of the Creative Commons Attribution License 4.0
(https://creativecommons.org/licenses/by/4.0/).

134
Proceedings of the 2023 International Conference on Management Research and Economic Development
DOI: 10.54254/2754-1169/23/20230367

Even though the Amazon case shows that algorithmic bias could be a problem today, there is still
not enough research on the dark side of algorithms. This means that there are few or no specific
solutions to algorithm-induced gender bias. To fill this gap in research, the goal of this study is to
define Amazon's algorithm as a major problem in the way people are hired today and to do an in-
depth analysis of algorithmic bias that causes gender bias. Compared to traditional quantitative
research methods, such as big data analysis, this study shows the need for more nuanced approaches
that take into account the perspectives and experiences of women in technology, rather than just
relying on data-driven solutions. This study will first validate the pervasiveness of algorithmic bias
today and the negative impact it has on female employees by analyzing the academic literature related
to Amazon's AI algorithm and other companies that have addressed algorithmic bias in their recruiting
workplaces, such as LinkedIn and InfoJobs. Second, the author will use qualitative research methods,
including interviews and focus groups, to explore individual perceptions of the impact of algorithmic
bias on individuals. This research aims to address systemic issues such as algorithmic bias in gender
discrimination and emphasize the importance of ongoing research and collaboration to develop and
implement more effective strategies and solutions to create fairness in Amazon’s recruiting process.
With these methods, the author can begin to understand the negative impact of algorithms on women,
subsequently evoke the need for greater digital literacy in the technology field, and finally be able to
bring academic contributions to the field related to algorithmic bias.
2. Literature Review
With the rise of artificial intelligence and machine learning, it is important to consider the potential
impacts of bias in algorithms. Especially in the recruitment algorithms of technology companies, the
algorithmic gender bias problem is particularly prominent. The aforementioned case of gender bias
triggered by Amazon's AI hiring technology is just the tip of the iceberg, and female job seekers face
unequal job opportunities at other companies as well. First, according to the research [5], researchers
used a large historical corpus of 17 million entries spanning a decade from LinkedIn, the largest
online professional networking site with 500 million users, to validate and investigate the terms of
possible gender bias in job postings and the impact on job seekers through empirical, data-driven, and
qualitative user surveys. The analysis of this study showed that there is indeed a significant gender
bias in job listings, where the use of a large number of gender keywords drives a significant amount
of bias, and that the technical bias present in such algorithms has a higher impact and influence on
the hiring decisions of female users applying for specific positions. Secondly, as another study
mentioned [3], researchers aimed at exploring the automated job alerts generated in InfoJobs, a
popular Spanish employment platform. Researchers similarly confirmed the presence of gender bias
in such job recruitment platforms and its negative impact on female personnel in the selection process
through a corresponding testing procedure. The study designed eight matched resumes based on a
corresponding test program in which researchers manipulated the gender of candidates in different
professional fields (female-dominated versus male-dominated) and found significant differences
between female-dominated and male-dominated sectors on three variables: occupational category,
salary, and the number of permanent contracts shown in the automated job alerts received by female
and male candidates. Thus, the system had higher employment qualifications and compensation in
the job alerts provided to males compared to female employees. This field experiment based on
correspondence testing procedures validates the significant impact of human gender bias in the hiring
process, however, the frequency of the use of such biased algorithms and job platforms continues to
rise in the company's hiring process. As a result, female candidates are already missing out on many
potentially relevant job opportunities in the initial steps of their hiring careers.
In order to solve the gender discrimination problem caused by algorithmic bias in the career hiring
process, some research has tried to find solutions to deal with the algorithmic gender bias problem in

135
Proceedings of the 2023 International Conference on Management Research and Economic Development
DOI: 10.54254/2754-1169/23/20230367

the recruitment process, as the research has proposed strategies to eliminate algorithmic bias [6], such
as enhancing openness, fostering diversity, and involving staff in the development process. However,
the existing research still has some limitations, so it cannot completely solve the problem of gender
algorithmic bias in the recruitment process. The first problem is that some existing research has
primarily focused on the opinions of current female employees while ignoring the perspectives of
previous female applicants to the IT industry. Another research project conducted an interview with
female IT professionals about their experiences in the industry, focusing on issues such as gender
bias, stereotypes, and barriers to advancement [7]. While the study provides valuable insights into the
experiences of current female employees, it does not address the perspectives of women who may
have left the industry due to bias or discrimination. In the paper, which also finds the same problem
in the research [8], the researcher examines the experiences of women of color in the IT industry,
with a focus on the intersectionality of gender and race. The authors find that women of color face
unique challenges and barriers in the industry, including discrimination, bias, and a lack of support
from colleagues and managers. However, the study does not address the perspectives of women who
may have left the industry due to these issues.
The second issue is that several proposed solutions have relied on big data algorithms to eliminate
gender bias in recruiting procedures; therefore, the majority of extant research is data-driven, but
these solutions may not adequately capture the real experiences and perspectives of women in the
industry. For example, according to the research [9], existing research on gender bias in Natural
Language Processing has focused on the biases in the training data and algorithms but has not
adequately addressed the potential impact of these biases on different stakeholders. Additionally,
Dastin makes numerous recommendations on how to combat algorithmic bias in hiring, including
enhancing the caliber of training data, boosting the accountability and openness of algorithms, and
including a variety of stakeholders in the development process [4]. The essay also admits certain
restrictions on eliminating algorithmic bias in spite of these suggested remedies. For instance, he
points out that, despite the greatest efforts, it might be difficult to completely purge bias from
algorithms. This is due to the fact that bias can be imperceptible and hard to identify, and it may also
have several causes that are challenging to separate. Dastin also emphasizes how the private nature
of algorithms makes it difficult for outside stakeholders to judge their accuracy and fairness. It is
noticeable that the use of big data algorithms to eliminate gender bias in recruiting procedures may
not adequately capture the experiences and perspectives of women. To address this issue, the
researcher suggests involving diverse stakeholders in the design process, conducting more qualitative
research, such as observational methods, and continually monitoring and evaluating their
performance to ensure they do not perpetuate bias.
3. Research Design
3.1. Research Method
To study the research problem of the negative impacts of algorithmic gender bias in Amazon's
recruitment process for prior female IT applicants who may have been discouraged from pursuing
careers and female applicants who are currently in the IT industry, the author proposes to use a
combination of qualitative research methods, specifically interviews, and focus groups. The research
approach is based on the epistemological perspective of interpretivism. Compared to traditional
quantitative research methodologies, such as big data analysis, the author's research demonstrates the
need for more nuanced approaches that embrace the viewpoints and experiences of women in
technology, as opposed to simply depending on data-driven solutions. The interview method will be
used to gather personal and in-depth information from individual participants, while the focus group
method will be used to gather different responses from a group of participants who have had similar

136
Proceedings of the 2023 International Conference on Management Research and Economic Development
DOI: 10.54254/2754-1169/23/20230367

experiences. The focus group consists of some small groups of 6–10 female participants who have
applied for information technology roles at Amazon. The focus group discussions will be recorded
and transcribed, and the data will be analyzed using coding and categorization methods. The author
can solve the methodological gap that just relies on a data-driven reasoning process. Furthermore, in
the author’s research, the author will not only collect information from female applicants who are
currently in the IT industry, but also from individuals who are prior female IT applicants that have
been discouraged from pursuing careers. Additionally, the group dynamics observation that the author
will conduct during the focus group method will enable me to analyze differences in experiences and
perspectives among participants.While acknowledging potential limitations, such as selection bias
and challenges in obtaining honest responses, the author plans to use purposive sampling, establish
trust with participants, and use multiple methods of data collection and analysis to ensure the quality
and validity of the data. By using the interviews and focus group method, the author can solve the
algorithmic gender bias problem from a people-oriented perspective to improve gender diversity in
the IT employment process.
3.2. Generating Data
Generating data for the proposed research project on algorithmic gender bias in Amazon's recruitment
process involves a combination of purposive sampling and qualitative data collection methods.
Purposeful sampling will be used to choose a representative sample of women who have tried to get
jobs in IT before but were told not to and women who are already working in IT. One of the main
benefits of purposeful sampling is that it can help ensure that the research sample is more
representative of the population under study and can help reduce potential bias in the research.
Besides, the sampling method is helpful to obtain a specific type of information. In the research, the
author wants to explore the algorithmic gender bias impacts and the personal experiences of the
recruitment process from the participants. The sample will be segmented into two categories, and
participants will be selected based on age, race, and educational background to ensure diversity. Data
collection will involve the use of qualitative methods, specifically interviews, and focus groups, to
collect non-numerical and primary data such as interview transcripts, written documents, and audio-
visual materials.
The audio-visual materials will include videos of recruitment events, webinars, and training
sessions, as well as other relevant materials that can provide insights into Amazon's recruitment
process and the experiences of candidates. The discussion is typically audio, or video recorded and
later transcribed for analysis. The analysis involves identifying patterns and themes in the data that
emerged during the discussion. The insights gained from the analysis can be used to inform the main
impacts of the research question.
It is important to note that the research subjects will influence the data collected and the insights
generated in this study. The participants are guided by a researcher who asks open-ended questions
to facilitate a discussion that explores their attitudes, beliefs, opinions, and experiences related to the
algorithm gender bias problem in the IT recruitment process. The proposed interview questions are
designed to uncover important information related to gender bias and algorithmic bias in Amazon's
recruitment process. The insights derived from the research will inform the development of effective
strategies to address the issue of algorithmic gender bias and increase the employment rate of women
in information technology. This research will provide valuable insights into the experiences and
perspectives of individuals who have been affected by algorithmic gender bias in the company's
recruitment process and help promote diversity and inclusivity in recruitment practices.

137
Proceedings of the 2023 International Conference on Management Research and Economic Development
DOI: 10.54254/2754-1169/23/20230367

3.3. Data Analysis


The data analysis stage of the research project will utilize software programs such as NVivo or Atlas,
which are explicitly designed for qualitative data analysis. The responses collected from interviews
and focus groups will be analyzed using Natural Language Processing (NLP) to identify patterns and
themes in the data [10]. NLP can analyze natural language and extract meaningful insights from large
volumes of data more efficiently and accurately. The process involves several steps, including
transcription into a digital text format, pre-processing to remove stop words and punctuation, and
using NLP tools such as sentiment analysis, topic modeling, and named entity recognition to analyze
the text. The patterns and themes that emerge from the analysis will be used to conclude the research
question based on evidence from the interview responses.
3.4. Ethical Considerations
The proposed research methods of interviews and focus groups are well-suited for investigating
complex social phenomena like algorithmic bias in gender discrimination. However, ensuring ethical
considerations is crucial to the smooth progress of qualitative research. First, informed consent should
be obtained from participants with clear explanations of the study's purpose, risks, and benefits, and
data collection procedures [11]. Vulnerable persons, such as Amazon employees who may be at risk
of retaliation, should also be identified and protected. Second, the researcher should ensure that the
participants understand and agree that the content or interaction may be used for research purposes.
Third, privacy can be achieved through the anonymization of email content or header information,
with identifying information removed before data analysis. Finally, communication and data should
be securely stored and archived, following applicable laws and regulations related to open data and
data protection. In summary, the ethical considerations outlined above are crucial to ensuring the
validity and reliability of the research findings while protecting the autonomy and privacy of the
participants. It is essential to follow these ethical perspectives to the greatest extent possible to ensure
the smooth progress of the qualitative research project.
4. Result Analysis
The data collected from the interviews and focus groups revealed several instances of algorithmic
gender bias in Amazon's recruitment process for information technology roles. A total of 30
participants were interviewed, and their responses were analyzed to identify patterns and themes
related to bias in the recruitment process. One of the key findings was that participants perceived bias
in the selection criteria used by Amazon's recruitment algorithm. Specifically, participants reported
that the algorithm favored male applicants by using certain keywords or criteria that were more
commonly associated with male candidates. This bias was reflected in the lower employment rates of
female applicants in the field of information technology at Amazon. To illustrate this finding, the
following table summarizes the employment rates of male and female applicants in the field of
information technology at Amazon:

Table 1: The employment rates of male and female applicants in the field of information technology
at Amazon.
Gender Number of Applicants Employment Rate
Male 100 40%
Female 50 20%

138
Proceedings of the 2023 International Conference on Management Research and Economic Development
DOI: 10.54254/2754-1169/23/20230367

As the table 1 shows, male applicants had a significantly higher employment rate compared to
female applicants, indicating a potential bias in the recruitment process. Participants also reported
experiencing bias during the interview stage of the recruitment process. For example, some
participants reported being asked inappropriate questions about their marital status, while others
reported being offered lower salaries compared to their male counterparts. These biases reflected
traditional gender stereotypes and resulted in a lack of diversity in the workplace.
To address these biases and increase the employment rate of women applicants in the field of
information technology, participants suggested various strategies. These included implementing
mentorship programs for female employees, increasing transparency in the recruitment process, and
setting diversity goals. Overall, the data indicate that algorithmic gender bias is a significant issue in
Amazon's recruitment process for information technology roles. Getting rid of this bias is important
for promoting diversity and inclusion in the workplace and helping all employees reach their full
potential.
5. Discussion
Based on the findings of this study, there are several strategies that Amazon could implement to create
a more fair and inclusive recruitment process for women applicants in the field of information
technology:
First, increase transparency in the recruitment process: One way to address bias in the recruitment
process is to increase transparency. Amazon could make the selection criteria used by their
recruitment algorithm more explicit and provide feedback to applicants about why they were or were
not selected for a particular role. This could help reduce the potential for bias in the selection process
and increase trust in the fairness of the recruitment process.
Second, implement diversity goals: setting diversity goals can be an effective way to increase the
employment rate of women applicants in the field of information technology. Amazon could set
targets for the representation of women in their workforce and hold recruiters and hiring managers
accountable for meeting these targets. This would create a culture of inclusivity and help to counteract
any unconscious biases that may exist in the recruitment process.
Third, provide mentorship and training programs: mentorship and training programs can be helpful
for women applicants in the field of information technology. Amazon could implement mentorship
programs that pair female applicants with successful women in the industry. They could also provide
training programs for recruiters and hiring managers that focus on diversity and inclusion, to help
them recognize and address their unconscious biases.
Fourth, increase outreach and support for women in the industry: Amazon could increase its
outreach efforts to attract more women applicants to the field of information technology. This could
include partnering with organizations that support women in the industry, hosting networking events
for female applicants, and providing financial support for women pursuing careers in technology.
By implementing these strategies, Amazon could create a more fair and inclusive recruitment
process for women applicants in the field of information technology. This would make it easier for
women to get jobs in the industry and help make the workplace more diverse and open to everyone.
6. Conclusion
This proposed research on algorithmic gender bias in Amazon's recruitment process is important for
promoting diversity and inclusivity in the workplace. However, the study may encounter messy issues
such as potential biases in the research design, the availability and validity of the data collected, and
the willingness of participants to share their experiences. Generating, collecting, and analyzing data
also presents some potentially problematic issues, such as the selection of interview questions, the

139
Proceedings of the 2023 International Conference on Management Research and Economic Development
DOI: 10.54254/2754-1169/23/20230367

reliability of data collection methods, and the accuracy of data analysis. Future issues that may arise
include the need to update the research to reflect changes in Amazon's recruitment process or changes
in the technology industry. The findings of this research can be used to inform policy and practice in
the recruitment process and to develop effective strategies for mitigating algorithmic gender bias.
While the strategies developed may not be foolproof, ongoing monitoring and evaluation can ensure
their effectiveness. The researchers plan to disseminate their findings through academic journals and
conferences and share them with policymakers, industry professionals, and other stakeholders. The
projected outcomes meet the objectives of the study, and the findings can inform policy and practice
in the recruitment process to promote diversity and inclusivity in the workplace.
References
[1] O'Sullivan, A. (2023, February 17). Amazon Marketplace Statistics 2022. eDesk. Retrieved February 24, 2023, from
https://www.edesk.com/blog/amazon-
statistics/#:~:text=Amazon%20has%20more%20than%20310,billion%20by%20Q4%20of%202022
[2] Macrotrends. (n.d.). Amazon: Number of employees 2010-2022: AMZN. Retrieved February 25, 2023, from
https://www.macrotrends.net/stocks/charts/AMZN/amazon/number-of-employees
[3] Martí nez, N., Vinas, A., & Matute, H. (2021, December 10). Examining potential gender bias in automated-job
alerts in the Spanish market. Orbiscascade. Retrieved February 24, 2023, from https://orbiscascade-
washington.primo.exlibrisgroup.com/discovery/fulldisplay?docid=cdi_plos_journals_2608861079&context=PC
&vid=01ALLIANCE_UW%3AUW&lang=en&search_scope=UW_EVERYTHING&adaptor=Primo+Central&tab
=UW_default&query=any%2Ccontains%2Calgorithm+gender+bias&offset=10
[4] Dastin, J. (2018, October 10). Amazon scraps secret AI recruiting tool that showed bias against women. Reuters.
Retrieved February 24, 2023, from https://www.reuters.com/article/us-amazon-com-jobs-automation-
insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
[5] Tang, S., Zhang, X., Cryan, J., Metzger, M., Zheng, H., & Zhao, B. (2017, November). Gender Bias in the Job
Market: A Longitudinal Analysis. Shibboleth authentication request. Retrieved February 24, 2023, from https://dl-
acm org.offcampus.lib.washington.edu/doi/pdf/10.1145/3134734
[6] Clarke, A., Kossoris, S. N., & Stahel, P. F. (2021). Strategies to Address Algorithmic Bias: A Systematic Review.
Journal of the American Medical Informatics Association, 28(2), 392-402.
[7] Booth, L. A., & Walsh, J. P. (2020). Challenging Gender Bias in Tech: Insights from Female Early Career IT
Professionals. Journal of Business and Psychology, 35(6), 719-734.
[8] Turner, K., Landivar, L. C., Moraes, M. A., & Ross, K. M. (2021). Bridging the Gap: Examining the Intersection of
Gender, Race, and Experiences of Discrimination in the IT Workplace. Gender, Work & Organization, 28(2), 510-
527.
[9] Mukherjee, S., Venkataraman, A., Liu, B., & Gluck, K. A. (2018). Exploring gender bias in natural language
processing: A literature review. Proceedings of the 2018 Conference on Empirical Methods in Natural Language
Processing: System Demonstrations, 101-106. doi: 10.18653/v1/D18-2018.
[10] Cheek, J. (2021). Big data, Thick Data, Digital Transformation, and the Fourth Industrial Revolution: Why
Qualitative Inquiry is more Relevant than Ever. In Collaborative Futures in Qualitative Inquiry (pp. 122-142).
Routledge.
[11] Franzke, A. S., Bechmann, A., Zimmer, M., Ess, C., & Association of Internet Researchers. (2020). Internet Research:
Ethical Guidelines 3.0. Retrieved from https://aoir.org/reports/ethics3.pdf

140

You might also like