Our Blog
/
Blog

Beyond the Pipeline: Addressing Attrition as a Barrier to Diversity in AI

$hero_image['alt']

As a field, AI struggles to both recruit and retain team members from diverse backgrounds. Why is this such a widespread phenomenon, and more importantly, what can be done to close the gap?

Companies Working to Mitigate AI Bias Must First Look Within

The lack of Diversity, Equity, and Inclusion (DEI) within the tech sector continues to be an area of great concern. Discrimination, unfair treatment, and hostile work environments have all contributed to high attrition rates among women and minoritized individuals in tech.

The work environment among artificial intelligence (AI) teams is no different and often mirrors the power structures that are responsible for disparities in wider society. Low representation of women and Black people in AI may lead to significant racial bias encoded within algorithms. Organizations that conduct AI research and who use AI in their work are increasingly aware of the need to actively challenge bias and discrimination in the products they produce.

But they often neglect to begin this work of bias mitigation by first examining their own work environment.

In recent months, many organizations have launched DEI efforts internally as a means of fostering a culture of inclusion. There is a lack of research, however, into how – and how well – these initiatives work.

While much attention has been paid to improving the hiring pipeline in tech, in the hopes of making the industry more inclusive, organizations are often left to tackle the issue of high attrition of those diverse hires on their own, with varying degrees of success.


 

Why They Leave: A New PAI Study Examines AI’s Diversity Crisis 

In partnership with DeepMind, PAI is launching a DEI research study on the experiences of women and minoritized individuals in AI. This study is part of our broader Fairness, Transparency, and Accountability (FTA) Research Initiative and will be led by Jeff Brown, Diversity and Inclusion Research Fellow, and Alice Xiang, Head of FTA Research.

We hope this research will deliver actionable insights that our Partner organizations, as well as others in the AI ecosystem, can use to foster a more inclusive work environment.

The need for this research is timely and comes as PAI has redoubled its own mission to strive for diversity, equity, and inclusion in our work.

PAI’s multi stakeholder approach to this work will yield insights with both a breadth and depth of information about the factors leading to greater inclusion among AI teams. The study will consist of interviews and questionnaires with DEI leaders and managers (of all backgrounds), as well as with individuals working on AI teams who are female-identified or who identify with some minoritized identity.

This work aims to be inclusive, and will seek folks of diverse racial/ethnic backgrounds, abilities, sexual orientations, and gender identities who have used AI in their current or previous work, including but not limited to technical roles.

Stemming from this research, the FTA team at PAI plans to release products aimed to address and improve the issue of attrition of diverse professionals in AI. Organizations may turn to these products as resources in striving to make a more inclusive environment for people working in AI.


 

Underrepresented in AI? Share Your Experiences With Our Research Team

Please visit the project overview if you are interested in learning more or have questions, and sign up here if you are interested in being interviewed. All data collected will be private and confidential, with more detailed information in the study’s data privacy plan.

If you know anyone who may be eligible and willing to share their experiences, we encourage you to share this post with them.

We’re calling on organizations working in AI to embrace innovative solutions to make their workplace more inclusive for diverse professionals, spaces that are free from everyday acts of discrimination and supportive of workers that have been historically oppressed both by wider society and by the tech industry. This work is urgent, amid a social climate that is ready to challenge oppressive power structures and the traditional racial status quo.


 

PAI would like to thank DeepMind for supporting this work.