After the Offer: The Role of Attrition in AI’s ‘Diversity Problem’

Jeffrey Brown

Executive Summary

As a field, AI struggles to retain team members from diverse backgrounds. Given the far-reaching effects of algorithmic systems and the documented harms to marginalized communities, the fact that these communities are not represented on AI teams is particularly troubling. Why is this such a widespread phenomenon and what can be done to close the gap? This research paper, “After the Offer: The Role of Attrition in AI’s ‘Diversity Problem’” seeks to answer these questions, providing four recommendations for how organizations can make the AI field more inclusive. Click the button below to download a summary of these recommendations or continue on to read the paper in full.

Summary of Recommendations

Amid heightened attention to society-wide racial and social injustice, organizations in the AI space have been urged to investigate the harmful effects that AI has had on marginalized populations. It’s an issue that engineers, researchers, project managers, and various leaders in both tech companies and civil society organizations have devoted significant time and resources to in recent years. In examining the effects of AI, organizations must consider who exactly has been designing these technologies.

Diversity reports have revealed that the people working at the organizations that develop and deploy AI lack diversity across several dimensions. While organizations have blamed pipeline problems in the past, research has increasingly shown that once workers belonging to minoritized identities get hired in these spaces, systemic difficulties affect their experiences in ways that their peers from dominant groups do not have to worry about.

Attrition in the tech industry is a problem that disproportionately affects minoritized workers. In AI, where technologies already have a disproportionately negative impact on these communities, this is especially troublesome.

Minoritized Workers

This report uses minoritized workers as an umbrella term to refer to people whose identities (in categories such as race, ethnicity, gender, or ability) have been historically marginalized by those in dominant social groups. The minoritized workers in this study include people who identified as minoritized within the identity categories of race and ethnicity, gender identity, sexual orientation, ability, and immigration status. Because this study was international in scope, it is important to note that these categories are relative to their social context.

We are left wondering: What leads to these folks leaving their teams, organizations, or even the AI field more broadly? What about the AI field in particular influences these people to stay or leave? And what can organizations do to stem this attrition to make their environments more inclusive?

The current study uses interviews with folks belonging to minoritized identities across the AI field, managers, and DEI (diversity, equity, and inclusion)- leaders in tech to get rich information about what aspects of cultures within an organization promote inclusion or contribute to attrition. Themes that emerged during these interviews formed 3 key takeaways:

  1. Diversity makes for better team climates
  2. Systemic supports are difficult but necessary to undo the current harms to minoritized workers
  3. Individual efforts to change organizational culture fall disproportionately on minoritized folks who are usually not professionally rewarded for their efforts

In line with these takeaways, the study makes 4 recommendations about what can be done to make the AI field more inclusive for workers:

  1. Organizations must systemically support ERGs
  2. Organizations must intentionally diversify leadership and managers
  3. DEI trainings must be specific in order to be effective and be more connected to the content of AI work
  4. Organizations must interrogate their values as practiced and fundamentally alter them to include the perspectives of people who are not White, cis, or male

These takeaways and recommendations are explored in more depth below.

Key Takeaways

Key Takeaways

1. Diversity makes for better team climates

Across interviews, participants consistently expressed that managers who belonged to minoritized identities or who took the time to learn about working with diverse identities were more supportive of their needs and career goals. Such efforts reportedly resulted in teams that were also more diverse, inclusive, interdisciplinary, and engendering of a positive team culture/climate. In these environments, workers belonging to minoritized identities thrived. A diversity in backgrounds and perspectives was particularly important for AI teams that needed to solve interdisciplinary problems.

Conversely, the negative impact of work environments that were sexist or where participants experienced acts of prejudice such as microaggressions was also a recurring theme.

While collaborative or positive work environments were also a common theme, such environments did not in themselves negate predominant cultures which deprioritized “DEI-focused” work, work that was highly interdisciplinary, or work that did not serve the dominant group. Negative organizational cultures seemed to exacerbate experiences of prejudice or discrimination on AI teams.

2. Systemic supports are difficult but necessary to undo the current harms to minoritized workers

Participants belonging to minoritized identities said that they either left or intended to leave organizations that did not support their continued career growth or possessed values that did not align with their own. Consistent with this, participants described examples of their organizations not valuing the content of their work.

Participants also tied their desires to leave with instances of prejudice or discrimination, which may also be related to “toxic” work environments. Some participants reported instances of being tokenized or being subject to negative stereotypes about their identity groups, somewhat reflective of wider contexts in tech beyond AI.

Systemic supports include incentive structures that allow minoritized workers to succeed at every level, from the teams that they work with actively validating their experiences to their managers finding the best ways for them to deliver work products in accordance with both individual and institutional needs. Guidelines for promotion that recognize the barriers these workers face in environments mostly occupied by dominant group norms are another important support.

3. Individual efforts to change organizational culture fall disproportionately on minoritized folks who are usually not professionally rewarded for their efforts

Individuals discussed ways in which they tried to make their workplaces or teams more inclusive or otherwise sought to incorporate diverse perspectives into their work around AI. Participants sometimes had to contend with bias against DEI efforts, reporting that other workers in their organizations would dismiss their efforts as lacking rigor or focus on the product.

There were some institutional efforts to foster a more inclusive culture, most commonly DEI trainings. DEI trainings that were very specific to some groups (e.g., gender diverse folks, Black people) were reported as being the most effective. However, even when they were specific, DEI trainings seemed to be disconnected from some aspects of the workplace climate or the content of what teams were working on.

Participants who mentioned Employee Resource Groups (ERGs) uniformly praised them, discussing the huge positive impact they had on a personal level, forming the bases of their social support networks in their organizations and having a strong impact on their ability to integrate aspects of their identities or other “DEI topics” they were passionate about into their work.

Recommendations

Recommendations

1. Organizations must systemically support ERGs

Employees specifically named ERGs as one of their main sources of support even in work environments that were otherwise toxic.. Additionally, ERGs provided built-in mentorship for those who did not have ready access to mentors or whose supervisors had not done the work to understand the kinds of support needed for those of minoritized identities to thrive in predominantly White and male environments.

What makes this recommendation work?

Within these ERGs, there existed other grass-roots initiatives that supported workers, such as informal talking circles and networks of employees that essentially provided peer mentoring that participants found crucial to navigating White- and male-dominated spaces. The mentorship provided by ERGs was also essential when HR failed to provide systemic support for staff and instead prioritized protecting the organization.

What must be in place?

While participants uniformly praised ERGs, they required large amounts of time from staff members that detracted from their work. Such groups also ran the risk of getting taken over by leadership and having their original mission derailed. Institutions should seek a balance between supporting these groups and giving them the freedom to organize in pursuit of their own best interests.

What won’t this solve?

ERGs will not necessarily make an organization’s AI or tech more inclusive. Rather, systematically supporting ERGs will provide more support and community for minoritized workers, which is meant to promote a more inclusive workplace in general.

2. Organizations must intentionally diversify leadership and managers
What makes this recommendation work?

Participants repeatedly pointed to managers and upper-level leaders who belonged to minoritized identities (especially racial ones) as important influences, changing policy that permeated through various levels of their organizations. A diverse workforce may also bring with it multiple perspectives, including those belonging to people from different disciplines who may be interested in working in the AI field due to the opportunity for interdisciplinary collaboration, research, and product development. Bringing in folks from various academic, professional, and technical backgrounds to solve problems is especially crucial for AI teams.

What must be in place?

There must be understanding about the reasons behind the lack of diversity and the “bigger picture” of how powerful groups more easily perpetuate power structures already in place. Participants spoke of managers who did not belong to minoritized identities themselves but who took the time to learn in depth about differences in power and privilege in the tech ecosystem, appreciating the diverse perspectives that workers brought. These managers, while not perfect, tended to take advocating for their reports very seriously, particularly female reports who often went overlooked.

What won’t this solve?

Intentionally diversifying leadership and managers will not automatically create a pipeline for diversity at the leadership level, nor will it automatically override institutional culture or policies that ignore DEI best practices.

3. DEI trainings must be specific in order to be effective and be more connected to the content of AI work
What makes this recommendation work?

Almost all participants reported that their organizations mandated some form of DEI training for all staff. These ranged widely, from very general ones to very specific trainings that discussed cultural competency about more specific groups of people (e.g., participants reported that there were trainings on anti-Black racism). Participants discussed that the more specific trainings tended to be more impactful.

What must be in place?

Organizations must invest in employees who see the importance of inclusive values in AI research and product design. Participants pointed to the importance of managers who had an ability to foster inclusive team values, which was not something that HR could mandate.

What won’t this solve?

As several participants observed, DEI trainings will not uproot or counteract institutional stigmas against DEI. It would take sustained effort and deliberate alignment of values for an organization to emphasize DEI in its work.

4. Organizations must interrogate their values as practiced and fundamentally alter them to include the perspectives of people who are not White, cis, or male
What makes this recommendation work?

Participants frequently reported that a misalignment of values was a primary reason for them leaving their organizations or wanting to leave their organizations. Participants in this sample discussed joining the AI field to create a positive impact while growing professionally. This led them to feeling disappointed when their organizations did not prioritize these goals (despite them being among their stated values).

What must be in place?

Participants found it frustrating when organizations stated that they valued diversity and then failed to live up to this value with hiring, promotion, and day-to-day operations, ignoring the voices of minoritized individuals. If diversity is truly a value, organizations may have to investigate their systems of norms and expectations that are fundamentally male, Eurocentric, and do not make space for those from diverse backgrounds. They then must take additional steps to consider how such systems influence their work in AI.

What won’t this solve?

Because achieving a fundamental re-alignment like this is a more comprehensive solution, it cannot satisfy the most immediate and urgent needs for reform. Short-term, organizations must work with DEI professionals to recognize how they are perpetuating potentially harmful norms of the dominant group and work to create policies that are more equitable. Longer term fixes may not, for instance, satisfy the immediate and urgent need for more diversity in leadership and teams in general.

After the Offer: The Role of Attrition in AI’s ‘Diversity Problem’

Executive Summary

Key Takeaways

Recommendations

Introduction

Why Study Attrition of Minoritized Workers in AI?

Background

Problems Due to Lack of Diversity of AI Teams

More Diverse Teams Yield Better Outcomes

Current Level of Diversity in Tech

Diversity in AI

What Has Been Done

What Has Been Done

What Has Been Done

Attrition in Tech

Current Study and Methodology

Recruitment

Participants

Measure

Procedure

Analysis

Results

Attrition

Culture

Efforts to Improve Inclusivity

Summary and the Path Forward

Acknowledgements

Appendices

Appendix 1: Recruitment Document

Appendix 2: Privacy Document

Appendix 3: Research Protocol

Appendix 4: Important Terms

Sources Cited

  1. Buolamwini, J., u0026amp; Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91). PMLR.
  2. Zhao, D., Wang, A., u0026amp; Russakovsky, O. (2021). Understanding and Evaluating Racial Biases in Image Captioning. arXiv preprint arXiv:2106.08503.
  3. Feldstein, S. (2021). The Global Expansion of AI Surveillance. Carnegie Endowment for International Peace. Retrieved 17 September 2019, from https://carnegieendowment.org/2019/09/17/global-expansion-of-ai-surveillance-pub-79847.
  4. Firth, N. (2021). Apple Card is being investigated over claims it gives women lower credit limits. MIT Technology Review. Retrieved 23 November 2021, from https://www.technologyreview.com/2019/11/11/131983/apple-card-is-being-investigated-over-claims-it-gives-women-lower-credit-limits/.
  5. Howard, A., u0026amp; Isbell, C. (2021). Diversity in AI: The Invisible Men and Women. MIT Sloan Management Review. Retrieved 21 September 2020, from https://sloanreview.mit.edu/article/diversity-in-ai-the-invisible-men-and-women/.
  6. AI Now. (2019). Discriminating Systems: Gender, Race, and Power in AI (Ebook). Retrieved 23 November 2021.
  7. Swauger, S. (2021). Opinion | What's worse than remote school? Remote test-taking with AI proctors. NBC News. Retrieved 7 November 2020, from https://www.nbcnews.com/think/opinion/remote-testing-monitored-ai-failing-students-forced-undergo-it-ncna1246769
  8. Belani, G. (2021). AI Paving the Way for Remote Work | IEEE Computer Society. Computer.org. Retrieved 26 July 2021, from https://www.computer.org/publications/tech-news/trends/remote-working-easier-with-ai
  9. Scott, A., Kapor Klein, F., and Onovakpuri, U. (2017). Tech Leavers Study (Ebook). Retrieved 24 November 2021, from https://www.kaporcenter.org/wp-content/uploads/2017/08/TechLeavers2017.pdf
  10. Women in the Workplace (2021). 2021. Retrieved 23 November 2021, from https://www.mckinsey.com/featured-insights/diversity-and-inclusion/women-in-the-workplace
  11. Silicon Valley Bank. (2021). 2020 Global Startup Outlook: Key insights from the Silicon Valley Bank startup outlook survey (Ebook). Retrieved 23 November 2021, from https://www.svb.com/globalassets/library/uploadedfiles/content/trends_and_insights/reports/startup_outlook_report/suo_global_report_2020-final.pdf
  12. Firth, N. (2021). Apple Card is being investigated over claims it gives women lower credit limits. MIT Technology Review. Retrieved 23 November 2021, from https://www.technologyreview.com/2019/11/11/131983/apple-card-is-being-investigated-over-claims-it-gives-women-lower-credit-limits/.
  13. Tomasev, N., McKee, K.R., Kay, J., u0026amp; Mohamed, S. (2021). Fairness for Unobserved Characteristics: Insights from technological impacts on queer communities. In Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society (AIES ’21), Retrieved October 1, 2021 from https://doi.org/10.1145/3461702.3462540
  14. Martinez, E., u0026amp; Kirchner, L. (2021). The secret bias hidden in mortgage-approval algorithms | AP News. AP News. Retrieved 24 November 2021, from https://apnews.com/article/lifestyle-technology-business-race-and-ethnicity-mortgages-2d3d40d5751f933a88c1e17063657586
  15. Turner Lee, N., Resnick, P., u0026amp; Barton, G. (2021). Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms. Brookings. Retrieved 24 November 2021, from https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/.
  16. Rock, D., u0026amp; Grant, H. (2016). Why diverse teams are smarter. Harvard Business Review, 4(4), 2-5.
  17. Wang, J., Cheng, G. H. L., Chen, T., u0026amp; Leung, K. (2019). Team creativity/innovation in culturally diverse teams: A meta‐analysis. Journal of Organizational Behavior, 40(6), 693-708.
  18. Lorenzo, R., Voigt, N., Tsusaka, M., Krentz, M., u0026amp; Abouzahr, K. (2018). How Diverse Leadership Teams Boost Innovation. BCG Global. Retrieved 24 November 2021, from https://www.bcg.com/publications/2018/how-diverse-leadership-teams-boost-innovation
  19. Hoobler, J. M., Masterson, C. R., Nkomo, S. M., u0026amp; Michel, E. J. (2018). The business case for women leaders: Meta-analysis, research critique, and path forward. Journal of Management, 44(6), 2473-2499.
  20. Chakravorti, B. (2020). To Increase Diversity, U.S. Tech Companies Need to Follow the Talent. Harvard Business Review. Retrieved 24 November 2021, from https://hbr.org/2020/12/to-increase-diversity-u-s-tech-companies-need-to-follow-the-talent.
  21. Accenture. (2018). Getting to Equal 2018: The Disability Inclusion Advantage. Retrieved from https://www.accenture.com/_acnmedia/pdf-89/accenture-disability-inclusion-research-report.pdf
  22. Whittaker, M., Alper, M., Bennett, C. L., Hendren, S., Kaziunas, L., Mills, M., ... u0026amp; West, S. M. (2019). Disability, bias, and AI. AI Now Institute.
  23. Heater, B. (2020). Tech companies respond to George Floyd’s death, ensuing protests and systemic racism. Techcrunch.com. Retrieved 24 November 2021, from https://techcrunch.com/2020/06/01/tech-co-protests/.
  24. Google (2021). 2021 Diversity Annual Report. Retrieved 24 November 2021, from https://static.googleusercontent.com/media/diversity.google/en//annual-report/static/pdfs/google_2021_diversity_annual_report.pdf?cachebust=2e13d07.
  25. Facebook. (2021). Facebook Diversity Update: Increasing Representation in Our Workforce and Supporting Minority-Owned Businesses | Meta. Meta. Retrieved 24 November 2021, from https://about.fb.com/news/2021/07/facebook-diversity-report-2021/.
  26. Amazon Staff. (2020). Our workforce data. US About Amazon. Retrieved 24 November 2021, from https://www.aboutamazon.com/news/workplace/our-workforce-data
  27. Adobe. (2021). Adobe Diversity By the Numbers. Adobe. Retrieved 24 November 2021, from https://www.adobe.com/diversity/data.html
  28. National Center for Women in Tech. (2020). NCWIT Scorecard: The Status of Women in Computing (2020 Update). Retrieved https://ncwit.org/resource/scorecard/
  29. Center for American Progress (2012). The State of diversity in Today’s workforce. Retrieved from https://www.americanprogress.org/article/the-state-of-diversity-in-todays-workforce/
  30. Gillenwater, S. (2020). Meet the CIOs of the Fortune 500 — 2021 edition. Boardroom Insiders. Retrieved from https://www.boardroominsiders.com/blog/meet-the-cios-of-the-fortune-500-2021-edition
  31. Stack Overflow. (2020). 2020 Developer Survey. Retrieved from https://insights.stackoverflow.com/survey/2020#developer-profile-disability-status-mental-health-and-differences
  32. Stanford HAI. (2021). The AI Index Report: Measuring Trends in Artificial intelligence (Ebook). Retrieved 24 November 2021, from https://aiindex.stanford.edu/wp-content/uploads/2021/03/2021-AI-Index-Report-_Chapter-6.pdf.
  33. Chi, N., Lurie, E., u0026amp; Mulligan, D. K. (2021). Reconfiguring Diversity and Inclusion for AI Ethics. arXiv preprint arXiv:2105.02407.
  34. Selyukh, A. (2016). Why Some Diversity Thinkers Aren't Buying The Tech Industry's Excuses. NPR. Retrieved 24 November 2021, from https://www.npr.org/sections/alltechconsidered/2016/07/19/486511816/why-some-diversity-thinkers-arent-buying-the-tech-industrys-excuses.
  35. National Association for Educational Progress. (2020). NAEP Report Card: Mathematics. Retrieved from https://www.nationsreportcard.gov/mathematics/nation/achievement/?grade=4
  36. Ladner, R. (2021). Expanding the pipeline: The status of persons with disabilities in the Computer Science Pipeline. Retrieved February 1, 2022, from https://cra.org/cra-wp/expanding-the-pipeline-the-status-of-persons-with-disabilities-in-the-computer-science-pipeline/
  37. Center for Evaluating the Research Pipeline (2021). “Data Buddies Survey 2019 Annual Report”. Computing Research Association, Washington, D.C.
  38. Code.org. (2021). Code.org's Approach to Diversity u0026amp; Equity in Computer Science. Code.org. Retrieved 24 November 2021, from https://code.org/diversity
  39. Zweben, S., u0026amp; Bizot, B. (2021). 2020 Taulbee Survey: Bachelor’s and Doctoral Degree Production Growth Continues but New Student Enrollment Shows Declines (Ebook). Computing Research Association. Retrieved 24 November 2021, from https://cra.org/wp-content/uploads/2021/05/2020-CRA-Taulbee-Survey.pdf
  40. Computing Research Association (2017). Generation CS: Computer Science Undergraduate Enrollments Surge Since 2006
  41. The Higher Education Statistics Agency (2021). Higher Education Student Statistics. Retrieved from https://www.hesa.ac.uk/news/16-01-2020/sb255-higher-education-student-statistics/subjects
  42. BCS. (2014). Women in IT Survey (Ebook). BCS: The Chartered Institute for IT. Retrieved 24 November 2021, from https://www.bcs.org/media/4446/women-it-survey.pdf
  43. Inclusive Boards. (2018). Inclusive Tech Alliance Report 2018 (Ebook). Retrieved 24 November 2021, from https://www.inclusivetechalliance.co.uk/wp-content/uploads/2019/07/Inclusive-Tech-Alliance-Report.pdf.
  44. Atomico. (2020). The State of European Tech 2020. 2020.stateofeuropeantech.com. Retrieved 24 November 2021, from https://2020.stateofeuropeantech.com/chapter/diversity-inclusion/article/diversity-inclusion/.
  45. Chung-Yan, G. A. (2010). The nonlinear effects of job complexity and autonomy on job satisfaction, turnover, and psychological well-being. Journal of occupational health psychology, 15(3), 237.
  46. McKnight, D. H., Phillips, B., u0026amp; Hardgrave, B. C. (2009). Which reduces IT turnover intention the most: Workplace characteristics or job characteristics?. Information u0026amp; Management, 46(3), 167-174.
  47. Vaamonde, J. D., Omar, A., u0026amp; Salessi, S. (2018). From organizational justice perceptions to turnover intentions: The mediating effects of burnout and job satisfaction. Europe's journal of psychology, 14(3), 554.
  48. Instructure (2019). How to get today's employees to stay and engage? Develop their careers. PR Newswire. Retrieved from https://www.prnewswire.com/news-releases/how-to-get-todays-employees-to-stay-and-engage-develop-their-careers-300860067.html
  49. McCarty, E. (2021). Integral and The Harris Poll Find Employees are giving Employers a Performance Review - Integral. Integral. Retrieved 24 November 2021, from https://www.teamintegral.com/2021/news-release-integral-employee-activation-index/
  50. McCarty, E. (2021). Integral and The Harris Poll Find Employees are giving Employers a Performance Review - Integral. Integral. Retrieved 24 November 2021, from https://www.teamintegral.com/2021/news-release-integral-employee-activation-index/
  51. Bureau of Labor Statistics. (2021). News Release - The Employment Situation - October 2021 (Ebook). Retrieved 24 November 2021, from https://www.bls.gov/news.release/pdf/empsit.pdf
  52. Scott, A., Kapor Klein, F., u0026amp; Onovakpuri, U. (2017). Tech Leavers Study (Ebook). Retrieved 24 November 2021, from https://www.kaporcenter.org/wp-content/uploads/2017/08/TechLeavers2017.pdf.
  53. Young, E., Wajcman, J. and Sprejer, L. (2021). Where are the Women? Mapping the Gender Job Gap in AI. Policy Briefing: Full Report. The Alan Turing Institute.
  54. Metz, C. (2021). A second Google A.I. researcher says the company fired her.. Nytimes.com. Retrieved 24 November 2021, from https://www.nytimes.com/2021/02/19/technology/google-ethical-artificial-intelligence-team.html
  55. Myrow, R. (2021). Pinterest Sounds A More Contrite Tone After Black Former Employees Speak Out. Npr.org. Retrieved 24 November 2021, from https://www.npr.org/2020/06/23/881624553/pinterest-sounds-a-more-contrite-tone-after-black-former-employees-speak-out
  56. Scheer, S. (2021). The Tech Sector’s Big Disability Inclusion Problem. ERE. Retrieved from https://www.ere.net/the-tech-sectors-big-disability-inclusion-problem/
  57. Robinson, O. C. (2014). Sampling in interview-based qualitative research: A theoretical and practical guide. Qualitative research in psychology, 11(1), 25-41.
  58. Yancey, A. K., Ortega, A. N., u0026amp; Kumanyika, S. K. (2006). Effective recruitment and retention of minority research participants. Annu. Rev. Public Health, 27, 1-28.
  59. Hill, C. E., Knox, S., Thompson, B. J., Williams, E. N., Hess, S. A., u0026amp; Ladany, N. (2005). Consensual qualitative research: An update. Journal of counseling psychology, 52(2), 196.
  60. Gunaratnam, Y. (2003). Researching'race'and ethnicity: Methods, knowledge and power. Sage.
  61. Race and Ethnicity. American Sociological Association. (2022). Retrieved 29 January 2022, archived at https://web.archive.org/web/20190821170406/https://www.asanet.org/topics/race-and-ethnicity
  62. University of Minnesota Libraries (2022). 10.2 The Meaning of Race and Ethnicity. Open.lib.umn.edu. Retrieved 29 January 2022, from https://open.lib.umn.edu/sociology/chapter/10-2-the-meaning-of-race-and-ethnicity/.
  63. Sue, Derald Wing, Christina M. Capodilupo, Gina C. Torino, Jennifer M. Bucceri, Aisha Holder, Kevin L. Nadal, and Marta Esquilin.
  64. https://adata.org/glossary-terms#D
Table of Contents
1
2
3
4
5
6
7
8
9