Most proposed algorithmic fairness techniques require access to demographic attribute data. To meet this requirement, there are a growing number of calls for increased collection of demographic data. Through conversations with Partners, we found that the lack of clarity around the acceptable uses for demographic data poses a key barrier to addressing algorithmic bias in practice.
The Demographic Data Workstream seeks to understand what types of data collection practices and governance frameworks are required to ensure that fairness assessments of algorithmic systems are conducted in the public interest.
Shedding Light on the Trade-offs of Using Demographic Data for Algorithmic Fairness
Knowing the Risks: A Necessary Step to Using Demographic Data for Algorithmic Fairness
Fairer Algorithmic Decision-Making and Its Consequences: Interrogating the Risks and Benefits of Demographic Data Collection, Use,…
In this white paper, we explore the trade-offs of using demographic data to create fairer algorithmic decision-making systems.
What We Can’t Measure, We Can’t Understand”: Challenges to Demographic Data Procurement in the Pursuit of…
As calls for unbiased algorithmic systems increase, so too does the number of individuals working on algorithmic fairness. However, these practitioners often do not have access to the data they need to detect bias.
Demographic-Reliant Algorithmic Fairness: Characterizing the Risks of Demographic Data Collection in the Pursuit of Fairness
This paper explores what conditions must be met for demographic data to be collected and used to enable algorithmic fairness methods by characterizing a range of risks.