Throughout the ABOUT ML Reference Document, yellow “Promising Interventions” callout boxes are included to highlight different promising interventions for practitioners to try. Below are the collected promising interventions, arranged according to the section of the Reference Document they appear in.
Those attempting documentation practices within any phase of the machine learning lifecycle can consider procurement-specific insights and pay particular attention to:
- Ensuring accountability and safety of personally identifiable information.
- Practicing software quality assurance.
- Engaging in open source community development.
- Prioritizing change management processes.
Those attempting documentation practices within any phase of the machine learning lifecycle can consider how documentation processes and artifacts would function when interacting with cultural, social, normative, and policy forces by paying particular attention to:
- Impact
- Human effect
- Ethical consideration
Those attempting documentation practices within any phase of the machine learning lifecycle can target ethical and human-focused considerations by paying particular attention to:
- Real-world examples
- Individual context
Those attempting documentation practices within any phase of the machine learning lifecycle can consider how ethics approval might be customizable for different disciplines or change over time by paying particular attention to:
- Establishing ethics scores and approvals
- Developing clear objectives during data collection with benchmarks and constraints for review (i.e. in 5, 7, or 10 years)
- Ensuring ease of contact for participants to whom the data belongs, asking questions such as “If you are doing longitudinal processes with a fairly transient population, how do you ensure you can find that person later to re-establish consent? “ and “Is that information still relevant for our use?”
- Committing to, after finding issues with a dataset, discontinuing use and/or putting mitigation practices in place
- Considering your team’s ability to fix, update, or remove any model or data released from distribution
- Replacing problematic benchmarks and encouraging use of better alternatives
Those attempting documentation practices within any phase of the machine learning lifecycle can consider lessons learned from flawed ML systems currently in use by paying particular attention to:
- Instantiating more rigorous tests that include comparing system decisions with those a human would arrive at
- Putting in place a mitigation strategy for risks in the realm of possibility for homegrown systems
Those attempting documentation practices within any phase of the machine learning lifecycle can consider potential misuse by paying particular attention to:
- Determining the auditing method
- Archiving the old model (versioning for safety and recovery)