ABOUT ML Promising Interventions

Throughout the ABOUT ML Reference Document, yellow “Promising Interventions” callout boxes are included to highlight different promising interventions for practitioners to try. Below are the collected promising interventions, arranged according to the section of the Reference Document they appear in.

Those attempting documentation practices within any phase of the machine learning lifecycle can consider procurement-specific insights and pay particular attention to:

  1. Ensuring accountability and safety of personally identifiable information.
  2. Practicing software quality assurance.
  3. Engaging in open source community development.
  4. Prioritizing change management processes.

Those attempting documentation practices within any phase of the machine learning lifecycle can consider how documentation processes and artifacts would function when interacting with cultural, social, normative, and policy forces by paying particular attention to:

  1. Impact
  2. Human effect
  3. Ethical consideration

Those attempting documentation practices within any phase of the machine learning lifecycle can target ethical and human-focused considerations by paying particular attention to:

  1. Real-world examples
  2. Individual context

Those attempting documentation practices within any phase of the machine learning lifecycle can consider how ethics approval might be customizable for different disciplines or change over time by paying particular attention to:

  1. Establishing ethics scores and approvals
  2. Developing clear objectives during data collection with benchmarks and constraints for review (i.e. in 5, 7, or 10 years)
  3. Ensuring ease of contact for participants to whom the data belongs, asking questions such as “If you are doing longitudinal processes with a fairly transient population, how do you ensure you can find that person later to re-establish consent? “ and “Is that information still relevant for our use?”
  4. Committing to, after finding issues with a dataset, discontinuing use and/or putting mitigation practices in place
  5. Considering your team’s ability to fix, update, or remove any model or data released from distribution
  6. Replacing problematic benchmarks and encouraging use of better alternatives

Those attempting documentation practices within any phase of the machine learning lifecycle can consider lessons learned from flawed ML systems currently in use by paying particular attention to:

  1. Instantiating more rigorous tests that include comparing system decisions with those a human would arrive at
  2. Putting in place a mitigation strategy for risks in the realm of possibility for homegrown systems

Those attempting documentation practices within any phase of the machine learning lifecycle can consider potential misuse by paying particular attention to:

  1. Determining the auditing method
  2. Archiving the old model (versioning for safety and recovery)