AI Fairness 360 - Demo

  • Data
  • Check
  • Mitigate
  • Compare

1. Choose sample data set

Bias occurs in data used to train a model. We have provided three sample datasets that you can use to explore bias checking and mitigation. Each dataset contains attributes that should be protected to avoid bias.

Predict a criminal defendant’s likelihood of reoffending.
Protected Attributes:
- Sex, privileged: Female, unprivileged: Male
- Race, privileged: Caucasian, unprivileged: Not Caucasian
Learn more

Predict an individual's credit risk.
Protected Attributes:
- Sex, privileged: Male, unprivileged: Female
- Age, privileged: Old, unprivileged: Young
Learn more

Predict whether income exceeds $50K/yr based on census data.
Protected Attributes:
- Race, privileged: White, unprivileged: Non-white
- Sex, privileged: Male, unprivileged: Female
Learn more