Machine learning systems are prone to bias, coming from the data used to train them. This can cause many kinds of issues as the machine makes its judgements. For example, a lack of fairness towards the groups that are not well represented in the data. Let’s understand why that is so.