sherlock

Gradient Based Activations for Accurate Bias-Free Learning

Vinod K Kurmi1, Rishabh Sharma2, Yash Vardhan Sharma2, Vinay P. Namboodiri3

1KU Leuven, Belgium, 2IIT Roorkee, India, 3University of Bath, UK

Work done at Indian Institute of Technology Kanpur, India

[Paper] [ArXiv] [Code] [Poster]

sherlock

Abstract

Bias mitigation in machine learning models is imperative, yet challenging. While several approaches have been proposed, one view towards mitigating bias is through adversarial learning. A discriminator is used to identify the bias attributes such as gender, age or race in question. This discriminator is used adversarially to ensure that it cannot distinguish the bias attributes. The main drawback in such a model is that it directly introduces a trade-off with accuracy as the features that the discriminator deems to be sensitive for discrimination of bias could be correlated with classification. In this work we solve the problem. We show that a biased discriminator can actually be used to improve this bias-accuracy tradeoff. Specifically, this is achieved by using a feature masking approach using the discriminator's gradients. We ensure that the features favoured for the bias discrimination are de-emphasized and the unbiased features are enhanced during classification. We show that this simple approach works well to reduce bias as well as improve accuracy significantly. We evaluate the proposed model on standard benchmarks. We improve the accuracy of the adversarial methods while maintaining or even improving the unbiasness and also outperform several other recent methods.

Training Method

Code is Coming Soon! Stay Tuned!!

Vinod K Kurmi, Rishabh Sharma, Yash Vardhan Sharma, Vinay P. Namboodiri
Gradient Based Activations for Accurate Bias-Free Learning

BibTex

@InProceedings{Kurmi_2021_aai_gba,
author = {K Kurmi, Vinod and Sharma, Rishabh and Sharma ,Yash Vardhan and Namboodiri, Vinay P.},
title = {Gradient Based Activations for Accurate Bias-Free Learning},
booktitle = {AAAI,},
month = {Feb},
year = {2022}
}