Options
Consistent plug-in classifiers for complex objectives and constraints
Date Issued
01-01-2020
Author(s)
Tavker, Shiv Kumar
Indian Institute of Technology, Madras
Narasimhan, Harikrishna
Abstract
We present a consistent algorithm for constrained classification problems where the objective (e.g. F-measure, G-mean) and the constraints (e.g. demographic parity fairness, coverage) are defined by general functions of the confusion matrix. Our approach reduces the problem into a sequence of plug-in classifier learning tasks. The reduction is achieved by posing the learning problem as an optimization over the intersection of two sets: the set of confusion matrices that are achievable and those that are feasible. This decoupling of the constraint space then allows us to solve the problem by applying Frank-Wolfe style optimization over the individual sets. For objective and constraints that are convex functions of the confusion matrix, our algorithm requires O(1/e2) calls to the plug-in subroutine, which improves on the O(1/e3) calls needed by the reduction-based algorithm of Narasimhan (2018) [29]. We show empirically that our algorithm is competitive with prior methods, while being more robust to choices of hyper-parameters.
Volume
2020-December