This section relies on the definitions from our Fairness in Machine Learning guide,
including the definitions of “estimator”, “reduction”, “sensitive features”,
“moment”, and “parity”.
Unfairness mitigation algorithms take form of scikit-learn-style estimators.
Any algorithm-specific parameters are passed to the constructor. The resulting
instance of the algorithm should support methods fit and
predict with APIs resembling those of scikit-learn as much as
possible. Any deviations are noted below.
Reduction constructors require a parameter corresponding to an estimator that
implements the fit method with the sample_weight argument.
Parity constraints for reductions are expressed via instances of various
subclasses of the class fairlearn.reductions.Moment. Formally,
instances of the class Moment implement vector-valued random variables
whose sample averages over the data are required to be bounded (above and/or
constraints = Moment()
reduction = Reduction(estimator, constraints)
Reductions provide fit and predict methods with the following
reduction.fit(X, y, **kwargs)
All of the currently supported parity constraints (subclasses of
Moment) are based on sensitive features that need to be provided to
fit as a keyword argument sensitive_features. In the future,
it will also be possible to provide sensitive features as columns of
The constructors of postprocessing algorithms require either an already
trained predictor or an estimator (which is trained on the data when executing
fit). For postprocessing algorithms, the constraints argument
is provided as a string.
postprocessor = PostProcessing(estimator=estimator, constraints=constraints)
Post-processing algorithms (such as the ones under
fairlearn.postprocessing) provide the same functions as the reductions
above albeit with sensitive_features as a required argument for
predict. In the future, we will make sensitive_features
optional if the sensitive features are already provided through X.
postprocessor.fit(X, y, sensitive_features=sensitive_features)