Uncertainty quantification with conformal prediction

Conformal prediction is a method for uncertainty quantification that is based on the idea of conformal sets (CS). A CS is a set of predictions that is guaranteed to contain the true value with a certain probability. This probability is called the confidence level of the CS. The CSs are constructed such that the confidence level is independent of the data distribution and the model’s structure or uncertainty. This means that the CPs are distribution-free and model-agnostic. They are very simple to compute and involve only a calibration step involving quantiles of a so-called conformal score over unseen data.

In this talk we introduce conformal prediction and its applications to classification and regression. We highlight the key assumptions of the method and discuss its main challenges. We also discuss the main differences between conformal prediction and other uncertainty quantification methods.

References

In this series