Reference

Data Shapley Valuation for Efficient Batch Active Learning, Amirata Ghorbani, James Zou, Andre Esteva. 2022 56th Asilomar Conference on Signals, Systems, and Computers(2022)

Abstract

Annotating the right set of data amongst all available data points is a key challenge in many machine learning applications. Batch active learning is a popular approach to address this, in which batches of unlabeled data points are selected for annotation, while an underlying learning algorithm gets subsequently updated. In this work, we introduce Active Data Shapley (ADS) –a filtering layer for batch active learning that significantly increases the efficiency of existing active learning algorithms by pre-selecting, using a linear time computation, the highest-value points from an unlabeled dataset. Using the notion of the Shapley value of data, our method estimates the value of unlabeled data points with regards to the prediction task at hand. We show that ADS is particularly effective when the pool of unlabeled data exhibits real-world caveats: noise, heterogeneity, and domain shift. We run experiments demonstrating that when ADS is used to pre-select the highest-ranking portion of an unlabeled dataset, the efficiency of state-of-the-art batch active learning methods increases by an average factor of 6x, while preserving performance effectiveness.