Influence functions and Data Pruning: from theory to non-convergence

Today’s session brings Influence Functions under the spotlight - the theory, non-convergence issues, and uses for data pruning. Fabio will uncover the fragile nature of influence functions in deep learning, helping us understand what neural networks memorize, and exploring the possibility of beating power law scaling of model performance with dataset size.

References

In this series