Minimax lower bounds for transfer learning with linear and one-hidden layer neural networks
Mohammadreza Mousavi Kalan, Zalan Fabian, Salman Avestimehr, Mahdi Soltanolkotabi
Published in Advances in Neural Information Processing Systems 33 , 2020
Transfer learning has emerged as a powerful technique for improving the performance of machine learning models on new domains where labeled training data may be scarce. In this approach a model trained for a source task, where plenty of labeled training data is available, is used as a starting point for training a model on a related target task with only few labeled training data. Despite recent empirical success of transfer learning approaches, the benefits and fundamental limits of transfer learning are poorly understood. In this paper we develop a statistical minimax framework to characterize the fundamental limits of transfer learning in the context of regression with linear and one-hidden layer neural network models. Specifically, we derive a lower-bound for the target generalization error achievable by any algorithm as a function of the number of labeled source and target data as well as appropriate notions of similarity between the source and target tasks. Our lower bound provides new insights into the benefits and limitations of transfer learning. We further corroborate our theoretical finding with various experiments.
Recommended citation Mousavi Kalan, M., Fabian, Z., Avestimehr, S. and Soltanolkotabi, M., 2020. "Minimax lower bounds for transfer learning with linear and one-hidden layer neural networks". Advances in Neural Information Processing Systems, 33, pp.1959-1969.
BibTeX
@article{mousavi2020minimax,
title={Minimax Lower Bounds for Transfer Learning with Linear and One-hidden Layer Neural Networks},
author={Mousavi Kalan, Mohammadreza and Fabian, Zalan and Avestimehr, Salman and Soltanolkotabi, Mahdi},
journal={Advances in Neural Information Processing Systems},
volume={33},
year={2020}
}