Transfer Learning: Optimality and adaptive algorithms
Laplace Lecture

Presented at the 10th World Congress in Probability and Statistics on July 21, 2021


Abstract: Human learners have the natural ability to use knowledge gained in one setting for learning in a different but related setting. This ability to transfer knowledge from one task to another is essential for effective learning. In this talk, we consider statistical transfer learning in various settings with a focus on nonparametric classification based on observations from different distributions under the posterior drift model, which is a general framework and arises in many practical problems.
We first establish the minimax rate of convergence and construct a rate-optimal weighted K-NN classifier. The results characterize precisely the contribution of the observations from the source distribution to the classification task under the target distribution. A data-driven adaptive classifier is then proposed and is shown to simultaneously attain within a logarithmic factor of the optimal rate over a large collection of parameter spaces.


Papers:


Back to Tony Cai's Homepage