Theoretical results in the functional linear regression literature have so
far focused on minimax estimation where smoothness parameters are assumed to
be known and the estimators typically depend on these smoothness parameters.
In this paper we consider adaptive estimation in functional linear
regression. The goal is to construct a single data-driven procedure that
achieves optimality results simultaneously over a collection of parameter
spaces. Such an adaptive procedure automatically adjusts to the smoothness
properties of the underlying slope and covariance functions. The main
technical tools for the construction of the adaptive procedure are
functional principal component analysis and block thresholding. The
estimator of the slope function is shown to adaptively attain the
optimal rate of convergence over a large collection of function spaces.