March 3, 2006
Similar papers 3
February 23, 2022
We propose inferential tools for functional linear quantile regression where the conditional quantile of a scalar response is assumed to be a linear functional of a functional covariate. In contrast to conventional approaches, we employ kernel convolution to smooth the original loss function. The coefficient function is estimated under a reproducing kernel Hilbert space framework. A gradient descent algorithm is designed to minimize the smoothed loss function with a roughness...
July 18, 2019
Regressing a scalar response on a random function is nowadays a common situation. In the nonparametric setting, this paper paves the way for making the local linear regression based on a projection approach a prominent method for solving this regression problem. Our asymptotic results demonstrate that the functional local linear regression outperforms its functional local constant counterpart. Beyond the estimation of the regression operator itself, the local linear regressio...
June 2, 2020
We study the problem of estimating the derivatives of a regression function, which has a wide range of applications as a key nonparametric functional of unknown functions. Standard analysis may be tailored to specific derivative orders, and parameter tuning remains a daunting challenge particularly for high-order derivatives. In this article, we propose a simple plug-in kernel ridge regression (KRR) estimator in nonparametric regression with random design that is broadly appl...
July 27, 2021
For the nonparametric regression models with covariates contaminated with normal measurement errors, this paper proposes an extrapolation algorithm to estimate the nonparametric regression functions. By applying the conditional expectation directly to the kernel-weighted least squares of the deviations between the local linear approximation and the observed responses, the proposed algorithm successfully bypasses the simulation step needed in the classical simulation extrapola...
August 9, 2011
Consider the nonparametric regression model Y=m(X)+E, where the function m is smooth but unknown, and E is independent of X. An estimator of the density of the error term E is proposed and its weak consistency is obtained. The contribution of this paper is twofold. First, we evaluate the impact of the estimation of the regression function on the error density estimator. Secondly, the optimal choices of the first and second step bandwidths used for estimating the regression fu...
March 30, 2016
We establish minimax convergence rates for classification of functional data and for nonparametric regression with functional design variables. The optimal rates are of logarithmic type under smoothness constraints on the functional density and the regression mapping, respectively. These asymptotic properties are attainable by conventional kernel procedures. The bandwidth selector does not require knowledge of the smoothness level of the target mapping. In this work, the func...
March 1, 2022
In this paper, we consider a functional linear regression model, where both the covariate and the response variable are functional random variables. We address the problem of optimal nonparametric estimation of the conditional expectation operator in this model. A collection of projection estimators over finite dimensional subspaces is first introduce. We provide a non-asymptotic bias-variance decomposition for the Mean Square Prediction error in the case where these subspace...
December 1, 2020
We study a functional linear regression model that deals with functional responses and allows for both functional covariates and high-dimensional vector covariates. The proposed model is flexible and nests several functional regression models in the literature as special cases. Based on the theory of reproducing kernel Hilbert spaces (RKHS), we propose a penalized least squares estimator that can accommodate functional variables observed on discrete sample points. Besides a c...
November 17, 2012
In this article, we consider convergence rates in functional linear regression with functional responses, where the linear coefficient lies in a reproducing kernel Hilbert space (RKHS). Without assuming that the reproducing kernel and the covariate covariance kernel are aligned, or assuming polynomial rate of decay of the eigenvalues of the covariance kernel, convergence rates in prediction risk are established. The corresponding lower bound in rates is derived by reducing to...
October 29, 2021
In this paper, we establish minimax optimal rates of convergence for prediction in a semi-functional linear model that consists of a functional component and a less smooth nonparametric component. Our results reveal that the smoother functional component can be learned with the minimax rate as if the nonparametric component were known. More specifically, a double-penalized least squares method is adopted to estimate both the functional and nonparametric components within the ...