Functional regression

From HandWiki

Functional regression is a version of regression analysis when responses or covariates include functional data. Functional regression models can be classified into four types depending on whether the responses or covariates are functional or scalar: (i) scalar responses with functional covariates, (ii) functional responses with scalar covariates, (iii) functional responses with functional covariates, and (iv) scalar or functional responses with functional and scalar covariates. In addition, functional regression models can be linear, partially linear, or nonlinear. In particular, functional polynomial models, functional single and multiple index models and functional additive models are three special cases of functional nonlinear models.

Functional linear models (FLMs)

Functional linear models (FLMs) are an extension of linear models (LMs). A linear model with scalar response [math]\displaystyle{ Y\in\mathbb{R} }[/math] and scalar covariates [math]\displaystyle{ X\in\mathbb{R}^p }[/math] can be written as

[math]\displaystyle{ Y = \beta_0 + \langle X,\beta\rangle + \varepsilon, }[/math]

 

 

 

 

(1)

where [math]\displaystyle{ \langle\cdot,\cdot\rangle }[/math] denotes the inner product in Euclidean space, [math]\displaystyle{ \beta_0\in\mathbb{R} }[/math] and [math]\displaystyle{ \beta\in\mathbb{R}^p }[/math] denote the regression coefficients, and [math]\displaystyle{ \varepsilon }[/math] is a random error with mean zero and finite variance. FLMs can be divided into two types based on the responses.

Functional linear models with scalar responses

Functional linear models with scalar responses can be obtained by replacing the scalar covariates [math]\displaystyle{ X }[/math] and the coefficient vector [math]\displaystyle{ \beta }[/math] in model (1) by a centered functional covariate [math]\displaystyle{ X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot)) }[/math] and a coefficient function [math]\displaystyle{ \beta = \beta(\cdot) }[/math] with domain [math]\displaystyle{ \mathcal{T} }[/math], respectively, and replacing the inner product in Euclidean space by that in Hilbert space [math]\displaystyle{ L^2 }[/math],

[math]\displaystyle{ Y = \beta_0 + \langle X^c, \beta\rangle +\varepsilon = \beta_0 + \int_\mathcal{T} X^c(t)\beta(t)\,dt + \varepsilon, }[/math]

 

 

 

 

(2)

where [math]\displaystyle{ \langle \cdot, \cdot \rangle }[/math] here denotes the inner product in [math]\displaystyle{ L^2 }[/math]. One approach to estimating [math]\displaystyle{ \beta_0 }[/math] and [math]\displaystyle{ \beta(\cdot) }[/math] is to expand the centered covariate [math]\displaystyle{ X^c(\cdot) }[/math] and the coefficient function [math]\displaystyle{ \beta(\cdot) }[/math] in the same functional basis, for example, B-spline basis or the eigenbasis used in the Karhunen–Loève expansion. Suppose [math]\displaystyle{ \{\phi_k\}_{k=1}^\infty }[/math] is an orthonormal basis of [math]\displaystyle{ L^2 }[/math]. Expanding [math]\displaystyle{ X^c }[/math] and [math]\displaystyle{ \beta }[/math] in this basis, [math]\displaystyle{ X^c(\cdot) = \sum_{k=1}^\infty x_k \phi_k(\cdot) }[/math], [math]\displaystyle{ \beta(\cdot) = \sum_{k=1}^\infty \beta_k \phi_k(\cdot) }[/math], model (2) becomes [math]\displaystyle{ Y = \beta_0 + \sum_{k=1}^\infty \beta_k x_k +\varepsilon. }[/math] For implementation, regularization is needed and can be done through truncation, [math]\displaystyle{ L^2 }[/math] penalization or [math]\displaystyle{ L^1 }[/math] penalization.[1] In addition, a reproducing kernel Hilbert space (RKHS) approach can also be used to estimate [math]\displaystyle{ \beta_0 }[/math] and [math]\displaystyle{ \beta(\cdot) }[/math] in model (2)[2]

Adding multiple functional and scalar covariates, model (2) can be extended to

[math]\displaystyle{ Y = \sum_{k=1}^q Z_k\alpha_k + \sum_{j=1}^p \int_{\mathcal{T}_j} X_j^c(t) \beta_j(t) \,dt + \varepsilon, }[/math]

 

 

 

 

(3)

where [math]\displaystyle{ Z_1,\ldots,Z_q }[/math] are scalar covariates with [math]\displaystyle{ Z_1=1 }[/math], [math]\displaystyle{ \alpha_1,\ldots,\alpha_q }[/math] are regression coefficients for [math]\displaystyle{ Z_1,\ldots,Z_q }[/math], respectively, [math]\displaystyle{ X^c_j }[/math] is a centered functional covariate given by [math]\displaystyle{ X_j^c(\cdot) = X_j(\cdot) - \mathbb{E}(X_j(\cdot)) }[/math], [math]\displaystyle{ \beta_j }[/math] is regression coefficient function for [math]\displaystyle{ X_j^c(\cdot) }[/math], and [math]\displaystyle{ \mathcal{T}_j }[/math] is the domain of [math]\displaystyle{ X_j }[/math] and [math]\displaystyle{ \beta_j }[/math], for [math]\displaystyle{ j=1,\ldots,p }[/math]. However, due to the parametric component [math]\displaystyle{ \alpha }[/math], the estimation methods for model (2) cannot be used in this case[3] and alternative estimation methods for model (3) are available.[4][5]

Functional linear models with functional responses

For a functional response [math]\displaystyle{ Y(\cdot) }[/math] with domain [math]\displaystyle{ \mathcal{T} }[/math] and a functional covariate [math]\displaystyle{ X(\cdot) }[/math] with domain [math]\displaystyle{ \mathcal{S} }[/math], two FLMs regressing [math]\displaystyle{ Y(\cdot) }[/math] on [math]\displaystyle{ X(\cdot) }[/math] have been considered.[3][6] One of these two models is of the form

[math]\displaystyle{ Y(t) = \beta_0(t) + \int_{\mathcal{S}} \beta(s,t) X^c(s)\,ds + \varepsilon(t),\ \text{for}\ t\in\mathcal{T}, }[/math]

 

 

 

 

(4)

where [math]\displaystyle{ X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot)) }[/math] is still the centered functional covariate, [math]\displaystyle{ \beta_0(\cdot) }[/math] and [math]\displaystyle{ \beta(\cdot,\cdot) }[/math] are coefficient functions, and [math]\displaystyle{ \varepsilon(\cdot) }[/math] is usually assumed to be a random process with mean zero and finite variance. In this case, at any given time [math]\displaystyle{ t\in\mathcal{T} }[/math], the value of [math]\displaystyle{ Y }[/math], i.e., [math]\displaystyle{ Y(t) }[/math], depends on the entire trajectory of [math]\displaystyle{ X }[/math]. Model (4), for any given time [math]\displaystyle{ t }[/math], is an extension of multivariate linear regression with the inner product in Euclidean space replaced by that in [math]\displaystyle{ L^2 }[/math]. An estimating equation motivated by multivariate linear regression is [math]\displaystyle{ r_{XY} = R_{XX}\beta, \text{ for } \beta\in L^2(\mathcal{S}\times\mathcal{S}), }[/math] where [math]\displaystyle{ r_{XY}(s,t) = \text{cov}(X(s),Y(t)) }[/math], [math]\displaystyle{ R_{XX}: L^2(\mathcal{S}\times\mathcal{S}) \rightarrow L^2(\mathcal{S}\times\mathcal{T}) }[/math] is defined as [math]\displaystyle{ (R_{XX}\beta)(s,t) = \int_\mathcal{S} r_{XX}(s,w)\beta(w,t)dw }[/math] with [math]\displaystyle{ r_{XX}(s,w) = \text{cov}(X(s),X(w)) }[/math] for [math]\displaystyle{ s,w\in\mathcal{S} }[/math].[3] Regularization is needed and can be done through truncation, [math]\displaystyle{ L^2 }[/math] penalization or [math]\displaystyle{ L^1 }[/math] penalization.[1] Various estimation methods for model (4) are available.[7][8]
When [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are concurrently observed, i.e., [math]\displaystyle{ \mathcal{S}=\mathcal{T} }[/math],[9] it is reasonable to consider a historical functional linear model, where the current value of [math]\displaystyle{ Y }[/math] only depends on the history of [math]\displaystyle{ X }[/math], i.e., [math]\displaystyle{ \beta(s,t)=0 }[/math] for [math]\displaystyle{ s\gt t }[/math] in model (4).[3][10] A simpler version of the historical functional linear model is the functional concurrent model (see below).
Adding multiple functional covariates, model (4) can be extended to

[math]\displaystyle{ Y(t) = \beta_0(t) + \sum_{j=1}^p\int_{\mathcal{S}_j} \beta_j(s,t) X^c_j(s)\,ds + \varepsilon(t),\ \text{for}\ t\in\mathcal{T}, }[/math]

 

 

 

 

(5)

where for [math]\displaystyle{ j=1,\ldots,p }[/math], [math]\displaystyle{ X_j^c(\cdot)=X_j(\cdot) - \mathbb{E}(X_j(\cdot)) }[/math] is a centered functional covariate with domain [math]\displaystyle{ \mathcal{S}_j }[/math], and [math]\displaystyle{ \beta_j(\cdot,\cdot) }[/math] is the corresponding coefficient function with the same domain, respectively.[3] In particular, taking [math]\displaystyle{ X_j(\cdot) }[/math] as a constant function yields a special case of model (5) [math]\displaystyle{ Y(t) = \sum_{j=1}^p X_j \beta_j(t) + \varepsilon(t),\ \text{for}\ t\in\mathcal{T}, }[/math] which is a FLM with functional responses and scalar covariates.

Functional concurrent models

Assuming that [math]\displaystyle{ \mathcal{S} = \mathcal{T} }[/math], another model, known as the functional concurrent model, sometimes also referred to as the varying-coefficient model, is of the form

[math]\displaystyle{ Y(t) = \alpha_0(t) + \alpha(t)X(t)+\varepsilon(t),\ \text{for}\ t\in\mathcal{T}, }[/math]

 

 

 

 

(6)

where [math]\displaystyle{ \alpha_0 }[/math] and [math]\displaystyle{ \alpha }[/math] are coefficient functions. Note that model (6) assumes the value of [math]\displaystyle{ Y }[/math] at time [math]\displaystyle{ t }[/math], i.e., [math]\displaystyle{ Y(t) }[/math], only depends on that of [math]\displaystyle{ X }[/math] at the same time, i.e., [math]\displaystyle{ X(t) }[/math]. Various estimation methods can be applied to model (6).[11][12][13]
Adding multiple functional covariates, model (6) can also be extended to [math]\displaystyle{ Y(t) = \alpha_0(t) + \sum_{j=1}^p\alpha_j(t)X_j(t)+\varepsilon(t),\ \text{for}\ t\in\mathcal{T}, }[/math] where [math]\displaystyle{ X_1,\ldots,X_p }[/math] are multiple functional covariates with domain [math]\displaystyle{ \mathcal{T} }[/math] and [math]\displaystyle{ \alpha_0,\alpha_1,\ldots,\alpha_p }[/math] are the coefficient functions with the same domain.[3]

Functional nonlinear models

Functional polynomial models

Functional polynomial models are an extension of the FLMs with scalar responses, analogous to extending linear regression to polynomial regression. For a scalar response [math]\displaystyle{ Y }[/math] and a functional covariate [math]\displaystyle{ X(\cdot) }[/math] with domain [math]\displaystyle{ \mathcal{T} }[/math], the simplest example of functional polynomial models is functional quadratic regression[14] [math]\displaystyle{ Y = \alpha + \int_\mathcal{T}\beta(t)X^c(t)\,dt + \int_\mathcal{T} \int_\mathcal{T} \gamma(s,t) X^c(s)X^c(t) \,ds\,dt + \varepsilon, }[/math] where [math]\displaystyle{ X^c(\cdot) = X(\cdot) - \mathbb{E}(X(\cdot)) }[/math] is the centered functional covariate, [math]\displaystyle{ \alpha }[/math] is a scalar coefficient, [math]\displaystyle{ \beta(\cdot) }[/math] and [math]\displaystyle{ \gamma(\cdot,\cdot) }[/math] are coefficient functions with domains [math]\displaystyle{ \mathcal{T} }[/math] and [math]\displaystyle{ \mathcal{T}\times\mathcal{T} }[/math], respectively, and [math]\displaystyle{ \varepsilon }[/math] is a random error with mean zero and finite variance. By analogy to FLMs with scalar responses, estimation of functional polynomial models can be obtained through expanding both the centered covariate [math]\displaystyle{ X^c }[/math] and the coefficient functions [math]\displaystyle{ \beta }[/math] and [math]\displaystyle{ \gamma }[/math] in an orthonormal basis.[14]

Functional single and multiple index models

A functional multiple index model is given by [math]\displaystyle{ Y = g\left(\int_{\mathcal{T}} X^c(t) \beta_1(t)\,dt, \ldots, \int_{\mathcal{T}} X^c(t) \beta_p(t)\,dt \right) + \varepsilon. }[/math] Taking [math]\displaystyle{ p=1 }[/math] yields a functional single index model. However, for [math]\displaystyle{ p\gt 1 }[/math], this model is problematic due to curse of dimensionality. With [math]\displaystyle{ p\gt 1 }[/math] and relatively small sample sizes, the estimator given by this model often has large variance.[15] An alternative [math]\displaystyle{ p }[/math]-component functional multiple index model can be expressed as [math]\displaystyle{ Y = g_1\left(\int_{\mathcal{T}} X^c(t) \beta_1(t)\,dt\right)+ \cdots+ g_p\left(\int_{\mathcal{T}} X^c(t) \beta_p(t)\,dt \right) + \varepsilon. }[/math] Estimation methods for functional single and multiple index models are available.[15][16]

Functional additive models (FAMs)

Given an expansion of a functional covariate [math]\displaystyle{ X }[/math] with domain [math]\displaystyle{ \mathcal{T} }[/math] in an orthonormal basis [math]\displaystyle{ \{\phi_k\}_{k=1}^\infty }[/math]: [math]\displaystyle{ X(t) = \sum_{k=1}^\infty x_k \phi_k(t) }[/math], a functional linear model with scalar responses shown in model (2) can be written as [math]\displaystyle{ \mathbb{E}(Y|X)=\mathbb{E}(Y) + \sum_{k=1}^\infty \beta_k x_k. }[/math] One form of FAMs is obtained by replacing the linear function of [math]\displaystyle{ x_k }[/math], i.e., [math]\displaystyle{ \beta_k x_k }[/math], by a general smooth function [math]\displaystyle{ f_k }[/math], [math]\displaystyle{ \mathbb{E}(Y|X)=\mathbb{E}(Y) + \sum_{k=1}^\infty f_k(x_k), }[/math] where [math]\displaystyle{ f_k }[/math] satisfies [math]\displaystyle{ \mathbb{E}(f_k(x_k))=0 }[/math] for [math]\displaystyle{ k\in\mathbb{N} }[/math].[3][17] Another form of FAMs consists of a sequence of time-additive models: [math]\displaystyle{ \mathbb{E}(Y|X(t_1),\ldots,X(t_p))=\sum_{j=1}^p f_j(X(t_j)), }[/math] where [math]\displaystyle{ \{t_1,\ldots,t_p\} }[/math] is a dense grid on [math]\displaystyle{ \mathcal{T} }[/math] with increasing size [math]\displaystyle{ p\in\mathbb{N} }[/math], and [math]\displaystyle{ f_j(x) = g(t_j,x) }[/math] with [math]\displaystyle{ g }[/math] a smooth function, for [math]\displaystyle{ j=1,\ldots,p }[/math][3][18]

Extensions

A direct extension of FLMs with scalar responses shown in model (2) is to add a link function to create a generalized functional linear model (GFLM) by analogy to extending linear regression to generalized linear regression (GLM), of which the three components are:

  1. Linear predictor [math]\displaystyle{ \eta = \beta_0 + \int_{\mathcal{T}} X^c(t)\beta(t)\,dt }[/math];
  2. Variance function [math]\displaystyle{ \text{Var}(Y|X) = V(\mu) }[/math], where [math]\displaystyle{ \mu = \mathbb{E}(Y|X) }[/math] is the conditional mean;
  3. Link function [math]\displaystyle{ g }[/math] connecting the conditional mean and the linear predictor through [math]\displaystyle{ \mu=g(\eta) }[/math].

See also

References

  1. 1.0 1.1 Morris, Jeffrey S. (2015). "Functional Regression". Annual Review of Statistics and Its Application 2 (1): 321–359. doi:10.1146/annurev-statistics-010814-020413. Bibcode2015AnRSA...2..321M. 
  2. Yuan and Cai (2010). "A reproducing kernel Hilbert space approach to functional linear regression". The Annals of Statistics. 38 (6):3412–3444. doi:10.1214/09-AOS772.
  3. 3.0 3.1 3.2 3.3 3.4 3.5 3.6 3.7 Wang, Jane-Ling; Chiou, Jeng-Min; Müller, Hans-Georg (2016). "Functional Data Analysis". Annual Review of Statistics and Its Application 3 (1): 257–295. doi:10.1146/annurev-statistics-041715-033624. Bibcode2016AnRSA...3..257W. https://zenodo.org/record/895750. 
  4. Kong, Xue, Yao and Zhang (2016). "Partially functional linear regression in high dimensions". Biometrika. 103 (1):147–159. doi:10.1093/biomet/asv062.
  5. Hu, Wang and Carroll (2004). "Profile-kernel versus backfitting in the partially linear models for longitudinal/clustered data". Biometrika. 91 (2): 251–262. doi:10.1093/biomet/91.2.251.
  6. Ramsay and Silverman (2005). Functional data analysis, 2nd ed., New York: Springer, ISBN:0-387-40080-X.
  7. Ramsay and Dalzell (1991). "Some tools for functional data analysis". Journal of the Royal Statistical Society. Series B (Methodological). 53 (3):539–572. https://www.jstor.org/stable/2345586.
  8. Yao, Müller and Wang (2005). "Functional linear regression analysis for longitudinal data". The Annals of Statistics. 33 (6):2873–2903. doi:10.1214/009053605000000660.
  9. Grenander (1950). "Stochastic processes and statistical inference". Arkiv Matematik. 1 (3):195–277. doi:10.1007/BF02590638.
  10. Malfait and Ramsay (2003). "The historical functional linear model". Canadian Journal of Statistics. 31 (2):115–128. doi:10.2307/3316063.
  11. Fan and Zhang (1999). "Statistical estimation in varying coefficient models". The Annals of Statistics. 27 (5):1491–1518. doi:10.1214/aos/1017939139.
  12. Huang, Wu and Zhou (2004). "Polynomial spline estimation and inference for varying coefficient models with longitudinal data". Biometrika. 14 (3):763–788. https://www.jstor.org/stable/24307415.
  13. Şentürk and Müller (2010). "Functional varying coefficient models for longitudinal data". Journal of the American Statistical Association. 105 (491):1256–1264. doi:10.1198/jasa.2010.tm09228.
  14. 14.0 14.1 Yao and Müller (2010). "Functional quadratic regression". Biometrika. 97 (1):49–64. doi:10.1093/biomet/asp069.
  15. 15.0 15.1 Chen, Hall and Müller (2011). "Single and multiple index functional regression models with nonparametric link". The Annals of Statistics. 39 (3):1720–1747. doi:10.1214/11-AOS882.
  16. Jiang and Wang (2011). "Functional single index models for longitudinal data". 39 (1):362–388. doi:10.1214/10-AOS845.
  17. Müller and Yao (2008). "Functional additive models". Journal of the American Statistical Association. 103 (484):1534–1544. doi:10.1198/016214508000000751.
  18. Fan, James and Radchenko (2015). "Functional additive regression". The Annals of Statistics. 43 (5):2296–2325. doi:10.1214/15-AOS1346.