Knowledge

Functional data analysis

Source 📝

7463:. In order to bypass the "curse" and the metric selection problem, we are motivated to consider nonlinear functional regression models, which are subject to some structural constraints but do not overly infringe flexibility. One desires models that retain polynomial rates of convergence, while being more flexible than, say, functional linear models. Such models are particularly useful when diagnostics for the functional linear model indicate lack of fit, which is often encountered in real life situations. In particular, functional polynomial models, functional 9743: 24:
often time, but may also be spatial location, wavelength, probability, etc. Intrinsically, functional data are infinite dimensional. The high intrinsic dimensionality of these data brings challenges for theory as well as computation, where these challenges vary with how the functional data were sampled. However, the high or infinite dimensional structure of the data is a rich source of information and there are many interesting challenges for research and data analysis.
9755:, where the amplitude variation is the growth rate and the time variation explains the difference in children's biological age at which the pubertal and the pre-pubertal growth spurt occurred. In the presence of time variation, the cross-sectional mean function may not be an efficient estimate as peaks and troughs are located randomly and thus meaningful signals may be distorted or hidden. 9724:
covariates as predictors. For regression based functional classification models, functional generalized linear models or more specifically, functional binary regression, such as functional logistic regression for binary responses, are commonly used classification approaches. More generally, the generalized functional linear regression model based on the
8038:, respectively. In addition to the parameter function β that the above functional quadratic regression model shares with the FLM, it also features a parameter surface γ. By analogy to FLMs with scalar responses, estimation of functional polynomial models can be obtained through expanding both the centered covariate 10184:
Landmark registration (or feature alignment) assumes well-expressed features are present in all sample curves and uses the location of such features as a gold-standard. Special features such as peak or trough locations in functions or derivatives are aligned to their average locations on the template
44:
was done in the 1970s by Kleffe, Dauxois and Pousse including results about the asymptotic distribution of the eigenvalues. More recently in the 1990s and 2000s the field has focused more on applications and understanding the effects of dense and sparse observations schemes. The term "Functional Data
9714:
are two main approaches. These classical clustering concepts for vector-valued multivariate data have been extended to functional data. For clustering of functional data, k-means clustering methods are more popular than hierarchical clustering methods. For k-means clustering on functional data, mean
23:
that analyses data providing information about curves, surfaces or anything else varying over a continuum. In its most general form, under an FDA framework, each sample element of functional data is considered to be a random function. The physical continuum over which these functions are defined is
9723:
Functional classification assigns a group membership to a new data object either based on functional regression or functional discriminant analysis. Functional data classification methods based on functional regression models use class levels as responses and the observed functional data and other
9728:
is used. Functional Linear Discriminant Analysis (FLDA) has also been considered as a classification method for functional data. Functional data classification involving density ratios has also been proposed. A study of the asymptotic behavior of the proposed classifiers in the large sample limit
10175:
The template function is determined through an iteration process, starting from cross-sectional mean, performing registration and recalculating the cross-sectional mean for the warped curves, expecting convergence after a few iterations. DTW minimizes a cost function through dynamic programming.
2073: 7793: 6285: 8295: 5242: 4716: 10185:
function. Then the warping function is introduced through a smooth transformation from the average location to the subject-specific locations. A problem of landmark registration is that the features may be missing or hard to identify due to the noise in the data.
5547: 7102: 9719:
is also widely used in clustering vector-valued multivariate data and has been extended to functional data clustering. Furthermore, Bayesian hierarchical clustering also plays an important role in the development of model-based functional clustering.
6923: 1305: 9758:
Time warping, also known as curve registration, curve alignment or time synchronization, aims to identify and separate amplitude variation and time variation. If both time and amplitude variation are present, then the observed functional data
1810: 10115:
of the domain to itself, that is, loosely speaking, a class of invertible functions that maps the compact domain to itself such that both the function and its inverse are smooth. The set of linear transformation is contained in the set of
9750:
In addition to amplitude variation, time variation may also be assumed to present in functional data. Time variation occurs when the subject-specific timing of certain events of interest varies among subjects. One classical example is the
300: 3731: 10403:
There are Python packages to work with functional data, and its representation, perform exploratory analysis, or preprocessing, and among other tasks such as inference, classification, regression or clustering of functional data.
8737: 863: 420: 8937: 10136:. Another traditional method for time warping is landmark registration, which aligns special features such as peak locations to an average location. Other relevant warping methods include pairwise warping, registration using 7594: 1605: 6094: 8116: 4474: 3905: 9217: 5105: 4590: 4018: 65:. The former is mathematically convenient, whereas the latter is somewhat more suitable from an applied perspective. These two approaches coincide if the random functions are continuous and a condition called 9487: 1905: 9271: 2867: 8593: 7870: 670: 5416: 4181: 2634: 7255: 2425: 3333: 3113: 6941: 3548: 6771: 7455:
Direct nonlinear extensions of the classical functional linear regression models (FLMs) still involve a linear predictor, but combine it with a nonlinear link function, analogous to the idea of
3022: 6460: 5794: 7479:
Functional polynomial regression models may be viewed as a natural extension of the Functional Linear Models (FLMs) with scalar responses, analogous to extending linear regression model to
10243: 7445: 2803: 9986: 9938: 8462: 4764: 511: 9024: 5598: 1695: 748: 4842: 4807: 2347: 10062: 9546: 4351: 3204: 3237: 9715:
functions are usually regarded as the cluster centers. Covariance structures have also been taken into consideration. Besides k-means type clustering, functional clustering based on
5373: 1361: 9591: 7954: 7373: 6710: 3624: 10109: 4990: 4580: 2546: 1498: 574: 9890: 4207: 2696: 7164: 5315: 10165: 2142: 461: 9052: 4545: 9746:
Structures in cross-sectional mean destroyed if time variation is ignored. On the contrary, structures in cross-sectional mean is well-captured after restoring time variation.
6534: 5025: 1130: 32:
Functional data analysis has roots going back to work by Grenander and Karhunen in the 1940s and 1950s. They considered the decomposition of square-integrable continuous time
9110: 8801: 7447:
or future value. Hence, it is a "concurrent regression model", which is also referred as "varying-coefficient" model. Further, various estimation methods have been proposed.
7284: 6563: 3794: 2967: 1840: 9332: 8639: 7919: 6337: 3589: 2756: 1723: 1649: 1157: 949: 921: 12642:
Pigoli, D; Hadjipantelis, PZ; Coleman, JS; Aston, JAD (2017). "The statistical analysis of acoustic phonetic data: exploring differences between spoken Romance languages".
9695: 6760: 6375: 6084: 5891: 2461: 1165: 4862: 3758: 2233: 1893: 893: 8833: 5821: 10308: 10276: 7530: 5853: 5709: 1508: 346: 6008: 3448: 1731: 8103: 8036: 7890: 5406: 1462: 1442: 185: 8403: 8373: 8347: 8083: 4918: 3363: 3143: 2726: 1866: 10390: 10335: 9784: 9660: 9370: 9079: 8964: 8828: 8764: 8504: 8063: 7589: 6601: 6046: 5645: 5094: 5063: 4254: 3390: 2260: 1399: 1043: 180: 9640: 7313: 6650: 5940: 1422: 8321: 3629: 11741:
Coffey, N; Hinde, J; Holian, E. (2014). "Clustering longitudinal profiles using P-splines and mixed effects models applied to time-course gene expression data".
4361: 10363: 10006: 9620: 9115: 7501: 6621: 5618: 4898: 4274: 4227: 3468: 2907: 2887: 2481: 2162: 2096: 1067: 1001: 981: 772: 535: 320: 135: 115: 95: 8625: 7986: 7562: 7196: 6492: 5972: 5677: 2292: 7591:, the simplest and the most prominent member in the family of functional polynomial regression models is the quadratic functional regression given as follows, 783: 354: 10120:. One challenge in time warping is identifiability of amplitude and phase variation. Specific assumptions are required to break this non-identifiability. 6086:. Two major models have been considered in this setup. One of these two models, generally referred to as functional linear model (FLM), can be written as: 9729:
shows that under certain conditions the misclassification rate converges to zero, a phenomenon that has been referred to as "perfect classification".
10830:
Shi, M; Weiss, RE; Taylor, JMG. (1996). "An analysis of paediatric CD4 counts for acquired immune deficiency syndrome using flexible random curves".
11547: 11496: 10111:, which warps the time of an underlying template function by subjected-specific shift and scale. More general class of warping functions includes 3803: 10176:
Problems of non-smooth differentiable warps or greedy computation in DTW can be resolved by adding a regularization term to the cost function.
9081:
ensures identifiability in the sense that the estimates of these additive component functions do not interfere with that of the intercept term
11256:
Huang, JZ; Wu, CO; Zhou, L. (2002). "Varying-coefficient models and basis function approximations for the analysis of repeated measurements".
7459:
from the conventional linear model. Developments towards fully nonparametric regression models for functional data encounter problems such as
12725:
Carroll, C; Müller, HG; Kneip, A (2021). "Cross-component registration for multivariate functional data, with application to growth curves".
3910: 2068:{\displaystyle \sup _{s,t\in }\left|\Sigma (s,t)-\sum _{j=1}^{K}\lambda _{j}\varphi _{j}(s)\varphi _{j}(t)\right|\to 0,\qquad K\to \infty .} 7788:{\displaystyle \mathbb {E} (Y|X)=\alpha +\int _{0}^{1}\beta (t)X^{c}(t)\,dt+\int _{0}^{1}\int _{0}^{1}\gamma (s,t)X^{c}(s)X^{c}(t)\,ds\,dt} 12579:
Anirudh, R; Turaga, P; Su, J; Srivastava, A (2015). "Elastic functional coding of human actions: From vector-fields to latent variables".
11811:
Angelini, C; Canditiis, DD; Pensky, M. (2012). "Clustering time-course microarray data using functional Bayesian infinite mixture model".
9403: 12661:
Happ, C; Greven, S (2018). "Multivariate Functional Principal Component Analysis for Data Observed on Different (Dimensional) Domains".
11283:
Huang, JZ; Wu, CO; Zhou, L. (2004). "Polynomial spline estimation and inference for varying coefficient models with longitudinal data".
9222: 6280:{\displaystyle Y(s)=\alpha _{0}(s)+\sum _{j=1}^{p}\int _{0}^{1}\alpha _{j}(s,t)X_{j}^{c}(t)\,dt+\varepsilon (s),\ {\text{for}}\ s\in } 2808: 11337:
Eggermont, PPB; Eubank, RL; LaRiccia, VN. (2010). "Convergence rates for smoothing spline estimators in varying coefficient models".
8509: 8290:{\displaystyle \mathbb {E} (Y|X)=g\left(\int _{0}^{1}X^{c}(t)\beta _{1}(t)\,dt,\ldots ,\int _{0}^{1}X^{c}(t)\beta _{p}(t)\,dt\right)} 7798: 12815:
Chen, K; Delicado, P; Müller, HG (2017). "Modelling function-valued stochastic processes, with applications to fertility dynamics".
11981: 41: 5237:{\displaystyle Y=\beta _{0}+\langle X^{c},\beta \rangle +\varepsilon =\beta _{0}+\int _{0}^{1}X^{c}(t)\beta (t)\,dt+\varepsilon .} 4711:{\displaystyle Y=\beta _{0}+\langle X,\beta \rangle +\varepsilon =\beta _{0}+X_{1}\beta _{1}+\dots +X_{p}\beta _{p}+\varepsilon ,} 10473: 9725: 3403: 952: 582: 10760:"Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference" 4027: 2551: 963:
The Hilbertian point of view is mathematically convenient, but abstract; the above considerations do not necessarily even view
7201: 10564: 10550: 2352: 10656:
Rice, JA; Silverman, BW. (1991). "Estimating the mean and covariance structure nonparametrically when the data are curves".
3260: 3040: 3473: 4020:
are the functional principal components (FPCs), sometimes referred to as scores. The Karhunen–Loève expansion facilitates
12417:
Marron, JS; Ramsay, JO; Sangalli, LM; Srivastava, A (2015). "Functional data analysis of amplitude and phase variation".
10493: 9387: 3410:
of the inherently infinite-dimensional functional data to finite-dimensional random vector of scores. More specifically,
2972: 12706:
Chiou, JM; Yang, YF; Chen, YT (2014). "Multivariate functional principal component analysis: a normalization approach".
12352:
Gasser, T; Müller, HG; Kohler, W; Molinari, L; Prader, A. (1984). "Nonparametric regression analysis of growth curves".
11673: 11434:
Chen, D; Hall, P; Müller HG. (2011). "Single and multiple index functional regression models with nonparametric link".
6380: 5714: 10536: 9988:
is a latent time warping function that corresponds to a cumulative distribution function. The time warping functions
3557: 10196: 8113:
A functional multiple index model is given as below, with symbols having their usual meanings as formerly described,
7378: 2761: 12770:
Dai, X; Müller, HG (2018). "Principal component analysis for functional data on Riemannian manifolds and spheres".
9943: 9895: 8416: 7286:
is usually assumed to be a random process with mean zero and finite variance. This model assumes that the value of
5542:{\displaystyle Y=\langle Z,\theta \rangle +\sum _{j=1}^{p}\int _{0}^{1}X_{j}^{c}(t)\beta _{j}(t)\,dt+\varepsilon ,} 4737: 466: 8969: 7464: 12866: 11070:
He, G; Müller, HG; Wang, JL. (2003). "Functional canonical analysis for square integrable stochastic processes".
5568: 1654: 682: 10067:
The simplest case of a family of warping functions to specify phase variation is linear transformation, that is
4812: 4777: 2297: 775: 10011: 9497: 4279: 3150: 7097:{\displaystyle Y(s)=\beta _{0}(s)+\sum _{j=1}^{p}\beta _{j}(s)X_{j}(s)+\varepsilon (s),\ {\text{for}}\ s\in ,} 3760:
are real-valued nonnegative eigenvalues in descending order with the corresponding orthonormal eigenfunctions
3209: 12871: 10396: 6918:{\displaystyle Y(s)=\alpha _{0}(s)+\sum _{j=1}^{p}X_{j}\alpha _{j}(s)+\varepsilon (s),\ {\text{for}}\ s\in ,} 5320: 2165: 1313: 10478: 9551: 7924: 7318: 6655: 3797: 3594: 2164:
and the Hilbert space machinery can be subsequently applied. Continuity of sample paths can be shown using
37: 12065:
Hall, P; Poskitt, DS; Presnell, B. (2001). "A Functional Data—Analytic Approach to Signal Discrimination".
10588: 10070: 4929: 4550: 4519:
that associates vector responses with vector covariates. The traditional linear model with scalar response
2486: 1467: 543: 11221:
Wu, CO; Yu, KF. (2002). "Nonparametric varying-coefficient models for the analysis of longitudinal data".
9789: 4186: 2639: 10894:
Kong, D; Xue, K; Yao, F; Zhang, HH. (2016). "Partially functional linear regression in high dimensions".
8830:, analogous to the extension of multiple linear regression models to additive models and is expressed as, 7123: 5269: 12225:
Dai, X; Müller, HG; Yao, F. (2017). "Optimal Bayes classifiers for functional data and density ratios".
10731:
Kleffe, J. (1973). "Principal components of random variables with values in a seperable hilbert space".
10139: 2101: 435: 9029: 4522: 138: 12462:
Sakoe, H; Chiba, S. (1978). "Dynamic programming algorithm optimization for spoken word recognition".
11135:
He, G; Müller, HG; Wang, JL; Yang, WJ. (2010). "Functional linear regression via canonical analysis".
6497: 4995: 1300:{\displaystyle \mu (t)=\mathbb {E} X(t),\qquad \Sigma (s,t)={\textrm {Cov}}(X(s),X(t)),\qquad s,t\in } 1075: 11903:
Petrone, S; Guindani, M; Gelfand, AE. (2009). "Hybrid Dirichlet mixture models for functional data".
11469:
Müller HG; Wu Y; Yao, F. (2013). "Continuously additive models for nonlinear functional regression".
9398: 9084: 8769: 7260: 6539: 4872:
random error (noise). Functional linear models can be divided into two types based on the responses.
3763: 2920: 1818: 10446: 9276: 8297:
Here g represents an (unknown) general smooth function defined on a p-dimensional domain. The case
7895: 6306: 3570: 2731: 1704: 1630: 1138: 930: 902: 10849:
Hilgert, N; Mas, A; Verzelen, N. (2013). "Minimax adaptive tests for the functional linear model".
10415: 9665: 9391: 7456: 6729: 6342: 6051: 5858: 4021: 3411: 3407: 2430: 11940:"Clustering in linear mixed models with approximate Dirichlet process mixtures using EM algorithm" 11768:
Heinzl, F; Tutz, G. (2014). "Clustering in linear-mixed models with a group fused lasso penalty".
4847: 3736: 2182: 1871: 871: 9711: 9594: 8376: 7460: 1805:{\displaystyle {\mathcal {C}}=\sum _{j=1}^{\infty }\lambda _{j}\varphi _{j}\otimes \varphi _{j}.} 6565:
is usually a random process with mean zero and finite variance. In this case, at any given time
5799: 295:{\displaystyle \mathbb {E} \|X\|_{L^{2}}^{2}=\mathbb {E} (\int _{0}^{1}|X(t)|^{2}dt)<\infty } 12178:"Robust Classification of Functional and Quantitative Image Data Using Functional Mixed Models" 11302:Şentürk, D; Müller, HG. (2010). "Functional varying coefficient models for longitudinal data". 11100:
Yao, F; Müller, HG; Wang, JL. (2005). "Functional data analysis for sparse longitudinal data".
10284: 10252: 8405:
and relatively small sample sizes, the estimator given by this model often has large variance.
7506: 5826: 5682: 325: 11674:"Funclust: A curves clustering method using functional random variables density approximation" 5977: 3417: 12329: 11541: 11490: 10969: 10817:
Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators
10557:
Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators
10488: 10461: 10451: 10441: 10436: 8088: 7991: 7875: 7480: 5378: 4512: 1447: 1427: 10431: 10248: 8382: 8352: 8326: 8068: 4903: 3726:{\displaystyle \Sigma (s,t)=\sum _{k=1}^{\infty }\lambda _{k}\varphi _{k}(s)\varphi _{k}(t)} 3338: 3252: 3118: 2701: 1845: 12316: 12100:
Ferraty, F; Vieu, P. (2003). "Curves discrimination: a nonparametric functional approach".
11820: 10956: 10620: 10582: 10573: 10498: 10368: 10313: 10129: 9762: 9645: 9337: 9057: 8942: 8806: 8742: 8467: 8323:
yields a functional single index model while multiple index models correspond to the case
8041: 7567: 6568: 6013: 5623: 5072: 5030: 4493: 4232: 3368: 2238: 1366: 1006: 143: 12597:
Dubey, P; Müller, HG (2021). "Modeling Time-Varying Random Objects and Dynamic Networks".
9625: 7289: 6626: 5916: 1407: 951:, in a non-increasing order. Truncating this infinite series to a finite order underpins 8: 11714:
Jacques, J; Preda, C. (2014). "Model-based clustering for multivariate functional data".
10456: 10426: 8300: 4767: 3564: 1896: 1049:
consist of equivalence classes, not functions. The stochastic process perspective views
514: 12320: 12153: 12128: 11824: 10960: 10624: 12842: 12797: 12779: 12752: 12734: 12688: 12670: 12624: 12606: 12479: 12444: 12426: 12394: 12334: 12285: 12234: 12202: 12177: 12158: 12082: 12047: 11962: 11939: 11920: 11880: 11855: 11836: 11793: 11696: 11654: 11619: 11602:
Banfield, JD; Raftery, AE. (1993). "Model-based Gaussian and non-Gaussian clustering".
11584: 11519: 11417: 11397: 11319: 11238: 11234: 11203: 11162: 11144: 11117: 11052: 11044: 11009: 11005: 10974: 10876: 10858: 10669: 10638: 10483: 10348: 10133: 9991: 9707: 9605: 8732:{\displaystyle \mathbb {E} (Y|X)=\mathbb {E} (Y)+\sum _{k=1}^{\infty }\beta _{k}x_{k}.} 7486: 6606: 5603: 4883: 4501: 4259: 4212: 3453: 2892: 2872: 2466: 2177: 2147: 2081: 1052: 986: 966: 858:{\displaystyle X=\mu +\sum _{i=1}^{\infty }\langle X,\varphi _{i}\rangle \varphi _{i},} 757: 520: 305: 120: 100: 80: 62: 33: 12113: 11083: 8598: 7959: 7535: 7169: 6465: 5945: 5650: 3032: 2912: 2265: 12756: 12628: 12289: 12277: 12272: 12255: 12207: 12193: 12162: 12006: 11916: 11885: 11785: 11588: 11579: 11562: 11242: 11013: 10776: 10759: 10642: 10560: 10546: 10532: 10524: 10508: 9752: 9492: 4516: 3450:
in a functional basis consisting of the eigenfunctions of the covariance operator on
3243: 415:{\displaystyle \mathbb {E} \langle X,h\rangle =\langle \mu ,h\rangle ,\qquad h\in H.} 12846: 12801: 12692: 12483: 12448: 12398: 12338: 12086: 12051: 12001: 11966: 11924: 11797: 11700: 11451:
Jiang, CR; Wang JL. (2011). "Functional single index models for longitudinal data".
11421: 11323: 11207: 11166: 11056: 10978: 10880: 10422:
Some packages can handle functional data under both dense and longitudinal designs.
10392:
and further to nonlinear manifolds, Hilbert spaces and eventually to metric spaces.
8932:{\displaystyle \mathbb {E} (Y|X)=\mathbb {E} (Y)+\sum _{k=1}^{\infty }f_{k}(x_{k}),} 6925:
which is a functional linear model with functional responses and scalar covariates.
12832: 12824: 12789: 12744: 12680: 12616: 12558: 12549:
Tang, R; Müller, HG. (2008). "Pairwise curve synchronization for functional data".
12510: 12471: 12436: 12384: 12324: 12267: 12197: 12189: 12148: 12140: 12109: 12074: 12037: 11996: 11954: 11912: 11875: 11867: 11840: 11828: 11777: 11750: 11723: 11688: 11658: 11646: 11611: 11574: 11529: 11478: 11407: 11346: 11311: 11265: 11230: 11193: 11154: 11121: 11109: 11079: 11036: 11001: 10964: 10903: 10868: 10771: 10740: 10696: 10665: 10628: 3247: 2098:
has continuous sample paths, namely that with probability one, the random function
1624: 751: 430: 12684: 12620: 1600:{\displaystyle ({\mathcal {C}}f)(t)=\int _{0}^{1}\Sigma (s,t)f(s)\,\mathrm {d} s.} 11982:"Classification using functional data analysis for temporal gene expression data" 11832: 11692: 11637:
James, GM; Sugar, CA. (2003). "Clustering for sparsely sampled functional data".
10520: 4771: 2483:
independent subjects. The sampling schedule may vary across subjects, denoted as
538: 426: 46: 12475: 11754: 11727: 11533: 11350: 11113: 10117: 10112: 7468: 4865: 4497: 1698: 676: 12144: 12078: 11315: 11269: 10744: 12860: 12581:
Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition
12515: 12498: 12281: 11958: 11198: 11181: 9742: 9716: 9600: 5066: 4229:
yields a good approximation to the infinite sum. Thereby, the information in
1046: 58: 12817:
Journal of the Royal Statistical Society. Series B (Statistical Methodology)
12562: 12389: 12372: 12260:
Journal of the Royal Statistical Society. Series B (Statistical Methodology)
12042: 12025: 11871: 11856:"Bayesian nonparametric functional data analysis through density estimation" 11650: 11567:
Journal of the Royal Statistical Society, Series B (Statistical Methodology)
11482: 10992:
Ramsay, JO; Dalzell, CJ. (1991). "Some tools for functional data analysis".
10907: 12211: 12010: 11889: 11789: 11781: 6536:
is the corresponding functional slopes with same domain, respectively, and
4469:{\displaystyle X_{i}^{(K)}(t)=\mu (t)+\sum _{k=1}^{K}A_{ik}\varphi _{k}(t)} 3900:{\displaystyle X_{i}(t)=\mu (t)+\sum _{k=1}^{\infty }A_{ik}\varphi _{k}(t)} 1159:). The mean and covariance functions are defined in a pointwise manner as 896: 11563:"Functional clustering and identifying substructures of longitudinal data" 11510:
Müller HG; Stadmüller, U. (2005). "Generalized Functional Linear Models".
11027:
Malfait, N; Ramsay, JO. (2003). "The historical functional linear model".
10559:, Wiley series in probability and statistics, John Wiley & Sons, Ltd, 9212:{\displaystyle \mathbb {E} (Y|X)=\mathbb {E} (Y)+\int _{0}^{1}g(t,X(t))dt} 3406:(FPCA) is the most prevalent tool in FDA, partly because FPCA facilitates 3253:
3. Sparsely sampled functions with noisy measurements (longitudinal data)
924: 12026:"Functional linear discriminant analysis for irregularly sampled curves" 10408: 5908: 12837: 12828: 12793: 12748: 12644:
Journal of the Royal Statistical Society. Series C (Applied Statistics)
12530:
Gasser, T; Kneip, A (1995). "Searching for structure in curve sample".
11623: 11048: 10872: 10832:
Journal of the Royal Statistical Society. Series C (Applied Statistics)
10701: 10684: 10633: 10608: 9112:. Another form of FAM is the continuously additive model, expressed as, 4013:{\displaystyle A_{ik}=\int _{0}^{1}(X_{i}(t)-\mu (t))\varphi _{k}(t)dt} 2548:
for the i-th subject. The corresponding i-th observation is denoted as
20: 12440: 11412: 11385: 11158: 11524: 3414:
is achieved by expanding the underlying observed random trajectories
57:
Random functions can be viewed as random elements taking values in a
11615: 11040: 9380:
An obvious and direct extension of FLMs with scalar responses (see (
4875: 12784: 12739: 12675: 12611: 12499:"Statistical tools to analyze data representing a sample of curves" 12431: 12304: 12239: 10994:
Journal of the Royal Statistical Society, Series B (Methodological)
10944: 10571:
Annual Review of Statistics and Its Application, Vol. 2, 321 - 359,
10503: 9701: 9482:{\displaystyle \eta =\beta _{0}+\int _{0}^{1}X^{c}(t)\beta (t)\,dt} 7471:
are three special cases of functional nonlinear regression models.
5069: 4869: 3033:
2. Densely sampled functions with noisy measurements (dense design)
2913:
1. Fully observed functions without noise at arbitrarily dense grid
11402: 11149: 10863: 10580:
Annual Review of Statistics and Its Application, Vol. 3, 257-295,
9266:{\displaystyle g:\times \mathbb {R} \longrightarrow \mathbb {R} } 8739:
One form of FAMs is obtained by replacing the linear function of
2862:{\displaystyle {\textrm {Var}}(\epsilon _{ij})=\sigma _{ij}^{2}} 12641: 11366:
Yao, F; Müller, HG. (2010). "Functional quadratic regression".
8588:{\displaystyle X^{c}(t)=\sum _{k=1}^{\infty }x_{k}\phi _{k}(t)} 7865:{\displaystyle X^{c}(\cdot )=X(\cdot )-\mathbb {E} (X(\cdot ))} 7257:
are the coefficient functions defined on the same interval and
12416: 8108: 5065:
and replacing the inner product in Euclidean space by that in
4500:
and wavelet bases. Important applications of FPCA include the
12464:
IEEE Transactions on Acoustics, Speech, and Signal Processing
10345:
The range set of the stochastic process may be extended from
10337:, for example the data could be a sample of random surfaces. 4024:
in the sense that the partial sum converges uniformly, i.e.,
3733:, where the series convergence is absolute and uniform, and 2144:
is continuous, the Karhunen-Loève expansion above holds for
12256:"Achieving near perfect classification for functional data" 11386:"A test of significance in functional quadratic regression" 3800:, the FPCA expansion of an underlying random trajectory is 665:{\displaystyle {\mathcal {C}}h=\mathbb {E} ,\qquad h\in H,} 12351: 7474: 4176:{\displaystyle \sup _{t\in }\mathbb {E} ^{2}\rightarrow 0} 3398: 2427:. The realizations of the process for the i-th subject is 1444:
are continuous functions and then the covariance function
12578: 10685:"Peter Hall, functional data analysis and random objects" 7450: 2629:{\displaystyle {\textbf {X}}_{i}=(X_{i1},...,X_{iN_{i}})} 1135:
indexed by the unit interval (or more generally interval
10583:
https://doi.org/10.1146/annurev-statistics-041715-033624
10574:
https://doi.org/10.1146/annurev-statistics-010814-020413
7250:{\displaystyle \beta _{0},\beta _{1},\ldots ,\beta _{p}} 11810: 11336: 10193:
So far we considered scalar valued stochastic process,
6762:
as a constant function yields a special case of model (
2420:{\displaystyle \Sigma (s,t)={\textrm {Cov}}(X(s),X(t))} 11902: 11853: 11182:"Statistical estimation in varying coefficient models" 9375: 8630:
A functional linear model with scalar responses (see (
4507: 3328:{\displaystyle Y_{ij}=X_{i}(T_{ij})+\varepsilon _{ij}} 3108:{\displaystyle Y_{ij}=X_{i}(T_{ij})+\varepsilon _{ij}} 11509: 10371: 10351: 10316: 10287: 10255: 10199: 10142: 10073: 10014: 9994: 9946: 9898: 9792: 9765: 9668: 9648: 9628: 9608: 9554: 9500: 9406: 9340: 9279: 9225: 9118: 9087: 9060: 9032: 8972: 8945: 8836: 8809: 8772: 8745: 8642: 8601: 8512: 8470: 8419: 8385: 8355: 8329: 8303: 8119: 8091: 8071: 8044: 7994: 7962: 7927: 7898: 7878: 7801: 7597: 7570: 7538: 7509: 7489: 7381: 7321: 7292: 7263: 7204: 7172: 7126: 6944: 6774: 6732: 6658: 6629: 6609: 6571: 6542: 6500: 6468: 6383: 6345: 6309: 6097: 6054: 6016: 5980: 5948: 5919: 5909:
Functional regression models with functional response
5861: 5829: 5802: 5717: 5685: 5653: 5626: 5606: 5571: 5419: 5381: 5323: 5272: 5266:) can be extended to multiple functional covariates, 5108: 5075: 5033: 4998: 4932: 4906: 4886: 4850: 4815: 4780: 4740: 4593: 4553: 4525: 4364: 4282: 4262: 4235: 4215: 4189: 4030: 3913: 3806: 3766: 3739: 3632: 3597: 3573: 3543:{\displaystyle {\mathcal {C}}:L^{2}\rightarrow L^{2}} 3476: 3456: 3420: 3395:
Real life example: CD4 count data for AIDS patients.
3371: 3341: 3263: 3212: 3153: 3121: 3043: 2975: 2923: 2895: 2875: 2811: 2764: 2734: 2704: 2642: 2554: 2489: 2469: 2433: 2355: 2300: 2268: 2241: 2185: 2150: 2104: 2084: 1908: 1874: 1848: 1821: 1734: 1707: 1657: 1633: 1511: 1470: 1450: 1430: 1410: 1369: 1316: 1168: 1141: 1078: 1055: 1009: 989: 969: 933: 905: 874: 786: 760: 685: 585: 546: 523: 469: 438: 357: 328: 308: 188: 146: 123: 103: 83: 12129:"Functional data classification: a wavelet approach" 11468: 8408: 2176:
Functional data are considered as realizations of a
12064: 10757: 7564:and the corresponding centered predictor processes 3017:{\displaystyle t\in {\mathcal {I}},\,i=1,\ldots ,n} 12814: 12724: 12377:Journal of the Royal Statistical Society, Series B 10848: 10384: 10357: 10340: 10329: 10302: 10270: 10237: 10159: 10103: 10056: 10000: 9980: 9932: 9884: 9778: 9689: 9654: 9634: 9614: 9585: 9540: 9481: 9364: 9326: 9265: 9211: 9104: 9073: 9054:. This constraint on the general smooth functions 9046: 9018: 8958: 8931: 8822: 8795: 8758: 8731: 8619: 8587: 8498: 8456: 8397: 8367: 8341: 8315: 8289: 8097: 8077: 8057: 8030: 7980: 7948: 7913: 7884: 7864: 7787: 7583: 7556: 7524: 7495: 7439: 7367: 7307: 7278: 7249: 7190: 7158: 7096: 6917: 6754: 6704: 6644: 6615: 6595: 6557: 6528: 6486: 6454: 6369: 6331: 6279: 6078: 6040: 6002: 5966: 5934: 5885: 5847: 5815: 5788: 5703: 5671: 5639: 5612: 5592: 5541: 5400: 5367: 5309: 5236: 5088: 5057: 5019: 4984: 4912: 4892: 4856: 4836: 4801: 4758: 4710: 4574: 4539: 4468: 4345: 4268: 4248: 4221: 4201: 4175: 4012: 3899: 3788: 3752: 3725: 3618: 3583: 3542: 3462: 3442: 3384: 3357: 3327: 3231: 3198: 3137: 3107: 3016: 2961: 2901: 2881: 2861: 2797: 2750: 2720: 2690: 2628: 2540: 2475: 2455: 2419: 2341: 2286: 2254: 2227: 2156: 2136: 2090: 2067: 1887: 1860: 1834: 1804: 1717: 1689: 1643: 1599: 1492: 1456: 1436: 1416: 1393: 1355: 1299: 1151: 1124: 1061: 1037: 995: 975: 943: 915: 887: 857: 766: 742: 664: 568: 529: 505: 455: 414: 340: 314: 294: 174: 137:is a separable Hilbert space such as the space of 129: 109: 89: 11740: 10925:. Springer Series in Statistics. Springer-Verlag. 6455:{\displaystyle X_{j}^{c}(t)=X_{j}(t)-\mu _{j}(t)} 5789:{\displaystyle X_{j}^{c}(t)=X_{j}(t)-\mu _{j}(t)} 4876:Functional regression models with scalar response 4515:can be viewed as an extension of the traditional 3026:Often unrealistic but mathematically convenient. 77:In the Hilbert space viewpoint, one considers an 12858: 11134: 10609:"Stochastic processes and statistical inference" 9702:Clustering and classification of functional data 6721: 4032: 1910: 12663:Journal of the American Statistical Association 12599:Journal of the American Statistical Association 12532:Journal of the American Statistical Association 12309:Annual Review of Statistics and Its Application 12302: 11854:Rodríguez, A; Dunson, DB; Gelfand, AE. (2009). 11639:Journal of the American Statistical Association 11601: 11304:Journal of the American Statistical Association 11102:Journal of the American Statistical Association 10949:Annual Review of Statistics and Its Application 10942: 10923:Inference for functional data with applications 10920: 10893: 10799: 10733:Mathematische Operationsforschung und Statistik 10543:Inference for Functional Data with Applications 6928: 4504:and functional principal component regression. 12175: 12126: 10829: 10238:{\displaystyle \{X(t)\}_{t\in {\mathcal {T}}}} 7440:{\displaystyle \{X_{j}(t):t\leq s\}_{j=1}^{p}} 5711:is the centered functional covariate given by 5317:, also including additional vector covariates 2798:{\displaystyle \mathbb {E} (\epsilon _{ij})=0} 983:as a function at all, since common choices of 72: 12705: 11383: 11339:Journal of Statistical Planning and Inference 11301: 11099: 11026: 10991: 10819:. Wiley Series in Probability and Statistics. 10655: 10578:Wang et al. (2016) Functional Data Analysis, 9981:{\displaystyle h_{i}{\overset {iid}{\sim }}h} 9933:{\displaystyle X_{i}{\overset {iid}{\sim }}X} 9394:(GLM). The three components of the GFLM are: 8457:{\displaystyle \{\phi _{k}\}_{k=1}^{\infty }} 4759:{\displaystyle \langle \cdot ,\cdot \rangle } 4209:and thus the partial sum with a large enough 506:{\displaystyle \mathbb {E} \|X\|_{L^{2}}^{2}} 12253: 12224: 12102:Computational Statistics & Data Analysis 11743:Computational Statistics & Data Analysis 11716:Computational Statistics & Data Analysis 11713: 11671: 11546:: CS1 maint: multiple names: authors list ( 11495:: CS1 maint: multiple names: authors list ( 11069: 10216: 10200: 10008:are assumed to be invertible and to satisfy 9019:{\displaystyle \mathbb {E} (f_{k}(x_{k}))=0} 8434: 8420: 7417: 7382: 7345: 7322: 6682: 6659: 5438: 5426: 5287: 5273: 5147: 5128: 5097:, one arrives at the functional linear model 4753: 4741: 4625: 4613: 1095: 1079: 839: 820: 625: 607: 482: 475: 393: 381: 375: 363: 201: 194: 12596: 12529: 12496: 12099: 12023: 11282: 11255: 10938: 10936: 10934: 10932: 10814: 10718:Zur Spektraltheorie stochastischer Prozesse 8109:Functional single and multiple index models 5593:{\displaystyle \theta \in \mathbb {R^{q}} } 1690:{\displaystyle (\lambda _{j},\varphi _{j})} 743:{\displaystyle {\mathcal {C}}=\mathbb {E} } 12660: 12548: 12461: 11979: 11937: 11767: 11636: 10758:Dauxois, J; Pousse, A; Romain, Y. (1982). 10245:, defined on one dimensional time domain. 9386:)) is to add a link function leading to a 4837:{\displaystyle \beta \in \mathbb {R} ^{p}} 4802:{\displaystyle \beta _{0}\in \mathbb {R} } 4256:is reduced from infinite dimensional to a 3029:Real life example: Tecator spectral data. 2463:, and the sample is assumed to consist of 2342:{\displaystyle \mu (t)=\mathbb {E} (X(t))} 2171: 182:. Under the integrability condition that 12836: 12783: 12769: 12738: 12674: 12610: 12514: 12430: 12388: 12328: 12303:Wang, JL; Chiou, JM; Müller, HG. (2016). 12271: 12238: 12201: 12152: 12041: 12000: 11879: 11578: 11523: 11411: 11401: 11362: 11360: 11197: 11148: 10968: 10943:Wang, JL; Chiou, JM; Müller, HG. (2016). 10862: 10775: 10720:. Annales Academiae scientiarum Fennicae. 10700: 10632: 10606: 10569:Morris, J. (2015) Functional Regression, 10057:{\displaystyle \mathbb {E} (h^{-1}(t))=t} 10016: 9562: 9541:{\displaystyle {\text{Var}}(Y|X)=V(\mu )} 9472: 9281: 9259: 9251: 9145: 9120: 9089: 9040: 8974: 8863: 8838: 8669: 8644: 8275: 8206: 8121: 7840: 7778: 7771: 7675: 7599: 6220: 5584: 5580: 5523: 5218: 4824: 4795: 4562: 4533: 4346:{\displaystyle A_{i}=(A_{i1},...,A_{iK})} 4060: 3199:{\displaystyle T_{i1},\ldots ,T_{iN_{i}}} 2992: 2766: 2317: 2262:process on a bounded and closed interval 2130: 2078:Finally, under the extra assumption that 1585: 1318: 1185: 697: 600: 576:that is uniquely defined by the relation 471: 463:. Under the integrability condition that 446: 359: 226: 190: 52: 12370: 12330:10.1146/annurev-statistics-041715-033624 12030:Journal of the Royal Statistical Society 11905:Journal of the Royal Statistical Society 11179: 10970:10.1146/annurev-statistics-041715-033624 10929: 10715: 10658:Journal of the Royal Statistical Society 10179: 9741: 9219:for a bivariate smooth additive surface 6652:, depends on the entire trajectories of 4844:denote the regression coefficients, and 3232:{\displaystyle N_{i}\rightarrow \infty } 42:functional principal components analysis 12296: 12176:Zhu, H; Brown, PJ; Morris, JS. (2012). 11560: 11428: 10474:Functional principal component analysis 10170: 7956:are coefficient functions with domains 7475:Functional polynomial regression models 5823:is regression coefficient function for 5368:{\displaystyle Z=(Z_{1},\cdots ,Z_{q})} 3404:Functional principal component analysis 3399:Functional principal component analysis 1356:{\displaystyle \mathbb {E} <\infty } 958: 953:functional principal component analysis 36:into eigencomponents, now known as the 12859: 12127:Chang, C; Chen, Y; Ogden, RT. (2014). 11357: 10730: 10682: 9586:{\displaystyle \mu =\mathbb {E} (Y|X)} 9372:, in order to ensure identifiability. 7949:{\displaystyle \gamma (\cdot ,\cdot )} 7872:is the centered functional covariate, 7451:Functional nonlinear regression models 7368:{\displaystyle \{X_{j}(s)\}_{j=1}^{p}} 6705:{\displaystyle \{X_{j}(t)\}_{j=1}^{p}} 6462:is a centered functional covariate on 3619:{\displaystyle \Sigma (\cdot ,\cdot )} 12592: 12590: 12574: 12572: 12412: 12410: 12408: 11095: 11093: 10104:{\displaystyle h(t)=\delta +\gamma t} 9706:For vector-valued multivariate data, 4985:{\displaystyle X^{c}(t)=X(t)-\mu (t)} 4926:) by a centered functional covariate 4575:{\displaystyle X\in \mathbb {R} ^{p}} 2541:{\displaystyle T_{i1},...,T_{iN_{i}}} 1493:{\displaystyle {\mathcal {C}}:H\to H} 569:{\displaystyle {\mathcal {C}}:H\to H} 11220: 10795: 10793: 10791: 10789: 10787: 10541:Horvath, L. and Kokoszka, P. (2012) 10132:(DTW) used for applications such as 9885:{\displaystyle Y_{i}(t)=X_{i},t\in } 6935: 6088: 5410: 5260:The simple functional linear model ( 5099: 4584: 4355: 4202:{\displaystyle K\rightarrow \infty } 3239:applies to typical functional data. 2691:{\displaystyle X_{ij}=X_{i}(T_{ij})} 1502: 1069:as a collection of random variables 429:but the mean can also be defined as 11734: 10494:Generalized functional linear model 9940:is a latent amplitude function and 9388:generalized functional linear model 9376:Generalized functional linear model 8375:, this model is problematic due to 7159:{\displaystyle X_{1},\ldots ,X_{p}} 5974:and multiple functional covariates 5310:{\displaystyle \{X_{j}\}_{j=1}^{p}} 4508:Functional linear regression models 3470:. Consider the covariance operator 2558: 923:, corresponding to the nonnegative 13: 12587: 12569: 12542: 12405: 12218: 11235:10.1111/j.1751-5823.2002.tb00176.x 11090: 11029:The Canadian Journal of Statistics 11006:10.1111/j.2517-6161.1991.tb01844.x 10800:Ramsay, J; Silverman, BW. (2005). 10670:10.1111/j.2517-6161.1991.tb01821.x 10514: 10395: 10228: 10160:{\displaystyle {\mathcal {L}}^{2}} 10146: 8895: 8701: 8636:)) can thus be written as follows, 8551: 8449: 4196: 3860: 3670: 3633: 3598: 3576: 3479: 3392:per subject is random and finite. 3365:are random times and their number 3226: 2984: 2698:. In addition, the measurement of 2356: 2137:{\displaystyle X:\to \mathbb {R} } 2059: 1948: 1824: 1761: 1737: 1710: 1636: 1587: 1555: 1517: 1473: 1451: 1431: 1404:Under the mean square continuity, 1350: 1205: 1144: 936: 908: 815: 688: 588: 549: 456:{\displaystyle \mu =\mathbb {E} X} 289: 14: 12883: 10921:Horváth, L; Kokoszka, P. (2012). 10784: 10649: 9047:{\displaystyle k\in \mathbb {N} } 8409:Functional additive models (FAMs) 6339:is the functional intercept, for 5905:) have been studied extensively. 4540:{\displaystyle Y\in \mathbb {R} } 3558:compact operator on Hilbert space 12273:10.1111/j.1467-9868.2011.01003.x 12194:10.1111/j.1541-0420.2012.01765.x 11917:10.1111/j.1467-9868.2009.00708.x 11580:10.1111/j.1467-9868.2007.00605.x 11223:International Statistical Review 11137:Journal of Multivariate Analysis 11072:Journal of Multivariate Analysis 10802:Functional Data Analysis, 2nd ed 10764:Journal of Multivariate Analysis 10555:Hsing, T. and Eubank, R. (2015) 9622:connecting the conditional mean 8766:in the above expression ( i.e., 7465:single and multiple index models 7315:depends on the current value of 6718:) has been studied extensively. 6529:{\displaystyle \alpha _{j}(s,t)} 5020:{\displaystyle \beta =\beta (t)} 3591:, i.e., the covariance function 3145:are recorded on a regular grid, 2728:is assumed to have random noise 1125:{\displaystyle \{X(t)\}_{t\in }} 12808: 12763: 12718: 12699: 12654: 12635: 12523: 12490: 12455: 12364: 12345: 12247: 12169: 12120: 12093: 12058: 12024:James, GM; Hastie, TJ. (2001). 12017: 11973: 11931: 11896: 11847: 11804: 11761: 11707: 11665: 11630: 11595: 11554: 11503: 11462: 11445: 11384:Horváth, L; Reeder, R. (2013). 11377: 11330: 11295: 11276: 11249: 11214: 11173: 11128: 11063: 11020: 10985: 10914: 10887: 10842: 10531:, 2nd ed., New York: Springer, 10341:Multivariate stochastic process 9732: 9105:{\displaystyle \mathbb {E} (Y)} 8803:) by a general smooth function 8796:{\displaystyle \beta _{k}x_{k}} 7279:{\displaystyle \varepsilon (s)} 6558:{\displaystyle \varepsilon (s)} 5913:Consider a functional response 4880:Replacing the vector covariate 4353:with the approximated process: 3789:{\displaystyle \varphi _{k}(t)} 2962:{\displaystyle Y_{it}=X_{i}(t)} 2869:, which are independent across 2052: 1835:{\displaystyle {\mathcal {C}}f} 1269: 1204: 649: 399: 11672:Jacques, J; Preda, C. (2013). 10823: 10808: 10751: 10724: 10709: 10676: 10600: 10297: 10291: 10265: 10259: 10212: 10206: 10167:distance and elastic warping. 10083: 10077: 10045: 10042: 10036: 10020: 9879: 9867: 9855: 9852: 9846: 9825: 9809: 9803: 9737: 9684: 9678: 9580: 9573: 9566: 9535: 9529: 9520: 9513: 9506: 9469: 9463: 9457: 9451: 9359: 9347: 9327:{\displaystyle \mathbb {E} =0} 9315: 9312: 9309: 9303: 9291: 9285: 9255: 9244: 9232: 9200: 9197: 9191: 9179: 9155: 9149: 9138: 9131: 9124: 9099: 9093: 9007: 9004: 8991: 8978: 8923: 8910: 8873: 8867: 8856: 8849: 8842: 8679: 8673: 8662: 8655: 8648: 8614: 8602: 8582: 8576: 8529: 8523: 8493: 8481: 8413:For a given orthonormal basis 8272: 8266: 8253: 8247: 8203: 8197: 8184: 8178: 8139: 8132: 8125: 8065:and the coefficient functions 8025: 8013: 8007: 7995: 7975: 7963: 7943: 7931: 7914:{\displaystyle \beta (\cdot )} 7908: 7902: 7859: 7856: 7850: 7844: 7833: 7827: 7818: 7812: 7768: 7762: 7749: 7743: 7730: 7718: 7672: 7666: 7653: 7647: 7617: 7610: 7603: 7551: 7539: 7519: 7513: 7401: 7395: 7341: 7335: 7302: 7296: 7273: 7267: 7185: 7173: 7088: 7076: 7053: 7047: 7038: 7032: 7019: 7013: 6976: 6970: 6954: 6948: 6909: 6897: 6874: 6868: 6859: 6853: 6806: 6800: 6784: 6778: 6749: 6743: 6678: 6672: 6639: 6633: 6590: 6578: 6552: 6546: 6523: 6511: 6481: 6469: 6449: 6443: 6427: 6421: 6405: 6399: 6332:{\displaystyle \alpha _{0}(s)} 6326: 6320: 6274: 6262: 6239: 6233: 6217: 6211: 6193: 6181: 6129: 6123: 6107: 6101: 6035: 6023: 5997: 5991: 5961: 5949: 5929: 5923: 5783: 5777: 5761: 5755: 5739: 5733: 5666: 5654: 5600:is regression coefficient for 5520: 5514: 5501: 5495: 5362: 5330: 5215: 5209: 5203: 5197: 5052: 5040: 5014: 5008: 4979: 4973: 4964: 4958: 4949: 4943: 4463: 4457: 4407: 4401: 4392: 4386: 4381: 4375: 4340: 4296: 4193: 4167: 4158: 4154: 4148: 4098: 4092: 4083: 4077: 4064: 4054: 4042: 4001: 3995: 3982: 3979: 3973: 3964: 3958: 3945: 3894: 3888: 3838: 3832: 3823: 3817: 3783: 3777: 3720: 3714: 3701: 3695: 3648: 3636: 3613: 3601: 3584:{\displaystyle {\mathcal {C}}} 3537: 3525: 3512: 3509: 3497: 3437: 3431: 3306: 3290: 3223: 3086: 3070: 2956: 2950: 2835: 2819: 2786: 2770: 2751:{\displaystyle \epsilon _{ij}} 2685: 2669: 2623: 2572: 2450: 2444: 2414: 2411: 2405: 2396: 2390: 2384: 2371: 2359: 2336: 2333: 2327: 2321: 2310: 2304: 2281: 2269: 2222: 2210: 2195: 2189: 2126: 2123: 2111: 2056: 2043: 2035: 2029: 2016: 2010: 1963: 1951: 1938: 1926: 1718:{\displaystyle {\mathcal {C}}} 1684: 1658: 1644:{\displaystyle {\mathcal {C}}} 1582: 1576: 1570: 1558: 1534: 1528: 1525: 1512: 1484: 1464:defines a covariance operator 1388: 1376: 1344: 1335: 1328: 1322: 1294: 1282: 1263: 1260: 1254: 1245: 1239: 1233: 1220: 1208: 1198: 1192: 1178: 1172: 1152:{\displaystyle {\mathcal {T}}} 1117: 1105: 1091: 1085: 1032: 1020: 944:{\displaystyle {\mathcal {C}}} 916:{\displaystyle {\mathcal {C}}} 737: 734: 722: 716: 704: 701: 643: 640: 628: 604: 560: 283: 267: 262: 256: 249: 230: 169: 157: 1: 12685:10.1080/01621459.2016.1273115 12621:10.1080/01621459.2021.1917416 12254:Delaigle, A; Hall, P (2012). 12114:10.1016/S0167-9473(03)00032-X 12002:10.1093/bioinformatics/bti742 11980:Leng, X; Müller, HG. (2006). 11813:Journal of Applied Statistics 11084:10.1016/S0047-259X(02)00056-8 10593: 10414: 10188: 9690:{\displaystyle \mu =g(\eta )} 9273:which is required to satisfy 7483:model. For a scalar response 7166:are functional covariates on 6755:{\displaystyle X_{j}(\cdot )} 6722:Function-on-scalar regression 6370:{\displaystyle j=1,\ldots ,p} 6079:{\displaystyle j=1,\ldots ,p} 5886:{\displaystyle j=1,\ldots ,p} 3626:, has spectral decomposition 2456:{\displaystyle X_{i}(\cdot )} 2166:Kolmogorov continuity theorem 302:, one can define the mean of 12497:Kneip, A; Gasser, T (1992). 11938:Heinzl, F; Tutz, G. (2013). 11833:10.1080/02664763.2011.578620 11693:10.1016/j.neucom.2012.11.042 10815:Hsing, T; Eubank, R (2015). 10777:10.1016/0047-259X(82)90088-4 10589:Category:Regression analysis 9708:k-means partitioning methods 6929:Concurrent regression models 4857:{\displaystyle \varepsilon } 4492:Other popular bases include 3753:{\displaystyle \lambda _{k}} 2228:{\displaystyle X(t),\ t\in } 1888:{\displaystyle \varphi _{j}} 888:{\displaystyle \varphi _{i}} 776:Karhunen-Loève decomposition 38:Karhunen-Loève decomposition 7: 12371:Ramsay, JO; Li, X. (1998). 11561:Chiou, JM; Li, PL. (2007). 10467: 10249:Multidimensional domain of 10128:Earlier approaches include 9382: 8632: 7503:and a functional covariate 7110: 6764: 6714: 6293: 5901: 5895: 5555: 5262: 5250: 4922: 4900:and the coefficient vector 4724: 4482: 3552: 1613: 139:square-integrable functions 73:Hilbertian random variables 10: 12888: 12476:10.1109/TASSP.1978.1163055 12305:"Functional Data Analysis" 11755:10.1016/j.csda.2013.04.001 11728:10.1016/j.csda.2012.12.004 11534:10.1214/009053604000001156 11351:10.1016/j.jspi.2009.06.017 11180:Fan, J; Zhang, W. (1999). 11114:10.1198/016214504000001745 10945:"Functional data analysis" 10123: 9753:Berkeley Growth Study Data 5816:{\displaystyle \beta _{j}} 4517:multivariate linear models 3244:Berkeley Growth Study Data 27: 12145:10.1007/s00180-014-0503-4 12079:10.1198/00401700152404273 11316:10.1198/jasa.2010.tm09228 10745:10.1080/02331887308801137 10303:{\displaystyle X(\cdot )} 10271:{\displaystyle X(\cdot )} 9642:and the linear predictor 9390:(GFLM) in analogy to the 8105:in an orthonormal basis. 7892:is a scalar coefficient, 7525:{\displaystyle X(\cdot )} 7375:only and not the history 5848:{\displaystyle X_{j}^{c}} 5704:{\displaystyle X_{j}^{c}} 4992:and coefficient function 3567:, the kernel function of 341:{\displaystyle \mu \in H} 40:. A rigorous analysis of 12772:The Annals of Statistics 12354:The Annals of Statistics 12133:Computational Statistics 11959:10.1177/1471082X12471372 11512:The Annals of Statistics 11436:The Annals of Statistics 11186:The Annals of Statistics 10529:Functional data analysis 9392:generalized linear model 7457:generalized linear model 6003:{\displaystyle X_{j}(t)} 4513:Functional linear models 3443:{\displaystyle X_{i}(t)} 2349:and covariance function 425:This formulation is the 45:Analysis" was coined by 17:Functional data analysis 12390:10.1111/1467-9868.00129 12043:10.1111/1467-9868.00297 11651:10.1198/016214503000189 11453:he Annals of Statistics 11270:10.1093/biomet/89.1.111 10545:, New York: Springer, 9712:hierarchical clustering 8377:curse of dimensionality 8098:{\displaystyle \gamma } 8031:{\displaystyle \times } 7885:{\displaystyle \alpha } 7461:curse of dimensionality 6933:This model is given by, 5401:{\displaystyle Z_{1}=1} 2172:Functional data designs 1457:{\displaystyle \Sigma } 1437:{\displaystyle \Sigma } 97:-valued random element 67:mean-squared continuity 12867:Statistical data types 12516:10.1214/aos/1176348769 11782:10.1002/bimj.201200111 11199:10.1214/aos/1017939139 10607:Grenander, U. (1950). 10479:Karhunen–Loève theorem 10386: 10359: 10331: 10304: 10272: 10239: 10161: 10105: 10058: 10002: 9982: 9934: 9886: 9780: 9747: 9691: 9656: 9636: 9616: 9587: 9542: 9483: 9366: 9328: 9267: 9213: 9106: 9075: 9048: 9020: 8960: 8933: 8899: 8824: 8797: 8760: 8733: 8705: 8621: 8589: 8555: 8500: 8458: 8399: 8398:{\displaystyle p>1} 8369: 8368:{\displaystyle p>1} 8343: 8342:{\displaystyle p>1} 8317: 8291: 8099: 8079: 8078:{\displaystyle \beta } 8059: 8032: 7982: 7950: 7915: 7886: 7866: 7789: 7585: 7558: 7526: 7497: 7441: 7369: 7309: 7280: 7251: 7192: 7160: 7098: 7002: 6919: 6832: 6756: 6726:In particular, taking 6706: 6646: 6617: 6597: 6559: 6530: 6488: 6456: 6371: 6333: 6281: 6155: 6080: 6042: 6004: 5968: 5936: 5887: 5849: 5817: 5790: 5705: 5673: 5641: 5614: 5594: 5543: 5464: 5402: 5369: 5311: 5238: 5090: 5059: 5021: 4986: 4914: 4913:{\displaystyle \beta } 4894: 4858: 4838: 4803: 4760: 4712: 4576: 4541: 4470: 4433: 4347: 4270: 4250: 4223: 4203: 4177: 4124: 4014: 3901: 3864: 3798:Karhunen–Loève theorem 3790: 3754: 3727: 3674: 3620: 3585: 3544: 3464: 3444: 3386: 3359: 3358:{\displaystyle T_{ij}} 3329: 3233: 3200: 3139: 3138:{\displaystyle T_{ij}} 3109: 3018: 2963: 2903: 2883: 2863: 2799: 2752: 2722: 2721:{\displaystyle X_{ij}} 2692: 2630: 2542: 2477: 2457: 2421: 2343: 2288: 2256: 2229: 2158: 2138: 2092: 2069: 1989: 1889: 1862: 1861:{\displaystyle f\in H} 1842:is continuous for all 1836: 1806: 1765: 1719: 1691: 1651:, yielding eigenpairs 1645: 1601: 1494: 1458: 1438: 1418: 1395: 1357: 1301: 1153: 1126: 1063: 1039: 997: 977: 945: 917: 889: 859: 819: 768: 744: 666: 570: 531: 507: 457: 416: 342: 322:as the unique element 316: 296: 176: 131: 111: 91: 53:Mathematical formalism 12563:10.1093/biomet/asn047 11947:Statistical Modelling 11872:10.1093/biomet/asn054 11483:10.1093/biomet/ast004 10908:10.1093/biomet/asv062 10489:Functional regression 10387: 10385:{\displaystyle R^{p}} 10360: 10332: 10330:{\displaystyle R^{p}} 10305: 10273: 10240: 10180:Landmark registration 10162: 10106: 10059: 10003: 9983: 9935: 9887: 9781: 9779:{\displaystyle Y_{i}} 9745: 9692: 9657: 9655:{\displaystyle \eta } 9637: 9617: 9588: 9543: 9484: 9367: 9365:{\displaystyle t\in } 9329: 9268: 9214: 9107: 9076: 9074:{\displaystyle f_{k}} 9049: 9021: 8961: 8959:{\displaystyle f_{k}} 8934: 8879: 8825: 8823:{\displaystyle f_{k}} 8798: 8761: 8759:{\displaystyle x_{k}} 8734: 8685: 8622: 8590: 8535: 8501: 8499:{\displaystyle L^{2}} 8459: 8400: 8370: 8344: 8318: 8292: 8100: 8080: 8060: 8058:{\displaystyle X^{c}} 8033: 7983: 7951: 7916: 7887: 7867: 7790: 7586: 7584:{\displaystyle X^{c}} 7559: 7527: 7498: 7481:polynomial regression 7442: 7370: 7310: 7281: 7252: 7193: 7161: 7099: 6982: 6920: 6812: 6757: 6707: 6647: 6618: 6598: 6596:{\displaystyle s\in } 6560: 6531: 6489: 6457: 6372: 6334: 6282: 6135: 6081: 6043: 6041:{\displaystyle t\in } 6005: 5969: 5937: 5888: 5850: 5818: 5791: 5706: 5674: 5642: 5640:{\displaystyle X_{j}} 5615: 5595: 5544: 5444: 5403: 5370: 5312: 5239: 5091: 5089:{\displaystyle L^{2}} 5060: 5058:{\displaystyle t\in } 5022: 4987: 4915: 4895: 4859: 4839: 4804: 4761: 4713: 4577: 4547:and vector covariate 4542: 4471: 4413: 4348: 4271: 4251: 4249:{\displaystyle X_{i}} 4224: 4204: 4178: 4104: 4015: 3902: 3844: 3791: 3755: 3728: 3654: 3621: 3586: 3545: 3465: 3445: 3387: 3385:{\displaystyle N_{i}} 3360: 3330: 3234: 3201: 3140: 3110: 3019: 2964: 2904: 2884: 2864: 2800: 2753: 2723: 2693: 2631: 2543: 2478: 2458: 2422: 2344: 2289: 2257: 2255:{\displaystyle L^{2}} 2230: 2159: 2139: 2093: 2070: 1969: 1890: 1863: 1837: 1807: 1745: 1720: 1692: 1646: 1602: 1495: 1459: 1439: 1419: 1396: 1394:{\displaystyle t\in } 1358: 1302: 1154: 1127: 1064: 1040: 1038:{\displaystyle L^{2}} 998: 978: 946: 918: 890: 860: 799: 769: 745: 667: 571: 532: 508: 458: 417: 343: 317: 297: 177: 175:{\displaystyle L^{2}} 132: 112: 92: 19:(FDA) is a branch of 12872:Statistical analysis 12503:Annals of Statistics 12373:"Curve registration" 10851:Annals of Statistics 10716:Karhunen, K (1946). 10689:Annals of Statistics 10683:Müller, HG. (2016). 10499:Stochastic processes 10369: 10349: 10314: 10285: 10253: 10197: 10171:Dynamic time warping 10140: 10130:dynamic time warping 10071: 10012: 9992: 9944: 9896: 9790: 9763: 9666: 9646: 9635:{\displaystyle \mu } 9626: 9606: 9552: 9498: 9404: 9338: 9277: 9223: 9116: 9085: 9058: 9030: 8970: 8943: 8834: 8807: 8770: 8743: 8640: 8599: 8510: 8468: 8417: 8383: 8353: 8327: 8301: 8117: 8089: 8069: 8042: 7992: 7960: 7925: 7896: 7876: 7799: 7595: 7568: 7536: 7507: 7487: 7379: 7319: 7308:{\displaystyle Y(s)} 7290: 7261: 7202: 7170: 7124: 6942: 6772: 6730: 6656: 6645:{\displaystyle Y(s)} 6627: 6607: 6569: 6540: 6498: 6466: 6381: 6343: 6307: 6095: 6052: 6014: 5978: 5946: 5935:{\displaystyle Y(s)} 5917: 5859: 5827: 5800: 5715: 5683: 5651: 5624: 5604: 5569: 5417: 5379: 5321: 5270: 5106: 5073: 5031: 4996: 4930: 4904: 4884: 4848: 4813: 4778: 4738: 4591: 4551: 4523: 4362: 4280: 4276:-dimensional vector 4260: 4233: 4213: 4187: 4028: 3911: 3804: 3764: 3737: 3630: 3595: 3571: 3474: 3454: 3418: 3369: 3339: 3261: 3210: 3151: 3119: 3041: 2973: 2921: 2893: 2873: 2809: 2762: 2732: 2702: 2640: 2552: 2487: 2467: 2431: 2353: 2298: 2266: 2239: 2183: 2148: 2102: 2082: 1906: 1872: 1846: 1819: 1732: 1705: 1655: 1631: 1509: 1468: 1448: 1428: 1417:{\displaystyle \mu } 1408: 1367: 1314: 1166: 1139: 1076: 1053: 1007: 987: 967: 959:Stochastic processes 931: 903: 872: 784: 758: 754:allows to decompose 683: 583: 544: 521: 467: 436: 355: 326: 306: 186: 144: 121: 101: 81: 12419:Statistical Science 12321:2016AnRSA...3..257W 11825:2012JApSt..39..129A 11770:Biometrical Journal 10961:2016AnRSA...3..257W 10625:1950ArM.....1..195G 10613:Arkiv för Matematik 9845: 9440: 9175: 8453: 8316:{\displaystyle p=1} 8236: 8167: 7714: 7699: 7643: 7436: 7364: 6701: 6398: 6210: 6170: 5844: 5732: 5700: 5494: 5479: 5306: 5186: 4582:can be expressed as 4385: 4022:dimension reduction 3944: 3412:dimension reduction 3408:dimension reduction 3242:Real life example: 2858: 2294:with mean function 1554: 515:covariance operator 502: 247: 221: 12829:10.1111/rssb.12160 12794:10.1214/17-AOS1660 12749:10.1111/biom.13340 12605:(540): 2252–2267. 11310:(491): 1256–1264. 10873:10.1214/13-AOS1093 10702:10.1214/16-AOS1492 10634:10.1007/BF02590638 10484:Modes of variation 10382: 10355: 10327: 10300: 10268: 10235: 10157: 10134:speech recognition 10101: 10054: 9998: 9978: 9930: 9882: 9828: 9786:can be modeled as 9776: 9748: 9687: 9652: 9632: 9612: 9583: 9538: 9479: 9426: 9362: 9324: 9263: 9209: 9161: 9102: 9071: 9044: 9016: 8956: 8929: 8820: 8793: 8756: 8729: 8617: 8585: 8496: 8454: 8433: 8395: 8365: 8339: 8313: 8287: 8222: 8153: 8095: 8075: 8055: 8028: 7978: 7946: 7911: 7882: 7862: 7785: 7700: 7685: 7629: 7581: 7554: 7522: 7493: 7437: 7416: 7365: 7344: 7305: 7276: 7247: 7188: 7156: 7094: 6915: 6752: 6702: 6681: 6642: 6613: 6593: 6555: 6526: 6484: 6452: 6384: 6367: 6329: 6277: 6196: 6156: 6076: 6038: 6000: 5964: 5932: 5883: 5845: 5830: 5813: 5786: 5718: 5701: 5686: 5669: 5637: 5610: 5590: 5539: 5480: 5465: 5398: 5365: 5307: 5286: 5234: 5172: 5086: 5055: 5017: 4982: 4910: 4890: 4854: 4834: 4799: 4756: 4708: 4572: 4537: 4502:modes of variation 4466: 4365: 4343: 4266: 4246: 4219: 4199: 4173: 4058: 4010: 3930: 3897: 3786: 3750: 3723: 3616: 3581: 3540: 3460: 3440: 3382: 3355: 3325: 3229: 3196: 3135: 3105: 3014: 2969:available for all 2959: 2899: 2879: 2859: 2841: 2795: 2748: 2718: 2688: 2626: 2538: 2473: 2453: 2417: 2339: 2284: 2252: 2225: 2178:stochastic process 2154: 2134: 2088: 2065: 1942: 1885: 1858: 1832: 1802: 1715: 1687: 1641: 1597: 1540: 1490: 1454: 1434: 1414: 1391: 1353: 1297: 1149: 1122: 1059: 1035: 993: 973: 941: 913: 885: 855: 764: 740: 662: 566: 527: 503: 481: 453: 412: 338: 312: 292: 233: 200: 172: 127: 107: 87: 63:stochastic process 34:stochastic process 12778:(6B): 3334–3361. 12708:Statistica Sinica 12538:(432): 1179–1188. 12441:10.1214/15-STS524 11413:10.3150/12-BEJ446 11396:(5A): 2120–2151. 11285:Statistica Sinica 11159:10.3150/09-BEJ228 10565:978-0-470-01691-6 10551:978-1-4614-3654-6 10509:Variance function 10358:{\displaystyle R} 10001:{\displaystyle h} 9973: 9925: 9615:{\displaystyle g} 9504: 9493:Variance function 7496:{\displaystyle Y} 7118: 7117: 7069: 7065: 7061: 6890: 6886: 6882: 6616:{\displaystyle Y} 6301: 6300: 6255: 6251: 6247: 5613:{\displaystyle Z} 5563: 5562: 5258: 5257: 4893:{\displaystyle X} 4732: 4731: 4490: 4489: 4269:{\displaystyle K} 4222:{\displaystyle K} 4031: 3463:{\displaystyle X} 2902:{\displaystyle j} 2882:{\displaystyle i} 2816: 2560: 2476:{\displaystyle n} 2381: 2203: 2157:{\displaystyle X} 2091:{\displaystyle X} 1909: 1899:then states that 1895:are continuous. 1621: 1620: 1230: 1062:{\displaystyle X} 996:{\displaystyle H} 976:{\displaystyle X} 767:{\displaystyle X} 530:{\displaystyle X} 315:{\displaystyle X} 130:{\displaystyle H} 110:{\displaystyle X} 90:{\displaystyle H} 12879: 12851: 12850: 12840: 12812: 12806: 12805: 12787: 12767: 12761: 12760: 12742: 12722: 12716: 12715: 12703: 12697: 12696: 12678: 12669:(522): 649–659. 12658: 12652: 12651: 12639: 12633: 12632: 12614: 12594: 12585: 12584: 12576: 12567: 12566: 12546: 12540: 12539: 12527: 12521: 12520: 12518: 12509:(3): 1266–1305. 12494: 12488: 12487: 12459: 12453: 12452: 12434: 12414: 12403: 12402: 12392: 12368: 12362: 12361: 12349: 12343: 12342: 12332: 12300: 12294: 12293: 12275: 12251: 12245: 12244: 12242: 12222: 12216: 12215: 12205: 12188:(4): 1260–1268. 12173: 12167: 12166: 12156: 12139:(6): 1497–1513. 12124: 12118: 12117: 12108:(1–2): 161–173. 12097: 12091: 12090: 12062: 12056: 12055: 12045: 12021: 12015: 12014: 12004: 11986: 11977: 11971: 11970: 11944: 11935: 11929: 11928: 11900: 11894: 11893: 11883: 11851: 11845: 11844: 11808: 11802: 11801: 11765: 11759: 11758: 11738: 11732: 11731: 11711: 11705: 11704: 11678: 11669: 11663: 11662: 11645:(462): 397–408. 11634: 11628: 11627: 11599: 11593: 11592: 11582: 11558: 11552: 11551: 11545: 11537: 11527: 11507: 11501: 11500: 11494: 11486: 11466: 11460: 11449: 11443: 11432: 11426: 11425: 11415: 11405: 11381: 11375: 11364: 11355: 11354: 11334: 11328: 11327: 11299: 11293: 11292: 11280: 11274: 11273: 11253: 11247: 11246: 11218: 11212: 11211: 11201: 11192:(5): 1491–1518. 11177: 11171: 11170: 11152: 11132: 11126: 11125: 11108:(470): 577–590. 11097: 11088: 11087: 11067: 11061: 11060: 11024: 11018: 11017: 10989: 10983: 10982: 10972: 10940: 10927: 10926: 10918: 10912: 10911: 10891: 10885: 10884: 10866: 10846: 10840: 10839: 10827: 10821: 10820: 10812: 10806: 10805: 10797: 10782: 10781: 10779: 10755: 10749: 10748: 10728: 10722: 10721: 10713: 10707: 10706: 10704: 10695:(5): 1867–1887. 10680: 10674: 10673: 10653: 10647: 10646: 10636: 10604: 10391: 10389: 10388: 10383: 10381: 10380: 10364: 10362: 10361: 10356: 10336: 10334: 10333: 10328: 10326: 10325: 10309: 10307: 10306: 10301: 10277: 10275: 10274: 10269: 10244: 10242: 10241: 10236: 10234: 10233: 10232: 10231: 10166: 10164: 10163: 10158: 10156: 10155: 10150: 10149: 10110: 10108: 10107: 10102: 10063: 10061: 10060: 10055: 10035: 10034: 10019: 10007: 10005: 10004: 9999: 9987: 9985: 9984: 9979: 9974: 9972: 9958: 9956: 9955: 9939: 9937: 9936: 9931: 9926: 9924: 9910: 9908: 9907: 9891: 9889: 9888: 9883: 9844: 9836: 9824: 9823: 9802: 9801: 9785: 9783: 9782: 9777: 9775: 9774: 9696: 9694: 9693: 9688: 9661: 9659: 9658: 9653: 9641: 9639: 9638: 9633: 9621: 9619: 9618: 9613: 9595:conditional mean 9592: 9590: 9589: 9584: 9576: 9565: 9547: 9545: 9544: 9539: 9516: 9505: 9502: 9488: 9486: 9485: 9480: 9450: 9449: 9439: 9434: 9422: 9421: 9399:Linear predictor 9371: 9369: 9368: 9363: 9333: 9331: 9330: 9325: 9284: 9272: 9270: 9269: 9264: 9262: 9254: 9218: 9216: 9215: 9210: 9174: 9169: 9148: 9134: 9123: 9111: 9109: 9108: 9103: 9092: 9080: 9078: 9077: 9072: 9070: 9069: 9053: 9051: 9050: 9045: 9043: 9025: 9023: 9022: 9017: 9003: 9002: 8990: 8989: 8977: 8965: 8963: 8962: 8957: 8955: 8954: 8938: 8936: 8935: 8930: 8922: 8921: 8909: 8908: 8898: 8893: 8866: 8852: 8841: 8829: 8827: 8826: 8821: 8819: 8818: 8802: 8800: 8799: 8794: 8792: 8791: 8782: 8781: 8765: 8763: 8762: 8757: 8755: 8754: 8738: 8736: 8735: 8730: 8725: 8724: 8715: 8714: 8704: 8699: 8672: 8658: 8647: 8626: 8624: 8623: 8620:{\displaystyle } 8618: 8594: 8592: 8591: 8586: 8575: 8574: 8565: 8564: 8554: 8549: 8522: 8521: 8506:, we can expand 8505: 8503: 8502: 8497: 8480: 8479: 8463: 8461: 8460: 8455: 8452: 8447: 8432: 8431: 8404: 8402: 8401: 8396: 8374: 8372: 8371: 8366: 8348: 8346: 8345: 8340: 8322: 8320: 8319: 8314: 8296: 8294: 8293: 8288: 8286: 8282: 8265: 8264: 8246: 8245: 8235: 8230: 8196: 8195: 8177: 8176: 8166: 8161: 8135: 8124: 8104: 8102: 8101: 8096: 8084: 8082: 8081: 8076: 8064: 8062: 8061: 8056: 8054: 8053: 8037: 8035: 8034: 8029: 7987: 7985: 7984: 7981:{\displaystyle } 7979: 7955: 7953: 7952: 7947: 7920: 7918: 7917: 7912: 7891: 7889: 7888: 7883: 7871: 7869: 7868: 7863: 7843: 7811: 7810: 7794: 7792: 7791: 7786: 7761: 7760: 7742: 7741: 7713: 7708: 7698: 7693: 7665: 7664: 7642: 7637: 7613: 7602: 7590: 7588: 7587: 7582: 7580: 7579: 7563: 7561: 7560: 7557:{\displaystyle } 7555: 7531: 7529: 7528: 7523: 7502: 7500: 7499: 7494: 7446: 7444: 7443: 7438: 7435: 7430: 7394: 7393: 7374: 7372: 7371: 7366: 7363: 7358: 7334: 7333: 7314: 7312: 7311: 7306: 7285: 7283: 7282: 7277: 7256: 7254: 7253: 7248: 7246: 7245: 7227: 7226: 7214: 7213: 7197: 7195: 7194: 7191:{\displaystyle } 7189: 7165: 7163: 7162: 7157: 7155: 7154: 7136: 7135: 7112: 7103: 7101: 7100: 7095: 7067: 7066: 7063: 7059: 7031: 7030: 7012: 7011: 7001: 6996: 6969: 6968: 6936: 6924: 6922: 6921: 6916: 6888: 6887: 6884: 6880: 6852: 6851: 6842: 6841: 6831: 6826: 6799: 6798: 6761: 6759: 6758: 6753: 6742: 6741: 6711: 6709: 6708: 6703: 6700: 6695: 6671: 6670: 6651: 6649: 6648: 6643: 6622: 6620: 6619: 6614: 6602: 6600: 6599: 6594: 6564: 6562: 6561: 6556: 6535: 6533: 6532: 6527: 6510: 6509: 6493: 6491: 6490: 6487:{\displaystyle } 6485: 6461: 6459: 6458: 6453: 6442: 6441: 6420: 6419: 6397: 6392: 6376: 6374: 6373: 6368: 6338: 6336: 6335: 6330: 6319: 6318: 6295: 6286: 6284: 6283: 6278: 6253: 6252: 6249: 6245: 6209: 6204: 6180: 6179: 6169: 6164: 6154: 6149: 6122: 6121: 6089: 6085: 6083: 6082: 6077: 6047: 6045: 6044: 6039: 6009: 6007: 6006: 6001: 5990: 5989: 5973: 5971: 5970: 5967:{\displaystyle } 5965: 5941: 5939: 5938: 5933: 5892: 5890: 5889: 5884: 5854: 5852: 5851: 5846: 5843: 5838: 5822: 5820: 5819: 5814: 5812: 5811: 5795: 5793: 5792: 5787: 5776: 5775: 5754: 5753: 5731: 5726: 5710: 5708: 5707: 5702: 5699: 5694: 5678: 5676: 5675: 5672:{\displaystyle } 5670: 5646: 5644: 5643: 5638: 5636: 5635: 5620:, the domain of 5619: 5617: 5616: 5611: 5599: 5597: 5596: 5591: 5589: 5588: 5587: 5557: 5548: 5546: 5545: 5540: 5513: 5512: 5493: 5488: 5478: 5473: 5463: 5458: 5411: 5407: 5405: 5404: 5399: 5391: 5390: 5374: 5372: 5371: 5366: 5361: 5360: 5342: 5341: 5316: 5314: 5313: 5308: 5305: 5300: 5285: 5284: 5252: 5243: 5241: 5240: 5235: 5196: 5195: 5185: 5180: 5168: 5167: 5140: 5139: 5124: 5123: 5100: 5095: 5093: 5092: 5087: 5085: 5084: 5064: 5062: 5061: 5056: 5026: 5024: 5023: 5018: 4991: 4989: 4988: 4983: 4942: 4941: 4919: 4917: 4916: 4911: 4899: 4897: 4896: 4891: 4863: 4861: 4860: 4855: 4843: 4841: 4840: 4835: 4833: 4832: 4827: 4808: 4806: 4805: 4800: 4798: 4790: 4789: 4765: 4763: 4762: 4757: 4726: 4717: 4715: 4714: 4709: 4698: 4697: 4688: 4687: 4669: 4668: 4659: 4658: 4646: 4645: 4609: 4608: 4585: 4581: 4579: 4578: 4573: 4571: 4570: 4565: 4546: 4544: 4543: 4538: 4536: 4484: 4475: 4473: 4472: 4467: 4456: 4455: 4446: 4445: 4432: 4427: 4384: 4373: 4356: 4352: 4350: 4349: 4344: 4339: 4338: 4311: 4310: 4292: 4291: 4275: 4273: 4272: 4267: 4255: 4253: 4252: 4247: 4245: 4244: 4228: 4226: 4225: 4220: 4208: 4206: 4205: 4200: 4182: 4180: 4179: 4174: 4166: 4165: 4147: 4146: 4137: 4136: 4123: 4118: 4076: 4075: 4063: 4057: 4019: 4017: 4016: 4011: 3994: 3993: 3957: 3956: 3943: 3938: 3926: 3925: 3906: 3904: 3903: 3898: 3887: 3886: 3877: 3876: 3863: 3858: 3816: 3815: 3795: 3793: 3792: 3787: 3776: 3775: 3759: 3757: 3756: 3751: 3749: 3748: 3732: 3730: 3729: 3724: 3713: 3712: 3694: 3693: 3684: 3683: 3673: 3668: 3625: 3623: 3622: 3617: 3590: 3588: 3587: 3582: 3580: 3579: 3565:Mercer's theorem 3549: 3547: 3546: 3541: 3524: 3523: 3496: 3495: 3483: 3482: 3469: 3467: 3466: 3461: 3449: 3447: 3446: 3441: 3430: 3429: 3391: 3389: 3388: 3383: 3381: 3380: 3364: 3362: 3361: 3356: 3354: 3353: 3334: 3332: 3331: 3326: 3324: 3323: 3305: 3304: 3289: 3288: 3276: 3275: 3238: 3236: 3235: 3230: 3222: 3221: 3205: 3203: 3202: 3197: 3195: 3194: 3193: 3192: 3166: 3165: 3144: 3142: 3141: 3136: 3134: 3133: 3114: 3112: 3111: 3106: 3104: 3103: 3085: 3084: 3069: 3068: 3056: 3055: 3023: 3021: 3020: 3015: 2988: 2987: 2968: 2966: 2965: 2960: 2949: 2948: 2936: 2935: 2908: 2906: 2905: 2900: 2888: 2886: 2885: 2880: 2868: 2866: 2865: 2860: 2857: 2852: 2834: 2833: 2818: 2817: 2814: 2804: 2802: 2801: 2796: 2785: 2784: 2769: 2757: 2755: 2754: 2749: 2747: 2746: 2727: 2725: 2724: 2719: 2717: 2716: 2697: 2695: 2694: 2689: 2684: 2683: 2668: 2667: 2655: 2654: 2635: 2633: 2632: 2627: 2622: 2621: 2620: 2619: 2587: 2586: 2568: 2567: 2562: 2561: 2547: 2545: 2544: 2539: 2537: 2536: 2535: 2534: 2502: 2501: 2482: 2480: 2479: 2474: 2462: 2460: 2459: 2454: 2443: 2442: 2426: 2424: 2423: 2418: 2383: 2382: 2379: 2348: 2346: 2345: 2340: 2320: 2293: 2291: 2290: 2287:{\displaystyle } 2285: 2261: 2259: 2258: 2253: 2251: 2250: 2234: 2232: 2231: 2226: 2201: 2163: 2161: 2160: 2155: 2143: 2141: 2140: 2135: 2133: 2097: 2095: 2094: 2089: 2074: 2072: 2071: 2066: 2042: 2038: 2028: 2027: 2009: 2008: 1999: 1998: 1988: 1983: 1941: 1897:Mercer's theorem 1894: 1892: 1891: 1886: 1884: 1883: 1867: 1865: 1864: 1859: 1841: 1839: 1838: 1833: 1828: 1827: 1815:Moreover, since 1811: 1809: 1808: 1803: 1798: 1797: 1785: 1784: 1775: 1774: 1764: 1759: 1741: 1740: 1724: 1722: 1721: 1716: 1714: 1713: 1696: 1694: 1693: 1688: 1683: 1682: 1670: 1669: 1650: 1648: 1647: 1642: 1640: 1639: 1625:spectral theorem 1615: 1606: 1604: 1603: 1598: 1590: 1553: 1548: 1521: 1520: 1503: 1499: 1497: 1496: 1491: 1477: 1476: 1463: 1461: 1460: 1455: 1443: 1441: 1440: 1435: 1423: 1421: 1420: 1415: 1400: 1398: 1397: 1392: 1362: 1360: 1359: 1354: 1343: 1342: 1321: 1306: 1304: 1303: 1298: 1232: 1231: 1228: 1188: 1158: 1156: 1155: 1150: 1148: 1147: 1131: 1129: 1128: 1123: 1121: 1120: 1068: 1066: 1065: 1060: 1044: 1042: 1041: 1036: 1019: 1018: 1002: 1000: 999: 994: 982: 980: 979: 974: 950: 948: 947: 942: 940: 939: 922: 920: 919: 914: 912: 911: 894: 892: 891: 886: 884: 883: 864: 862: 861: 856: 851: 850: 838: 837: 818: 813: 773: 771: 770: 765: 752:spectral theorem 749: 747: 746: 741: 700: 692: 691: 671: 669: 668: 663: 603: 592: 591: 575: 573: 572: 567: 553: 552: 536: 534: 533: 528: 512: 510: 509: 504: 501: 496: 495: 494: 474: 462: 460: 459: 454: 449: 431:Bochner integral 421: 419: 418: 413: 362: 347: 345: 344: 339: 321: 319: 318: 313: 301: 299: 298: 293: 276: 275: 270: 252: 246: 241: 229: 220: 215: 214: 213: 193: 181: 179: 178: 173: 156: 155: 136: 134: 133: 128: 116: 114: 113: 108: 96: 94: 93: 88: 12887: 12886: 12882: 12881: 12880: 12878: 12877: 12876: 12857: 12856: 12855: 12854: 12813: 12809: 12768: 12764: 12723: 12719: 12704: 12700: 12659: 12655: 12650:(5): 1130–1145. 12640: 12636: 12595: 12588: 12577: 12570: 12547: 12543: 12528: 12524: 12495: 12491: 12460: 12456: 12415: 12406: 12369: 12365: 12350: 12346: 12301: 12297: 12252: 12248: 12223: 12219: 12174: 12170: 12125: 12121: 12098: 12094: 12063: 12059: 12022: 12018: 11984: 11978: 11974: 11942: 11936: 11932: 11901: 11897: 11852: 11848: 11809: 11805: 11766: 11762: 11739: 11735: 11712: 11708: 11676: 11670: 11666: 11635: 11631: 11616:10.2307/2532201 11600: 11596: 11559: 11555: 11539: 11538: 11508: 11504: 11488: 11487: 11467: 11463: 11450: 11446: 11433: 11429: 11382: 11378: 11365: 11358: 11335: 11331: 11300: 11296: 11281: 11277: 11254: 11250: 11219: 11215: 11178: 11174: 11133: 11129: 11098: 11091: 11068: 11064: 11041:10.2307/3316063 11025: 11021: 10990: 10986: 10941: 10930: 10919: 10915: 10892: 10888: 10847: 10843: 10828: 10824: 10813: 10809: 10798: 10785: 10756: 10752: 10729: 10725: 10714: 10710: 10681: 10677: 10654: 10650: 10605: 10601: 10596: 10525:Silverman, B.W. 10517: 10515:Further reading 10470: 10420: 10401: 10376: 10372: 10370: 10367: 10366: 10350: 10347: 10346: 10343: 10321: 10317: 10315: 10312: 10311: 10286: 10283: 10282: 10279: 10254: 10251: 10250: 10227: 10226: 10219: 10215: 10198: 10195: 10194: 10191: 10182: 10173: 10151: 10145: 10144: 10143: 10141: 10138: 10137: 10126: 10118:diffeomorphisms 10113:diffeomorphisms 10072: 10069: 10068: 10027: 10023: 10015: 10013: 10010: 10009: 9993: 9990: 9989: 9962: 9957: 9951: 9947: 9945: 9942: 9941: 9914: 9909: 9903: 9899: 9897: 9894: 9893: 9837: 9832: 9819: 9815: 9797: 9793: 9791: 9788: 9787: 9770: 9766: 9764: 9761: 9760: 9740: 9735: 9704: 9667: 9664: 9663: 9647: 9644: 9643: 9627: 9624: 9623: 9607: 9604: 9603: 9572: 9561: 9553: 9550: 9549: 9512: 9501: 9499: 9496: 9495: 9445: 9441: 9435: 9430: 9417: 9413: 9405: 9402: 9401: 9378: 9339: 9336: 9335: 9280: 9278: 9275: 9274: 9258: 9250: 9224: 9221: 9220: 9170: 9165: 9144: 9130: 9119: 9117: 9114: 9113: 9088: 9086: 9083: 9082: 9065: 9061: 9059: 9056: 9055: 9039: 9031: 9028: 9027: 8998: 8994: 8985: 8981: 8973: 8971: 8968: 8967: 8950: 8946: 8944: 8941: 8940: 8917: 8913: 8904: 8900: 8894: 8883: 8862: 8848: 8837: 8835: 8832: 8831: 8814: 8810: 8808: 8805: 8804: 8787: 8783: 8777: 8773: 8771: 8768: 8767: 8750: 8746: 8744: 8741: 8740: 8720: 8716: 8710: 8706: 8700: 8689: 8668: 8654: 8643: 8641: 8638: 8637: 8600: 8597: 8596: 8570: 8566: 8560: 8556: 8550: 8539: 8517: 8513: 8511: 8508: 8507: 8475: 8471: 8469: 8466: 8465: 8448: 8437: 8427: 8423: 8418: 8415: 8414: 8411: 8384: 8381: 8380: 8354: 8351: 8350: 8349:. However, for 8328: 8325: 8324: 8302: 8299: 8298: 8260: 8256: 8241: 8237: 8231: 8226: 8191: 8187: 8172: 8168: 8162: 8157: 8152: 8148: 8131: 8120: 8118: 8115: 8114: 8111: 8090: 8087: 8086: 8070: 8067: 8066: 8049: 8045: 8043: 8040: 8039: 7993: 7990: 7989: 7961: 7958: 7957: 7926: 7923: 7922: 7897: 7894: 7893: 7877: 7874: 7873: 7839: 7806: 7802: 7800: 7797: 7796: 7756: 7752: 7737: 7733: 7709: 7704: 7694: 7689: 7660: 7656: 7638: 7633: 7609: 7598: 7596: 7593: 7592: 7575: 7571: 7569: 7566: 7565: 7537: 7534: 7533: 7508: 7505: 7504: 7488: 7485: 7484: 7477: 7469:additive models 7467:and functional 7453: 7431: 7420: 7389: 7385: 7380: 7377: 7376: 7359: 7348: 7329: 7325: 7320: 7317: 7316: 7291: 7288: 7287: 7262: 7259: 7258: 7241: 7237: 7222: 7218: 7209: 7205: 7203: 7200: 7199: 7171: 7168: 7167: 7150: 7146: 7131: 7127: 7125: 7122: 7121: 7062: 7026: 7022: 7007: 7003: 6997: 6986: 6964: 6960: 6943: 6940: 6939: 6931: 6883: 6847: 6843: 6837: 6833: 6827: 6816: 6794: 6790: 6773: 6770: 6769: 6737: 6733: 6731: 6728: 6727: 6724: 6696: 6685: 6666: 6662: 6657: 6654: 6653: 6628: 6625: 6624: 6608: 6605: 6604: 6603:, the value of 6570: 6567: 6566: 6541: 6538: 6537: 6505: 6501: 6499: 6496: 6495: 6467: 6464: 6463: 6437: 6433: 6415: 6411: 6393: 6388: 6382: 6379: 6378: 6344: 6341: 6340: 6314: 6310: 6308: 6305: 6304: 6248: 6205: 6200: 6175: 6171: 6165: 6160: 6150: 6139: 6117: 6113: 6096: 6093: 6092: 6053: 6050: 6049: 6015: 6012: 6011: 5985: 5981: 5979: 5976: 5975: 5947: 5944: 5943: 5918: 5915: 5914: 5911: 5860: 5857: 5856: 5839: 5834: 5828: 5825: 5824: 5807: 5803: 5801: 5798: 5797: 5771: 5767: 5749: 5745: 5727: 5722: 5716: 5713: 5712: 5695: 5690: 5684: 5681: 5680: 5652: 5649: 5648: 5631: 5627: 5625: 5622: 5621: 5605: 5602: 5601: 5583: 5579: 5578: 5570: 5567: 5566: 5508: 5504: 5489: 5484: 5474: 5469: 5459: 5448: 5418: 5415: 5414: 5386: 5382: 5380: 5377: 5376: 5356: 5352: 5337: 5333: 5322: 5319: 5318: 5301: 5290: 5280: 5276: 5271: 5268: 5267: 5191: 5187: 5181: 5176: 5163: 5159: 5135: 5131: 5119: 5115: 5107: 5104: 5103: 5080: 5076: 5074: 5071: 5070: 5032: 5029: 5028: 4997: 4994: 4993: 4937: 4933: 4931: 4928: 4927: 4905: 4902: 4901: 4885: 4882: 4881: 4878: 4849: 4846: 4845: 4828: 4823: 4822: 4814: 4811: 4810: 4794: 4785: 4781: 4779: 4776: 4775: 4772:Euclidean space 4739: 4736: 4735: 4693: 4689: 4683: 4679: 4664: 4660: 4654: 4650: 4641: 4637: 4604: 4600: 4592: 4589: 4588: 4566: 4561: 4560: 4552: 4549: 4548: 4532: 4524: 4521: 4520: 4510: 4451: 4447: 4438: 4434: 4428: 4417: 4374: 4369: 4363: 4360: 4359: 4331: 4327: 4303: 4299: 4287: 4283: 4281: 4278: 4277: 4261: 4258: 4257: 4240: 4236: 4234: 4231: 4230: 4214: 4211: 4210: 4188: 4185: 4184: 4161: 4157: 4142: 4138: 4129: 4125: 4119: 4108: 4071: 4067: 4059: 4035: 4029: 4026: 4025: 3989: 3985: 3952: 3948: 3939: 3934: 3918: 3914: 3912: 3909: 3908: 3882: 3878: 3869: 3865: 3859: 3848: 3811: 3807: 3805: 3802: 3801: 3771: 3767: 3765: 3762: 3761: 3744: 3740: 3738: 3735: 3734: 3708: 3704: 3689: 3685: 3679: 3675: 3669: 3658: 3631: 3628: 3627: 3596: 3593: 3592: 3575: 3574: 3572: 3569: 3568: 3519: 3515: 3491: 3487: 3478: 3477: 3475: 3472: 3471: 3455: 3452: 3451: 3425: 3421: 3419: 3416: 3415: 3401: 3376: 3372: 3370: 3367: 3366: 3346: 3342: 3340: 3337: 3336: 3316: 3312: 3297: 3293: 3284: 3280: 3268: 3264: 3262: 3259: 3258: 3255: 3217: 3213: 3211: 3208: 3207: 3188: 3184: 3180: 3176: 3158: 3154: 3152: 3149: 3148: 3126: 3122: 3120: 3117: 3116: 3096: 3092: 3077: 3073: 3064: 3060: 3048: 3044: 3042: 3039: 3038: 3035: 2983: 2982: 2974: 2971: 2970: 2944: 2940: 2928: 2924: 2922: 2919: 2918: 2915: 2894: 2891: 2890: 2874: 2871: 2870: 2853: 2845: 2826: 2822: 2813: 2812: 2810: 2807: 2806: 2777: 2773: 2765: 2763: 2760: 2759: 2739: 2735: 2733: 2730: 2729: 2709: 2705: 2703: 2700: 2699: 2676: 2672: 2663: 2659: 2647: 2643: 2641: 2638: 2637: 2615: 2611: 2607: 2603: 2579: 2575: 2563: 2557: 2556: 2555: 2553: 2550: 2549: 2530: 2526: 2522: 2518: 2494: 2490: 2488: 2485: 2484: 2468: 2465: 2464: 2438: 2434: 2432: 2429: 2428: 2378: 2377: 2354: 2351: 2350: 2316: 2299: 2296: 2295: 2267: 2264: 2263: 2246: 2242: 2240: 2237: 2236: 2184: 2181: 2180: 2174: 2149: 2146: 2145: 2129: 2103: 2100: 2099: 2083: 2080: 2079: 2023: 2019: 2004: 2000: 1994: 1990: 1984: 1973: 1947: 1943: 1913: 1907: 1904: 1903: 1879: 1875: 1873: 1870: 1869: 1847: 1844: 1843: 1823: 1822: 1820: 1817: 1816: 1793: 1789: 1780: 1776: 1770: 1766: 1760: 1749: 1736: 1735: 1733: 1730: 1729: 1709: 1708: 1706: 1703: 1702: 1678: 1674: 1665: 1661: 1656: 1653: 1652: 1635: 1634: 1632: 1629: 1628: 1586: 1549: 1544: 1516: 1515: 1510: 1507: 1506: 1472: 1471: 1469: 1466: 1465: 1449: 1446: 1445: 1429: 1426: 1425: 1409: 1406: 1405: 1368: 1365: 1364: 1338: 1334: 1317: 1315: 1312: 1311: 1227: 1226: 1184: 1167: 1164: 1163: 1143: 1142: 1140: 1137: 1136: 1098: 1094: 1077: 1074: 1073: 1054: 1051: 1050: 1014: 1010: 1008: 1005: 1004: 988: 985: 984: 968: 965: 964: 961: 935: 934: 932: 929: 928: 907: 906: 904: 901: 900: 879: 875: 873: 870: 869: 846: 842: 833: 829: 814: 803: 785: 782: 781: 759: 756: 755: 696: 687: 686: 684: 681: 680: 599: 587: 586: 584: 581: 580: 548: 547: 545: 542: 541: 539:linear operator 522: 519: 518: 513:is finite, the 497: 490: 486: 485: 470: 468: 465: 464: 445: 437: 434: 433: 427:Pettis integral 358: 356: 353: 352: 327: 324: 323: 307: 304: 303: 271: 266: 265: 248: 242: 237: 225: 216: 209: 205: 204: 189: 187: 184: 183: 151: 147: 145: 142: 141: 122: 119: 118: 102: 99: 98: 82: 79: 78: 75: 55: 47:James O. Ramsay 30: 12: 11: 5: 12885: 12875: 12874: 12869: 12853: 12852: 12823:(1): 177–196. 12807: 12762: 12733:(3): 839–851. 12717: 12698: 12653: 12634: 12586: 12568: 12557:(4): 875–889. 12541: 12522: 12489: 12454: 12425:(4): 468–484. 12404: 12383:(2): 351–363. 12363: 12344: 12315:(1): 257–295. 12295: 12266:(2): 267–286. 12246: 12233:(3): 545–560. 12217: 12168: 12119: 12092: 12057: 12036:(3): 533–550. 12016: 11989:Bioinformatics 11972: 11930: 11911:(4): 755–782. 11895: 11866:(1): 149–162. 11846: 11819:(1): 129–149. 11803: 11760: 11733: 11706: 11681:Neurocomputing 11664: 11629: 11610:(3): 803–821. 11594: 11573:(4): 679–699. 11553: 11518:(2): 774–805. 11502: 11477:(3): 607–622. 11461: 11444: 11442:(3):1720–1747. 11427: 11376: 11356: 11345:(2): 369–381. 11329: 11294: 11275: 11264:(1): 111–128. 11248: 11229:(3): 373–393. 11213: 11172: 11143:(3): 705–729. 11127: 11089: 11062: 11035:(2): 115–128. 11019: 11000:(3): 539–561. 10984: 10955:(1): 257–295. 10928: 10913: 10902:(1): 147–159. 10886: 10857:(2): 838–869. 10841: 10822: 10807: 10783: 10770:(1): 136–154. 10750: 10739:(5): 391–406. 10723: 10708: 10675: 10664:(1): 233–243. 10648: 10619:(3): 195–277. 10598: 10597: 10595: 10592: 10586: 10585: 10576: 10567: 10553: 10539: 10516: 10513: 10512: 10511: 10506: 10501: 10496: 10491: 10486: 10481: 10476: 10469: 10466: 10465: 10464: 10459: 10454: 10449: 10444: 10439: 10434: 10429: 10419: 10413: 10412: 10411: 10400: 10394: 10379: 10375: 10354: 10342: 10339: 10324: 10320: 10299: 10296: 10293: 10290: 10281:The domain of 10278: 10267: 10264: 10261: 10258: 10247: 10230: 10225: 10222: 10218: 10214: 10211: 10208: 10205: 10202: 10190: 10187: 10181: 10178: 10172: 10169: 10154: 10148: 10125: 10122: 10100: 10097: 10094: 10091: 10088: 10085: 10082: 10079: 10076: 10053: 10050: 10047: 10044: 10041: 10038: 10033: 10030: 10026: 10022: 10018: 9997: 9977: 9971: 9968: 9965: 9961: 9954: 9950: 9929: 9923: 9920: 9917: 9913: 9906: 9902: 9881: 9878: 9875: 9872: 9869: 9866: 9863: 9860: 9857: 9854: 9851: 9848: 9843: 9840: 9835: 9831: 9827: 9822: 9818: 9814: 9811: 9808: 9805: 9800: 9796: 9773: 9769: 9739: 9736: 9734: 9731: 9717:mixture models 9703: 9700: 9699: 9698: 9686: 9683: 9680: 9677: 9674: 9671: 9651: 9631: 9611: 9598: 9582: 9579: 9575: 9571: 9568: 9564: 9560: 9557: 9537: 9534: 9531: 9528: 9525: 9522: 9519: 9515: 9511: 9508: 9490: 9478: 9475: 9471: 9468: 9465: 9462: 9459: 9456: 9453: 9448: 9444: 9438: 9433: 9429: 9425: 9420: 9416: 9412: 9409: 9377: 9374: 9361: 9358: 9355: 9352: 9349: 9346: 9343: 9323: 9320: 9317: 9314: 9311: 9308: 9305: 9302: 9299: 9296: 9293: 9290: 9287: 9283: 9261: 9257: 9253: 9249: 9246: 9243: 9240: 9237: 9234: 9231: 9228: 9208: 9205: 9202: 9199: 9196: 9193: 9190: 9187: 9184: 9181: 9178: 9173: 9168: 9164: 9160: 9157: 9154: 9151: 9147: 9143: 9140: 9137: 9133: 9129: 9126: 9122: 9101: 9098: 9095: 9091: 9068: 9064: 9042: 9038: 9035: 9015: 9012: 9009: 9006: 9001: 8997: 8993: 8988: 8984: 8980: 8976: 8953: 8949: 8928: 8925: 8920: 8916: 8912: 8907: 8903: 8897: 8892: 8889: 8886: 8882: 8878: 8875: 8872: 8869: 8865: 8861: 8858: 8855: 8851: 8847: 8844: 8840: 8817: 8813: 8790: 8786: 8780: 8776: 8753: 8749: 8728: 8723: 8719: 8713: 8709: 8703: 8698: 8695: 8692: 8688: 8684: 8681: 8678: 8675: 8671: 8667: 8664: 8661: 8657: 8653: 8650: 8646: 8616: 8613: 8610: 8607: 8604: 8595:on the domain 8584: 8581: 8578: 8573: 8569: 8563: 8559: 8553: 8548: 8545: 8542: 8538: 8534: 8531: 8528: 8525: 8520: 8516: 8495: 8492: 8489: 8486: 8483: 8478: 8474: 8451: 8446: 8443: 8440: 8436: 8430: 8426: 8422: 8410: 8407: 8394: 8391: 8388: 8364: 8361: 8358: 8338: 8335: 8332: 8312: 8309: 8306: 8285: 8281: 8278: 8274: 8271: 8268: 8263: 8259: 8255: 8252: 8249: 8244: 8240: 8234: 8229: 8225: 8221: 8218: 8215: 8212: 8209: 8205: 8202: 8199: 8194: 8190: 8186: 8183: 8180: 8175: 8171: 8165: 8160: 8156: 8151: 8147: 8144: 8141: 8138: 8134: 8130: 8127: 8123: 8110: 8107: 8094: 8074: 8052: 8048: 8027: 8024: 8021: 8018: 8015: 8012: 8009: 8006: 8003: 8000: 7997: 7977: 7974: 7971: 7968: 7965: 7945: 7942: 7939: 7936: 7933: 7930: 7910: 7907: 7904: 7901: 7881: 7861: 7858: 7855: 7852: 7849: 7846: 7842: 7838: 7835: 7832: 7829: 7826: 7823: 7820: 7817: 7814: 7809: 7805: 7784: 7781: 7777: 7774: 7770: 7767: 7764: 7759: 7755: 7751: 7748: 7745: 7740: 7736: 7732: 7729: 7726: 7723: 7720: 7717: 7712: 7707: 7703: 7697: 7692: 7688: 7684: 7681: 7678: 7674: 7671: 7668: 7663: 7659: 7655: 7652: 7649: 7646: 7641: 7636: 7632: 7628: 7625: 7622: 7619: 7616: 7612: 7608: 7605: 7601: 7578: 7574: 7553: 7550: 7547: 7544: 7541: 7521: 7518: 7515: 7512: 7492: 7476: 7473: 7452: 7449: 7434: 7429: 7426: 7423: 7419: 7415: 7412: 7409: 7406: 7403: 7400: 7397: 7392: 7388: 7384: 7362: 7357: 7354: 7351: 7347: 7343: 7340: 7337: 7332: 7328: 7324: 7304: 7301: 7298: 7295: 7275: 7272: 7269: 7266: 7244: 7240: 7236: 7233: 7230: 7225: 7221: 7217: 7212: 7208: 7187: 7184: 7181: 7178: 7175: 7153: 7149: 7145: 7142: 7139: 7134: 7130: 7116: 7115: 7106: 7104: 7093: 7090: 7087: 7084: 7081: 7078: 7075: 7072: 7058: 7055: 7052: 7049: 7046: 7043: 7040: 7037: 7034: 7029: 7025: 7021: 7018: 7015: 7010: 7006: 7000: 6995: 6992: 6989: 6985: 6981: 6978: 6975: 6972: 6967: 6963: 6959: 6956: 6953: 6950: 6947: 6930: 6927: 6914: 6911: 6908: 6905: 6902: 6899: 6896: 6893: 6879: 6876: 6873: 6870: 6867: 6864: 6861: 6858: 6855: 6850: 6846: 6840: 6836: 6830: 6825: 6822: 6819: 6815: 6811: 6808: 6805: 6802: 6797: 6793: 6789: 6786: 6783: 6780: 6777: 6751: 6748: 6745: 6740: 6736: 6723: 6720: 6699: 6694: 6691: 6688: 6684: 6680: 6677: 6674: 6669: 6665: 6661: 6641: 6638: 6635: 6632: 6612: 6592: 6589: 6586: 6583: 6580: 6577: 6574: 6554: 6551: 6548: 6545: 6525: 6522: 6519: 6516: 6513: 6508: 6504: 6483: 6480: 6477: 6474: 6471: 6451: 6448: 6445: 6440: 6436: 6432: 6429: 6426: 6423: 6418: 6414: 6410: 6407: 6404: 6401: 6396: 6391: 6387: 6366: 6363: 6360: 6357: 6354: 6351: 6348: 6328: 6325: 6322: 6317: 6313: 6299: 6298: 6289: 6287: 6276: 6273: 6270: 6267: 6264: 6261: 6258: 6244: 6241: 6238: 6235: 6232: 6229: 6226: 6223: 6219: 6216: 6213: 6208: 6203: 6199: 6195: 6192: 6189: 6186: 6183: 6178: 6174: 6168: 6163: 6159: 6153: 6148: 6145: 6142: 6138: 6134: 6131: 6128: 6125: 6120: 6116: 6112: 6109: 6106: 6103: 6100: 6075: 6072: 6069: 6066: 6063: 6060: 6057: 6037: 6034: 6031: 6028: 6025: 6022: 6019: 5999: 5996: 5993: 5988: 5984: 5963: 5960: 5957: 5954: 5951: 5931: 5928: 5925: 5922: 5910: 5907: 5882: 5879: 5876: 5873: 5870: 5867: 5864: 5842: 5837: 5833: 5810: 5806: 5785: 5782: 5779: 5774: 5770: 5766: 5763: 5760: 5757: 5752: 5748: 5744: 5741: 5738: 5735: 5730: 5725: 5721: 5698: 5693: 5689: 5668: 5665: 5662: 5659: 5656: 5634: 5630: 5609: 5586: 5582: 5577: 5574: 5561: 5560: 5551: 5549: 5538: 5535: 5532: 5529: 5526: 5522: 5519: 5516: 5511: 5507: 5503: 5500: 5497: 5492: 5487: 5483: 5477: 5472: 5468: 5462: 5457: 5454: 5451: 5447: 5443: 5440: 5437: 5434: 5431: 5428: 5425: 5422: 5397: 5394: 5389: 5385: 5364: 5359: 5355: 5351: 5348: 5345: 5340: 5336: 5332: 5329: 5326: 5304: 5299: 5296: 5293: 5289: 5283: 5279: 5275: 5256: 5255: 5246: 5244: 5233: 5230: 5227: 5224: 5221: 5217: 5214: 5211: 5208: 5205: 5202: 5199: 5194: 5190: 5184: 5179: 5175: 5171: 5166: 5162: 5158: 5155: 5152: 5149: 5146: 5143: 5138: 5134: 5130: 5127: 5122: 5118: 5114: 5111: 5083: 5079: 5054: 5051: 5048: 5045: 5042: 5039: 5036: 5016: 5013: 5010: 5007: 5004: 5001: 4981: 4978: 4975: 4972: 4969: 4966: 4963: 4960: 4957: 4954: 4951: 4948: 4945: 4940: 4936: 4909: 4889: 4877: 4874: 4853: 4831: 4826: 4821: 4818: 4797: 4793: 4788: 4784: 4755: 4752: 4749: 4746: 4743: 4730: 4729: 4720: 4718: 4707: 4704: 4701: 4696: 4692: 4686: 4682: 4678: 4675: 4672: 4667: 4663: 4657: 4653: 4649: 4644: 4640: 4636: 4633: 4630: 4627: 4624: 4621: 4618: 4615: 4612: 4607: 4603: 4599: 4596: 4569: 4564: 4559: 4556: 4535: 4531: 4528: 4509: 4506: 4498:Fourier series 4488: 4487: 4478: 4476: 4465: 4462: 4459: 4454: 4450: 4444: 4441: 4437: 4431: 4426: 4423: 4420: 4416: 4412: 4409: 4406: 4403: 4400: 4397: 4394: 4391: 4388: 4383: 4380: 4377: 4372: 4368: 4342: 4337: 4334: 4330: 4326: 4323: 4320: 4317: 4314: 4309: 4306: 4302: 4298: 4295: 4290: 4286: 4265: 4243: 4239: 4218: 4198: 4195: 4192: 4172: 4169: 4164: 4160: 4156: 4153: 4150: 4145: 4141: 4135: 4132: 4128: 4122: 4117: 4114: 4111: 4107: 4103: 4100: 4097: 4094: 4091: 4088: 4085: 4082: 4079: 4074: 4070: 4066: 4062: 4056: 4053: 4050: 4047: 4044: 4041: 4038: 4034: 4009: 4006: 4003: 4000: 3997: 3992: 3988: 3984: 3981: 3978: 3975: 3972: 3969: 3966: 3963: 3960: 3955: 3951: 3947: 3942: 3937: 3933: 3929: 3924: 3921: 3917: 3896: 3893: 3890: 3885: 3881: 3875: 3872: 3868: 3862: 3857: 3854: 3851: 3847: 3843: 3840: 3837: 3834: 3831: 3828: 3825: 3822: 3819: 3814: 3810: 3785: 3782: 3779: 3774: 3770: 3747: 3743: 3722: 3719: 3716: 3711: 3707: 3703: 3700: 3697: 3692: 3688: 3682: 3678: 3672: 3667: 3664: 3661: 3657: 3653: 3650: 3647: 3644: 3641: 3638: 3635: 3615: 3612: 3609: 3606: 3603: 3600: 3578: 3556:), which is a 3539: 3536: 3533: 3530: 3527: 3522: 3518: 3514: 3511: 3508: 3505: 3502: 3499: 3494: 3490: 3486: 3481: 3459: 3439: 3436: 3433: 3428: 3424: 3400: 3397: 3379: 3375: 3352: 3349: 3345: 3322: 3319: 3315: 3311: 3308: 3303: 3300: 3296: 3292: 3287: 3283: 3279: 3274: 3271: 3267: 3254: 3251: 3228: 3225: 3220: 3216: 3191: 3187: 3183: 3179: 3175: 3172: 3169: 3164: 3161: 3157: 3132: 3129: 3125: 3102: 3099: 3095: 3091: 3088: 3083: 3080: 3076: 3072: 3067: 3063: 3059: 3054: 3051: 3047: 3034: 3031: 3013: 3010: 3007: 3004: 3001: 2998: 2995: 2991: 2986: 2981: 2978: 2958: 2955: 2952: 2947: 2943: 2939: 2934: 2931: 2927: 2914: 2911: 2898: 2878: 2856: 2851: 2848: 2844: 2840: 2837: 2832: 2829: 2825: 2821: 2794: 2791: 2788: 2783: 2780: 2776: 2772: 2768: 2745: 2742: 2738: 2715: 2712: 2708: 2687: 2682: 2679: 2675: 2671: 2666: 2662: 2658: 2653: 2650: 2646: 2625: 2618: 2614: 2610: 2606: 2602: 2599: 2596: 2593: 2590: 2585: 2582: 2578: 2574: 2571: 2566: 2533: 2529: 2525: 2521: 2517: 2514: 2511: 2508: 2505: 2500: 2497: 2493: 2472: 2452: 2449: 2446: 2441: 2437: 2416: 2413: 2410: 2407: 2404: 2401: 2398: 2395: 2392: 2389: 2386: 2376: 2373: 2370: 2367: 2364: 2361: 2358: 2338: 2335: 2332: 2329: 2326: 2323: 2319: 2315: 2312: 2309: 2306: 2303: 2283: 2280: 2277: 2274: 2271: 2249: 2245: 2224: 2221: 2218: 2215: 2212: 2209: 2206: 2200: 2197: 2194: 2191: 2188: 2173: 2170: 2153: 2132: 2128: 2125: 2122: 2119: 2116: 2113: 2110: 2107: 2087: 2076: 2075: 2064: 2061: 2058: 2055: 2051: 2048: 2045: 2041: 2037: 2034: 2031: 2026: 2022: 2018: 2015: 2012: 2007: 2003: 1997: 1993: 1987: 1982: 1979: 1976: 1972: 1968: 1965: 1962: 1959: 1956: 1953: 1950: 1946: 1940: 1937: 1934: 1931: 1928: 1925: 1922: 1919: 1916: 1912: 1882: 1878: 1857: 1854: 1851: 1831: 1826: 1813: 1812: 1801: 1796: 1792: 1788: 1783: 1779: 1773: 1769: 1763: 1758: 1755: 1752: 1748: 1744: 1739: 1712: 1699:tensor product 1686: 1681: 1677: 1673: 1668: 1664: 1660: 1638: 1619: 1618: 1609: 1607: 1596: 1593: 1589: 1584: 1581: 1578: 1575: 1572: 1569: 1566: 1563: 1560: 1557: 1552: 1547: 1543: 1539: 1536: 1533: 1530: 1527: 1524: 1519: 1514: 1489: 1486: 1483: 1480: 1475: 1453: 1433: 1413: 1390: 1387: 1384: 1381: 1378: 1375: 1372: 1352: 1349: 1346: 1341: 1337: 1333: 1330: 1327: 1324: 1320: 1308: 1307: 1296: 1293: 1290: 1287: 1284: 1281: 1278: 1275: 1272: 1268: 1265: 1262: 1259: 1256: 1253: 1250: 1247: 1244: 1241: 1238: 1235: 1225: 1222: 1219: 1216: 1213: 1210: 1207: 1203: 1200: 1197: 1194: 1191: 1187: 1183: 1180: 1177: 1174: 1171: 1146: 1133: 1132: 1119: 1116: 1113: 1110: 1107: 1104: 1101: 1097: 1093: 1090: 1087: 1084: 1081: 1058: 1047:Sobolev spaces 1034: 1031: 1028: 1025: 1022: 1017: 1013: 992: 972: 960: 957: 938: 910: 882: 878: 866: 865: 854: 849: 845: 841: 836: 832: 828: 825: 822: 817: 812: 809: 806: 802: 798: 795: 792: 789: 763: 739: 736: 733: 730: 727: 724: 721: 718: 715: 712: 709: 706: 703: 699: 695: 690: 673: 672: 661: 658: 655: 652: 648: 645: 642: 639: 636: 633: 630: 627: 624: 621: 618: 615: 612: 609: 606: 602: 598: 595: 590: 565: 562: 559: 556: 551: 526: 500: 493: 489: 484: 480: 477: 473: 452: 448: 444: 441: 423: 422: 411: 408: 405: 402: 398: 395: 392: 389: 386: 383: 380: 377: 374: 371: 368: 365: 361: 337: 334: 331: 311: 291: 288: 285: 282: 279: 274: 269: 264: 261: 258: 255: 251: 245: 240: 236: 232: 228: 224: 219: 212: 208: 203: 199: 196: 192: 171: 168: 165: 162: 159: 154: 150: 126: 106: 86: 74: 71: 69:is satisfied. 54: 51: 29: 26: 9: 6: 4: 3: 2: 12884: 12873: 12870: 12868: 12865: 12864: 12862: 12848: 12844: 12839: 12834: 12830: 12826: 12822: 12818: 12811: 12803: 12799: 12795: 12791: 12786: 12781: 12777: 12773: 12766: 12758: 12754: 12750: 12746: 12741: 12736: 12732: 12728: 12721: 12713: 12709: 12702: 12694: 12690: 12686: 12682: 12677: 12672: 12668: 12664: 12657: 12649: 12645: 12638: 12630: 12626: 12622: 12618: 12613: 12608: 12604: 12600: 12593: 12591: 12582: 12575: 12573: 12564: 12560: 12556: 12552: 12545: 12537: 12533: 12526: 12517: 12512: 12508: 12504: 12500: 12493: 12485: 12481: 12477: 12473: 12469: 12465: 12458: 12450: 12446: 12442: 12438: 12433: 12428: 12424: 12420: 12413: 12411: 12409: 12400: 12396: 12391: 12386: 12382: 12378: 12374: 12367: 12360:(1): 210–229. 12359: 12355: 12348: 12340: 12336: 12331: 12326: 12322: 12318: 12314: 12310: 12306: 12299: 12291: 12287: 12283: 12279: 12274: 12269: 12265: 12261: 12257: 12250: 12241: 12236: 12232: 12228: 12221: 12213: 12209: 12204: 12199: 12195: 12191: 12187: 12183: 12179: 12172: 12164: 12160: 12155: 12150: 12146: 12142: 12138: 12134: 12130: 12123: 12115: 12111: 12107: 12103: 12096: 12088: 12084: 12080: 12076: 12072: 12068: 12067:Technometrics 12061: 12053: 12049: 12044: 12039: 12035: 12031: 12027: 12020: 12012: 12008: 12003: 11998: 11994: 11990: 11983: 11976: 11968: 11964: 11960: 11956: 11952: 11948: 11941: 11934: 11926: 11922: 11918: 11914: 11910: 11906: 11899: 11891: 11887: 11882: 11877: 11873: 11869: 11865: 11861: 11857: 11850: 11842: 11838: 11834: 11830: 11826: 11822: 11818: 11814: 11807: 11799: 11795: 11791: 11787: 11783: 11779: 11775: 11771: 11764: 11756: 11752: 11748: 11744: 11737: 11729: 11725: 11722:(C): 92–106. 11721: 11717: 11710: 11702: 11698: 11694: 11690: 11686: 11682: 11675: 11668: 11660: 11656: 11652: 11648: 11644: 11640: 11633: 11625: 11621: 11617: 11613: 11609: 11605: 11598: 11590: 11586: 11581: 11576: 11572: 11568: 11564: 11557: 11549: 11543: 11535: 11531: 11526: 11521: 11517: 11513: 11506: 11498: 11492: 11484: 11480: 11476: 11472: 11465: 11458: 11454: 11448: 11441: 11437: 11431: 11423: 11419: 11414: 11409: 11404: 11399: 11395: 11391: 11387: 11380: 11373: 11369: 11363: 11361: 11352: 11348: 11344: 11340: 11333: 11325: 11321: 11317: 11313: 11309: 11305: 11298: 11291:(3): 763–788. 11290: 11286: 11279: 11271: 11267: 11263: 11259: 11252: 11244: 11240: 11236: 11232: 11228: 11224: 11217: 11209: 11205: 11200: 11195: 11191: 11187: 11183: 11176: 11168: 11164: 11160: 11156: 11151: 11146: 11142: 11138: 11131: 11123: 11119: 11115: 11111: 11107: 11103: 11096: 11094: 11085: 11081: 11077: 11073: 11066: 11058: 11054: 11050: 11046: 11042: 11038: 11034: 11030: 11023: 11015: 11011: 11007: 11003: 10999: 10995: 10988: 10980: 10976: 10971: 10966: 10962: 10958: 10954: 10950: 10946: 10939: 10937: 10935: 10933: 10924: 10917: 10909: 10905: 10901: 10897: 10890: 10882: 10878: 10874: 10870: 10865: 10860: 10856: 10852: 10845: 10838:(2): 151–163. 10837: 10833: 10826: 10818: 10811: 10803: 10796: 10794: 10792: 10790: 10788: 10778: 10773: 10769: 10765: 10761: 10754: 10746: 10742: 10738: 10734: 10727: 10719: 10712: 10703: 10698: 10694: 10690: 10686: 10679: 10671: 10667: 10663: 10659: 10652: 10644: 10640: 10635: 10630: 10626: 10622: 10618: 10614: 10610: 10603: 10599: 10591: 10590: 10584: 10581: 10577: 10575: 10572: 10568: 10566: 10562: 10558: 10554: 10552: 10548: 10544: 10540: 10538: 10537:0-387-40080-X 10534: 10530: 10526: 10522: 10521:Ramsay, J. O. 10519: 10518: 10510: 10507: 10505: 10502: 10500: 10497: 10495: 10492: 10490: 10487: 10485: 10482: 10480: 10477: 10475: 10472: 10471: 10463: 10460: 10458: 10455: 10453: 10450: 10448: 10445: 10443: 10440: 10438: 10435: 10433: 10430: 10428: 10425: 10424: 10423: 10417: 10410: 10407: 10406: 10405: 10398: 10393: 10377: 10373: 10352: 10338: 10322: 10318: 10294: 10288: 10262: 10256: 10246: 10223: 10220: 10209: 10203: 10186: 10177: 10168: 10152: 10135: 10131: 10121: 10119: 10114: 10098: 10095: 10092: 10089: 10086: 10080: 10074: 10065: 10051: 10048: 10039: 10031: 10028: 10024: 9995: 9975: 9969: 9966: 9963: 9959: 9952: 9948: 9927: 9921: 9918: 9915: 9911: 9904: 9900: 9876: 9873: 9870: 9864: 9861: 9858: 9849: 9841: 9838: 9833: 9829: 9820: 9816: 9812: 9806: 9798: 9794: 9771: 9767: 9756: 9754: 9744: 9730: 9727: 9726:FPCA approach 9721: 9718: 9713: 9709: 9681: 9675: 9672: 9669: 9649: 9629: 9609: 9602: 9601:Link function 9599: 9596: 9577: 9569: 9558: 9555: 9532: 9526: 9523: 9517: 9509: 9494: 9491: 9476: 9473: 9466: 9460: 9454: 9446: 9442: 9436: 9431: 9427: 9423: 9418: 9414: 9410: 9407: 9400: 9397: 9396: 9395: 9393: 9389: 9385: 9384: 9373: 9356: 9353: 9350: 9344: 9341: 9321: 9318: 9306: 9300: 9297: 9294: 9288: 9247: 9241: 9238: 9235: 9229: 9226: 9206: 9203: 9194: 9188: 9185: 9182: 9176: 9171: 9166: 9162: 9158: 9152: 9141: 9135: 9127: 9096: 9066: 9062: 9036: 9033: 9013: 9010: 8999: 8995: 8986: 8982: 8951: 8947: 8926: 8918: 8914: 8905: 8901: 8890: 8887: 8884: 8880: 8876: 8870: 8859: 8853: 8845: 8815: 8811: 8788: 8784: 8778: 8774: 8751: 8747: 8726: 8721: 8717: 8711: 8707: 8696: 8693: 8690: 8686: 8682: 8676: 8665: 8659: 8651: 8635: 8634: 8628: 8611: 8608: 8605: 8579: 8571: 8567: 8561: 8557: 8546: 8543: 8540: 8536: 8532: 8526: 8518: 8514: 8490: 8487: 8484: 8476: 8472: 8444: 8441: 8438: 8428: 8424: 8406: 8392: 8389: 8386: 8378: 8362: 8359: 8356: 8336: 8333: 8330: 8310: 8307: 8304: 8283: 8279: 8276: 8269: 8261: 8257: 8250: 8242: 8238: 8232: 8227: 8223: 8219: 8216: 8213: 8210: 8207: 8200: 8192: 8188: 8181: 8173: 8169: 8163: 8158: 8154: 8149: 8145: 8142: 8136: 8128: 8106: 8092: 8072: 8050: 8046: 8022: 8019: 8016: 8010: 8004: 8001: 7998: 7972: 7969: 7966: 7940: 7937: 7934: 7928: 7905: 7899: 7879: 7853: 7847: 7836: 7830: 7824: 7821: 7815: 7807: 7803: 7782: 7779: 7775: 7772: 7765: 7757: 7753: 7746: 7738: 7734: 7727: 7724: 7721: 7715: 7710: 7705: 7701: 7695: 7690: 7686: 7682: 7679: 7676: 7669: 7661: 7657: 7650: 7644: 7639: 7634: 7630: 7626: 7623: 7620: 7614: 7606: 7576: 7572: 7548: 7545: 7542: 7516: 7510: 7490: 7482: 7472: 7470: 7466: 7462: 7458: 7448: 7432: 7427: 7424: 7421: 7413: 7410: 7407: 7404: 7398: 7390: 7386: 7360: 7355: 7352: 7349: 7338: 7330: 7326: 7299: 7293: 7270: 7264: 7242: 7238: 7234: 7231: 7228: 7223: 7219: 7215: 7210: 7206: 7182: 7179: 7176: 7151: 7147: 7143: 7140: 7137: 7132: 7128: 7114: 7107: 7105: 7091: 7085: 7082: 7079: 7073: 7070: 7056: 7050: 7044: 7041: 7035: 7027: 7023: 7016: 7008: 7004: 6998: 6993: 6990: 6987: 6983: 6979: 6973: 6965: 6961: 6957: 6951: 6945: 6938: 6937: 6934: 6926: 6912: 6906: 6903: 6900: 6894: 6891: 6877: 6871: 6865: 6862: 6856: 6848: 6844: 6838: 6834: 6828: 6823: 6820: 6817: 6813: 6809: 6803: 6795: 6791: 6787: 6781: 6775: 6767: 6766: 6746: 6738: 6734: 6719: 6717: 6716: 6697: 6692: 6689: 6686: 6675: 6667: 6663: 6636: 6630: 6610: 6587: 6584: 6581: 6575: 6572: 6549: 6543: 6520: 6517: 6514: 6506: 6502: 6478: 6475: 6472: 6446: 6438: 6434: 6430: 6424: 6416: 6412: 6408: 6402: 6394: 6389: 6385: 6364: 6361: 6358: 6355: 6352: 6349: 6346: 6323: 6315: 6311: 6297: 6290: 6288: 6271: 6268: 6265: 6259: 6256: 6242: 6236: 6230: 6227: 6224: 6221: 6214: 6206: 6201: 6197: 6190: 6187: 6184: 6176: 6172: 6166: 6161: 6157: 6151: 6146: 6143: 6140: 6136: 6132: 6126: 6118: 6114: 6110: 6104: 6098: 6091: 6090: 6087: 6073: 6070: 6067: 6064: 6061: 6058: 6055: 6032: 6029: 6026: 6020: 6017: 5994: 5986: 5982: 5958: 5955: 5952: 5926: 5920: 5906: 5904: 5903: 5898: 5897: 5880: 5877: 5874: 5871: 5868: 5865: 5862: 5840: 5835: 5831: 5808: 5804: 5780: 5772: 5768: 5764: 5758: 5750: 5746: 5742: 5736: 5728: 5723: 5719: 5696: 5691: 5687: 5663: 5660: 5657: 5632: 5628: 5607: 5575: 5572: 5559: 5552: 5550: 5536: 5533: 5530: 5527: 5524: 5517: 5509: 5505: 5498: 5490: 5485: 5481: 5475: 5470: 5466: 5460: 5455: 5452: 5449: 5445: 5441: 5435: 5432: 5429: 5423: 5420: 5413: 5412: 5409: 5395: 5392: 5387: 5383: 5357: 5353: 5349: 5346: 5343: 5338: 5334: 5327: 5324: 5302: 5297: 5294: 5291: 5281: 5277: 5265: 5264: 5254: 5247: 5245: 5231: 5228: 5225: 5222: 5219: 5212: 5206: 5200: 5192: 5188: 5182: 5177: 5173: 5169: 5164: 5160: 5156: 5153: 5150: 5144: 5141: 5136: 5132: 5125: 5120: 5116: 5112: 5109: 5102: 5101: 5098: 5096: 5081: 5077: 5068: 5067:Hilbert space 5049: 5046: 5043: 5037: 5034: 5011: 5005: 5002: 4999: 4976: 4970: 4967: 4961: 4955: 4952: 4946: 4938: 4934: 4925: 4924: 4907: 4887: 4873: 4871: 4867: 4851: 4829: 4819: 4816: 4791: 4786: 4782: 4773: 4769: 4768:inner product 4750: 4747: 4744: 4728: 4721: 4719: 4705: 4702: 4699: 4694: 4690: 4684: 4680: 4676: 4673: 4670: 4665: 4661: 4655: 4651: 4647: 4642: 4638: 4634: 4631: 4628: 4622: 4619: 4616: 4610: 4605: 4601: 4597: 4594: 4587: 4586: 4583: 4567: 4557: 4554: 4529: 4526: 4518: 4514: 4505: 4503: 4499: 4495: 4486: 4479: 4477: 4460: 4452: 4448: 4442: 4439: 4435: 4429: 4424: 4421: 4418: 4414: 4410: 4404: 4398: 4395: 4389: 4378: 4370: 4366: 4358: 4357: 4354: 4335: 4332: 4328: 4324: 4321: 4318: 4315: 4312: 4307: 4304: 4300: 4293: 4288: 4284: 4263: 4241: 4237: 4216: 4190: 4170: 4162: 4151: 4143: 4139: 4133: 4130: 4126: 4120: 4115: 4112: 4109: 4105: 4101: 4095: 4089: 4086: 4080: 4072: 4068: 4051: 4048: 4045: 4039: 4036: 4023: 4007: 4004: 3998: 3990: 3986: 3976: 3970: 3967: 3961: 3953: 3949: 3940: 3935: 3931: 3927: 3922: 3919: 3915: 3891: 3883: 3879: 3873: 3870: 3866: 3855: 3852: 3849: 3845: 3841: 3835: 3829: 3826: 3820: 3812: 3808: 3799: 3780: 3772: 3768: 3745: 3741: 3717: 3709: 3705: 3698: 3690: 3686: 3680: 3676: 3665: 3662: 3659: 3655: 3651: 3645: 3642: 3639: 3610: 3607: 3604: 3566: 3561: 3559: 3555: 3554: 3534: 3531: 3528: 3520: 3516: 3506: 3503: 3500: 3492: 3488: 3484: 3457: 3434: 3426: 3422: 3413: 3409: 3405: 3396: 3393: 3377: 3373: 3350: 3347: 3343: 3320: 3317: 3313: 3309: 3301: 3298: 3294: 3285: 3281: 3277: 3272: 3269: 3265: 3257:Measurements 3250: 3249: 3245: 3240: 3218: 3214: 3189: 3185: 3181: 3177: 3173: 3170: 3167: 3162: 3159: 3155: 3146: 3130: 3127: 3123: 3100: 3097: 3093: 3089: 3081: 3078: 3074: 3065: 3061: 3057: 3052: 3049: 3045: 3037:Measurements 3030: 3027: 3024: 3011: 3008: 3005: 3002: 2999: 2996: 2993: 2989: 2979: 2976: 2953: 2945: 2941: 2937: 2932: 2929: 2925: 2917:Measurements 2910: 2896: 2876: 2854: 2849: 2846: 2842: 2838: 2830: 2827: 2823: 2792: 2789: 2781: 2778: 2774: 2743: 2740: 2736: 2713: 2710: 2706: 2680: 2677: 2673: 2664: 2660: 2656: 2651: 2648: 2644: 2616: 2612: 2608: 2604: 2600: 2597: 2594: 2591: 2588: 2583: 2580: 2576: 2569: 2564: 2531: 2527: 2523: 2519: 2515: 2512: 2509: 2506: 2503: 2498: 2495: 2491: 2470: 2447: 2439: 2435: 2408: 2402: 2399: 2393: 2387: 2374: 2368: 2365: 2362: 2330: 2324: 2313: 2307: 2301: 2278: 2275: 2272: 2247: 2243: 2219: 2216: 2213: 2207: 2204: 2198: 2192: 2186: 2179: 2169: 2167: 2151: 2120: 2117: 2114: 2108: 2105: 2085: 2062: 2053: 2049: 2046: 2039: 2032: 2024: 2020: 2013: 2005: 2001: 1995: 1991: 1985: 1980: 1977: 1974: 1970: 1966: 1960: 1957: 1954: 1944: 1935: 1932: 1929: 1923: 1920: 1917: 1914: 1902: 1901: 1900: 1898: 1880: 1876: 1855: 1852: 1849: 1829: 1799: 1794: 1790: 1786: 1781: 1777: 1771: 1767: 1756: 1753: 1750: 1746: 1742: 1728: 1727: 1726: 1700: 1697:, so that in 1679: 1675: 1671: 1666: 1662: 1626: 1617: 1610: 1608: 1594: 1591: 1579: 1573: 1567: 1564: 1561: 1550: 1545: 1541: 1537: 1531: 1522: 1505: 1504: 1501: 1487: 1481: 1478: 1411: 1402: 1385: 1382: 1379: 1373: 1370: 1347: 1339: 1331: 1325: 1291: 1288: 1285: 1279: 1276: 1273: 1270: 1266: 1257: 1251: 1248: 1242: 1236: 1223: 1217: 1214: 1211: 1201: 1195: 1189: 1181: 1175: 1169: 1162: 1161: 1160: 1114: 1111: 1108: 1102: 1099: 1088: 1082: 1072: 1071: 1070: 1056: 1048: 1029: 1026: 1023: 1015: 1011: 990: 970: 956: 954: 926: 898: 880: 876: 852: 847: 843: 834: 830: 826: 823: 810: 807: 804: 800: 796: 793: 790: 787: 780: 779: 778: 777: 761: 753: 731: 728: 725: 719: 713: 710: 707: 693: 678: 659: 656: 653: 650: 646: 637: 634: 631: 622: 619: 616: 613: 610: 596: 593: 579: 578: 577: 563: 557: 554: 540: 524: 516: 498: 491: 487: 478: 450: 442: 439: 432: 428: 409: 406: 403: 400: 396: 390: 387: 384: 378: 372: 369: 366: 351: 350: 349: 335: 332: 329: 309: 286: 280: 277: 272: 259: 253: 243: 238: 234: 222: 217: 210: 206: 197: 166: 163: 160: 152: 148: 140: 124: 104: 84: 70: 68: 64: 60: 59:Hilbert space 50: 48: 43: 39: 35: 25: 22: 18: 12820: 12816: 12810: 12775: 12771: 12765: 12730: 12726: 12720: 12714:: 1571–1596. 12711: 12707: 12701: 12666: 12662: 12656: 12647: 12643: 12637: 12602: 12598: 12583:: 3147–3155. 12580: 12554: 12550: 12544: 12535: 12531: 12525: 12506: 12502: 12492: 12467: 12463: 12457: 12422: 12418: 12380: 12376: 12366: 12357: 12353: 12347: 12312: 12308: 12298: 12263: 12259: 12249: 12230: 12226: 12220: 12185: 12181: 12171: 12136: 12132: 12122: 12105: 12101: 12095: 12070: 12066: 12060: 12033: 12029: 12019: 11995:(1): 68–76. 11992: 11988: 11975: 11953:(1): 41–67. 11950: 11946: 11933: 11908: 11904: 11898: 11863: 11859: 11849: 11816: 11812: 11806: 11776:(1): 44–68. 11773: 11769: 11763: 11749:(C): 14–29. 11746: 11742: 11736: 11719: 11715: 11709: 11684: 11680: 11667: 11642: 11638: 11632: 11607: 11603: 11597: 11570: 11566: 11556: 11542:cite journal 11525:math/0505638 11515: 11511: 11505: 11491:cite journal 11474: 11470: 11464: 11459:(1):362–388. 11456: 11452: 11447: 11439: 11435: 11430: 11393: 11389: 11379: 11371: 11367: 11342: 11338: 11332: 11307: 11303: 11297: 11288: 11284: 11278: 11261: 11257: 11251: 11226: 11222: 11216: 11189: 11185: 11175: 11140: 11136: 11130: 11105: 11101: 11078:(1): 54–77. 11075: 11071: 11065: 11032: 11028: 11022: 10997: 10993: 10987: 10952: 10948: 10922: 10916: 10899: 10895: 10889: 10854: 10850: 10844: 10835: 10831: 10825: 10816: 10810: 10801: 10767: 10763: 10753: 10736: 10732: 10726: 10717: 10711: 10692: 10688: 10678: 10661: 10657: 10651: 10616: 10612: 10602: 10587: 10579: 10570: 10556: 10542: 10528: 10421: 10402: 10344: 10280: 10192: 10183: 10174: 10127: 10066: 9757: 9749: 9733:Time warping 9722: 9705: 9381: 9379: 8631: 8629: 8412: 8112: 7532:with domain 7478: 7454: 7119: 7108: 6932: 6763: 6725: 6713: 6302: 6291: 5912: 5900: 5894: 5564: 5553: 5261: 5259: 5248: 4921: 4879: 4766:denotes the 4733: 4722: 4511: 4491: 4480: 3562: 3551: 3402: 3394: 3256: 3241: 3147: 3036: 3028: 3025: 2916: 2175: 2077: 1814: 1622: 1611: 1403: 1309: 1134: 962: 897:eigenvectors 867: 674: 424: 76: 66: 56: 31: 16: 15: 12838:2117/126653 11687:: 164–171. 10804:. Springer. 9738:Motivations 2235:that is an 1627:applies to 925:eigenvalues 348:satisfying 12861:Categories 12785:1705.06226 12740:1811.01429 12727:Biometrics 12676:1509.02029 12612:2104.04628 12551:Biometrika 12432:1512.03216 12240:1605.03707 12227:Biometrika 12182:Biometrics 12073:(1): 1–9. 11860:Biometrika 11604:Biometrics 11471:Biometrika 11374:(1):49–64. 11368:Biometrika 11258:Biometrika 10896:Biometrika 10594:References 10447:classiFunc 10409:scikit-fda 10310:can be in 10189:Extensions 8966:satisfies 5893:. Models ( 4920:in model ( 4864:is a zero 3248:Stock data 1868:, all the 61:, or as a 21:statistics 12757:220687157 12629:233210300 12470:: 43–49. 12290:124261587 12282:1369-7412 12163:120454400 11589:120883171 11403:1105.0014 11390:Bernoulli 11243:122007787 11150:1102.5212 11014:118960346 10864:1206.1194 10643:120451372 10295:⋅ 10263:⋅ 10224:∈ 10096:γ 10090:δ 10029:− 9960:∼ 9912:∼ 9865:∈ 9839:− 9682:η 9670:μ 9650:η 9630:μ 9556:μ 9533:μ 9461:β 9428:∫ 9415:β 9408:η 9345:∈ 9256:⟶ 9248:× 9163:∫ 9037:∈ 8896:∞ 8881:∑ 8775:β 8708:β 8702:∞ 8687:∑ 8568:ϕ 8552:∞ 8537:∑ 8450:∞ 8425:ϕ 8258:β 8224:∫ 8217:… 8189:β 8155:∫ 8093:γ 8073:β 8011:× 7941:⋅ 7935:⋅ 7929:γ 7906:⋅ 7900:β 7880:α 7854:⋅ 7837:− 7831:⋅ 7816:⋅ 7716:γ 7702:∫ 7687:∫ 7645:β 7631:∫ 7624:α 7517:⋅ 7411:≤ 7265:ε 7239:β 7232:… 7220:β 7207:β 7141:… 7074:∈ 7045:ε 7005:β 6984:∑ 6962:β 6895:∈ 6866:ε 6845:α 6814:∑ 6792:α 6747:⋅ 6712:. Model ( 6576:∈ 6544:ε 6503:α 6435:μ 6431:− 6359:… 6312:α 6260:∈ 6231:ε 6173:α 6158:∫ 6137:∑ 6115:α 6068:… 6021:∈ 5875:… 5805:β 5769:μ 5765:− 5576:∈ 5573:θ 5534:ε 5506:β 5467:∫ 5446:∑ 5439:⟩ 5436:θ 5427:⟨ 5347:⋯ 5229:ε 5207:β 5174:∫ 5161:β 5154:ε 5148:⟩ 5145:β 5129:⟨ 5117:β 5038:∈ 5006:β 5000:β 4971:μ 4968:− 4908:β 4852:ε 4820:∈ 4817:β 4792:∈ 4783:β 4754:⟩ 4751:⋅ 4745:⋅ 4742:⟨ 4703:ε 4691:β 4674:⋯ 4662:β 4639:β 4632:ε 4626:⟩ 4623:β 4614:⟨ 4602:β 4558:∈ 4530:∈ 4449:φ 4415:∑ 4399:μ 4197:∞ 4194:→ 4168:→ 4140:φ 4106:∑ 4102:− 4090:μ 4087:− 4040:∈ 3987:φ 3971:μ 3968:− 3932:∫ 3880:φ 3861:∞ 3846:∑ 3830:μ 3796:. By the 3769:φ 3742:λ 3706:φ 3687:φ 3677:λ 3671:∞ 3656:∑ 3634:Σ 3611:⋅ 3605:⋅ 3599:Σ 3513:→ 3314:ε 3227:∞ 3224:→ 3171:… 3094:ε 3006:… 2980:∈ 2843:σ 2824:ϵ 2775:ϵ 2737:ϵ 2448:⋅ 2357:Σ 2302:μ 2208:∈ 2127:→ 2060:∞ 2057:→ 2044:→ 2021:φ 2002:φ 1992:λ 1971:∑ 1967:− 1949:Σ 1924:∈ 1877:φ 1853:∈ 1791:φ 1787:⊗ 1778:φ 1768:λ 1762:∞ 1747:∑ 1701:notation 1676:φ 1663:λ 1556:Σ 1542:∫ 1500:given by 1485:→ 1452:Σ 1432:Σ 1412:μ 1374:∈ 1351:∞ 1280:∈ 1206:Σ 1170:μ 1103:∈ 877:φ 844:φ 840:⟩ 831:φ 821:⟨ 816:∞ 801:∑ 794:μ 732:μ 729:− 720:⊗ 714:μ 711:− 654:∈ 638:μ 635:− 626:⟩ 623:μ 620:− 608:⟨ 561:→ 483:‖ 476:‖ 440:μ 404:∈ 394:⟩ 385:μ 382:⟨ 376:⟩ 364:⟨ 333:∈ 330:μ 290:∞ 235:∫ 202:‖ 195:‖ 12847:13719492 12802:13671221 12693:88521295 12484:17900407 12449:55849758 12399:17175587 12339:13709250 12212:22670567 12154:11192549 12087:21662019 12052:16050693 12011:16257986 11967:11448616 11925:18638091 11890:19262739 11798:10969266 11790:24249100 11701:33591208 11422:88512527 11324:14296231 11208:16758288 11167:17843044 11057:55092204 10979:13709250 10881:13119710 10504:Lp space 10468:See also 10418:packages 10399:packages 9892:, where 9662:through 9548:, where 9334:for all 6623:, i.e., 5375:, where 4870:variance 3907:, where 3335:, where 3115:, where 2636:, where 1363:for all 117:, where 12317:Bibcode 12203:3443537 11881:2650433 11841:8902492 11821:Bibcode 11659:9487422 11624:2532201 11122:1243975 11049:3316063 10957:Bibcode 10621:Bibcode 10527:(2005) 10462:fdasrvf 10452:fda.usc 10442:FDboost 10437:fdapace 10124:Methods 9593:is the 8379:. With 5899:) and ( 4868:finite 3550:as in ( 1725:writes 774:as the 750:. The 675:or, in 28:History 12845:  12800:  12755:  12691:  12627:  12482:  12447:  12397:  12337:  12288:  12280:  12210:  12200:  12161:  12151:  12085:  12050:  12009:  11965:  11923:  11888:  11878:  11839:  11796:  11788:  11699:  11657:  11622:  11587:  11420:  11322:  11241:  11206:  11165:  11120:  11055:  11047:  11012:  10977:  10879:  10641:  10563:  10549:  10535:  10432:refund 10397:Python 8939:where 7795:where 7120:where 7068:  7060:  6889:  6881:  6303:where 6254:  6246:  5855:, for 5796:, and 5565:where 4734:where 4494:spline 3206:, and 2202:  868:where 679:form, 677:tensor 12843:S2CID 12798:S2CID 12780:arXiv 12753:S2CID 12735:arXiv 12689:S2CID 12671:arXiv 12625:S2CID 12607:arXiv 12480:S2CID 12445:S2CID 12427:arXiv 12395:S2CID 12335:S2CID 12286:S2CID 12235:arXiv 12159:S2CID 12083:S2CID 12048:S2CID 11985:(PDF) 11963:S2CID 11943:(PDF) 11921:S2CID 11837:S2CID 11794:S2CID 11697:S2CID 11677:(PDF) 11655:S2CID 11620:JSTOR 11585:S2CID 11520:arXiv 11418:S2CID 11398:arXiv 11320:S2CID 11239:S2CID 11204:S2CID 11163:S2CID 11145:arXiv 11118:S2CID 11053:S2CID 11045:JSTOR 11010:S2CID 10975:S2CID 10877:S2CID 10859:arXiv 10639:S2CID 2758:with 1003:like 537:is a 12278:ISSN 12208:PMID 12007:PMID 11886:PMID 11786:PMID 11548:link 11497:link 10561:ISBN 10547:ISBN 10533:ISBN 10523:and 9710:and 9026:for 8390:> 8360:> 8334:> 8085:and 7988:and 7921:and 5408:, by 5027:for 4866:mean 4809:and 3560:. 3246:and 2889:and 2805:and 1623:The 1424:and 1401:). 1348:< 1310:(if 1045:and 895:are 287:< 12833:hdl 12825:doi 12790:doi 12745:doi 12681:doi 12667:113 12617:doi 12603:117 12559:doi 12511:doi 12472:doi 12437:doi 12385:doi 12325:doi 12268:doi 12231:104 12198:PMC 12190:doi 12149:PMC 12141:doi 12110:doi 12075:doi 12038:doi 11997:doi 11955:doi 11913:doi 11876:PMC 11868:doi 11829:doi 11778:doi 11751:doi 11724:doi 11689:doi 11685:112 11647:doi 11612:doi 11575:doi 11530:doi 11479:doi 11475:100 11408:doi 11347:doi 11343:140 11312:doi 11308:105 11266:doi 11231:doi 11194:doi 11155:doi 11110:doi 11106:100 11080:doi 11037:doi 11002:doi 10965:doi 10904:doi 10900:103 10869:doi 10772:doi 10741:doi 10697:doi 10666:doi 10629:doi 10457:dtw 10427:fda 10365:to 9697:. 9597:; 9503:Var 9489:; 8464:on 7064:for 6885:for 6250:for 5942:on 5647:is 4770:in 4183:as 4033:sup 3563:By 2815:Var 2380:Cov 1911:sup 1229:Cov 927:of 899:of 517:of 12863:: 12841:. 12831:. 12821:79 12819:. 12796:. 12788:. 12776:46 12774:. 12751:. 12743:. 12731:77 12729:. 12712:24 12710:. 12687:. 12679:. 12665:. 12648:67 12646:. 12623:. 12615:. 12601:. 12589:^ 12571:^ 12555:95 12553:. 12536:90 12534:. 12507:20 12505:. 12501:. 12478:. 12468:26 12466:. 12443:. 12435:. 12423:30 12421:. 12407:^ 12393:. 12381:60 12379:. 12375:. 12358:12 12356:. 12333:. 12323:. 12311:. 12307:. 12284:. 12276:. 12264:74 12262:. 12258:. 12229:. 12206:. 12196:. 12186:68 12184:. 12180:. 12157:. 12147:. 12137:29 12135:. 12131:. 12106:44 12104:. 12081:. 12071:43 12069:. 12046:. 12034:63 12032:. 12028:. 12005:. 11993:22 11991:. 11987:. 11961:. 11951:13 11949:. 11945:. 11919:. 11909:71 11907:. 11884:. 11874:. 11864:96 11862:. 11858:. 11835:. 11827:. 11817:39 11815:. 11792:. 11784:. 11774:56 11772:. 11747:71 11745:. 11720:71 11718:. 11695:. 11683:. 11679:. 11653:. 11643:98 11641:. 11618:. 11608:49 11606:. 11583:. 11571:69 11569:. 11565:. 11544:}} 11540:{{ 11528:. 11516:33 11514:. 11493:}} 11489:{{ 11473:. 11457:39 11455:. 11440:39 11438:. 11416:. 11406:. 11394:19 11392:. 11388:. 11372:97 11370:. 11359:^ 11341:. 11318:. 11306:. 11289:14 11287:. 11262:89 11260:. 11237:. 11227:70 11225:. 11202:. 11190:27 11188:. 11184:. 11161:. 11153:. 11141:16 11139:. 11116:. 11104:. 11092:^ 11076:85 11074:. 11051:. 11043:. 11033:31 11031:. 11008:. 10998:53 10996:. 10973:. 10963:. 10951:. 10947:. 10931:^ 10898:. 10875:. 10867:. 10855:41 10853:. 10836:45 10834:. 10786:^ 10768:12 10766:. 10762:. 10735:. 10693:44 10691:. 10687:. 10662:53 10660:. 10637:. 10627:. 10615:. 10611:. 10064:. 8627:. 7198:, 6494:, 6377:, 6048:, 6010:, 5679:, 4774:, 4496:, 2909:. 2168:. 955:. 49:. 12849:. 12835:: 12827:: 12804:. 12792:: 12782:: 12759:. 12747:: 12737:: 12695:. 12683:: 12673:: 12631:. 12619:: 12609:: 12565:. 12561:: 12519:. 12513:: 12486:. 12474:: 12451:. 12439:: 12429:: 12401:. 12387:: 12341:. 12327:: 12319:: 12313:3 12292:. 12270:: 12243:. 12237:: 12214:. 12192:: 12165:. 12143:: 12116:. 12112:: 12089:. 12077:: 12054:. 12040:: 12013:. 11999:: 11969:. 11957:: 11927:. 11915:: 11892:. 11870:: 11843:. 11831:: 11823:: 11800:. 11780:: 11757:. 11753:: 11730:. 11726:: 11703:. 11691:: 11661:. 11649:: 11626:. 11614:: 11591:. 11577:: 11550:) 11536:. 11532:: 11522:: 11499:) 11485:. 11481:: 11424:. 11410:: 11400:: 11353:. 11349:: 11326:. 11314:: 11272:. 11268:: 11245:. 11233:: 11210:. 11196:: 11169:. 11157:: 11147:: 11124:. 11112:: 11086:. 11082:: 11059:. 11039:: 11016:. 11004:: 10981:. 10967:: 10959:: 10953:3 10910:. 10906:: 10883:. 10871:: 10861:: 10780:. 10774:: 10747:. 10743:: 10737:4 10705:. 10699:: 10672:. 10668:: 10645:. 10631:: 10623:: 10617:1 10416:R 10378:p 10374:R 10353:R 10323:p 10319:R 10298:) 10292:( 10289:X 10266:) 10260:( 10257:X 10229:T 10221:t 10217:} 10213:) 10210:t 10207:( 10204:X 10201:{ 10153:2 10147:L 10099:t 10093:+ 10087:= 10084:) 10081:t 10078:( 10075:h 10052:t 10049:= 10046:) 10043:) 10040:t 10037:( 10032:1 10025:h 10021:( 10017:E 9996:h 9976:h 9970:d 9967:i 9964:i 9953:i 9949:h 9928:X 9922:d 9919:i 9916:i 9905:i 9901:X 9880:] 9877:1 9874:, 9871:0 9868:[ 9862:t 9859:, 9856:] 9853:) 9850:t 9847:( 9842:1 9834:i 9830:h 9826:[ 9821:i 9817:X 9813:= 9810:) 9807:t 9804:( 9799:i 9795:Y 9772:i 9768:Y 9685:) 9679:( 9676:g 9673:= 9610:g 9581:) 9578:X 9574:| 9570:Y 9567:( 9563:E 9559:= 9536:) 9530:( 9527:V 9524:= 9521:) 9518:X 9514:| 9510:Y 9507:( 9477:t 9474:d 9470:) 9467:t 9464:( 9458:) 9455:t 9452:( 9447:c 9443:X 9437:1 9432:0 9424:+ 9419:0 9411:= 9383:3 9360:] 9357:1 9354:, 9351:0 9348:[ 9342:t 9322:0 9319:= 9316:] 9313:) 9310:) 9307:t 9304:( 9301:X 9298:, 9295:t 9292:( 9289:g 9286:[ 9282:E 9260:R 9252:R 9245:] 9242:1 9239:, 9236:0 9233:[ 9230:: 9227:g 9207:t 9204:d 9201:) 9198:) 9195:t 9192:( 9189:X 9186:, 9183:t 9180:( 9177:g 9172:1 9167:0 9159:+ 9156:) 9153:Y 9150:( 9146:E 9142:= 9139:) 9136:X 9132:| 9128:Y 9125:( 9121:E 9100:) 9097:Y 9094:( 9090:E 9067:k 9063:f 9041:N 9034:k 9014:0 9011:= 9008:) 9005:) 9000:k 8996:x 8992:( 8987:k 8983:f 8979:( 8975:E 8952:k 8948:f 8927:, 8924:) 8919:k 8915:x 8911:( 8906:k 8902:f 8891:1 8888:= 8885:k 8877:+ 8874:) 8871:Y 8868:( 8864:E 8860:= 8857:) 8854:X 8850:| 8846:Y 8843:( 8839:E 8816:k 8812:f 8789:k 8785:x 8779:k 8752:k 8748:x 8727:. 8722:k 8718:x 8712:k 8697:1 8694:= 8691:k 8683:+ 8680:) 8677:Y 8674:( 8670:E 8666:= 8663:) 8660:X 8656:| 8652:Y 8649:( 8645:E 8633:3 8615:] 8612:1 8609:, 8606:0 8603:[ 8583:) 8580:t 8577:( 8572:k 8562:k 8558:x 8547:1 8544:= 8541:k 8533:= 8530:) 8527:t 8524:( 8519:c 8515:X 8494:] 8491:1 8488:, 8485:0 8482:[ 8477:2 8473:L 8445:1 8442:= 8439:k 8435:} 8429:k 8421:{ 8393:1 8387:p 8363:1 8357:p 8337:1 8331:p 8311:1 8308:= 8305:p 8284:) 8280:t 8277:d 8273:) 8270:t 8267:( 8262:p 8254:) 8251:t 8248:( 8243:c 8239:X 8233:1 8228:0 8220:, 8214:, 8211:t 8208:d 8204:) 8201:t 8198:( 8193:1 8185:) 8182:t 8179:( 8174:c 8170:X 8164:1 8159:0 8150:( 8146:g 8143:= 8140:) 8137:X 8133:| 8129:Y 8126:( 8122:E 8051:c 8047:X 8026:] 8023:1 8020:, 8017:0 8014:[ 8008:] 8005:1 8002:, 7999:0 7996:[ 7976:] 7973:1 7970:, 7967:0 7964:[ 7944:) 7938:, 7932:( 7909:) 7903:( 7860:) 7857:) 7851:( 7848:X 7845:( 7841:E 7834:) 7828:( 7825:X 7822:= 7819:) 7813:( 7808:c 7804:X 7783:t 7780:d 7776:s 7773:d 7769:) 7766:t 7763:( 7758:c 7754:X 7750:) 7747:s 7744:( 7739:c 7735:X 7731:) 7728:t 7725:, 7722:s 7719:( 7711:1 7706:0 7696:1 7691:0 7683:+ 7680:t 7677:d 7673:) 7670:t 7667:( 7662:c 7658:X 7654:) 7651:t 7648:( 7640:1 7635:0 7627:+ 7621:= 7618:) 7615:X 7611:| 7607:Y 7604:( 7600:E 7577:c 7573:X 7552:] 7549:1 7546:, 7543:0 7540:[ 7520:) 7514:( 7511:X 7491:Y 7433:p 7428:1 7425:= 7422:j 7418:} 7414:s 7408:t 7405:: 7402:) 7399:t 7396:( 7391:j 7387:X 7383:{ 7361:p 7356:1 7353:= 7350:j 7346:} 7342:) 7339:s 7336:( 7331:j 7327:X 7323:{ 7303:) 7300:s 7297:( 7294:Y 7274:) 7271:s 7268:( 7243:p 7235:, 7229:, 7224:1 7216:, 7211:0 7186:] 7183:1 7180:, 7177:0 7174:[ 7152:p 7148:X 7144:, 7138:, 7133:1 7129:X 7113:) 7111:7 7109:( 7092:, 7089:] 7086:1 7083:, 7080:0 7077:[ 7071:s 7057:, 7054:) 7051:s 7048:( 7042:+ 7039:) 7036:s 7033:( 7028:j 7024:X 7020:) 7017:s 7014:( 7009:j 6999:p 6994:1 6991:= 6988:j 6980:+ 6977:) 6974:s 6971:( 6966:0 6958:= 6955:) 6952:s 6949:( 6946:Y 6913:, 6910:] 6907:1 6904:, 6901:0 6898:[ 6892:s 6878:, 6875:) 6872:s 6869:( 6863:+ 6860:) 6857:s 6854:( 6849:j 6839:j 6835:X 6829:p 6824:1 6821:= 6818:j 6810:+ 6807:) 6804:s 6801:( 6796:0 6788:= 6785:) 6782:s 6779:( 6776:Y 6768:) 6765:6 6750:) 6744:( 6739:j 6735:X 6715:6 6698:p 6693:1 6690:= 6687:j 6683:} 6679:) 6676:t 6673:( 6668:j 6664:X 6660:{ 6640:) 6637:s 6634:( 6631:Y 6611:Y 6591:] 6588:1 6585:, 6582:0 6579:[ 6573:s 6553:) 6550:s 6547:( 6524:) 6521:t 6518:, 6515:s 6512:( 6507:j 6482:] 6479:1 6476:, 6473:0 6470:[ 6450:) 6447:t 6444:( 6439:j 6428:) 6425:t 6422:( 6417:j 6413:X 6409:= 6406:) 6403:t 6400:( 6395:c 6390:j 6386:X 6365:p 6362:, 6356:, 6353:1 6350:= 6347:j 6327:) 6324:s 6321:( 6316:0 6296:) 6294:6 6292:( 6275:] 6272:1 6269:, 6266:0 6263:[ 6257:s 6243:, 6240:) 6237:s 6234:( 6228:+ 6225:t 6222:d 6218:) 6215:t 6212:( 6207:c 6202:j 6198:X 6194:) 6191:t 6188:, 6185:s 6182:( 6177:j 6167:1 6162:0 6152:p 6147:1 6144:= 6141:j 6133:+ 6130:) 6127:s 6124:( 6119:0 6111:= 6108:) 6105:s 6102:( 6099:Y 6074:p 6071:, 6065:, 6062:1 6059:= 6056:j 6036:] 6033:1 6030:, 6027:0 6024:[ 6018:t 5998:) 5995:t 5992:( 5987:j 5983:X 5962:] 5959:1 5956:, 5953:0 5950:[ 5930:) 5927:s 5924:( 5921:Y 5902:5 5896:4 5881:p 5878:, 5872:, 5869:1 5866:= 5863:j 5841:c 5836:j 5832:X 5809:j 5784:) 5781:t 5778:( 5773:j 5762:) 5759:t 5756:( 5751:j 5747:X 5743:= 5740:) 5737:t 5734:( 5729:c 5724:j 5720:X 5697:c 5692:j 5688:X 5667:] 5664:1 5661:, 5658:0 5655:[ 5633:j 5629:X 5608:Z 5585:q 5581:R 5558:) 5556:5 5554:( 5537:, 5531:+ 5528:t 5525:d 5521:) 5518:t 5515:( 5510:j 5502:) 5499:t 5496:( 5491:c 5486:j 5482:X 5476:1 5471:0 5461:p 5456:1 5453:= 5450:j 5442:+ 5433:, 5430:Z 5424:= 5421:Y 5396:1 5393:= 5388:1 5384:Z 5363:) 5358:q 5354:Z 5350:, 5344:, 5339:1 5335:Z 5331:( 5328:= 5325:Z 5303:p 5298:1 5295:= 5292:j 5288:} 5282:j 5278:X 5274:{ 5263:4 5253:) 5251:4 5249:( 5232:. 5226:+ 5223:t 5220:d 5216:) 5213:t 5210:( 5204:) 5201:t 5198:( 5193:c 5189:X 5183:1 5178:0 5170:+ 5165:0 5157:= 5151:+ 5142:, 5137:c 5133:X 5126:+ 5121:0 5113:= 5110:Y 5082:2 5078:L 5053:] 5050:1 5047:, 5044:0 5041:[ 5035:t 5015:) 5012:t 5009:( 5003:= 4980:) 4977:t 4974:( 4965:) 4962:t 4959:( 4956:X 4953:= 4950:) 4947:t 4944:( 4939:c 4935:X 4923:3 4888:X 4830:p 4825:R 4796:R 4787:0 4748:, 4727:) 4725:3 4723:( 4706:, 4700:+ 4695:p 4685:p 4681:X 4677:+ 4671:+ 4666:1 4656:1 4652:X 4648:+ 4643:0 4635:= 4629:+ 4620:, 4617:X 4611:+ 4606:0 4598:= 4595:Y 4568:p 4563:R 4555:X 4534:R 4527:Y 4485:) 4483:2 4481:( 4464:) 4461:t 4458:( 4453:k 4443:k 4440:i 4436:A 4430:K 4425:1 4422:= 4419:k 4411:+ 4408:) 4405:t 4402:( 4396:= 4393:) 4390:t 4387:( 4382:) 4379:K 4376:( 4371:i 4367:X 4341:) 4336:K 4333:i 4329:A 4325:, 4322:. 4319:. 4316:. 4313:, 4308:1 4305:i 4301:A 4297:( 4294:= 4289:i 4285:A 4264:K 4242:i 4238:X 4217:K 4191:K 4171:0 4163:2 4159:] 4155:) 4152:t 4149:( 4144:k 4134:k 4131:i 4127:A 4121:K 4116:1 4113:= 4110:k 4099:) 4096:t 4093:( 4084:) 4081:t 4078:( 4073:i 4069:X 4065:[ 4061:E 4055:] 4052:1 4049:, 4046:0 4043:[ 4037:t 4008:t 4005:d 4002:) 3999:t 3996:( 3991:k 3983:) 3980:) 3977:t 3974:( 3965:) 3962:t 3959:( 3954:i 3950:X 3946:( 3941:1 3936:0 3928:= 3923:k 3920:i 3916:A 3895:) 3892:t 3889:( 3884:k 3874:k 3871:i 3867:A 3856:1 3853:= 3850:k 3842:+ 3839:) 3836:t 3833:( 3827:= 3824:) 3821:t 3818:( 3813:i 3809:X 3784:) 3781:t 3778:( 3773:k 3746:k 3721:) 3718:t 3715:( 3710:k 3702:) 3699:s 3696:( 3691:k 3681:k 3666:1 3663:= 3660:k 3652:= 3649:) 3646:t 3643:, 3640:s 3637:( 3614:) 3608:, 3602:( 3577:C 3553:1 3538:] 3535:1 3532:, 3529:0 3526:[ 3521:2 3517:L 3510:] 3507:1 3504:, 3501:0 3498:[ 3493:2 3489:L 3485:: 3480:C 3458:X 3438:) 3435:t 3432:( 3427:i 3423:X 3378:i 3374:N 3351:j 3348:i 3344:T 3321:j 3318:i 3310:+ 3307:) 3302:j 3299:i 3295:T 3291:( 3286:i 3282:X 3278:= 3273:j 3270:i 3266:Y 3219:i 3215:N 3190:i 3186:N 3182:i 3178:T 3174:, 3168:, 3163:1 3160:i 3156:T 3131:j 3128:i 3124:T 3101:j 3098:i 3090:+ 3087:) 3082:j 3079:i 3075:T 3071:( 3066:i 3062:X 3058:= 3053:j 3050:i 3046:Y 3012:n 3009:, 3003:, 3000:1 2997:= 2994:i 2990:, 2985:I 2977:t 2957:) 2954:t 2951:( 2946:i 2942:X 2938:= 2933:t 2930:i 2926:Y 2897:j 2877:i 2855:2 2850:j 2847:i 2839:= 2836:) 2831:j 2828:i 2820:( 2793:0 2790:= 2787:) 2782:j 2779:i 2771:( 2767:E 2744:j 2741:i 2714:j 2711:i 2707:X 2686:) 2681:j 2678:i 2674:T 2670:( 2665:i 2661:X 2657:= 2652:j 2649:i 2645:X 2624:) 2617:i 2613:N 2609:i 2605:X 2601:, 2598:. 2595:. 2592:. 2589:, 2584:1 2581:i 2577:X 2573:( 2570:= 2565:i 2559:X 2532:i 2528:N 2524:i 2520:T 2516:, 2513:. 2510:. 2507:. 2504:, 2499:1 2496:i 2492:T 2471:n 2451:) 2445:( 2440:i 2436:X 2415:) 2412:) 2409:t 2406:( 2403:X 2400:, 2397:) 2394:s 2391:( 2388:X 2385:( 2375:= 2372:) 2369:t 2366:, 2363:s 2360:( 2337:) 2334:) 2331:t 2328:( 2325:X 2322:( 2318:E 2314:= 2311:) 2308:t 2305:( 2282:] 2279:1 2276:, 2273:0 2270:[ 2248:2 2244:L 2223:] 2220:1 2217:, 2214:0 2211:[ 2205:t 2199:, 2196:) 2193:t 2190:( 2187:X 2152:X 2131:R 2124:] 2121:1 2118:, 2115:0 2112:[ 2109:: 2106:X 2086:X 2063:. 2054:K 2050:, 2047:0 2040:| 2036:) 2033:t 2030:( 2025:j 2017:) 2014:s 2011:( 2006:j 1996:j 1986:K 1981:1 1978:= 1975:j 1964:) 1961:t 1958:, 1955:s 1952:( 1945:| 1939:] 1936:1 1933:, 1930:0 1927:[ 1921:t 1918:, 1915:s 1881:j 1856:H 1850:f 1830:f 1825:C 1800:. 1795:j 1782:j 1772:j 1757:1 1754:= 1751:j 1743:= 1738:C 1711:C 1685:) 1680:j 1672:, 1667:j 1659:( 1637:C 1616:) 1614:1 1612:( 1595:. 1592:s 1588:d 1583:) 1580:s 1577:( 1574:f 1571:) 1568:t 1565:, 1562:s 1559:( 1551:1 1546:0 1538:= 1535:) 1532:t 1529:( 1526:) 1523:f 1518:C 1513:( 1488:H 1482:H 1479:: 1474:C 1389:] 1386:1 1383:, 1380:0 1377:[ 1371:t 1345:] 1340:2 1336:) 1332:t 1329:( 1326:X 1323:[ 1319:E 1295:] 1292:1 1289:, 1286:0 1283:[ 1277:t 1274:, 1271:s 1267:, 1264:) 1261:) 1258:t 1255:( 1252:X 1249:, 1246:) 1243:s 1240:( 1237:X 1234:( 1224:= 1221:) 1218:t 1215:, 1212:s 1209:( 1202:, 1199:) 1196:t 1193:( 1190:X 1186:E 1182:= 1179:) 1176:t 1173:( 1145:T 1118:] 1115:1 1112:, 1109:0 1106:[ 1100:t 1096:} 1092:) 1089:t 1086:( 1083:X 1080:{ 1057:X 1033:] 1030:1 1027:, 1024:0 1021:[ 1016:2 1012:L 991:H 971:X 937:C 909:C 881:i 853:, 848:i 835:i 827:, 824:X 811:1 808:= 805:i 797:+ 791:= 788:X 762:X 738:] 735:) 726:X 723:( 717:) 708:X 705:( 702:[ 698:E 694:= 689:C 660:, 657:H 651:h 647:, 644:] 641:) 632:X 629:( 617:X 614:, 611:h 605:[ 601:E 597:= 594:h 589:C 564:H 558:H 555:: 550:C 525:X 499:2 492:2 488:L 479:X 472:E 451:X 447:E 443:= 410:. 407:H 401:h 397:, 391:h 388:, 379:= 373:h 370:, 367:X 360:E 336:H 310:X 284:) 281:t 278:d 273:2 268:| 263:) 260:t 257:( 254:X 250:| 244:1 239:0 231:( 227:E 223:= 218:2 211:2 207:L 198:X 191:E 170:] 167:1 164:, 161:0 158:[ 153:2 149:L 125:H 105:X 85:H

Index

statistics
stochastic process
Karhunen-Loève decomposition
functional principal components analysis
James O. Ramsay
Hilbert space
stochastic process
square-integrable functions
Pettis integral
Bochner integral
covariance operator
linear operator
tensor
spectral theorem
Karhunen-Loève decomposition
eigenvectors
eigenvalues
functional principal component analysis
Sobolev spaces
spectral theorem
tensor product
Mercer's theorem
Kolmogorov continuity theorem
stochastic process
Berkeley Growth Study Data
Stock data
Functional principal component analysis
dimension reduction
dimension reduction
1

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.