7384:
6499:
7379:{\displaystyle {\begin{aligned}\operatorname {Var} \left({\tilde {\beta }}\right)&=\operatorname {Var} (Cy)\\&=C{\text{ Var}}(y)C^{\operatorname {T} }\\&=\sigma ^{2}CC^{\operatorname {T} }\\&=\sigma ^{2}\left((X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }+D\right)\left(X(X^{\operatorname {T} }X)^{-1}+D^{\operatorname {T} }\right)\\&=\sigma ^{2}\left((X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }X(X^{\operatorname {T} }X)^{-1}+(X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }D^{\operatorname {T} }+DX(X^{\operatorname {T} }X)^{-1}+DD^{\operatorname {T} }\right)\\&=\sigma ^{2}(X^{\operatorname {T} }X)^{-1}+\sigma ^{2}(X^{\operatorname {T} }X)^{-1}(DX)^{\operatorname {T} }+\sigma ^{2}DX(X^{\operatorname {T} }X)^{-1}+\sigma ^{2}DD^{\operatorname {T} }\\&=\sigma ^{2}(X^{\operatorname {T} }X)^{-1}+\sigma ^{2}DD^{\operatorname {T} }&&DX=0\\&=\operatorname {Var} \left({\widehat {\beta }}\right)+\sigma ^{2}DD^{\operatorname {T} }&&\sigma ^{2}(X^{\operatorname {T} }X)^{-1}=\operatorname {Var} \left({\widehat {\beta }}\right)\end{aligned}}}
8234:
7729:
8229:{\displaystyle {\begin{aligned}\operatorname {Var} \left(\ell ^{\operatorname {T} }{\tilde {\beta }}\right)&=\ell ^{\operatorname {T} }\operatorname {Var} \left({\tilde {\beta }}\right)\ell \\&=\sigma ^{2}\ell ^{\operatorname {T} }(X^{\operatorname {T} }X)^{-1}\ell +\ell ^{\operatorname {T} }DD^{\operatorname {T} }\ell \\&=\operatorname {Var} \left(\ell ^{\operatorname {T} }{\widehat {\beta }}\right)+(D^{\operatorname {T} }\ell )^{\operatorname {T} }(D^{\operatorname {T} }\ell )&&\sigma ^{2}\ell ^{\operatorname {T} }(X^{\operatorname {T} }X)^{-1}\ell =\operatorname {Var} \left(\ell ^{\operatorname {T} }{\widehat {\beta }}\right)\\&=\operatorname {Var} \left(\ell ^{\operatorname {T} }{\widehat {\beta }}\right)+\|D^{\operatorname {T} }\ell \|\\&\geq \operatorname {Var} \left(\ell ^{\operatorname {T} }{\widehat {\beta }}\right)\end{aligned}}}
8611:
6406:
10518:. Autocorrelation can be visualized on a data plot when a given observation is more likely to lie above a fitted line if adjacent observations also lie above the fitted regression line. Autocorrelation is common in time series data where a data series may experience "inertia." If a dependent variable takes a while to fully absorb a shock. Spatial autocorrelation can also occur geographic areas are likely to have similar errors. Autocorrelation may be the result of misspecification such as choosing the wrong functional form. In these cases, correcting the specification is one possible way to deal with autocorrelation.
8281:
5889:
11676:
412:
5291:
3787:
8606:{\displaystyle {\begin{aligned}\ell ^{\operatorname {T} }{\tilde {\beta }}&=\ell ^{\operatorname {T} }\left(((X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }+D)Y\right)&&{\text{ from above}}\\&=\ell ^{\operatorname {T} }(X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }Y+\ell ^{\operatorname {T} }DY\\&=\ell ^{\operatorname {T} }{\widehat {\beta }}+(D^{\operatorname {T} }\ell )^{\operatorname {T} }Y\\&=\ell ^{\operatorname {T} }{\widehat {\beta }}&&D^{\operatorname {T} }\ell =0\end{aligned}}}
4469:
6401:{\displaystyle {\begin{aligned}\operatorname {E} \left&=\operatorname {E} \\&=\operatorname {E} \left\\&=\left((X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }+D\right)X\beta +\left((X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }+D\right)\operatorname {E} \\&=\left((X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }+D\right)X\beta &&\operatorname {E} =0\\&=(X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }X\beta +DX\beta \\&=(I_{K}+DX)\beta .\\\end{aligned}}}
5011:
3387:
4041:
10383:
5286:{\displaystyle {\begin{bmatrix}k_{1}&\cdots &k_{p+1}\end{bmatrix}}{\begin{bmatrix}\mathbf {v_{1}} \\\vdots \\\mathbf {v} _{p+1}\end{bmatrix}}{\begin{bmatrix}\mathbf {v_{1}} &\cdots &\mathbf {v} _{p+1}\end{bmatrix}}{\begin{bmatrix}k_{1}\\\vdots \\k_{p+1}\end{bmatrix}}=\mathbf {k} ^{\operatorname {T} }{\mathcal {H}}\mathbf {k} =\lambda \mathbf {k} ^{\operatorname {T} }\mathbf {k} >0}
3782:{\displaystyle {\begin{aligned}{\frac {d}{d{\boldsymbol {\beta }}}}f&=-2X^{\operatorname {T} }\left(\mathbf {y} -X{\boldsymbol {\beta }}\right)\\&=-2{\begin{bmatrix}\sum _{i=1}^{n}(y_{i}-\dots -\beta _{p}x_{ip})\\\sum _{i=1}^{n}x_{i1}(y_{i}-\dots -\beta _{p}x_{ip})\\\vdots \\\sum _{i=1}^{n}x_{ip}(y_{i}-\dots -\beta _{p}x_{ip})\end{bmatrix}}\\&=\mathbf {0} _{p+1},\end{aligned}}}
4464:{\displaystyle {\mathcal {H}}=2{\begin{bmatrix}n&\sum _{i=1}^{n}x_{i1}&\cdots &\sum _{i=1}^{n}x_{ip}\\\sum _{i=1}^{n}x_{i1}&\sum _{i=1}^{n}x_{i1}^{2}&\cdots &\sum _{i=1}^{n}x_{i1}x_{ip}\\\vdots &\vdots &\ddots &\vdots \\\sum _{i=1}^{n}x_{ip}&\sum _{i=1}^{n}x_{ip}x_{i1}&\cdots &\sum _{i=1}^{n}x_{ip}^{2}\end{bmatrix}}=2X^{\operatorname {T} }X}
10140:
4030:
9653:
9968:
3819:
2519:
in other words, it is the expectation of the square of the weighted sum (across parameters) of the differences between the estimators and the corresponding parameters to be estimated. (Since we are considering the case in which all the parameter estimates are unbiased, this mean squared error is the
10378:{\displaystyle \operatorname {E} =\operatorname {Var} ={\begin{bmatrix}\sigma ^{2}&0&\cdots &0\\0&\sigma ^{2}&\cdots &0\\\vdots &\vdots &\ddots &\vdots \\0&0&\cdots &\sigma ^{2}\end{bmatrix}}=\sigma ^{2}\mathbf {I} \quad {\text{with }}\sigma ^{2}>0}
3081:
10510:
occurs when the amount of error is correlated with an independent variable. For example, in a regression on food expenditure and income, the error is correlated with income. Low income people generally spend a similar amount on food, while high income people may spend a very large amount or as
8763:
The dependent variable is assumed to be a linear function of the variables specified in the model. The specification must be linear in its parameters. This does not mean that there must be a linear relationship between the independent and dependent variables. The independent variables can take
1196:
9550:
3378:
5665:
5004:
9278:
9541:
2664:
4611:
7583:
5432:
10110:, i.e. some explanatory variables are linearly dependent. One scenario in which this will occur is called "dummy variable trap," when a base dummy variable is not omitted resulting in perfect correlation between the dummy variables and the constant term.
4747:
10113:
Multicollinearity (as long as it is not "perfect") can be present resulting in a less efficient, but still unbiased estimate. The estimates will be less precise and highly sensitive to particular sets of data. Multicollinearity can be detected from
8678:
9728:
5575:
1320:
2780:
10511:
little as low income people spend. Heteroskedastic can also be caused by changes in measurement practices. For example, as statistical offices improve their data, measurement error decreases, so the error term declines over time.
4855:
544:, although Gauss' work significantly predates Markov's. But while Gauss derived the result under the assumption of independence and normality, Markov reduced the assumptions to the form stated above. A further generalization to
2889:
4025:{\displaystyle X={\begin{bmatrix}1&x_{11}&\cdots &x_{1p}\\1&x_{21}&\cdots &x_{2p}\\&&\vdots \\1&x_{n1}&\cdots &x_{np}\end{bmatrix}}\in \mathbb {R} ^{n\times (p+1)};\qquad n\geq p+1}
1913:
1015:
3197:
1082:
1815:
4886:
5799:
2044:
857:
9648:{\displaystyle \mathbf {X} ={\begin{bmatrix}\mathbf {x} _{1}^{\operatorname {T} }&\mathbf {x} _{2}^{\operatorname {T} }&\cdots &\mathbf {x} _{n}^{\operatorname {T} }\end{bmatrix}}^{\operatorname {T} }}
7503:
7460:
5583:
10101:
7652:
10463:
9119:
7691:
5338:
9435:
10064:
9134:
4616:
9443:
8286:
7734:
6504:
5894:
3392:
2597:
2381:
2290:
8988:
601:
7721:
7613:
4526:
7516:
5509:
8821:
8273:
3153:
8875:
2514:
685:
7417:
4521:
3814:
2850:
2696:
1751:
5881:
5711:
920:
11131:
10498:
9963:{\displaystyle \operatorname {E} ={\begin{bmatrix}\operatorname {E} \\\operatorname {E} \\\vdots \\\operatorname {E} \end{bmatrix}}=\mathbf {0} \quad {\text{for all }}i,j\in n}
6462:
9712:
9685:
8619:
1702:
1465:
1411:
9290:
One should be aware, however, that the parameters that minimize the residuals of the transformed equation do not necessarily minimize the residuals of the original equation.
8907:
5504:
5480:
4879:
9024:
2214:
10017:
8753:
8727:
5456:
5333:
1550:
1502:
2569:
2104:
1947:
1350:
1207:
1046:
757:
5845:
5311:
2589:
8927:
1637:
705:
6429:
5731:
3173:
2542:
2323:
2164:
2134:
2077:
1669:
1600:
1384:
2191:
1438:
10530:
6491:
725:
521:
with finite variance). The requirement that the estimator be unbiased cannot be dropped, since biased estimators exist with lower variance. See, for example, the
3089:
The main idea of the proof is that the least-squares estimator is uncorrelated with every linear unbiased estimator of zero, i.e., with every linear combination
9316:
9044:
5819:
4491:
2874:
2823:
2803:
1835:
1570:
1066:
880:
777:
641:
621:
2711:
5851:
estimators, minimum mean squared error implies minimum variance. The goal is therefore to show that such an estimator has a variance no smaller than that of
4754:
3076:{\displaystyle \sum _{i=1}^{n}\left(y_{i}-{\widehat {y}}_{i}\right)^{2}=\sum _{i=1}^{n}\left(y_{i}-\sum _{j=1}^{K}{\widehat {\beta }}_{j}X_{ij}\right)^{2}.}
11124:
1843:
927:
1191:{\displaystyle y=X\beta +\varepsilon ,\quad (y,\varepsilon \in \mathbb {R} ^{n},\beta \in \mathbb {R} ^{K}{\text{ and }}X\in \mathbb {R} ^{n\times K})}
1762:
11193:
11117:
8696:, extends the Gauss–Markov theorem to the case where the error vector has a non-scalar covariance matrix. The Aitken estimator is also a BLUE.
5736:
1955:
782:
8729:
are assumed to be fixed in repeated samples. This assumption is considered inappropriate for a predominantly nonexperimental science like
7465:
1602:
as sample responses, are observable, the following statements and arguments including assumptions, proofs and the others assume under the
3373:{\displaystyle f(\beta _{0},\beta _{1},\dots ,\beta _{p})=\sum _{i=1}^{n}(y_{i}-\beta _{0}-\beta _{1}x_{i1}-\dots -\beta _{p}x_{ip})^{2}}
7422:
5660:{\displaystyle \mathbf {v} ,\mathbf {v} ^{\operatorname {T} }X^{\operatorname {T} }X\mathbf {v} =\|\mathbf {X} \mathbf {v} \|^{2}\geq 0}
11202:
11207:
10072:
7618:
442:
10407:
4999:{\displaystyle \mathbf {k} \neq \mathbf {0} \implies \left(k_{1}\mathbf {v_{1}} +\dots +k_{p+1}\mathbf {v} _{p+1}\right)^{2}>0}
11640:
9062:
7657:
510:
352:
9324:
9283:
This assumption also covers specification issues: assuming that the proper functional form has been selected and there are no
9273:{\displaystyle \ln Y=\ln A+\alpha \ln L+(1-\alpha )\ln K+\varepsilon =\beta _{0}+\beta _{1}\ln L+\beta _{2}\ln K+\varepsilon }
10894:
11104:
9536:{\displaystyle \mathbf {x} _{i}={\begin{bmatrix}x_{i1}&x_{i2}&\cdots &x_{ik}\end{bmatrix}}^{\operatorname {T} }}
11175:
10029:
9053:
2659:{\displaystyle \operatorname {Var} \left({\widetilde {\beta }}\right)-\operatorname {Var} \left({\widehat {\beta }}\right)}
342:
11535:
10558:
4606:{\displaystyle X={\begin{bmatrix}\mathbf {v_{1}} &\mathbf {v_{2}} &\cdots &\mathbf {v} _{p+1}\end{bmatrix}}}
2232:
11515:
11165:
10521:
When the spherical errors assumption may be violated, the generalized least squares estimator can be shown to be BLUE.
8932:
7578:{\displaystyle \operatorname {Var} \left({\tilde {\beta }}\right)-\operatorname {Var} \left({\widehat {\beta }}\right)}
2879:
2328:
1468:
563:
7696:
7588:
1467:
are called the "disturbance", "noise" or simply "error" (will be contrasted with "residual" later in the article; see
11435:
11076:
11046:
11019:
10991:
10958:
10928:
10858:
10824:
10794:
10769:
10739:
10614:
27:
11241:
10401:
9049:
8767:
8242:
5427:{\displaystyle \mathbf {k} ^{\operatorname {T} }\mathbf {k} =\sum _{i=1}^{p+1}k_{i}^{2}>0\implies \lambda >0}
3092:
306:
8826:
11711:
10552:
357:
295:
115:
90:
11477:
10910:
10840:
217:
4742:{\displaystyle k_{1}\mathbf {v_{1}} +\dots +k_{p+1}\mathbf {v} _{p+1}=\mathbf {0} \iff k_{1}=\dots =k_{p+1}=0}
2395:
1471:). Note that to include a constant in the model above, one can choose to introduce the constant as a variable
646:
7392:
4496:
176:
8929:. An equation with a parameter dependent on an independent variable does not qualify as linear, for example
3792:
2828:
11663:
11563:
11553:
11472:
11417:
7585:
is a positive semidefinite matrix is equivalent to the property that the best linear unbiased estimator of
2672:
1711:
435:
5854:
5678:
885:
11690:
11505:
8673:{\displaystyle \ell ^{\operatorname {T} }{\tilde {\beta }}=\ell ^{\operatorname {T} }{\widehat {\beta }}}
378:
11185:
10468:
9978:
9974:
6438:
5570:{\displaystyle {\boldsymbol {\beta }}=\left(X^{\operatorname {T} }X\right)^{-1}X^{\operatorname {T} }Y}
347:
316:
243:
9690:
9661:
1680:
1443:
1389:
11530:
11357:
11321:
11290:
10920:
10914:
10850:
10844:
10119:
8689:
522:
337:
326:
290:
197:
8880:
5485:
5461:
4860:
11388:
11352:
11280:
11170:
11152:
10886:
8993:
2196:
1315:{\displaystyle y_{i}=\sum _{j=1}^{K}\beta _{j}X_{ij}+\varepsilon _{i}\quad \forall i=1,2,\ldots ,n}
269:
192:
85:
64:
20:
11038:
10000:
8736:
8710:
5439:
5316:
1507:
1474:
11251:
10540:
10397:
530:
494:
428:
321:
11068:
10731:
10606:
2547:
2082:
1925:
1328:
1022:
733:
11604:
11430:
11295:
11285:
11236:
10983:
5824:
468:
285:
280:
222:
5296:
2574:
11685:
11645:
11609:
11594:
11545:
11489:
11316:
10948:
10759:
10020:
9986:
9284:
8912:
1609:
690:
373:
69:
11030:
10975:
10878:
10814:
9985:, where causality flows back and forth between both the dependent and independent variable.
6414:
5716:
3158:
2527:
2298:
2139:
2109:
2052:
1644:
1575:
1359:
11650:
11589:
11576:
11525:
11425:
11347:
11326:
11300:
3086:
The theorem now states that the OLS estimator is a best linear unbiased estimator (BLUE).
2169:
1416:
537:
490:
483:
393:
383:
264:
232:
187:
166:
74:
6467:
710:
8:
11668:
11599:
11484:
11451:
11403:
11393:
11372:
11367:
11246:
11228:
11213:
11144:
11031:
10879:
2775:{\displaystyle {\widehat {\beta }}=(X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }y}
506:
311:
212:
207:
161:
110:
100:
45:
11109:
11680:
11584:
11573:
11398:
11061:
10874:
10724:
10599:
10507:
10501:
10393:
9318:
observations, the expectation—conditional on the regressors—of the error term is zero:
9301:
9029:
5804:
4850:{\displaystyle \mathbf {k} =(k_{1},\dots ,k_{p+1})^{T}\in \mathbb {R} ^{(p+1)\times 1}}
4476:
2859:
2808:
2788:
2385:
1820:
1555:
1051:
865:
762:
626:
606:
545:
480:
476:
416:
145:
130:
10662:
David, F. N.; Neyman, J. (1938). "Extension of the
Markoff theorem on least squares".
11675:
11635:
11362:
11272:
11263:
11072:
11042:
11015:
10987:
10976:
10954:
10924:
10890:
10854:
10820:
10790:
10765:
10735:
10671:
10610:
10535:
10107:
9125:
2217:
472:
411:
202:
105:
59:
9982:
11331:
11198:
10701:
10644:
10389:
10115:
8693:
1756:
549:
526:
502:
227:
156:
10692:
Aitken, A. C. (1935). "On Least
Squares and Linear Combinations of Observations".
1908:{\displaystyle {\text{Cov}}(\varepsilon _{i},\varepsilon _{j})=0,\forall i\neq j.}
1010:{\displaystyle {\hat {Y}}=\sigma _{y}{\frac {(X-\mu _{x})}{\sigma _{x}}}+\mu _{y}}
11614:
11520:
11461:
11456:
10515:
8733:. Instead, the assumptions of the Gauss–Markov theorem are stated conditional on
3187:
the sum of squares of residuals may proceed as follows with a calculation of the
388:
95:
2136:, since these data are observable. (The dependence of the coefficients on each
11558:
10630:
4035:
3188:
2224:
140:
10705:
11705:
11630:
11160:
11140:
10944:
10810:
10755:
10131:
9715:
8705:
2669:
is a positive semi-definite matrix for every other linear unbiased estimator
1810:{\displaystyle \operatorname {Var} (\varepsilon _{i})=\sigma ^{2}<\infty }
541:
518:
259:
135:
2591:
of linear combination parameters. This is equivalent to the condition that
10675:
9052:
are often used to convert an equation into a linear form. For example, the
8730:
514:
498:
125:
8704:
In most treatments of OLS, the regressors (parameters of interest) in the
2106:, since those are not observable, but are allowed to depend on the values
11218:
11056:
10594:
9719:
171:
120:
11010:
Davidson, James (2000). "Statistical
Analysis of the Regression Model".
5794:{\displaystyle C=(X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }+D}
10635:
2039:{\displaystyle {\widehat {\beta }}_{j}=c_{1j}y_{1}+\cdots +c_{kj}y_{k}}
852:{\displaystyle \mu _{\hat {Y}}=\mu _{Y},\sigma _{\hat {Y}}=\sigma _{Y}}
456:
8764:
non-linear forms as long as the parameters are linear. The equation
2853:
486:
10648:
7498:{\displaystyle \operatorname {Var} \left({\widehat {\beta }}\right)}
1386:
are non-random and observable (called the "explanatory variables"),
11092:
7654:(best in the sense that it has minimum variance). To see this, let
7455:{\displaystyle \operatorname {Var} \left({\tilde {\beta }}\right)}
11098:
8699:
11099:
Proof of the Gauss Markov theorem for multiple linear regression
10396:. If this assumption is violated, OLS is still unbiased, but
10096:{\displaystyle \mathbf {X} ^{\operatorname {T} }\mathbf {X} }
9973:
This assumption is violated if the explanatory variables are
7647:{\displaystyle \ell ^{\operatorname {T} }{\widehat {\beta }}}
2571:
is one with the smallest mean squared error for every vector
1076:
Suppose we have, in matrix notation, the linear relationship
10633:(1949). "A Historical Note on the Method of Least Squares".
10458:{\displaystyle \operatorname {Var} =\sigma ^{2}\mathbf {I} }
10103:
is not invertible and the OLS estimator cannot be computed.
505:
and expectation value of zero. The errors do not need to be
9114:{\displaystyle Y=AL^{\alpha }K^{1-\alpha }e^{\varepsilon }}
8680:
which gives the uniqueness of the OLS estimator as a BLUE.
7686:{\displaystyle \ell ^{\operatorname {T} }{\tilde {\beta }}}
1504:
with a newly introduced last column of X being unity i.e.,
31:
26:"BLUE" redirects here. For queue management algorithm, see
11093:
Earliest Known Uses of Some of the Words of
Mathematics: G
10597:(1971). "Best Linear Unbiased Estimation and Prediction".
10578:
11139:
2383:
be some linear combination of the coefficients. Then the
2079:
are not allowed to depend on the underlying coefficients
10531:
Independent and identically distributed random variables
9430:{\displaystyle \operatorname {E} =\operatorname {E} =0.}
2166:
is typically nonlinear; the estimator is linear in each
11059:(1971). "Least Squares and the Standard Linear Model".
10885:(Fifth international ed.). South-Western. p.
3155:
whose coefficients do not depend upon the unobservable
1677:
assumptions concern the set of error random variables,
10974:
10465:
in the multivariate normal density, then the equation
10219:
9989:
techniques are commonly used to address this problem.
9779:
9568:
9468:
8683:
5179:
5126:
5069:
5020:
4541:
4063:
3834:
3485:
2331:
603:
and that we want to find the best linear estimator of
10471:
10410:
10143:
10075:
10059:{\displaystyle \operatorname {rank} (\mathbf {X} )=k}
10032:
10003:
9731:
9693:
9664:
9553:
9446:
9327:
9304:
9137:
9124:
But it can be expressed in linear form by taking the
9065:
9032:
8996:
8935:
8915:
8883:
8829:
8770:
8739:
8713:
8622:
8284:
8245:
7732:
7699:
7660:
7621:
7591:
7519:
7468:
7425:
7395:
6502:
6470:
6441:
6417:
5892:
5857:
5827:
5807:
5739:
5719:
5681:
5586:
5512:
5488:
5464:
5442:
5341:
5319:
5299:
5014:
4889:
4863:
4757:
4619:
4529:
4499:
4479:
4044:
3822:
3795:
3390:
3200:
3161:
3095:
2892:
2862:
2831:
2811:
2791:
2714:
2675:
2600:
2577:
2550:
2530:
2520:
same as the variance of the linear combination.) The
2398:
2301:
2235:
2199:
2172:
2142:
2112:
2085:
2055:
1958:
1928:
1846:
1823:
1765:
1714:
1683:
1647:
1612:
1578:
1558:
1510:
1477:
1446:
1419:
1392:
1362:
1331:
1210:
1085:
1054:
1025:
930:
888:
868:
785:
765:
736:
713:
693:
649:
629:
609:
566:
10504:
centered at μ with radius σ in n-dimensional space.
5667:. So the Hessian is positive definite if full rank.
11029:Goldberger, Arthur (1991). "Classical Regression".
10919:(Second ed.). New York: McGraw-Hill. pp.
10849:(Second ed.). New York: McGraw-Hill. pp.
8616:This proves that the equality holds if and only if
759:would have the same mean and standard deviation as
11105:A Proof of the Gauss Markov theorem using geometry
11060:
10723:
10598:
10492:
10457:
10388:This implies the error term has uniform variance (
10377:
10095:
10058:
10011:
9962:
9706:
9679:
9647:
9535:
9429:
9310:
9272:
9113:
9038:
9018:
8982:
8921:
8901:
8869:
8815:
8747:
8721:
8672:
8605:
8267:
8228:
7715:
7685:
7646:
7607:
7577:
7497:
7454:
7411:
7378:
6485:
6456:
6423:
6400:
5875:
5839:
5813:
5793:
5725:
5705:
5659:
5569:
5498:
5474:
5450:
5426:
5327:
5305:
5285:
4998:
4873:
4849:
4741:
4605:
4515:
4485:
4463:
4024:
3808:
3781:
3372:
3167:
3147:
3075:
2868:
2844:
2817:
2797:
2774:
2690:
2658:
2583:
2563:
2536:
2508:
2376:{\textstyle \sum _{j=1}^{K}\lambda _{j}\beta _{j}}
2375:
2317:
2285:{\displaystyle \operatorname {E} \left=\beta _{j}}
2284:
2208:
2185:
2158:
2128:
2098:
2071:
2038:
1941:
1907:
1829:
1809:
1745:
1696:
1663:
1631:
1594:
1564:
1544:
1496:
1459:
1432:
1405:
1378:
1344:
1314:
1190:
1060:
1040:
1009:
914:
874:
851:
771:
751:
719:
699:
679:
635:
615:
595:
560:Suppose we are given two random variable vectors,
8983:{\displaystyle y=\beta _{0}+\beta _{1}(x)\cdot x}
596:{\displaystyle X{\text{, }}Y\in \mathbb {R} ^{k}}
11703:
11037:. Cambridge: Harvard University Press. pp.
10400:. The term "spherical errors" will describe the
7716:{\displaystyle \ell ^{\operatorname {T} }\beta }
7608:{\displaystyle \ell ^{\operatorname {T} }\beta }
7513:As it has been stated before, the condition of
5008:In terms of vector multiplication, this means
8700:Gauss–Markov theorem as stated in econometrics
11125:
10694:Proceedings of the Royal Society of Edinburgh
8877:can be transformed to be linear by replacing
8816:{\displaystyle y=\beta _{0}+\beta _{1}x^{2},}
8268:{\displaystyle D^{\operatorname {T} }\ell =0}
3148:{\displaystyle a_{1}y_{1}+\cdots +a_{n}y_{n}}
1759:, that is all have the same finite variance:
436:
11067:. New York: John Wiley & Sons. pp.
10730:. New York: John Wiley & Sons. pp.
10687:
10685:
10605:. New York: John Wiley & Sons. pp.
10546:
9658:Geometrically, this assumption implies that
8870:{\displaystyle y=\beta _{0}+\beta _{1}^{2}x}
8171:
8155:
5642:
5628:
1048:has the same mean and standard deviation as
11095:(brief history and explanation of the name)
10717:
10715:
10661:
5458:was arbitrary, it means all eigenvalues of
3191:and showing that it is positive definite.
882:has respective mean and standard deviation
11132:
11118:
11028:
10973:
10953:. Princeton University Press. p. 10.
10873:
10764:. Princeton University Press. p. 13.
10514:This assumption is violated when there is
5414:
5410:
4907:
4903:
4697:
4693:
443:
429:
10819:. Princeton University Press. p. 7.
10682:
10581:Applied multivariate statistical analysis
10420:
10194:
10153:
9919:
9890:
9866:
9837:
9820:
9791:
9767:
9741:
9543:is the data vector of regressors for the
9371:
9337:
5847:non-zero matrix. As we're restricting to
4819:
3975:
3194:The MSE function we want to minimize is
3175:but whose expected value is always zero.
1169:
1146:
1125:
583:
16:Theorem related to ordinary least squares
11009:
10909:
10839:
10712:
10629:
8239:Moreover, equality holds if and only if
2509:{\displaystyle \operatorname {E} \left,}
680:{\displaystyle {\hat {Y}}=\alpha X+\mu }
555:
11641:Numerical smoothing and differentiation
10943:
10809:
10789:. New York: W. W. Norton. p. 275.
10784:
10754:
10422:
10196:
10161:
10155:
10134:of the error vector must be spherical.
9056:—often used in economics—is nonlinear:
7508:
7412:{\displaystyle DD^{\operatorname {T} }}
5514:
4516:{\displaystyle X^{\operatorname {T} }X}
3455:
3405:
1840:Distinct error terms are uncorrelated:
511:independent and identically distributed
11704:
10691:
3809:{\displaystyle X^{\operatorname {T} }}
2845:{\displaystyle X^{\operatorname {T} }}
2703:ordinary least squares estimator (OLS)
11113:
11055:
11014:. Oxford: Blackwell. pp. 17–36.
10721:
10593:
10579:Johnson, R.A.; Wichern, D.W. (2002).
9655:is the data matrix or design matrix.
7693:another linear unbiased estimator of
3380:for a multiple regression model with
2691:{\displaystyle {\widetilde {\beta }}}
1746:{\displaystyle \operatorname {E} =0.}
922:, the best linear estimator would be
11176:Iteratively reweighted least squares
9722:(i.e., their cross moment) is zero.
9293:
5876:{\displaystyle {\widehat {\beta }},}
5706:{\displaystyle {\tilde {\beta }}=Cy}
3384:variables. The first derivative is
915:{\displaystyle \mu _{x},\sigma _{x}}
10978:Statistical Methods in Econometrics
10559:Minimum-variance unbiased estimator
10125:
9981:. Endogeneity can be the result of
8684:Generalized least squares estimator
7505:by a positive semidefinite matrix.
7419:is a positive semidefinite matrix,
5313:is the eigenvalue corresponding to
2389:of the corresponding estimation is
13:
11194:Pearson product-moment correlation
11003:
10726:Regression and Econometric Methods
10166:
10144:
10106:A violation of this assumption is
10083:
9881:
9828:
9782:
9732:
9640:
9626:
9602:
9583:
9528:
9362:
9328:
8653:
8628:
8585:
8560:
8537:
8524:
8496:
8470:
8454:
8428:
8415:
8370:
8344:
8323:
8294:
8251:
8200:
8163:
8130:
8082:
8039:
8026:
7997:
7984:
7971:
7938:
7904:
7891:
7859:
7846:
7787:
7753:
7705:
7666:
7627:
7597:
7404:
7322:
7296:
7219:
7177:
7144:
7102:
7070:
7035:
6993:
6955:
6923:
6901:
6891:
6865:
6833:
6817:
6791:
6748:
6719:
6687:
6661:
6623:
6590:
6330:
6304:
6265:
6240:
6214:
6176:
6160:
6134:
6096:
6070:
6008:
5982:
5958:
5930:
5897:
5780:
5754:
5612:
5602:
5580:Or, just see that for all vectors
5559:
5532:
5491:
5467:
5349:
5267:
5244:
5237:
4866:
4505:
4453:
4047:
3801:
3433:
2837:
2764:
2738:
2399:
2236:
1890:
1804:
1715:
1469:errors and residuals in statistics
1282:
643:, using the best linear estimator
467:for some authors) states that the
14:
11723:
11086:
10493:{\displaystyle f(\varepsilon )=c}
9547:th observation, and consequently
6457:{\displaystyle {\tilde {\beta }}}
5883:the OLS estimator. We calculate:
4493:are linearly independent so that
1440:are random. The random variables
28:Blue (queue management algorithm)
11674:
10451:
10430:
10402:multivariate normal distribution
10349:
10204:
10175:
10089:
10078:
10043:
10005:
9935:
9744:
9707:{\displaystyle \varepsilon _{i}}
9680:{\displaystyle \mathbf {x} _{i}}
9667:
9616:
9592:
9573:
9555:
9449:
9408:
9387:
9352:
8741:
8715:
5637:
5632:
5621:
5597:
5588:
5444:
5355:
5344:
5321:
5273:
5262:
5250:
5232:
5150:
5135:
5131:
5097:
5078:
5074:
4963:
4930:
4926:
4899:
4891:
4759:
4689:
4669:
4636:
4632:
4579:
4564:
4560:
4550:
4546:
3756:
3444:
2220:.) The estimator is said to be
1697:{\displaystyle \varepsilon _{i}}
1460:{\displaystyle \varepsilon _{i}}
1406:{\displaystyle \varepsilon _{i}}
410:
10967:
10937:
10903:
10867:
10833:
10787:An Introduction to Econometrics
10553:Best linear unbiased prediction
10353:
9939:
5713:be another linear estimator of
4006:
1281:
1107:
471:(OLS) estimator has the lowest
358:Least-squares spectral analysis
296:Generalized estimating equation
116:Multinomial logistic regression
91:Vector generalized linear model
10803:
10778:
10748:
10655:
10623:
10587:
10571:
10481:
10475:
10434:
10417:
10208:
10191:
10179:
10150:
10047:
10039:
9920:
9887:
9867:
9834:
9821:
9788:
9768:
9738:
9418:
9368:
9356:
9334:
9189:
9177:
9013:
9007:
8971:
8965:
8902:{\displaystyle \beta _{1}^{2}}
8639:
8533:
8516:
8437:
8420:
8381:
8353:
8336:
8333:
8305:
8048:
8031:
8005:
7989:
7980:
7963:
7868:
7851:
7808:
7764:
7677:
7536:
7442:
7331:
7314:
7186:
7169:
7111:
7094:
7066:
7056:
7044:
7027:
7002:
6985:
6932:
6915:
6874:
6857:
6842:
6825:
6800:
6783:
6728:
6711:
6670:
6653:
6582:
6576:
6555:
6546:
6523:
6448:
6385:
6363:
6313:
6296:
6277:
6271:
6223:
6206:
6188:
6182:
6143:
6126:
6079:
6062:
6039:
6024:
5991:
5974:
5945:
5936:
5913:
5763:
5746:
5688:
5499:{\displaystyle {\mathcal {H}}}
5475:{\displaystyle {\mathcal {H}}}
5411:
4904:
4874:{\displaystyle {\mathcal {H}}}
4836:
4824:
4805:
4766:
4694:
3998:
3986:
3733:
3688:
3640:
3595:
3554:
3509:
3361:
3276:
3249:
3204:
2747:
2730:
2522:best linear unbiased estimator
1878:
1852:
1785:
1772:
1734:
1721:
1531:
1519:
1185:
1108:
1032:
978:
959:
937:
829:
796:
743:
656:
525:(which also drops linearity),
1:
11101:(makes use of matrix algebra)
10583:. Vol. 5. Prentice hall.
10565:
9718:to each other, so that their
9019:{\displaystyle \beta _{1}(x)}
5506:is positive definite. Thus,
2209:{\displaystyle \varepsilon ,}
177:Nonlinear mixed-effects model
11664:Regression analysis category
11554:Response surface methodology
10664:Statistical Research Memoirs
10012:{\displaystyle \mathbf {X} }
9992:
8758:
8748:{\displaystyle \mathbf {X} }
8722:{\displaystyle \mathbf {X} }
5577:is indeed a global minimum.
5451:{\displaystyle \mathbf {k} }
5328:{\displaystyle \mathbf {k} }
2295:regardless of the values of
1545:{\displaystyle X_{i(K+1)}=1}
1497:{\displaystyle \beta _{K+1}}
1071:
536:The theorem was named after
7:
11536:Frisch–Waugh–Lovell theorem
11506:Mean and predicted response
10982:. Academic Press. pp.
10524:
6464:is unbiased if and only if
379:Mean and predicted response
10:
11728:
11186:Correlation and dependence
11063:Principles of Econometrics
10601:Principles of Econometrics
8909:by another parameter, say
8823:qualifies as linear while
4038:of second derivatives is
3183:Proof that the OLS indeed
2564:{\displaystyle \beta _{j}}
2099:{\displaystyle \beta _{j}}
2049:in which the coefficients
1942:{\displaystyle \beta _{j}}
1345:{\displaystyle \beta _{j}}
1041:{\displaystyle {\hat {Y}}}
752:{\displaystyle {\hat {Y}}}
172:Linear mixed-effects model
25:
18:
11659:
11623:
11572:
11544:
11531:Minimum mean-square error
11498:
11444:
11418:Decomposition of variance
11416:
11381:
11340:
11322:Growth curve (statistics)
11309:
11291:Generalized least squares
11271:
11260:
11227:
11184:
11151:
10881:Introductory Econometrics
10706:10.1017/S0370164600014346
10547:Other unbiased statistics
10120:variance inflation factor
10108:perfect multicollinearity
8690:generalized least squares
5840:{\displaystyle K\times n}
3178:
2883:(misprediction amounts):
2193:and hence in each random
862:Therefore, if the vector
509:, nor do they need to be
338:Least absolute deviations
11389:Generalized linear model
11281:Simple linear regression
11171:Non-linear least squares
11153:Computational statistics
11033:A Course in Econometrics
10722:Huang, David S. (1970).
5670:
5482:are positive, therefore
5436:Finally, as eigenvector
5306:{\displaystyle \lambda }
4473:Assuming the columns of
2584:{\displaystyle \lambda }
1949:is a linear combination
86:Generalized linear model
19:Not to be confused with
10785:Walters, A. A. (1970).
10541:Measurement uncertainty
9997:The sample data matrix
8922:{\displaystyle \gamma }
1632:{\displaystyle X_{ij},}
1356:observable parameters,
727:are both real numbers.
700:{\displaystyle \alpha }
495:linear regression model
11712:Theorems in statistics
11681:Mathematics portal
11605:Orthogonal polynomials
11431:Analysis of covariance
11296:Weighted least squares
11286:Ordinary least squares
11237:Ordinary least squares
10494:
10459:
10379:
10097:
10060:
10019:must have full column
10013:
9964:
9708:
9681:
9649:
9537:
9431:
9312:
9274:
9115:
9040:
9020:
8984:
8923:
8903:
8871:
8817:
8749:
8723:
8674:
8607:
8269:
8230:
7717:
7687:
7648:
7609:
7579:
7499:
7456:
7413:
7380:
6487:
6458:
6425:
6424:{\displaystyle \beta }
6402:
5877:
5841:
5815:
5795:
5727:
5726:{\displaystyle \beta }
5707:
5661:
5571:
5500:
5476:
5452:
5428:
5388:
5329:
5307:
5287:
5000:
4875:
4851:
4743:
4607:
4517:
4487:
4465:
4415:
4361:
4325:
4252:
4206:
4170:
4132:
4091:
4026:
3816:is the design matrix
3810:
3783:
3674:
3581:
3508:
3374:
3275:
3169:
3168:{\displaystyle \beta }
3149:
3077:
3026:
2986:
2913:
2870:
2846:
2819:
2799:
2776:
2692:
2660:
2585:
2565:
2538:
2537:{\displaystyle \beta }
2510:
2435:
2377:
2352:
2319:
2318:{\displaystyle X_{ij}}
2286:
2210:
2187:
2160:
2159:{\displaystyle X_{ij}}
2130:
2129:{\displaystyle X_{ij}}
2100:
2073:
2072:{\displaystyle c_{ij}}
2040:
1943:
1909:
1831:
1811:
1747:
1698:
1665:
1664:{\displaystyle y_{i}.}
1633:
1596:
1595:{\displaystyle y_{i},}
1566:
1546:
1498:
1461:
1434:
1407:
1380:
1379:{\displaystyle X_{ij}}
1346:
1316:
1244:
1192:
1062:
1042:
1011:
916:
876:
853:
773:
753:
721:
701:
681:
637:
617:
597:
469:ordinary least squares
417:Mathematics portal
343:Iteratively reweighted
11646:System identification
11610:Chebyshev polynomials
11595:Numerical integration
11546:Design of experiments
11490:Regression validation
11317:Polynomial regression
11242:Partial least squares
10500:is the formula for a
10495:
10460:
10380:
10122:, among other tests.
10098:
10061:
10014:
9987:Instrumental variable
9965:
9709:
9682:
9650:
9538:
9432:
9313:
9275:
9116:
9054:Cobb–Douglas function
9041:
9021:
8985:
8924:
8904:
8872:
8818:
8750:
8724:
8675:
8608:
8270:
8231:
7718:
7688:
7649:
7610:
7580:
7500:
7457:
7414:
7381:
6488:
6459:
6426:
6403:
5878:
5842:
5816:
5796:
5728:
5708:
5662:
5572:
5501:
5477:
5453:
5429:
5362:
5330:
5308:
5288:
5001:
4876:
4857:be an eigenvector of
4852:
4744:
4608:
4518:
4488:
4466:
4395:
4341:
4305:
4232:
4186:
4150:
4112:
4071:
4027:
3811:
3784:
3654:
3561:
3488:
3375:
3255:
3170:
3150:
3078:
3006:
2966:
2893:
2876:) that minimizes the
2871:
2847:
2820:
2800:
2777:
2693:
2661:
2586:
2566:
2539:
2524:(BLUE) of the vector
2511:
2415:
2378:
2332:
2320:
2287:
2216:which is why this is
2211:
2188:
2186:{\displaystyle y_{i}}
2161:
2131:
2101:
2074:
2041:
1944:
1910:
1832:
1812:
1748:
1708:They have mean zero:
1699:
1666:
1634:
1606:condition of knowing
1597:
1567:
1547:
1499:
1462:
1435:
1433:{\displaystyle y_{i}}
1408:
1381:
1347:
1317:
1224:
1193:
1063:
1043:
1012:
917:
877:
854:
774:
754:
722:
702:
687:Where the parameters
682:
638:
618:
598:
556:Scalar Case Statement
523:James–Stein estimator
374:Regression validation
353:Bayesian multivariate
70:Polynomial regression
30:. For the color, see
11651:Moving least squares
11590:Approximation theory
11526:Studentized residual
11516:Errors and residuals
11511:Gauss–Markov theorem
11426:Analysis of variance
11348:Nonlinear regression
11327:Segmented regression
11301:General linear model
11219:Confounding variable
11166:Linear least squares
10469:
10408:
10141:
10073:
10030:
10001:
9729:
9691:
9662:
9551:
9444:
9325:
9302:
9135:
9063:
9050:Data transformations
9030:
8994:
8933:
8913:
8881:
8827:
8768:
8737:
8711:
8692:(GLS), developed by
8620:
8282:
8243:
7730:
7697:
7658:
7619:
7589:
7517:
7509:Remarks on the proof
7466:
7423:
7393:
6500:
6486:{\displaystyle DX=0}
6468:
6439:
6415:
5890:
5855:
5825:
5805:
5737:
5717:
5679:
5584:
5510:
5486:
5462:
5440:
5339:
5317:
5297:
5012:
4887:
4861:
4755:
4617:
4527:
4497:
4477:
4042:
3820:
3793:
3388:
3198:
3159:
3093:
2890:
2860:
2829:
2809:
2789:
2712:
2673:
2598:
2575:
2548:
2528:
2396:
2329:
2299:
2233:
2197:
2170:
2140:
2110:
2083:
2053:
1956:
1926:
1844:
1821:
1763:
1712:
1681:
1645:
1610:
1576:
1556:
1508:
1475:
1444:
1417:
1390:
1360:
1329:
1208:
1083:
1052:
1023:
928:
886:
866:
783:
763:
734:
720:{\displaystyle \mu }
711:
691:
647:
627:
607:
564:
546:non-spherical errors
538:Carl Friedrich Gauss
461:Gauss–Markov theorem
399:Gauss–Markov theorem
394:Studentized residual
384:Errors and residuals
218:Principal components
188:Nonlinear regression
75:General linear model
21:Gauss–Markov process
11669:Statistics category
11600:Gaussian quadrature
11485:Model specification
11452:Stepwise regression
11310:Predictor structure
11247:Total least squares
11229:Regression analysis
11214:Partial correlation
11145:regression analysis
10916:Econometric Methods
10875:Wooldridge, Jeffrey
10846:Econometric Methods
9975:measured with error
9630:
9606:
9587:
8898:
8863:
5403:
4523:is invertible, let
4433:
4224:
2218:"linear" regression
1572:. Note that though
1413:are random, and so
1352:are non-random but
517:with mean zero and
244:Errors-in-variables
111:Logistic regression
101:Binomial regression
46:Regression analysis
40:Part of a series on
11686:Statistics outline
11585:Numerical analysis
11012:Econometric Theory
10508:Heteroskedasticity
10490:
10455:
10394:serial correlation
10375:
10329:
10093:
10056:
10009:
9960:
9925:
9704:
9677:
9645:
9633:
9614:
9590:
9571:
9533:
9521:
9427:
9308:
9270:
9111:
9036:
9016:
8980:
8919:
8899:
8884:
8867:
8849:
8813:
8745:
8719:
8670:
8603:
8601:
8265:
8226:
8224:
7713:
7683:
7644:
7605:
7575:
7495:
7452:
7409:
7376:
7374:
6483:
6454:
6421:
6398:
6396:
5873:
5837:
5811:
5791:
5723:
5703:
5657:
5567:
5496:
5472:
5448:
5424:
5389:
5325:
5303:
5283:
5221:
5168:
5115:
5058:
4996:
4871:
4847:
4739:
4603:
4597:
4513:
4483:
4461:
4436:
4416:
4207:
4022:
3964:
3806:
3779:
3777:
3738:
3370:
3165:
3145:
3073:
2878:sum of squares of
2866:
2842:
2815:
2795:
2772:
2688:
2656:
2581:
2561:
2534:
2506:
2386:mean squared error
2373:
2315:
2282:
2206:
2183:
2156:
2126:
2096:
2069:
2036:
1939:
1905:
1827:
1807:
1743:
1694:
1661:
1629:
1592:
1562:
1542:
1494:
1457:
1430:
1403:
1376:
1342:
1312:
1188:
1058:
1038:
1007:
912:
872:
849:
769:
749:
730:Such an estimator
717:
697:
677:
633:
613:
593:
131:Multinomial probit
11699:
11698:
11691:Statistics topics
11636:Calibration curve
11445:Model exploration
11412:
11411:
11382:Non-normal errors
11273:Linear regression
11264:statistical model
10896:978-1-111-53439-4
10577:See chapter 7 of
10536:Linear regression
10357:
9943:
9311:{\displaystyle n}
9294:Strict exogeneity
9285:omitted variables
9126:natural logarithm
9039:{\displaystyle x}
9026:is a function of
8667:
8642:
8574:
8510:
8398:
8308:
8214:
8144:
8096:
7952:
7811:
7767:
7680:
7641:
7568:
7539:
7488:
7445:
7365:
7268:
6574:
6526:
6451:
6411:Therefore, since
5916:
5867:
5814:{\displaystyle D}
5691:
4486:{\displaystyle X}
3410:
3037:
2943:
2869:{\displaystyle X}
2818:{\displaystyle X}
2798:{\displaystyle y}
2724:
2685:
2649:
2620:
2461:
2256:
1969:
1850:
1830:{\displaystyle i}
1565:{\displaystyle i}
1159:
1061:{\displaystyle Y}
1035:
992:
940:
875:{\displaystyle X}
832:
799:
772:{\displaystyle Y}
746:
659:
636:{\displaystyle X}
616:{\displaystyle Y}
573:
473:sampling variance
453:
452:
106:Binary regression
65:Simple regression
60:Linear regression
11719:
11679:
11678:
11436:Multivariate AOV
11332:Local regression
11269:
11268:
11261:Regression as a
11252:Ridge regression
11199:Rank correlation
11134:
11127:
11120:
11111:
11110:
11082:
11066:
11052:
11036:
11025:
10998:
10997:
10981:
10971:
10965:
10964:
10941:
10935:
10934:
10907:
10901:
10900:
10884:
10871:
10865:
10864:
10837:
10831:
10830:
10807:
10801:
10800:
10782:
10776:
10775:
10752:
10746:
10745:
10729:
10719:
10710:
10709:
10689:
10680:
10679:
10659:
10653:
10652:
10643:(3/4): 458–460.
10627:
10621:
10620:
10604:
10591:
10585:
10584:
10575:
10499:
10497:
10496:
10491:
10464:
10462:
10461:
10456:
10454:
10449:
10448:
10433:
10425:
10390:homoscedasticity
10384:
10382:
10381:
10376:
10368:
10367:
10358:
10355:
10352:
10347:
10346:
10334:
10333:
10326:
10325:
10265:
10264:
10231:
10230:
10207:
10199:
10178:
10170:
10169:
10164:
10158:
10126:Spherical errors
10116:condition number
10102:
10100:
10099:
10094:
10092:
10087:
10086:
10081:
10065:
10063:
10062:
10057:
10046:
10018:
10016:
10015:
10010:
10008:
9969:
9967:
9966:
9961:
9944:
9941:
9938:
9930:
9929:
9918:
9917:
9905:
9904:
9896:
9865:
9864:
9852:
9851:
9843:
9819:
9818:
9806:
9805:
9797:
9766:
9765:
9753:
9752:
9747:
9713:
9711:
9710:
9705:
9703:
9702:
9686:
9684:
9683:
9678:
9676:
9675:
9670:
9654:
9652:
9651:
9646:
9644:
9643:
9638:
9637:
9629:
9624:
9619:
9605:
9600:
9595:
9586:
9581:
9576:
9558:
9542:
9540:
9539:
9534:
9532:
9531:
9526:
9525:
9518:
9517:
9498:
9497:
9483:
9482:
9458:
9457:
9452:
9436:
9434:
9433:
9428:
9417:
9416:
9411:
9396:
9395:
9390:
9381:
9380:
9355:
9347:
9346:
9317:
9315:
9314:
9309:
9279:
9277:
9276:
9271:
9254:
9253:
9232:
9231:
9219:
9218:
9120:
9118:
9117:
9112:
9110:
9109:
9100:
9099:
9084:
9083:
9045:
9043:
9042:
9037:
9025:
9023:
9022:
9017:
9006:
9005:
8989:
8987:
8986:
8981:
8964:
8963:
8951:
8950:
8928:
8926:
8925:
8920:
8908:
8906:
8905:
8900:
8897:
8892:
8876:
8874:
8873:
8868:
8862:
8857:
8845:
8844:
8822:
8820:
8819:
8814:
8809:
8808:
8799:
8798:
8786:
8785:
8754:
8752:
8751:
8746:
8744:
8728:
8726:
8725:
8720:
8718:
8679:
8677:
8676:
8671:
8669:
8668:
8660:
8657:
8656:
8644:
8643:
8635:
8632:
8631:
8612:
8610:
8609:
8604:
8602:
8589:
8588:
8578:
8576:
8575:
8567:
8564:
8563:
8548:
8541:
8540:
8528:
8527:
8512:
8511:
8503:
8500:
8499:
8484:
8474:
8473:
8458:
8457:
8448:
8447:
8432:
8431:
8419:
8418:
8403:
8399:
8397: from above
8396:
8393:
8391:
8387:
8374:
8373:
8364:
8363:
8348:
8347:
8327:
8326:
8310:
8309:
8301:
8298:
8297:
8274:
8272:
8271:
8266:
8255:
8254:
8235:
8233:
8232:
8227:
8225:
8221:
8217:
8216:
8215:
8207:
8204:
8203:
8177:
8167:
8166:
8151:
8147:
8146:
8145:
8137:
8134:
8133:
8107:
8103:
8099:
8098:
8097:
8089:
8086:
8085:
8059:
8058:
8043:
8042:
8030:
8029:
8020:
8019:
8009:
8001:
8000:
7988:
7987:
7975:
7974:
7959:
7955:
7954:
7953:
7945:
7942:
7941:
7915:
7908:
7907:
7895:
7894:
7879:
7878:
7863:
7862:
7850:
7849:
7840:
7839:
7824:
7817:
7813:
7812:
7804:
7791:
7790:
7774:
7770:
7769:
7768:
7760:
7757:
7756:
7722:
7720:
7719:
7714:
7709:
7708:
7692:
7690:
7689:
7684:
7682:
7681:
7673:
7670:
7669:
7653:
7651:
7650:
7645:
7643:
7642:
7634:
7631:
7630:
7614:
7612:
7611:
7606:
7601:
7600:
7584:
7582:
7581:
7576:
7574:
7570:
7569:
7561:
7545:
7541:
7540:
7532:
7504:
7502:
7501:
7496:
7494:
7490:
7489:
7481:
7461:
7459:
7458:
7453:
7451:
7447:
7446:
7438:
7418:
7416:
7415:
7410:
7408:
7407:
7385:
7383:
7382:
7377:
7375:
7371:
7367:
7366:
7358:
7342:
7341:
7326:
7325:
7313:
7312:
7302:
7300:
7299:
7287:
7286:
7274:
7270:
7269:
7261:
7242:
7225:
7223:
7222:
7210:
7209:
7197:
7196:
7181:
7180:
7168:
7167:
7152:
7148:
7147:
7135:
7134:
7122:
7121:
7106:
7105:
7087:
7086:
7074:
7073:
7055:
7054:
7039:
7038:
7026:
7025:
7013:
7012:
6997:
6996:
6984:
6983:
6968:
6964:
6960:
6959:
6958:
6943:
6942:
6927:
6926:
6905:
6904:
6895:
6894:
6885:
6884:
6869:
6868:
6853:
6852:
6837:
6836:
6821:
6820:
6811:
6810:
6795:
6794:
6777:
6776:
6761:
6757:
6753:
6752:
6751:
6739:
6738:
6723:
6722:
6702:
6698:
6691:
6690:
6681:
6680:
6665:
6664:
6647:
6646:
6631:
6627:
6626:
6614:
6613:
6598:
6594:
6593:
6575:
6572:
6561:
6532:
6528:
6527:
6519:
6492:
6490:
6489:
6484:
6463:
6461:
6460:
6455:
6453:
6452:
6444:
6430:
6428:
6427:
6422:
6407:
6405:
6404:
6399:
6397:
6375:
6374:
6356:
6334:
6333:
6324:
6323:
6308:
6307:
6289:
6263:
6255:
6251:
6244:
6243:
6234:
6233:
6218:
6217:
6194:
6175:
6171:
6164:
6163:
6154:
6153:
6138:
6137:
6111:
6107:
6100:
6099:
6090:
6089:
6074:
6073:
6050:
6046:
6042:
6023:
6019:
6012:
6011:
6002:
6001:
5986:
5985:
5951:
5922:
5918:
5917:
5909:
5882:
5880:
5879:
5874:
5869:
5868:
5860:
5846:
5844:
5843:
5838:
5820:
5818:
5817:
5812:
5800:
5798:
5797:
5792:
5784:
5783:
5774:
5773:
5758:
5757:
5732:
5730:
5729:
5724:
5712:
5710:
5709:
5704:
5693:
5692:
5684:
5666:
5664:
5663:
5658:
5650:
5649:
5640:
5635:
5624:
5616:
5615:
5606:
5605:
5600:
5591:
5576:
5574:
5573:
5568:
5563:
5562:
5553:
5552:
5544:
5540:
5536:
5535:
5517:
5505:
5503:
5502:
5497:
5495:
5494:
5481:
5479:
5478:
5473:
5471:
5470:
5457:
5455:
5454:
5449:
5447:
5433:
5431:
5430:
5425:
5402:
5397:
5387:
5376:
5358:
5353:
5352:
5347:
5334:
5332:
5331:
5326:
5324:
5312:
5310:
5309:
5304:
5292:
5290:
5289:
5284:
5276:
5271:
5270:
5265:
5253:
5248:
5247:
5241:
5240:
5235:
5226:
5225:
5218:
5217:
5191:
5190:
5173:
5172:
5165:
5164:
5153:
5140:
5139:
5138:
5120:
5119:
5112:
5111:
5100:
5083:
5082:
5081:
5063:
5062:
5055:
5054:
5032:
5031:
5005:
5003:
5002:
4997:
4989:
4988:
4983:
4979:
4978:
4977:
4966:
4960:
4959:
4935:
4934:
4933:
4923:
4922:
4902:
4894:
4880:
4878:
4877:
4872:
4870:
4869:
4856:
4854:
4853:
4848:
4846:
4845:
4822:
4813:
4812:
4803:
4802:
4778:
4777:
4762:
4748:
4746:
4745:
4740:
4732:
4731:
4707:
4706:
4692:
4684:
4683:
4672:
4666:
4665:
4641:
4640:
4639:
4629:
4628:
4612:
4610:
4609:
4604:
4602:
4601:
4594:
4593:
4582:
4569:
4568:
4567:
4555:
4554:
4553:
4522:
4520:
4519:
4514:
4509:
4508:
4492:
4490:
4489:
4484:
4470:
4468:
4467:
4462:
4457:
4456:
4441:
4440:
4432:
4427:
4414:
4409:
4387:
4386:
4374:
4373:
4360:
4355:
4338:
4337:
4324:
4319:
4278:
4277:
4265:
4264:
4251:
4246:
4223:
4218:
4205:
4200:
4183:
4182:
4169:
4164:
4145:
4144:
4131:
4126:
4104:
4103:
4090:
4085:
4051:
4050:
4031:
4029:
4028:
4023:
4002:
4001:
3978:
3969:
3968:
3961:
3960:
3941:
3940:
3915:
3914:
3910:
3909:
3890:
3889:
3871:
3870:
3851:
3850:
3815:
3813:
3812:
3807:
3805:
3804:
3788:
3786:
3785:
3780:
3778:
3771:
3770:
3759:
3747:
3743:
3742:
3732:
3731:
3719:
3718:
3700:
3699:
3687:
3686:
3673:
3668:
3639:
3638:
3626:
3625:
3607:
3606:
3594:
3593:
3580:
3575:
3553:
3552:
3540:
3539:
3521:
3520:
3507:
3502:
3467:
3463:
3459:
3458:
3447:
3437:
3436:
3411:
3409:
3408:
3396:
3379:
3377:
3376:
3371:
3369:
3368:
3359:
3358:
3346:
3345:
3327:
3326:
3314:
3313:
3301:
3300:
3288:
3287:
3274:
3269:
3248:
3247:
3229:
3228:
3216:
3215:
3174:
3172:
3171:
3166:
3154:
3152:
3151:
3146:
3144:
3143:
3134:
3133:
3115:
3114:
3105:
3104:
3082:
3080:
3079:
3074:
3069:
3068:
3063:
3059:
3058:
3057:
3045:
3044:
3039:
3038:
3030:
3025:
3020:
3002:
3001:
2985:
2980:
2962:
2961:
2956:
2952:
2951:
2950:
2945:
2944:
2936:
2929:
2928:
2912:
2907:
2875:
2873:
2872:
2867:
2851:
2849:
2848:
2843:
2841:
2840:
2824:
2822:
2821:
2816:
2804:
2802:
2801:
2796:
2781:
2779:
2778:
2773:
2768:
2767:
2758:
2757:
2742:
2741:
2726:
2725:
2717:
2705:is the function
2697:
2695:
2694:
2689:
2687:
2686:
2678:
2665:
2663:
2662:
2657:
2655:
2651:
2650:
2642:
2626:
2622:
2621:
2613:
2590:
2588:
2587:
2582:
2570:
2568:
2567:
2562:
2560:
2559:
2543:
2541:
2540:
2535:
2515:
2513:
2512:
2507:
2502:
2498:
2497:
2492:
2488:
2487:
2483:
2482:
2481:
2469:
2468:
2463:
2462:
2454:
2445:
2444:
2434:
2429:
2382:
2380:
2379:
2374:
2372:
2371:
2362:
2361:
2351:
2346:
2324:
2322:
2321:
2316:
2314:
2313:
2291:
2289:
2288:
2283:
2281:
2280:
2268:
2264:
2263:
2258:
2257:
2249:
2215:
2213:
2212:
2207:
2192:
2190:
2189:
2184:
2182:
2181:
2165:
2163:
2162:
2157:
2155:
2154:
2135:
2133:
2132:
2127:
2125:
2124:
2105:
2103:
2102:
2097:
2095:
2094:
2078:
2076:
2075:
2070:
2068:
2067:
2045:
2043:
2042:
2037:
2035:
2034:
2025:
2024:
2003:
2002:
1993:
1992:
1977:
1976:
1971:
1970:
1962:
1948:
1946:
1945:
1940:
1938:
1937:
1920:linear estimator
1914:
1912:
1911:
1906:
1877:
1876:
1864:
1863:
1851:
1848:
1836:
1834:
1833:
1828:
1816:
1814:
1813:
1808:
1800:
1799:
1784:
1783:
1752:
1750:
1749:
1744:
1733:
1732:
1703:
1701:
1700:
1695:
1693:
1692:
1670:
1668:
1667:
1662:
1657:
1656:
1638:
1636:
1635:
1630:
1625:
1624:
1601:
1599:
1598:
1593:
1588:
1587:
1571:
1569:
1568:
1563:
1551:
1549:
1548:
1543:
1535:
1534:
1503:
1501:
1500:
1495:
1493:
1492:
1466:
1464:
1463:
1458:
1456:
1455:
1439:
1437:
1436:
1431:
1429:
1428:
1412:
1410:
1409:
1404:
1402:
1401:
1385:
1383:
1382:
1377:
1375:
1374:
1351:
1349:
1348:
1343:
1341:
1340:
1321:
1319:
1318:
1313:
1280:
1279:
1267:
1266:
1254:
1253:
1243:
1238:
1220:
1219:
1197:
1195:
1194:
1189:
1184:
1183:
1172:
1160:
1157:
1155:
1154:
1149:
1134:
1133:
1128:
1067:
1065:
1064:
1059:
1047:
1045:
1044:
1039:
1037:
1036:
1028:
1016:
1014:
1013:
1008:
1006:
1005:
993:
991:
990:
981:
977:
976:
957:
955:
954:
942:
941:
933:
921:
919:
918:
913:
911:
910:
898:
897:
881:
879:
878:
873:
858:
856:
855:
850:
848:
847:
835:
834:
833:
825:
815:
814:
802:
801:
800:
792:
778:
776:
775:
770:
758:
756:
755:
750:
748:
747:
739:
726:
724:
723:
718:
706:
704:
703:
698:
686:
684:
683:
678:
661:
660:
652:
642:
640:
639:
634:
622:
620:
619:
614:
602:
600:
599:
594:
592:
591:
586:
574:
571:
550:Alexander Aitken
529:, or simply any
527:ridge regression
445:
438:
431:
415:
414:
322:Ridge regression
157:Multilevel model
37:
36:
11727:
11726:
11722:
11721:
11720:
11718:
11717:
11716:
11702:
11701:
11700:
11695:
11673:
11655:
11619:
11615:Chebyshev nodes
11568:
11564:Bayesian design
11540:
11521:Goodness of fit
11494:
11467:
11457:Model selection
11440:
11408:
11377:
11336:
11305:
11262:
11256:
11223:
11180:
11147:
11138:
11089:
11079:
11049:
11022:
11006:
11004:Further reading
11001:
10994:
10972:
10968:
10961:
10942:
10938:
10931:
10908:
10904:
10897:
10872:
10868:
10861:
10838:
10834:
10827:
10808:
10804:
10797:
10783:
10779:
10772:
10753:
10749:
10742:
10720:
10713:
10690:
10683:
10660:
10656:
10649:10.2307/2332682
10631:Plackett, R. L.
10628:
10624:
10617:
10592:
10588:
10576:
10572:
10568:
10549:
10527:
10516:autocorrelation
10470:
10467:
10466:
10450:
10444:
10440:
10429:
10421:
10409:
10406:
10405:
10363:
10359:
10354:
10348:
10342:
10338:
10328:
10327:
10321:
10317:
10315:
10310:
10305:
10299:
10298:
10293:
10288:
10283:
10277:
10276:
10271:
10266:
10260:
10256:
10254:
10248:
10247:
10242:
10237:
10232:
10226:
10222:
10215:
10214:
10203:
10195:
10174:
10165:
10160:
10159:
10154:
10142:
10139:
10138:
10128:
10088:
10082:
10077:
10076:
10074:
10071:
10070:
10042:
10031:
10028:
10027:
10004:
10002:
9999:
9998:
9995:
9940:
9934:
9924:
9923:
9913:
9909:
9897:
9892:
9891:
9878:
9877:
9871:
9870:
9860:
9856:
9844:
9839:
9838:
9825:
9824:
9814:
9810:
9798:
9793:
9792:
9775:
9774:
9761:
9757:
9748:
9743:
9742:
9730:
9727:
9726:
9698:
9694:
9692:
9689:
9688:
9671:
9666:
9665:
9663:
9660:
9659:
9639:
9632:
9631:
9625:
9620:
9615:
9612:
9607:
9601:
9596:
9591:
9588:
9582:
9577:
9572:
9564:
9563:
9562:
9554:
9552:
9549:
9548:
9527:
9520:
9519:
9510:
9506:
9504:
9499:
9490:
9486:
9484:
9475:
9471:
9464:
9463:
9462:
9453:
9448:
9447:
9445:
9442:
9441:
9412:
9407:
9406:
9391:
9386:
9385:
9376:
9372:
9351:
9342:
9338:
9326:
9323:
9322:
9303:
9300:
9299:
9296:
9249:
9245:
9227:
9223:
9214:
9210:
9136:
9133:
9132:
9128:of both sides:
9105:
9101:
9089:
9085:
9079:
9075:
9064:
9061:
9060:
9031:
9028:
9027:
9001:
8997:
8995:
8992:
8991:
8959:
8955:
8946:
8942:
8934:
8931:
8930:
8914:
8911:
8910:
8893:
8888:
8882:
8879:
8878:
8858:
8853:
8840:
8836:
8828:
8825:
8824:
8804:
8800:
8794:
8790:
8781:
8777:
8769:
8766:
8765:
8761:
8740:
8738:
8735:
8734:
8714:
8712:
8709:
8708:
8702:
8686:
8659:
8658:
8652:
8648:
8634:
8633:
8627:
8623:
8621:
8618:
8617:
8600:
8599:
8584:
8580:
8577:
8566:
8565:
8559:
8555:
8546:
8545:
8536:
8532:
8523:
8519:
8502:
8501:
8495:
8491:
8482:
8481:
8469:
8465:
8453:
8449:
8440:
8436:
8427:
8423:
8414:
8410:
8401:
8400:
8395:
8392:
8369:
8365:
8356:
8352:
8343:
8339:
8332:
8328:
8322:
8318:
8311:
8300:
8299:
8293:
8289:
8285:
8283:
8280:
8279:
8275:. We calculate
8250:
8246:
8244:
8241:
8240:
8223:
8222:
8206:
8205:
8199:
8195:
8194:
8190:
8175:
8174:
8162:
8158:
8136:
8135:
8129:
8125:
8124:
8120:
8105:
8104:
8088:
8087:
8081:
8077:
8076:
8072:
8051:
8047:
8038:
8034:
8025:
8021:
8015:
8011:
8008:
7996:
7992:
7983:
7979:
7970:
7966:
7944:
7943:
7937:
7933:
7932:
7928:
7913:
7912:
7903:
7899:
7890:
7886:
7871:
7867:
7858:
7854:
7845:
7841:
7835:
7831:
7822:
7821:
7803:
7802:
7798:
7786:
7782:
7775:
7759:
7758:
7752:
7748:
7747:
7743:
7733:
7731:
7728:
7727:
7704:
7700:
7698:
7695:
7694:
7672:
7671:
7665:
7661:
7659:
7656:
7655:
7633:
7632:
7626:
7622:
7620:
7617:
7616:
7596:
7592:
7590:
7587:
7586:
7560:
7559:
7555:
7531:
7530:
7526:
7518:
7515:
7514:
7511:
7480:
7479:
7475:
7467:
7464:
7463:
7437:
7436:
7432:
7424:
7421:
7420:
7403:
7399:
7394:
7391:
7390:
7373:
7372:
7357:
7356:
7352:
7334:
7330:
7321:
7317:
7308:
7304:
7301:
7295:
7291:
7282:
7278:
7260:
7259:
7255:
7240:
7239:
7224:
7218:
7214:
7205:
7201:
7189:
7185:
7176:
7172:
7163:
7159:
7150:
7149:
7143:
7139:
7130:
7126:
7114:
7110:
7101:
7097:
7082:
7078:
7069:
7065:
7047:
7043:
7034:
7030:
7021:
7017:
7005:
7001:
6992:
6988:
6979:
6975:
6966:
6965:
6954:
6950:
6935:
6931:
6922:
6918:
6900:
6896:
6890:
6886:
6877:
6873:
6864:
6860:
6845:
6841:
6832:
6828:
6816:
6812:
6803:
6799:
6790:
6786:
6782:
6778:
6772:
6768:
6759:
6758:
6747:
6743:
6731:
6727:
6718:
6714:
6707:
6703:
6686:
6682:
6673:
6669:
6660:
6656:
6652:
6648:
6642:
6638:
6629:
6628:
6622:
6618:
6609:
6605:
6596:
6595:
6589:
6585:
6571:
6559:
6558:
6533:
6518:
6517:
6513:
6503:
6501:
6498:
6497:
6469:
6466:
6465:
6443:
6442:
6440:
6437:
6436:
6416:
6413:
6412:
6395:
6394:
6370:
6366:
6354:
6353:
6329:
6325:
6316:
6312:
6303:
6299:
6287:
6286:
6262:
6239:
6235:
6226:
6222:
6213:
6209:
6205:
6201:
6192:
6191:
6159:
6155:
6146:
6142:
6133:
6129:
6125:
6121:
6095:
6091:
6082:
6078:
6069:
6065:
6061:
6057:
6048:
6047:
6007:
6003:
5994:
5990:
5981:
5977:
5973:
5969:
5968:
5964:
5949:
5948:
5923:
5908:
5907:
5903:
5893:
5891:
5888:
5887:
5859:
5858:
5856:
5853:
5852:
5826:
5823:
5822:
5806:
5803:
5802:
5779:
5775:
5766:
5762:
5753:
5749:
5738:
5735:
5734:
5718:
5715:
5714:
5683:
5682:
5680:
5677:
5676:
5673:
5645:
5641:
5636:
5631:
5620:
5611:
5607:
5601:
5596:
5595:
5587:
5585:
5582:
5581:
5558:
5554:
5545:
5531:
5527:
5526:
5522:
5521:
5513:
5511:
5508:
5507:
5490:
5489:
5487:
5484:
5483:
5466:
5465:
5463:
5460:
5459:
5443:
5441:
5438:
5437:
5398:
5393:
5377:
5366:
5354:
5348:
5343:
5342:
5340:
5337:
5336:
5320:
5318:
5315:
5314:
5298:
5295:
5294:
5272:
5266:
5261:
5260:
5249:
5243:
5242:
5236:
5231:
5230:
5220:
5219:
5207:
5203:
5200:
5199:
5193:
5192:
5186:
5182:
5175:
5174:
5167:
5166:
5154:
5149:
5148:
5146:
5141:
5134:
5130:
5129:
5122:
5121:
5114:
5113:
5101:
5096:
5095:
5092:
5091:
5085:
5084:
5077:
5073:
5072:
5065:
5064:
5057:
5056:
5044:
5040:
5038:
5033:
5027:
5023:
5016:
5015:
5013:
5010:
5009:
4984:
4967:
4962:
4961:
4949:
4945:
4929:
4925:
4924:
4918:
4914:
4913:
4909:
4908:
4898:
4890:
4888:
4885:
4884:
4865:
4864:
4862:
4859:
4858:
4823:
4818:
4817:
4808:
4804:
4792:
4788:
4773:
4769:
4758:
4756:
4753:
4752:
4721:
4717:
4702:
4698:
4688:
4673:
4668:
4667:
4655:
4651:
4635:
4631:
4630:
4624:
4620:
4618:
4615:
4614:
4596:
4595:
4583:
4578:
4577:
4575:
4570:
4563:
4559:
4558:
4556:
4549:
4545:
4544:
4537:
4536:
4528:
4525:
4524:
4504:
4500:
4498:
4495:
4494:
4478:
4475:
4474:
4452:
4448:
4435:
4434:
4428:
4420:
4410:
4399:
4393:
4388:
4379:
4375:
4366:
4362:
4356:
4345:
4339:
4330:
4326:
4320:
4309:
4302:
4301:
4296:
4291:
4286:
4280:
4279:
4270:
4266:
4257:
4253:
4247:
4236:
4230:
4225:
4219:
4211:
4201:
4190:
4184:
4175:
4171:
4165:
4154:
4147:
4146:
4137:
4133:
4127:
4116:
4110:
4105:
4096:
4092:
4086:
4075:
4069:
4059:
4058:
4046:
4045:
4043:
4040:
4039:
3979:
3974:
3973:
3963:
3962:
3953:
3949:
3947:
3942:
3933:
3929:
3927:
3921:
3920:
3912:
3911:
3902:
3898:
3896:
3891:
3885:
3881:
3879:
3873:
3872:
3863:
3859:
3857:
3852:
3846:
3842:
3840:
3830:
3829:
3821:
3818:
3817:
3800:
3796:
3794:
3791:
3790:
3776:
3775:
3760:
3755:
3754:
3745:
3744:
3737:
3736:
3724:
3720:
3714:
3710:
3695:
3691:
3679:
3675:
3669:
3658:
3651:
3650:
3644:
3643:
3631:
3627:
3621:
3617:
3602:
3598:
3586:
3582:
3576:
3565:
3558:
3557:
3545:
3541:
3535:
3531:
3516:
3512:
3503:
3492:
3481:
3480:
3465:
3464:
3454:
3443:
3442:
3438:
3432:
3428:
3415:
3404:
3400:
3395:
3391:
3389:
3386:
3385:
3364:
3360:
3351:
3347:
3341:
3337:
3319:
3315:
3309:
3305:
3296:
3292:
3283:
3279:
3270:
3259:
3243:
3239:
3224:
3220:
3211:
3207:
3199:
3196:
3195:
3181:
3160:
3157:
3156:
3139:
3135:
3129:
3125:
3110:
3106:
3100:
3096:
3094:
3091:
3090:
3064:
3050:
3046:
3040:
3029:
3028:
3027:
3021:
3010:
2997:
2993:
2992:
2988:
2987:
2981:
2970:
2957:
2946:
2935:
2934:
2933:
2924:
2920:
2919:
2915:
2914:
2908:
2897:
2891:
2888:
2887:
2861:
2858:
2857:
2836:
2832:
2830:
2827:
2826:
2810:
2807:
2806:
2790:
2787:
2786:
2763:
2759:
2750:
2746:
2737:
2733:
2716:
2715:
2713:
2710:
2709:
2677:
2676:
2674:
2671:
2670:
2641:
2640:
2636:
2612:
2611:
2607:
2599:
2596:
2595:
2576:
2573:
2572:
2555:
2551:
2549:
2546:
2545:
2529:
2526:
2525:
2493:
2477:
2473:
2464:
2453:
2452:
2451:
2450:
2446:
2440:
2436:
2430:
2419:
2414:
2410:
2409:
2405:
2397:
2394:
2393:
2367:
2363:
2357:
2353:
2347:
2336:
2330:
2327:
2326:
2306:
2302:
2300:
2297:
2296:
2276:
2272:
2259:
2248:
2247:
2246:
2242:
2234:
2231:
2230:
2198:
2195:
2194:
2177:
2173:
2171:
2168:
2167:
2147:
2143:
2141:
2138:
2137:
2117:
2113:
2111:
2108:
2107:
2090:
2086:
2084:
2081:
2080:
2060:
2056:
2054:
2051:
2050:
2030:
2026:
2017:
2013:
1998:
1994:
1985:
1981:
1972:
1961:
1960:
1959:
1957:
1954:
1953:
1933:
1929:
1927:
1924:
1923:
1872:
1868:
1859:
1855:
1847:
1845:
1842:
1841:
1822:
1819:
1818:
1795:
1791:
1779:
1775:
1764:
1761:
1760:
1728:
1724:
1713:
1710:
1709:
1688:
1684:
1682:
1679:
1678:
1652:
1648:
1646:
1643:
1642:
1617:
1613:
1611:
1608:
1607:
1583:
1579:
1577:
1574:
1573:
1557:
1554:
1553:
1515:
1511:
1509:
1506:
1505:
1482:
1478:
1476:
1473:
1472:
1451:
1447:
1445:
1442:
1441:
1424:
1420:
1418:
1415:
1414:
1397:
1393:
1391:
1388:
1387:
1367:
1363:
1361:
1358:
1357:
1336:
1332:
1330:
1327:
1326:
1275:
1271:
1259:
1255:
1249:
1245:
1239:
1228:
1215:
1211:
1209:
1206:
1205:
1173:
1168:
1167:
1158: and
1156:
1150:
1145:
1144:
1129:
1124:
1123:
1084:
1081:
1080:
1074:
1053:
1050:
1049:
1027:
1026:
1024:
1021:
1020:
1001:
997:
986:
982:
972:
968:
958:
956:
950:
946:
932:
931:
929:
926:
925:
906:
902:
893:
889:
887:
884:
883:
867:
864:
863:
843:
839:
824:
823:
819:
810:
806:
791:
790:
786:
784:
781:
780:
764:
761:
760:
738:
737:
735:
732:
731:
712:
709:
708:
692:
689:
688:
651:
650:
648:
645:
644:
628:
625:
624:
608:
605:
604:
587:
582:
581:
570:
565:
562:
561:
558:
503:equal variances
449:
409:
389:Goodness of fit
96:Discrete choice
35:
24:
17:
12:
11:
5:
11725:
11715:
11714:
11697:
11696:
11694:
11693:
11688:
11683:
11671:
11666:
11660:
11657:
11656:
11654:
11653:
11648:
11643:
11638:
11633:
11627:
11625:
11621:
11620:
11618:
11617:
11612:
11607:
11602:
11597:
11592:
11587:
11581:
11579:
11570:
11569:
11567:
11566:
11561:
11559:Optimal design
11556:
11550:
11548:
11542:
11541:
11539:
11538:
11533:
11528:
11523:
11518:
11513:
11508:
11502:
11500:
11496:
11495:
11493:
11492:
11487:
11482:
11481:
11480:
11475:
11470:
11465:
11454:
11448:
11446:
11442:
11441:
11439:
11438:
11433:
11428:
11422:
11420:
11414:
11413:
11410:
11409:
11407:
11406:
11401:
11396:
11391:
11385:
11383:
11379:
11378:
11376:
11375:
11370:
11365:
11360:
11358:Semiparametric
11355:
11350:
11344:
11342:
11338:
11337:
11335:
11334:
11329:
11324:
11319:
11313:
11311:
11307:
11306:
11304:
11303:
11298:
11293:
11288:
11283:
11277:
11275:
11266:
11258:
11257:
11255:
11254:
11249:
11244:
11239:
11233:
11231:
11225:
11224:
11222:
11221:
11216:
11211:
11205:
11203:Spearman's rho
11196:
11190:
11188:
11182:
11181:
11179:
11178:
11173:
11168:
11163:
11157:
11155:
11149:
11148:
11137:
11136:
11129:
11122:
11114:
11108:
11107:
11102:
11096:
11088:
11087:External links
11085:
11084:
11083:
11077:
11053:
11047:
11026:
11020:
11005:
11002:
11000:
10999:
10992:
10966:
10959:
10945:Hayashi, Fumio
10936:
10929:
10911:Johnston, John
10902:
10895:
10866:
10859:
10841:Johnston, John
10832:
10825:
10811:Hayashi, Fumio
10802:
10795:
10777:
10770:
10756:Hayashi, Fumio
10747:
10740:
10711:
10681:
10654:
10622:
10615:
10586:
10569:
10567:
10564:
10563:
10562:
10556:
10548:
10545:
10544:
10543:
10538:
10533:
10526:
10523:
10489:
10486:
10483:
10480:
10477:
10474:
10453:
10447:
10443:
10439:
10436:
10432:
10428:
10424:
10419:
10416:
10413:
10386:
10385:
10374:
10371:
10366:
10362:
10351:
10345:
10341:
10337:
10332:
10324:
10320:
10316:
10314:
10311:
10309:
10306:
10304:
10301:
10300:
10297:
10294:
10292:
10289:
10287:
10284:
10282:
10279:
10278:
10275:
10272:
10270:
10267:
10263:
10259:
10255:
10253:
10250:
10249:
10246:
10243:
10241:
10238:
10236:
10233:
10229:
10225:
10221:
10220:
10218:
10213:
10210:
10206:
10202:
10198:
10193:
10190:
10187:
10184:
10181:
10177:
10173:
10168:
10163:
10157:
10152:
10149:
10146:
10127:
10124:
10091:
10085:
10080:
10067:
10066:
10055:
10052:
10049:
10045:
10041:
10038:
10035:
10007:
9994:
9991:
9971:
9970:
9959:
9956:
9953:
9950:
9947:
9937:
9933:
9928:
9922:
9916:
9912:
9908:
9903:
9900:
9895:
9889:
9886:
9883:
9880:
9879:
9876:
9873:
9872:
9869:
9863:
9859:
9855:
9850:
9847:
9842:
9836:
9833:
9830:
9827:
9826:
9823:
9817:
9813:
9809:
9804:
9801:
9796:
9790:
9787:
9784:
9781:
9780:
9778:
9773:
9770:
9764:
9760:
9756:
9751:
9746:
9740:
9737:
9734:
9701:
9697:
9674:
9669:
9642:
9636:
9628:
9623:
9618:
9613:
9611:
9608:
9604:
9599:
9594:
9589:
9585:
9580:
9575:
9570:
9569:
9567:
9561:
9557:
9530:
9524:
9516:
9513:
9509:
9505:
9503:
9500:
9496:
9493:
9489:
9485:
9481:
9478:
9474:
9470:
9469:
9467:
9461:
9456:
9451:
9438:
9437:
9426:
9423:
9420:
9415:
9410:
9405:
9402:
9399:
9394:
9389:
9384:
9379:
9375:
9370:
9367:
9364:
9361:
9358:
9354:
9350:
9345:
9341:
9336:
9333:
9330:
9307:
9295:
9292:
9281:
9280:
9269:
9266:
9263:
9260:
9257:
9252:
9248:
9244:
9241:
9238:
9235:
9230:
9226:
9222:
9217:
9213:
9209:
9206:
9203:
9200:
9197:
9194:
9191:
9188:
9185:
9182:
9179:
9176:
9173:
9170:
9167:
9164:
9161:
9158:
9155:
9152:
9149:
9146:
9143:
9140:
9122:
9121:
9108:
9104:
9098:
9095:
9092:
9088:
9082:
9078:
9074:
9071:
9068:
9035:
9015:
9012:
9009:
9004:
9000:
8979:
8976:
8973:
8970:
8967:
8962:
8958:
8954:
8949:
8945:
8941:
8938:
8918:
8896:
8891:
8887:
8866:
8861:
8856:
8852:
8848:
8843:
8839:
8835:
8832:
8812:
8807:
8803:
8797:
8793:
8789:
8784:
8780:
8776:
8773:
8760:
8757:
8743:
8717:
8701:
8698:
8685:
8682:
8666:
8663:
8655:
8651:
8647:
8641:
8638:
8630:
8626:
8614:
8613:
8598:
8595:
8592:
8587:
8583:
8579:
8573:
8570:
8562:
8558:
8554:
8551:
8549:
8547:
8544:
8539:
8535:
8531:
8526:
8522:
8518:
8515:
8509:
8506:
8498:
8494:
8490:
8487:
8485:
8483:
8480:
8477:
8472:
8468:
8464:
8461:
8456:
8452:
8446:
8443:
8439:
8435:
8430:
8426:
8422:
8417:
8413:
8409:
8406:
8404:
8402:
8394:
8390:
8386:
8383:
8380:
8377:
8372:
8368:
8362:
8359:
8355:
8351:
8346:
8342:
8338:
8335:
8331:
8325:
8321:
8317:
8314:
8312:
8307:
8304:
8296:
8292:
8288:
8287:
8264:
8261:
8258:
8253:
8249:
8237:
8236:
8220:
8213:
8210:
8202:
8198:
8193:
8189:
8186:
8183:
8180:
8178:
8176:
8173:
8170:
8165:
8161:
8157:
8154:
8150:
8143:
8140:
8132:
8128:
8123:
8119:
8116:
8113:
8110:
8108:
8106:
8102:
8095:
8092:
8084:
8080:
8075:
8071:
8068:
8065:
8062:
8057:
8054:
8050:
8046:
8041:
8037:
8033:
8028:
8024:
8018:
8014:
8010:
8007:
8004:
7999:
7995:
7991:
7986:
7982:
7978:
7973:
7969:
7965:
7962:
7958:
7951:
7948:
7940:
7936:
7931:
7927:
7924:
7921:
7918:
7916:
7914:
7911:
7906:
7902:
7898:
7893:
7889:
7885:
7882:
7877:
7874:
7870:
7866:
7861:
7857:
7853:
7848:
7844:
7838:
7834:
7830:
7827:
7825:
7823:
7820:
7816:
7810:
7807:
7801:
7797:
7794:
7789:
7785:
7781:
7778:
7776:
7773:
7766:
7763:
7755:
7751:
7746:
7742:
7739:
7736:
7735:
7712:
7707:
7703:
7679:
7676:
7668:
7664:
7640:
7637:
7629:
7625:
7604:
7599:
7595:
7573:
7567:
7564:
7558:
7554:
7551:
7548:
7544:
7538:
7535:
7529:
7525:
7522:
7510:
7507:
7493:
7487:
7484:
7478:
7474:
7471:
7450:
7444:
7441:
7435:
7431:
7428:
7406:
7402:
7398:
7387:
7386:
7370:
7364:
7361:
7355:
7351:
7348:
7345:
7340:
7337:
7333:
7329:
7324:
7320:
7316:
7311:
7307:
7303:
7298:
7294:
7290:
7285:
7281:
7277:
7273:
7267:
7264:
7258:
7254:
7251:
7248:
7245:
7243:
7241:
7238:
7235:
7232:
7229:
7226:
7221:
7217:
7213:
7208:
7204:
7200:
7195:
7192:
7188:
7184:
7179:
7175:
7171:
7166:
7162:
7158:
7155:
7153:
7151:
7146:
7142:
7138:
7133:
7129:
7125:
7120:
7117:
7113:
7109:
7104:
7100:
7096:
7093:
7090:
7085:
7081:
7077:
7072:
7068:
7064:
7061:
7058:
7053:
7050:
7046:
7042:
7037:
7033:
7029:
7024:
7020:
7016:
7011:
7008:
7004:
7000:
6995:
6991:
6987:
6982:
6978:
6974:
6971:
6969:
6967:
6963:
6957:
6953:
6949:
6946:
6941:
6938:
6934:
6930:
6925:
6921:
6917:
6914:
6911:
6908:
6903:
6899:
6893:
6889:
6883:
6880:
6876:
6872:
6867:
6863:
6859:
6856:
6851:
6848:
6844:
6840:
6835:
6831:
6827:
6824:
6819:
6815:
6809:
6806:
6802:
6798:
6793:
6789:
6785:
6781:
6775:
6771:
6767:
6764:
6762:
6760:
6756:
6750:
6746:
6742:
6737:
6734:
6730:
6726:
6721:
6717:
6713:
6710:
6706:
6701:
6697:
6694:
6689:
6685:
6679:
6676:
6672:
6668:
6663:
6659:
6655:
6651:
6645:
6641:
6637:
6634:
6632:
6630:
6625:
6621:
6617:
6612:
6608:
6604:
6601:
6599:
6597:
6592:
6588:
6584:
6581:
6578:
6570:
6567:
6564:
6562:
6560:
6557:
6554:
6551:
6548:
6545:
6542:
6539:
6536:
6534:
6531:
6525:
6522:
6516:
6512:
6509:
6506:
6505:
6482:
6479:
6476:
6473:
6450:
6447:
6420:
6409:
6408:
6393:
6390:
6387:
6384:
6381:
6378:
6373:
6369:
6365:
6362:
6359:
6357:
6355:
6352:
6349:
6346:
6343:
6340:
6337:
6332:
6328:
6322:
6319:
6315:
6311:
6306:
6302:
6298:
6295:
6292:
6290:
6288:
6285:
6282:
6279:
6276:
6273:
6270:
6267:
6264:
6261:
6258:
6254:
6250:
6247:
6242:
6238:
6232:
6229:
6225:
6221:
6216:
6212:
6208:
6204:
6200:
6197:
6195:
6193:
6190:
6187:
6184:
6181:
6178:
6174:
6170:
6167:
6162:
6158:
6152:
6149:
6145:
6141:
6136:
6132:
6128:
6124:
6120:
6117:
6114:
6110:
6106:
6103:
6098:
6094:
6088:
6085:
6081:
6077:
6072:
6068:
6064:
6060:
6056:
6053:
6051:
6049:
6045:
6041:
6038:
6035:
6032:
6029:
6026:
6022:
6018:
6015:
6010:
6006:
6000:
5997:
5993:
5989:
5984:
5980:
5976:
5972:
5967:
5963:
5960:
5957:
5954:
5952:
5950:
5947:
5944:
5941:
5938:
5935:
5932:
5929:
5926:
5924:
5921:
5915:
5912:
5906:
5902:
5899:
5896:
5895:
5872:
5866:
5863:
5836:
5833:
5830:
5810:
5790:
5787:
5782:
5778:
5772:
5769:
5765:
5761:
5756:
5752:
5748:
5745:
5742:
5722:
5702:
5699:
5696:
5690:
5687:
5672:
5669:
5656:
5653:
5648:
5644:
5639:
5634:
5630:
5627:
5623:
5619:
5614:
5610:
5604:
5599:
5594:
5590:
5566:
5561:
5557:
5551:
5548:
5543:
5539:
5534:
5530:
5525:
5520:
5516:
5493:
5469:
5446:
5423:
5420:
5417:
5413:
5409:
5406:
5401:
5396:
5392:
5386:
5383:
5380:
5375:
5372:
5369:
5365:
5361:
5357:
5351:
5346:
5323:
5302:
5282:
5279:
5275:
5269:
5264:
5259:
5256:
5252:
5246:
5239:
5234:
5229:
5224:
5216:
5213:
5210:
5206:
5202:
5201:
5198:
5195:
5194:
5189:
5185:
5181:
5180:
5178:
5171:
5163:
5160:
5157:
5152:
5147:
5145:
5142:
5137:
5133:
5128:
5127:
5125:
5118:
5110:
5107:
5104:
5099:
5094:
5093:
5090:
5087:
5086:
5080:
5076:
5071:
5070:
5068:
5061:
5053:
5050:
5047:
5043:
5039:
5037:
5034:
5030:
5026:
5022:
5021:
5019:
4995:
4992:
4987:
4982:
4976:
4973:
4970:
4965:
4958:
4955:
4952:
4948:
4944:
4941:
4938:
4932:
4928:
4921:
4917:
4912:
4906:
4901:
4897:
4893:
4868:
4844:
4841:
4838:
4835:
4832:
4829:
4826:
4821:
4816:
4811:
4807:
4801:
4798:
4795:
4791:
4787:
4784:
4781:
4776:
4772:
4768:
4765:
4761:
4738:
4735:
4730:
4727:
4724:
4720:
4716:
4713:
4710:
4705:
4701:
4696:
4691:
4687:
4682:
4679:
4676:
4671:
4664:
4661:
4658:
4654:
4650:
4647:
4644:
4638:
4634:
4627:
4623:
4600:
4592:
4589:
4586:
4581:
4576:
4574:
4571:
4566:
4562:
4557:
4552:
4548:
4543:
4542:
4540:
4535:
4532:
4512:
4507:
4503:
4482:
4460:
4455:
4451:
4447:
4444:
4439:
4431:
4426:
4423:
4419:
4413:
4408:
4405:
4402:
4398:
4394:
4392:
4389:
4385:
4382:
4378:
4372:
4369:
4365:
4359:
4354:
4351:
4348:
4344:
4340:
4336:
4333:
4329:
4323:
4318:
4315:
4312:
4308:
4304:
4303:
4300:
4297:
4295:
4292:
4290:
4287:
4285:
4282:
4281:
4276:
4273:
4269:
4263:
4260:
4256:
4250:
4245:
4242:
4239:
4235:
4231:
4229:
4226:
4222:
4217:
4214:
4210:
4204:
4199:
4196:
4193:
4189:
4185:
4181:
4178:
4174:
4168:
4163:
4160:
4157:
4153:
4149:
4148:
4143:
4140:
4136:
4130:
4125:
4122:
4119:
4115:
4111:
4109:
4106:
4102:
4099:
4095:
4089:
4084:
4081:
4078:
4074:
4070:
4068:
4065:
4064:
4062:
4057:
4054:
4049:
4036:Hessian matrix
4021:
4018:
4015:
4012:
4009:
4005:
4000:
3997:
3994:
3991:
3988:
3985:
3982:
3977:
3972:
3967:
3959:
3956:
3952:
3948:
3946:
3943:
3939:
3936:
3932:
3928:
3926:
3923:
3922:
3919:
3916:
3913:
3908:
3905:
3901:
3897:
3895:
3892:
3888:
3884:
3880:
3878:
3875:
3874:
3869:
3866:
3862:
3858:
3856:
3853:
3849:
3845:
3841:
3839:
3836:
3835:
3833:
3828:
3825:
3803:
3799:
3774:
3769:
3766:
3763:
3758:
3753:
3750:
3748:
3746:
3741:
3735:
3730:
3727:
3723:
3717:
3713:
3709:
3706:
3703:
3698:
3694:
3690:
3685:
3682:
3678:
3672:
3667:
3664:
3661:
3657:
3653:
3652:
3649:
3646:
3645:
3642:
3637:
3634:
3630:
3624:
3620:
3616:
3613:
3610:
3605:
3601:
3597:
3592:
3589:
3585:
3579:
3574:
3571:
3568:
3564:
3560:
3559:
3556:
3551:
3548:
3544:
3538:
3534:
3530:
3527:
3524:
3519:
3515:
3511:
3506:
3501:
3498:
3495:
3491:
3487:
3486:
3484:
3479:
3476:
3473:
3470:
3468:
3466:
3462:
3457:
3453:
3450:
3446:
3441:
3435:
3431:
3427:
3424:
3421:
3418:
3416:
3414:
3407:
3403:
3399:
3394:
3393:
3367:
3363:
3357:
3354:
3350:
3344:
3340:
3336:
3333:
3330:
3325:
3322:
3318:
3312:
3308:
3304:
3299:
3295:
3291:
3286:
3282:
3278:
3273:
3268:
3265:
3262:
3258:
3254:
3251:
3246:
3242:
3238:
3235:
3232:
3227:
3223:
3219:
3214:
3210:
3206:
3203:
3189:Hessian matrix
3180:
3177:
3164:
3142:
3138:
3132:
3128:
3124:
3121:
3118:
3113:
3109:
3103:
3099:
3084:
3083:
3072:
3067:
3062:
3056:
3053:
3049:
3043:
3036:
3033:
3024:
3019:
3016:
3013:
3009:
3005:
3000:
2996:
2991:
2984:
2979:
2976:
2973:
2969:
2965:
2960:
2955:
2949:
2942:
2939:
2932:
2927:
2923:
2918:
2911:
2906:
2903:
2900:
2896:
2865:
2839:
2835:
2814:
2794:
2783:
2782:
2771:
2766:
2762:
2756:
2753:
2749:
2745:
2740:
2736:
2732:
2729:
2723:
2720:
2684:
2681:
2667:
2666:
2654:
2648:
2645:
2639:
2635:
2632:
2629:
2625:
2619:
2616:
2610:
2606:
2603:
2580:
2558:
2554:
2544:of parameters
2533:
2517:
2516:
2505:
2501:
2496:
2491:
2486:
2480:
2476:
2472:
2467:
2460:
2457:
2449:
2443:
2439:
2433:
2428:
2425:
2422:
2418:
2413:
2408:
2404:
2401:
2370:
2366:
2360:
2356:
2350:
2345:
2342:
2339:
2335:
2312:
2309:
2305:
2293:
2292:
2279:
2275:
2271:
2267:
2262:
2255:
2252:
2245:
2241:
2238:
2225:if and only if
2205:
2202:
2180:
2176:
2153:
2150:
2146:
2123:
2120:
2116:
2093:
2089:
2066:
2063:
2059:
2047:
2046:
2033:
2029:
2023:
2020:
2016:
2012:
2009:
2006:
2001:
1997:
1991:
1988:
1984:
1980:
1975:
1968:
1965:
1936:
1932:
1916:
1915:
1904:
1901:
1898:
1895:
1892:
1889:
1886:
1883:
1880:
1875:
1871:
1867:
1862:
1858:
1854:
1838:
1826:
1806:
1803:
1798:
1794:
1790:
1787:
1782:
1778:
1774:
1771:
1768:
1753:
1742:
1739:
1736:
1731:
1727:
1723:
1720:
1717:
1691:
1687:
1660:
1655:
1651:
1628:
1623:
1620:
1616:
1591:
1586:
1582:
1561:
1541:
1538:
1533:
1530:
1527:
1524:
1521:
1518:
1514:
1491:
1488:
1485:
1481:
1454:
1450:
1427:
1423:
1400:
1396:
1373:
1370:
1366:
1339:
1335:
1323:
1322:
1311:
1308:
1305:
1302:
1299:
1296:
1293:
1290:
1287:
1284:
1278:
1274:
1270:
1265:
1262:
1258:
1252:
1248:
1242:
1237:
1234:
1231:
1227:
1223:
1218:
1214:
1201:expanding to,
1199:
1198:
1187:
1182:
1179:
1176:
1171:
1166:
1163:
1153:
1148:
1143:
1140:
1137:
1132:
1127:
1122:
1119:
1116:
1113:
1110:
1106:
1103:
1100:
1097:
1094:
1091:
1088:
1073:
1070:
1057:
1034:
1031:
1004:
1000:
996:
989:
985:
980:
975:
971:
967:
964:
961:
953:
949:
945:
939:
936:
909:
905:
901:
896:
892:
871:
846:
842:
838:
831:
828:
822:
818:
813:
809:
805:
798:
795:
789:
768:
745:
742:
716:
696:
676:
673:
670:
667:
664:
658:
655:
632:
612:
590:
585:
580:
577:
569:
557:
554:
451:
450:
448:
447:
440:
433:
425:
422:
421:
420:
419:
404:
403:
402:
401:
396:
391:
386:
381:
376:
368:
367:
363:
362:
361:
360:
355:
350:
345:
340:
332:
331:
330:
329:
324:
319:
314:
309:
301:
300:
299:
298:
293:
288:
283:
275:
274:
273:
272:
267:
262:
254:
253:
249:
248:
247:
246:
238:
237:
236:
235:
230:
225:
220:
215:
210:
205:
200:
198:Semiparametric
195:
190:
182:
181:
180:
179:
174:
169:
167:Random effects
164:
159:
151:
150:
149:
148:
143:
141:Ordered probit
138:
133:
128:
123:
118:
113:
108:
103:
98:
93:
88:
80:
79:
78:
77:
72:
67:
62:
54:
53:
49:
48:
42:
41:
15:
9:
6:
4:
3:
2:
11724:
11713:
11710:
11709:
11707:
11692:
11689:
11687:
11684:
11682:
11677:
11672:
11670:
11667:
11665:
11662:
11661:
11658:
11652:
11649:
11647:
11644:
11642:
11639:
11637:
11634:
11632:
11631:Curve fitting
11629:
11628:
11626:
11622:
11616:
11613:
11611:
11608:
11606:
11603:
11601:
11598:
11596:
11593:
11591:
11588:
11586:
11583:
11582:
11580:
11578:
11577:approximation
11575:
11571:
11565:
11562:
11560:
11557:
11555:
11552:
11551:
11549:
11547:
11543:
11537:
11534:
11532:
11529:
11527:
11524:
11522:
11519:
11517:
11514:
11512:
11509:
11507:
11504:
11503:
11501:
11497:
11491:
11488:
11486:
11483:
11479:
11476:
11474:
11471:
11469:
11468:
11460:
11459:
11458:
11455:
11453:
11450:
11449:
11447:
11443:
11437:
11434:
11432:
11429:
11427:
11424:
11423:
11421:
11419:
11415:
11405:
11402:
11400:
11397:
11395:
11392:
11390:
11387:
11386:
11384:
11380:
11374:
11371:
11369:
11366:
11364:
11361:
11359:
11356:
11354:
11353:Nonparametric
11351:
11349:
11346:
11345:
11343:
11339:
11333:
11330:
11328:
11325:
11323:
11320:
11318:
11315:
11314:
11312:
11308:
11302:
11299:
11297:
11294:
11292:
11289:
11287:
11284:
11282:
11279:
11278:
11276:
11274:
11270:
11267:
11265:
11259:
11253:
11250:
11248:
11245:
11243:
11240:
11238:
11235:
11234:
11232:
11230:
11226:
11220:
11217:
11215:
11212:
11209:
11208:Kendall's tau
11206:
11204:
11200:
11197:
11195:
11192:
11191:
11189:
11187:
11183:
11177:
11174:
11172:
11169:
11167:
11164:
11162:
11161:Least squares
11159:
11158:
11156:
11154:
11150:
11146:
11142:
11141:Least squares
11135:
11130:
11128:
11123:
11121:
11116:
11115:
11112:
11106:
11103:
11100:
11097:
11094:
11091:
11090:
11080:
11078:0-471-85845-5
11074:
11070:
11065:
11064:
11058:
11054:
11050:
11048:0-674-17544-1
11044:
11040:
11035:
11034:
11027:
11023:
11021:0-631-17837-6
11017:
11013:
11008:
11007:
10995:
10993:0-12-576830-3
10989:
10985:
10980:
10979:
10970:
10962:
10960:0-691-01018-8
10956:
10952:
10951:
10946:
10940:
10932:
10930:0-07-032679-7
10926:
10922:
10918:
10917:
10912:
10906:
10898:
10892:
10888:
10883:
10882:
10876:
10870:
10862:
10860:0-07-032679-7
10856:
10852:
10848:
10847:
10842:
10836:
10828:
10826:0-691-01018-8
10822:
10818:
10817:
10812:
10806:
10798:
10796:0-393-09931-8
10792:
10788:
10781:
10773:
10771:0-691-01018-8
10767:
10763:
10762:
10757:
10751:
10743:
10741:0-471-41754-8
10737:
10733:
10728:
10727:
10718:
10716:
10707:
10703:
10699:
10695:
10688:
10686:
10677:
10673:
10669:
10665:
10658:
10650:
10646:
10642:
10638:
10637:
10632:
10626:
10618:
10616:0-471-85845-5
10612:
10608:
10603:
10602:
10596:
10590:
10582:
10574:
10570:
10560:
10557:
10554:
10551:
10550:
10542:
10539:
10537:
10534:
10532:
10529:
10528:
10522:
10519:
10517:
10512:
10509:
10505:
10503:
10487:
10484:
10478:
10472:
10445:
10441:
10437:
10426:
10414:
10411:
10403:
10399:
10395:
10391:
10372:
10369:
10364:
10360:
10343:
10339:
10335:
10330:
10322:
10318:
10312:
10307:
10302:
10295:
10290:
10285:
10280:
10273:
10268:
10261:
10257:
10251:
10244:
10239:
10234:
10227:
10223:
10216:
10211:
10200:
10188:
10185:
10182:
10171:
10147:
10137:
10136:
10135:
10133:
10132:outer product
10123:
10121:
10117:
10111:
10109:
10104:
10053:
10050:
10036:
10033:
10026:
10025:
10024:
10022:
9990:
9988:
9984:
9980:
9976:
9957:
9954:
9951:
9948:
9945:
9942:for all
9931:
9926:
9914:
9910:
9906:
9901:
9898:
9893:
9884:
9874:
9861:
9857:
9853:
9848:
9845:
9840:
9831:
9815:
9811:
9807:
9802:
9799:
9794:
9785:
9776:
9771:
9762:
9758:
9754:
9749:
9735:
9725:
9724:
9723:
9721:
9720:inner product
9717:
9699:
9695:
9672:
9656:
9634:
9621:
9609:
9597:
9578:
9565:
9559:
9546:
9522:
9514:
9511:
9507:
9501:
9494:
9491:
9487:
9479:
9476:
9472:
9465:
9459:
9454:
9424:
9421:
9413:
9403:
9400:
9397:
9392:
9382:
9377:
9373:
9365:
9359:
9348:
9343:
9339:
9331:
9321:
9320:
9319:
9305:
9291:
9288:
9286:
9267:
9264:
9261:
9258:
9255:
9250:
9246:
9242:
9239:
9236:
9233:
9228:
9224:
9220:
9215:
9211:
9207:
9204:
9201:
9198:
9195:
9192:
9186:
9183:
9180:
9174:
9171:
9168:
9165:
9162:
9159:
9156:
9153:
9150:
9147:
9144:
9141:
9138:
9131:
9130:
9129:
9127:
9106:
9102:
9096:
9093:
9090:
9086:
9080:
9076:
9072:
9069:
9066:
9059:
9058:
9057:
9055:
9051:
9047:
9033:
9010:
9002:
8998:
8977:
8974:
8968:
8960:
8956:
8952:
8947:
8943:
8939:
8936:
8916:
8894:
8889:
8885:
8864:
8859:
8854:
8850:
8846:
8841:
8837:
8833:
8830:
8810:
8805:
8801:
8795:
8791:
8787:
8782:
8778:
8774:
8771:
8756:
8732:
8707:
8706:design matrix
8697:
8695:
8691:
8681:
8664:
8661:
8649:
8645:
8636:
8624:
8596:
8593:
8590:
8581:
8571:
8568:
8556:
8552:
8550:
8542:
8529:
8520:
8513:
8507:
8504:
8492:
8488:
8486:
8478:
8475:
8466:
8462:
8459:
8450:
8444:
8441:
8433:
8424:
8411:
8407:
8405:
8388:
8384:
8378:
8375:
8366:
8360:
8357:
8349:
8340:
8329:
8319:
8315:
8313:
8302:
8290:
8278:
8277:
8276:
8262:
8259:
8256:
8247:
8218:
8211:
8208:
8196:
8191:
8187:
8184:
8181:
8179:
8168:
8159:
8152:
8148:
8141:
8138:
8126:
8121:
8117:
8114:
8111:
8109:
8100:
8093:
8090:
8078:
8073:
8069:
8066:
8063:
8060:
8055:
8052:
8044:
8035:
8022:
8016:
8012:
8002:
7993:
7976:
7967:
7960:
7956:
7949:
7946:
7934:
7929:
7925:
7922:
7919:
7917:
7909:
7900:
7896:
7887:
7883:
7880:
7875:
7872:
7864:
7855:
7842:
7836:
7832:
7828:
7826:
7818:
7814:
7805:
7799:
7795:
7792:
7783:
7779:
7777:
7771:
7761:
7749:
7744:
7740:
7737:
7726:
7725:
7724:
7710:
7701:
7674:
7662:
7638:
7635:
7623:
7602:
7593:
7571:
7565:
7562:
7556:
7552:
7549:
7546:
7542:
7533:
7527:
7523:
7520:
7506:
7491:
7485:
7482:
7476:
7472:
7469:
7448:
7439:
7433:
7429:
7426:
7400:
7396:
7368:
7362:
7359:
7353:
7349:
7346:
7343:
7338:
7335:
7327:
7318:
7309:
7305:
7292:
7288:
7283:
7279:
7275:
7271:
7265:
7262:
7256:
7252:
7249:
7246:
7244:
7236:
7233:
7230:
7227:
7215:
7211:
7206:
7202:
7198:
7193:
7190:
7182:
7173:
7164:
7160:
7156:
7154:
7140:
7136:
7131:
7127:
7123:
7118:
7115:
7107:
7098:
7091:
7088:
7083:
7079:
7075:
7062:
7059:
7051:
7048:
7040:
7031:
7022:
7018:
7014:
7009:
7006:
6998:
6989:
6980:
6976:
6972:
6970:
6961:
6951:
6947:
6944:
6939:
6936:
6928:
6919:
6912:
6909:
6906:
6897:
6887:
6881:
6878:
6870:
6861:
6854:
6849:
6846:
6838:
6829:
6822:
6813:
6807:
6804:
6796:
6787:
6779:
6773:
6769:
6765:
6763:
6754:
6744:
6740:
6735:
6732:
6724:
6715:
6708:
6704:
6699:
6695:
6692:
6683:
6677:
6674:
6666:
6657:
6649:
6643:
6639:
6635:
6633:
6619:
6615:
6610:
6606:
6602:
6600:
6586:
6579:
6568:
6565:
6563:
6552:
6549:
6543:
6540:
6537:
6535:
6529:
6520:
6514:
6510:
6507:
6496:
6495:
6494:
6480:
6477:
6474:
6471:
6445:
6434:
6418:
6391:
6388:
6382:
6379:
6376:
6371:
6367:
6360:
6358:
6350:
6347:
6344:
6341:
6338:
6335:
6326:
6320:
6317:
6309:
6300:
6293:
6291:
6283:
6280:
6274:
6268:
6259:
6256:
6252:
6248:
6245:
6236:
6230:
6227:
6219:
6210:
6202:
6198:
6196:
6185:
6179:
6172:
6168:
6165:
6156:
6150:
6147:
6139:
6130:
6122:
6118:
6115:
6112:
6108:
6104:
6101:
6092:
6086:
6083:
6075:
6066:
6058:
6054:
6052:
6043:
6036:
6033:
6030:
6027:
6020:
6016:
6013:
6004:
5998:
5995:
5987:
5978:
5970:
5965:
5961:
5955:
5953:
5942:
5939:
5933:
5927:
5925:
5919:
5910:
5904:
5900:
5886:
5885:
5884:
5870:
5864:
5861:
5850:
5834:
5831:
5828:
5808:
5788:
5785:
5776:
5770:
5767:
5759:
5750:
5743:
5740:
5720:
5700:
5697:
5694:
5685:
5668:
5654:
5651:
5646:
5625:
5617:
5608:
5592:
5578:
5564:
5555:
5549:
5546:
5541:
5537:
5528:
5523:
5518:
5434:
5421:
5418:
5415:
5407:
5404:
5399:
5394:
5390:
5384:
5381:
5378:
5373:
5370:
5367:
5363:
5359:
5335:. Moreover,
5300:
5280:
5277:
5257:
5254:
5227:
5222:
5214:
5211:
5208:
5204:
5196:
5187:
5183:
5176:
5169:
5161:
5158:
5155:
5143:
5123:
5116:
5108:
5105:
5102:
5088:
5066:
5059:
5051:
5048:
5045:
5041:
5035:
5028:
5024:
5017:
5006:
4993:
4990:
4985:
4980:
4974:
4971:
4968:
4956:
4953:
4950:
4946:
4942:
4939:
4936:
4919:
4915:
4910:
4895:
4882:
4842:
4839:
4833:
4830:
4827:
4814:
4809:
4799:
4796:
4793:
4789:
4785:
4782:
4779:
4774:
4770:
4763:
4749:
4736:
4733:
4728:
4725:
4722:
4718:
4714:
4711:
4708:
4703:
4699:
4685:
4680:
4677:
4674:
4662:
4659:
4656:
4652:
4648:
4645:
4642:
4625:
4621:
4598:
4590:
4587:
4584:
4572:
4538:
4533:
4530:
4510:
4501:
4480:
4471:
4458:
4449:
4445:
4442:
4437:
4429:
4424:
4421:
4417:
4411:
4406:
4403:
4400:
4396:
4390:
4383:
4380:
4376:
4370:
4367:
4363:
4357:
4352:
4349:
4346:
4342:
4334:
4331:
4327:
4321:
4316:
4313:
4310:
4306:
4298:
4293:
4288:
4283:
4274:
4271:
4267:
4261:
4258:
4254:
4248:
4243:
4240:
4237:
4233:
4227:
4220:
4215:
4212:
4208:
4202:
4197:
4194:
4191:
4187:
4179:
4176:
4172:
4166:
4161:
4158:
4155:
4151:
4141:
4138:
4134:
4128:
4123:
4120:
4117:
4113:
4107:
4100:
4097:
4093:
4087:
4082:
4079:
4076:
4072:
4066:
4060:
4055:
4052:
4037:
4032:
4019:
4016:
4013:
4010:
4007:
4003:
3995:
3992:
3989:
3983:
3980:
3970:
3965:
3957:
3954:
3950:
3944:
3937:
3934:
3930:
3924:
3917:
3906:
3903:
3899:
3893:
3886:
3882:
3876:
3867:
3864:
3860:
3854:
3847:
3843:
3837:
3831:
3826:
3823:
3797:
3772:
3767:
3764:
3761:
3751:
3749:
3739:
3728:
3725:
3721:
3715:
3711:
3707:
3704:
3701:
3696:
3692:
3683:
3680:
3676:
3670:
3665:
3662:
3659:
3655:
3647:
3635:
3632:
3628:
3622:
3618:
3614:
3611:
3608:
3603:
3599:
3590:
3587:
3583:
3577:
3572:
3569:
3566:
3562:
3549:
3546:
3542:
3536:
3532:
3528:
3525:
3522:
3517:
3513:
3504:
3499:
3496:
3493:
3489:
3482:
3477:
3474:
3471:
3469:
3460:
3451:
3448:
3439:
3429:
3425:
3422:
3419:
3417:
3412:
3401:
3397:
3383:
3365:
3355:
3352:
3348:
3342:
3338:
3334:
3331:
3328:
3323:
3320:
3316:
3310:
3306:
3302:
3297:
3293:
3289:
3284:
3280:
3271:
3266:
3263:
3260:
3256:
3252:
3244:
3240:
3236:
3233:
3230:
3225:
3221:
3217:
3212:
3208:
3201:
3192:
3190:
3186:
3176:
3162:
3140:
3136:
3130:
3126:
3122:
3119:
3116:
3111:
3107:
3101:
3097:
3087:
3070:
3065:
3060:
3054:
3051:
3047:
3041:
3034:
3031:
3022:
3017:
3014:
3011:
3007:
3003:
2998:
2994:
2989:
2982:
2977:
2974:
2971:
2967:
2963:
2958:
2953:
2947:
2940:
2937:
2930:
2925:
2921:
2916:
2909:
2904:
2901:
2898:
2894:
2886:
2885:
2884:
2882:
2881:
2863:
2855:
2833:
2812:
2792:
2769:
2760:
2754:
2751:
2743:
2734:
2727:
2721:
2718:
2708:
2707:
2706:
2704:
2699:
2682:
2679:
2652:
2646:
2643:
2637:
2633:
2630:
2627:
2623:
2617:
2614:
2608:
2604:
2601:
2594:
2593:
2592:
2578:
2556:
2552:
2531:
2523:
2503:
2499:
2494:
2489:
2484:
2478:
2474:
2470:
2465:
2458:
2455:
2447:
2441:
2437:
2431:
2426:
2423:
2420:
2416:
2411:
2406:
2402:
2392:
2391:
2390:
2388:
2387:
2368:
2364:
2358:
2354:
2348:
2343:
2340:
2337:
2333:
2310:
2307:
2303:
2277:
2273:
2269:
2265:
2260:
2253:
2250:
2243:
2239:
2229:
2228:
2227:
2226:
2223:
2219:
2203:
2200:
2178:
2174:
2151:
2148:
2144:
2121:
2118:
2114:
2091:
2087:
2064:
2061:
2057:
2031:
2027:
2021:
2018:
2014:
2010:
2007:
2004:
1999:
1995:
1989:
1986:
1982:
1978:
1973:
1966:
1963:
1952:
1951:
1950:
1934:
1930:
1921:
1902:
1899:
1896:
1893:
1887:
1884:
1881:
1873:
1869:
1865:
1860:
1856:
1839:
1824:
1801:
1796:
1792:
1788:
1780:
1776:
1769:
1766:
1758:
1757:homoscedastic
1754:
1740:
1737:
1729:
1725:
1718:
1707:
1706:
1705:
1689:
1685:
1676:
1671:
1658:
1653:
1649:
1641:
1626:
1621:
1618:
1614:
1605:
1589:
1584:
1580:
1559:
1539:
1536:
1528:
1525:
1522:
1516:
1512:
1489:
1486:
1483:
1479:
1470:
1452:
1448:
1425:
1421:
1398:
1394:
1371:
1368:
1364:
1355:
1337:
1333:
1309:
1306:
1303:
1300:
1297:
1294:
1291:
1288:
1285:
1276:
1272:
1268:
1263:
1260:
1256:
1250:
1246:
1240:
1235:
1232:
1229:
1225:
1221:
1216:
1212:
1204:
1203:
1202:
1180:
1177:
1174:
1164:
1161:
1151:
1141:
1138:
1135:
1130:
1120:
1117:
1114:
1111:
1104:
1101:
1098:
1095:
1092:
1089:
1086:
1079:
1078:
1077:
1069:
1055:
1029:
1017:
1002:
998:
994:
987:
983:
973:
969:
965:
962:
951:
947:
943:
934:
923:
907:
903:
899:
894:
890:
869:
860:
844:
840:
836:
826:
820:
816:
811:
807:
803:
793:
787:
766:
740:
728:
714:
694:
674:
671:
668:
665:
662:
653:
630:
610:
588:
578:
575:
567:
553:
551:
548:was given by
547:
543:
542:Andrey Markov
539:
534:
532:
528:
524:
520:
519:homoscedastic
516:
512:
508:
504:
500:
496:
492:
488:
485:
482:
478:
474:
470:
466:
465:Gauss theorem
462:
458:
446:
441:
439:
434:
432:
427:
426:
424:
423:
418:
413:
408:
407:
406:
405:
400:
397:
395:
392:
390:
387:
385:
382:
380:
377:
375:
372:
371:
370:
369:
365:
364:
359:
356:
354:
351:
349:
346:
344:
341:
339:
336:
335:
334:
333:
328:
325:
323:
320:
318:
315:
313:
310:
308:
305:
304:
303:
302:
297:
294:
292:
289:
287:
284:
282:
279:
278:
277:
276:
271:
268:
266:
263:
261:
260:Least squares
258:
257:
256:
255:
251:
250:
245:
242:
241:
240:
239:
234:
231:
229:
226:
224:
221:
219:
216:
214:
211:
209:
206:
204:
201:
199:
196:
194:
193:Nonparametric
191:
189:
186:
185:
184:
183:
178:
175:
173:
170:
168:
165:
163:
162:Fixed effects
160:
158:
155:
154:
153:
152:
147:
144:
142:
139:
137:
136:Ordered logit
134:
132:
129:
127:
124:
122:
119:
117:
114:
112:
109:
107:
104:
102:
99:
97:
94:
92:
89:
87:
84:
83:
82:
81:
76:
73:
71:
68:
66:
63:
61:
58:
57:
56:
55:
51:
50:
47:
44:
43:
39:
38:
33:
29:
22:
11624:Applications
11510:
11463:
11341:Non-standard
11062:
11057:Theil, Henri
11032:
11011:
10977:
10969:
10950:Econometrics
10949:
10939:
10915:
10905:
10880:
10869:
10845:
10835:
10816:Econometrics
10815:
10805:
10786:
10780:
10761:Econometrics
10760:
10750:
10725:
10697:
10693:
10667:
10663:
10657:
10640:
10634:
10625:
10600:
10595:Theil, Henri
10589:
10580:
10573:
10520:
10513:
10506:
10387:
10129:
10112:
10105:
10068:
9996:
9983:simultaneity
9972:
9657:
9544:
9439:
9297:
9289:
9282:
9123:
9048:
8762:
8731:econometrics
8703:
8687:
8615:
8238:
7512:
7388:
6435:observable,
6432:
6410:
5848:
5674:
5579:
5435:
5007:
4883:
4750:
4472:
4033:
3381:
3193:
3184:
3182:
3088:
3085:
2877:
2852:denotes the
2784:
2702:
2700:
2668:
2521:
2518:
2384:
2294:
2221:
2048:
1919:
1917:
1675:Gauss–Markov
1674:
1672:
1639:
1603:
1353:
1324:
1200:
1075:
1018:
924:
861:
729:
559:
535:
515:uncorrelated
499:uncorrelated
464:
460:
454:
398:
317:Non-negative
10670:: 105–116.
10398:inefficient
2325:. Now, let
779:, that is,
533:estimator.
475:within the
463:(or simply
327:Regularized
291:Generalized
223:Least angle
121:Mixed logit
11499:Background
11462:Mallows's
10636:Biometrika
10566:References
10356:with
10069:Otherwise
9979:endogenous
9716:orthogonal
531:degenerate
487:estimators
457:statistics
366:Background
270:Non-linear
252:Estimation
11574:Numerical
10700:: 42–48.
10479:ε
10442:σ
10427:∣
10423:ε
10415:
10392:) and no
10361:σ
10340:σ
10319:σ
10313:⋯
10296:⋮
10291:⋱
10286:⋮
10281:⋮
10269:⋯
10258:σ
10240:⋯
10224:σ
10201:∣
10197:ε
10189:
10172:∣
10162:ε
10156:ε
10148:
10037:
9993:Full rank
9977:, or are
9955:∈
9911:ε
9907:⋅
9885:
9875:⋮
9858:ε
9854:⋅
9832:
9812:ε
9808:⋅
9786:
9759:ε
9755:⋅
9736:
9696:ε
9610:⋯
9502:⋯
9401:…
9383:∣
9374:ε
9366:
9349:∣
9340:ε
9332:
9268:ε
9259:
9247:β
9237:
9225:β
9212:β
9205:ε
9196:
9187:α
9184:−
9169:
9163:α
9154:
9142:
9107:ε
9097:α
9094:−
9081:α
8999:β
8975:⋅
8957:β
8944:β
8917:γ
8886:β
8851:β
8838:β
8792:β
8779:β
8759:Linearity
8665:^
8662:β
8650:ℓ
8640:~
8637:β
8625:ℓ
8591:ℓ
8572:^
8569:β
8557:ℓ
8530:ℓ
8508:^
8505:β
8493:ℓ
8467:ℓ
8442:−
8412:ℓ
8358:−
8320:ℓ
8306:~
8303:β
8291:ℓ
8257:ℓ
8212:^
8209:β
8197:ℓ
8188:
8182:≥
8172:‖
8169:ℓ
8156:‖
8142:^
8139:β
8127:ℓ
8118:
8094:^
8091:β
8079:ℓ
8070:
8061:ℓ
8053:−
8023:ℓ
8013:σ
8003:ℓ
7977:ℓ
7950:^
7947:β
7935:ℓ
7926:
7910:ℓ
7888:ℓ
7881:ℓ
7873:−
7843:ℓ
7833:σ
7819:ℓ
7809:~
7806:β
7796:
7784:ℓ
7765:~
7762:β
7750:ℓ
7741:
7711:β
7702:ℓ
7678:~
7675:β
7663:ℓ
7639:^
7636:β
7624:ℓ
7603:β
7594:ℓ
7566:^
7563:β
7553:
7547:−
7537:~
7534:β
7524:
7486:^
7483:β
7473:
7443:~
7440:β
7430:
7363:^
7360:β
7350:
7336:−
7306:σ
7280:σ
7266:^
7263:β
7253:
7203:σ
7191:−
7161:σ
7128:σ
7116:−
7080:σ
7049:−
7019:σ
7007:−
6977:σ
6937:−
6879:−
6847:−
6805:−
6770:σ
6733:−
6675:−
6640:σ
6607:σ
6573: Var
6544:
6524:~
6521:β
6511:
6449:~
6446:β
6419:β
6389:β
6351:β
6339:β
6318:−
6275:ε
6269:
6260:β
6228:−
6186:ε
6180:
6148:−
6116:β
6084:−
6037:ε
6031:β
5996:−
5962:
5934:
5914:~
5911:β
5901:
5865:^
5862:β
5832:×
5768:−
5721:β
5689:~
5686:β
5652:≥
5643:‖
5629:‖
5547:−
5515:β
5416:λ
5412:⟹
5364:∑
5301:λ
5258:λ
5197:⋮
5144:⋯
5089:⋮
5036:⋯
4940:⋯
4905:⟹
4896:≠
4840:×
4815:∈
4783:…
4712:⋯
4695:⟺
4646:⋯
4573:⋯
4397:∑
4391:⋯
4343:∑
4307:∑
4299:⋮
4294:⋱
4289:⋮
4284:⋮
4234:∑
4228:⋯
4188:∑
4152:∑
4114:∑
4108:⋯
4073:∑
4011:≥
3984:×
3971:∈
3945:⋯
3918:⋮
3894:⋯
3855:⋯
3712:β
3708:−
3705:⋯
3702:−
3656:∑
3648:⋮
3619:β
3615:−
3612:⋯
3609:−
3563:∑
3533:β
3529:−
3526:⋯
3523:−
3490:∑
3475:−
3456:β
3449:−
3423:−
3406:β
3339:β
3335:−
3332:⋯
3329:−
3307:β
3303:−
3294:β
3290:−
3257:∑
3241:β
3234:…
3222:β
3209:β
3185:minimizes
3163:β
3120:⋯
3035:^
3032:β
3008:∑
3004:−
2968:∑
2941:^
2931:−
2895:∑
2880:residuals
2854:transpose
2752:−
2722:^
2719:β
2683:~
2680:β
2647:^
2644:β
2634:
2628:−
2618:~
2615:β
2605:
2579:λ
2553:β
2532:β
2475:β
2471:−
2459:^
2456:β
2438:λ
2417:∑
2403:
2365:β
2355:λ
2334:∑
2274:β
2254:^
2251:β
2240:
2201:ε
2088:β
2008:⋯
1967:^
1964:β
1931:β
1897:≠
1891:∀
1870:ε
1857:ε
1805:∞
1793:σ
1777:ε
1770:
1755:They are
1726:ε
1719:
1686:ε
1480:β
1449:ε
1395:ε
1334:β
1304:…
1283:∀
1273:ε
1247:β
1226:∑
1178:×
1165:∈
1142:∈
1139:β
1121:∈
1118:ε
1102:ε
1096:β
1072:Statement
1033:^
999:μ
984:σ
970:μ
966:−
948:σ
938:^
904:σ
891:μ
841:σ
830:^
821:σ
808:μ
797:^
788:μ
744:^
715:μ
695:α
675:μ
666:α
657:^
579:∈
489:, if the
233:Segmented
11706:Category
11404:Logistic
11394:Binomial
11373:Isotonic
11368:Quantile
10947:(2000).
10913:(1972).
10877:(2012).
10843:(1972).
10813:(2000).
10758:(2000).
10525:See also
9298:For all
8990:, where
7462:exceeds
6493:. Then:
5849:unbiased
4751:Now let
4613:, then
2222:unbiased
1817:for all
1552:for all
484:unbiased
348:Bayesian
286:Weighted
281:Ordinary
213:Isotonic
208:Quantile
11399:Poisson
10921:159–168
10851:267–291
10676:4025782
10118:or the
2825:(where
1640:but not
572:,
501:, have
493:in the
307:Partial
146:Poisson
11363:Robust
11075:
11071:–162.
11045:
11041:–169.
11018:
10990:
10986:–351.
10957:
10927:
10893:
10857:
10823:
10793:
10768:
10738:
10734:–147.
10674:
10613:
10609:–124.
10561:(MVUE)
10555:(BLUP)
9440:where
8694:Aitken
7389:Since
5801:where
5293:where
3789:where
3179:Remark
1325:where
1019:since
623:given
513:(only
507:normal
491:errors
481:linear
459:, the
265:Linear
203:Robust
126:Probit
52:Models
10404:: if
5821:is a
5733:with
5671:Proof
477:class
312:Total
228:Local
11143:and
11073:ISBN
11043:ISBN
11016:ISBN
10988:ISBN
10955:ISBN
10925:ISBN
10891:ISBN
10855:ISBN
10821:ISBN
10791:ISBN
10766:ISBN
10736:ISBN
10672:OCLC
10611:ISBN
10502:ball
10370:>
10130:The
10034:rank
10021:rank
9714:are
9687:and
8688:The
5675:Let
5419:>
5405:>
5278:>
4991:>
4034:The
2805:and
2701:The
1802:<
1673:The
1604:only
707:and
540:and
497:are
32:Blue
11478:BIC
11473:AIC
11069:101
11039:160
10984:330
10887:220
10732:127
10702:doi
10645:doi
10607:119
10412:Var
10186:Var
8185:Var
8115:Var
8067:Var
7923:Var
7793:Var
7738:Var
7615:is
7550:Var
7521:Var
7470:Var
7427:Var
7347:Var
7250:Var
6541:Var
6508:Var
6431:is
4881:.
2856:of
2785:of
2631:Var
2602:Var
1922:of
1849:Cov
1837:and
1767:Var
479:of
455:In
11708::
10923:.
10889:.
10853:.
10714:^
10698:55
10696:.
10684:^
10666:.
10641:36
10639:.
10023:.
9425:0.
9287:.
9256:ln
9234:ln
9193:ln
9166:ln
9151:ln
9139:ln
9046:.
8755:.
7723:.
6433:un
3887:21
3848:11
2698:.
1918:A
1741:0.
1704::
1354:un
1068:.
859:.
552:.
11466:p
11464:C
11210:)
11201:(
11133:e
11126:t
11119:v
11081:.
11051:.
11024:.
10996:.
10963:.
10933:.
10899:.
10863:.
10829:.
10799:.
10774:.
10744:.
10708:.
10704::
10678:.
10668:2
10651:.
10647::
10619:.
10488:c
10485:=
10482:)
10476:(
10473:f
10452:I
10446:2
10438:=
10435:]
10431:X
10418:[
10373:0
10365:2
10350:I
10344:2
10336:=
10331:]
10323:2
10308:0
10303:0
10274:0
10262:2
10252:0
10245:0
10235:0
10228:2
10217:[
10212:=
10209:]
10205:X
10192:[
10183:=
10180:]
10176:X
10167:T
10151:[
10145:E
10090:X
10084:T
10079:X
10054:k
10051:=
10048:)
10044:X
10040:(
10006:X
9958:n
9952:j
9949:,
9946:i
9936:0
9932:=
9927:]
9921:]
9915:i
9902:k
9899:j
9894:x
9888:[
9882:E
9868:]
9862:i
9849:2
9846:j
9841:x
9835:[
9829:E
9822:]
9816:i
9803:1
9800:j
9795:x
9789:[
9783:E
9777:[
9772:=
9769:]
9763:i
9750:j
9745:x
9739:[
9733:E
9700:i
9673:i
9668:x
9641:T
9635:]
9627:T
9622:n
9617:x
9603:T
9598:2
9593:x
9584:T
9579:1
9574:x
9566:[
9560:=
9556:X
9545:i
9529:T
9523:]
9515:k
9512:i
9508:x
9495:2
9492:i
9488:x
9480:1
9477:i
9473:x
9466:[
9460:=
9455:i
9450:x
9422:=
9419:]
9414:n
9409:x
9404:,
9398:,
9393:1
9388:x
9378:i
9369:[
9363:E
9360:=
9357:]
9353:X
9344:i
9335:[
9329:E
9306:n
9265:+
9262:K
9251:2
9243:+
9240:L
9229:1
9221:+
9216:0
9208:=
9202:+
9199:K
9190:)
9181:1
9178:(
9175:+
9172:L
9160:+
9157:A
9148:=
9145:Y
9103:e
9091:1
9087:K
9077:L
9073:A
9070:=
9067:Y
9034:x
9014:)
9011:x
9008:(
9003:1
8978:x
8972:)
8969:x
8966:(
8961:1
8953:+
8948:0
8940:=
8937:y
8895:2
8890:1
8865:x
8860:2
8855:1
8847:+
8842:0
8834:=
8831:y
8811:,
8806:2
8802:x
8796:1
8788:+
8783:0
8775:=
8772:y
8742:X
8716:X
8654:T
8646:=
8629:T
8597:0
8594:=
8586:T
8582:D
8561:T
8553:=
8543:Y
8538:T
8534:)
8525:T
8521:D
8517:(
8514:+
8497:T
8489:=
8479:Y
8476:D
8471:T
8463:+
8460:Y
8455:T
8451:X
8445:1
8438:)
8434:X
8429:T
8425:X
8421:(
8416:T
8408:=
8389:)
8385:Y
8382:)
8379:D
8376:+
8371:T
8367:X
8361:1
8354:)
8350:X
8345:T
8341:X
8337:(
8334:(
8330:(
8324:T
8316:=
8295:T
8263:0
8260:=
8252:T
8248:D
8219:)
8201:T
8192:(
8164:T
8160:D
8153:+
8149:)
8131:T
8122:(
8112:=
8101:)
8083:T
8074:(
8064:=
8056:1
8049:)
8045:X
8040:T
8036:X
8032:(
8027:T
8017:2
8006:)
7998:T
7994:D
7990:(
7985:T
7981:)
7972:T
7968:D
7964:(
7961:+
7957:)
7939:T
7930:(
7920:=
7905:T
7901:D
7897:D
7892:T
7884:+
7876:1
7869:)
7865:X
7860:T
7856:X
7852:(
7847:T
7837:2
7829:=
7815:)
7800:(
7788:T
7780:=
7772:)
7754:T
7745:(
7706:T
7667:T
7628:T
7598:T
7572:)
7557:(
7543:)
7528:(
7492:)
7477:(
7449:)
7434:(
7405:T
7401:D
7397:D
7369:)
7354:(
7344:=
7339:1
7332:)
7328:X
7323:T
7319:X
7315:(
7310:2
7297:T
7293:D
7289:D
7284:2
7276:+
7272:)
7257:(
7247:=
7237:0
7234:=
7231:X
7228:D
7220:T
7216:D
7212:D
7207:2
7199:+
7194:1
7187:)
7183:X
7178:T
7174:X
7170:(
7165:2
7157:=
7145:T
7141:D
7137:D
7132:2
7124:+
7119:1
7112:)
7108:X
7103:T
7099:X
7095:(
7092:X
7089:D
7084:2
7076:+
7071:T
7067:)
7063:X
7060:D
7057:(
7052:1
7045:)
7041:X
7036:T
7032:X
7028:(
7023:2
7015:+
7010:1
7003:)
6999:X
6994:T
6990:X
6986:(
6981:2
6973:=
6962:)
6956:T
6952:D
6948:D
6945:+
6940:1
6933:)
6929:X
6924:T
6920:X
6916:(
6913:X
6910:D
6907:+
6902:T
6898:D
6892:T
6888:X
6882:1
6875:)
6871:X
6866:T
6862:X
6858:(
6855:+
6850:1
6843:)
6839:X
6834:T
6830:X
6826:(
6823:X
6818:T
6814:X
6808:1
6801:)
6797:X
6792:T
6788:X
6784:(
6780:(
6774:2
6766:=
6755:)
6749:T
6745:D
6741:+
6736:1
6729:)
6725:X
6720:T
6716:X
6712:(
6709:X
6705:(
6700:)
6696:D
6693:+
6688:T
6684:X
6678:1
6671:)
6667:X
6662:T
6658:X
6654:(
6650:(
6644:2
6636:=
6624:T
6620:C
6616:C
6611:2
6603:=
6591:T
6587:C
6583:)
6580:y
6577:(
6569:C
6566:=
6556:)
6553:y
6550:C
6547:(
6538:=
6530:)
6515:(
6481:0
6478:=
6475:X
6472:D
6392:.
6386:)
6383:X
6380:D
6377:+
6372:K
6368:I
6364:(
6361:=
6348:X
6345:D
6342:+
6336:X
6331:T
6327:X
6321:1
6314:)
6310:X
6305:T
6301:X
6297:(
6294:=
6284:0
6281:=
6278:]
6272:[
6266:E
6257:X
6253:)
6249:D
6246:+
6241:T
6237:X
6231:1
6224:)
6220:X
6215:T
6211:X
6207:(
6203:(
6199:=
6189:]
6183:[
6177:E
6173:)
6169:D
6166:+
6161:T
6157:X
6151:1
6144:)
6140:X
6135:T
6131:X
6127:(
6123:(
6119:+
6113:X
6109:)
6105:D
6102:+
6097:T
6093:X
6087:1
6080:)
6076:X
6071:T
6067:X
6063:(
6059:(
6055:=
6044:]
6040:)
6034:+
6028:X
6025:(
6021:)
6017:D
6014:+
6009:T
6005:X
5999:1
5992:)
5988:X
5983:T
5979:X
5975:(
5971:(
5966:[
5959:E
5956:=
5946:]
5943:y
5940:C
5937:[
5931:E
5928:=
5920:]
5905:[
5898:E
5871:,
5835:n
5829:K
5809:D
5789:D
5786:+
5781:T
5777:X
5771:1
5764:)
5760:X
5755:T
5751:X
5747:(
5744:=
5741:C
5701:y
5698:C
5695:=
5655:0
5647:2
5638:v
5633:X
5626:=
5622:v
5618:X
5613:T
5609:X
5603:T
5598:v
5593:,
5589:v
5565:Y
5560:T
5556:X
5550:1
5542:)
5538:X
5533:T
5529:X
5524:(
5519:=
5492:H
5468:H
5445:k
5422:0
5408:0
5400:2
5395:i
5391:k
5385:1
5382:+
5379:p
5374:1
5371:=
5368:i
5360:=
5356:k
5350:T
5345:k
5322:k
5281:0
5274:k
5268:T
5263:k
5255:=
5251:k
5245:H
5238:T
5233:k
5228:=
5223:]
5215:1
5212:+
5209:p
5205:k
5188:1
5184:k
5177:[
5170:]
5162:1
5159:+
5156:p
5151:v
5136:1
5132:v
5124:[
5117:]
5109:1
5106:+
5103:p
5098:v
5079:1
5075:v
5067:[
5060:]
5052:1
5049:+
5046:p
5042:k
5029:1
5025:k
5018:[
4994:0
4986:2
4981:)
4975:1
4972:+
4969:p
4964:v
4957:1
4954:+
4951:p
4947:k
4943:+
4937:+
4931:1
4927:v
4920:1
4916:k
4911:(
4900:0
4892:k
4867:H
4843:1
4837:)
4834:1
4831:+
4828:p
4825:(
4820:R
4810:T
4806:)
4800:1
4797:+
4794:p
4790:k
4786:,
4780:,
4775:1
4771:k
4767:(
4764:=
4760:k
4737:0
4734:=
4729:1
4726:+
4723:p
4719:k
4715:=
4709:=
4704:1
4700:k
4690:0
4686:=
4681:1
4678:+
4675:p
4670:v
4663:1
4660:+
4657:p
4653:k
4649:+
4643:+
4637:1
4633:v
4626:1
4622:k
4599:]
4591:1
4588:+
4585:p
4580:v
4565:2
4561:v
4551:1
4547:v
4539:[
4534:=
4531:X
4511:X
4506:T
4502:X
4481:X
4459:X
4454:T
4450:X
4446:2
4443:=
4438:]
4430:2
4425:p
4422:i
4418:x
4412:n
4407:1
4404:=
4401:i
4384:1
4381:i
4377:x
4371:p
4368:i
4364:x
4358:n
4353:1
4350:=
4347:i
4335:p
4332:i
4328:x
4322:n
4317:1
4314:=
4311:i
4275:p
4272:i
4268:x
4262:1
4259:i
4255:x
4249:n
4244:1
4241:=
4238:i
4221:2
4216:1
4213:i
4209:x
4203:n
4198:1
4195:=
4192:i
4180:1
4177:i
4173:x
4167:n
4162:1
4159:=
4156:i
4142:p
4139:i
4135:x
4129:n
4124:1
4121:=
4118:i
4101:1
4098:i
4094:x
4088:n
4083:1
4080:=
4077:i
4067:n
4061:[
4056:2
4053:=
4048:H
4020:1
4017:+
4014:p
4008:n
4004:;
3999:)
3996:1
3993:+
3990:p
3987:(
3981:n
3976:R
3966:]
3958:p
3955:n
3951:x
3938:1
3935:n
3931:x
3925:1
3907:p
3904:2
3900:x
3883:x
3877:1
3868:p
3865:1
3861:x
3844:x
3838:1
3832:[
3827:=
3824:X
3802:T
3798:X
3773:,
3768:1
3765:+
3762:p
3757:0
3752:=
3740:]
3734:)
3729:p
3726:i
3722:x
3716:p
3697:i
3693:y
3689:(
3684:p
3681:i
3677:x
3671:n
3666:1
3663:=
3660:i
3641:)
3636:p
3633:i
3629:x
3623:p
3604:i
3600:y
3596:(
3591:1
3588:i
3584:x
3578:n
3573:1
3570:=
3567:i
3555:)
3550:p
3547:i
3543:x
3537:p
3518:i
3514:y
3510:(
3505:n
3500:1
3497:=
3494:i
3483:[
3478:2
3472:=
3461:)
3452:X
3445:y
3440:(
3434:T
3430:X
3426:2
3420:=
3413:f
3402:d
3398:d
3382:p
3366:2
3362:)
3356:p
3353:i
3349:x
3343:p
3324:1
3321:i
3317:x
3311:1
3298:0
3285:i
3281:y
3277:(
3272:n
3267:1
3264:=
3261:i
3253:=
3250:)
3245:p
3237:,
3231:,
3226:1
3218:,
3213:0
3205:(
3202:f
3141:n
3137:y
3131:n
3127:a
3123:+
3117:+
3112:1
3108:y
3102:1
3098:a
3071:.
3066:2
3061:)
3055:j
3052:i
3048:X
3042:j
3023:K
3018:1
3015:=
3012:j
2999:i
2995:y
2990:(
2983:n
2978:1
2975:=
2972:i
2964:=
2959:2
2954:)
2948:i
2938:y
2926:i
2922:y
2917:(
2910:n
2905:1
2902:=
2899:i
2864:X
2838:T
2834:X
2813:X
2793:y
2770:y
2765:T
2761:X
2755:1
2748:)
2744:X
2739:T
2735:X
2731:(
2728:=
2653:)
2638:(
2624:)
2609:(
2557:j
2504:,
2500:]
2495:2
2490:)
2485:)
2479:j
2466:j
2448:(
2442:j
2432:K
2427:1
2424:=
2421:j
2412:(
2407:[
2400:E
2369:j
2359:j
2349:K
2344:1
2341:=
2338:j
2311:j
2308:i
2304:X
2278:j
2270:=
2266:]
2261:j
2244:[
2237:E
2204:,
2179:i
2175:y
2152:j
2149:i
2145:X
2122:j
2119:i
2115:X
2092:j
2065:j
2062:i
2058:c
2032:k
2028:y
2022:j
2019:k
2015:c
2011:+
2005:+
2000:1
1996:y
1990:j
1987:1
1983:c
1979:=
1974:j
1935:j
1903:.
1900:j
1894:i
1888:,
1885:0
1882:=
1879:)
1874:j
1866:,
1861:i
1853:(
1825:i
1797:2
1789:=
1786:)
1781:i
1773:(
1738:=
1735:]
1730:i
1722:[
1716:E
1690:i
1659:.
1654:i
1650:y
1627:,
1622:j
1619:i
1615:X
1590:,
1585:i
1581:y
1560:i
1540:1
1537:=
1532:)
1529:1
1526:+
1523:K
1520:(
1517:i
1513:X
1490:1
1487:+
1484:K
1453:i
1426:i
1422:y
1399:i
1372:j
1369:i
1365:X
1338:j
1310:n
1307:,
1301:,
1298:2
1295:,
1292:1
1289:=
1286:i
1277:i
1269:+
1264:j
1261:i
1257:X
1251:j
1241:K
1236:1
1233:=
1230:j
1222:=
1217:i
1213:y
1186:)
1181:K
1175:n
1170:R
1162:X
1152:K
1147:R
1136:,
1131:n
1126:R
1115:,
1112:y
1109:(
1105:,
1099:+
1093:X
1090:=
1087:y
1056:Y
1030:Y
1003:y
995:+
988:x
979:)
974:x
963:X
960:(
952:y
944:=
935:Y
908:x
900:,
895:x
870:X
845:Y
837:=
827:Y
817:,
812:Y
804:=
794:Y
767:Y
741:Y
672:+
669:X
663:=
654:Y
631:X
611:Y
589:k
584:R
576:Y
568:X
444:e
437:t
430:v
34:.
23:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.