1705:
1376:
4187:
model, and as a general rule, their exact distributions cannot be derived analytically. For finite samples, FGLS may be less efficient than OLS in some cases. Thus, while GLS can be made feasible, it is not always wise to apply this method when the sample is small. A method used to improve the accuracy of the estimators in finite samples is to iterate; that is, to take the residuals from FGLS to update the errors' covariance estimator and then update the FGLS estimation, applying the same idea iteratively until the estimators vary less than some tolerance. However, this method does not necessarily improve the efficiency of the estimator very much if the original sample was small.
1700:{\displaystyle {\begin{aligned}{\hat {\boldsymbol {\beta }}}&={\underset {\mathbf {b} }{\operatorname {argmin} }}\,(\mathbf {y} -\mathbf {X} \mathbf {b} )^{\mathrm {T} }\mathbf {\Omega } ^{-1}(\mathbf {y} -\mathbf {X} \mathbf {b} )\\&={\underset {\mathbf {b} }{\operatorname {argmin} }}\,\mathbf {y} ^{\mathrm {T} }\,\mathbf {\Omega } ^{-1}\mathbf {y} +(\mathbf {X} \mathbf {b} )^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {X} \mathbf {b} -\mathbf {y} ^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {X} \mathbf {b} -(\mathbf {X} \mathbf {b} )^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {y} \,,\end{aligned}}}
401:
1898:
3868:
2677:
1710:
3709:
3022:
914:
2492:
3279:
3623:
4069:
2842:
1893:{\displaystyle {\hat {\boldsymbol {\beta }}}={\underset {\mathbf {b} }{\operatorname {argmin} }}\,\mathbf {y} ^{\mathrm {T} }\,\mathbf {\Omega } ^{-1}\mathbf {y} +\mathbf {b} ^{\mathrm {T} }\mathbf {X} ^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {X} \mathbf {b} -2\mathbf {b} ^{\mathrm {T} }\mathbf {X} ^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {y} ,}
4183:
variance of the estimator robust to heteroscedasticity or serial autocorrelation. However, for large samples, FGLS is preferred over OLS under heteroskedasticity or serial correlation. A cautionary note is that the FGLS estimator is not always consistent. One case in which FGLS might be inconsistent is if there are individual-specific fixed effects.
693:
3863:{\displaystyle {\hat {\boldsymbol {\beta }}}={\underset {\mathbf {b} }{\operatorname {argmax} }}\;p(\mathbf {b} |{\boldsymbol {\varepsilon }})={\underset {\mathbf {b} }{\operatorname {argmax} }}\;\log p(\mathbf {b} |{\boldsymbol {\varepsilon }})={\underset {\mathbf {b} }{\operatorname {argmax} }}\;\log p({\boldsymbol {\varepsilon }}|\mathbf {b} ),}
3138:
3483:
2135:
5218:
2360:
4917:
2672:{\displaystyle \mathbf {y} ^{*}=\mathbf {X} ^{*}{\boldsymbol {\beta }}+{\boldsymbol {\varepsilon }}^{*},\quad {\text{where}}\quad \mathbf {y} ^{*}=\mathbf {C} ^{-1}\mathbf {y} ,\quad \mathbf {X} ^{*}=\mathbf {C} ^{-1}\mathbf {X} ,\quad {\boldsymbol {\varepsilon }}^{*}=\mathbf {C} ^{-1}{\boldsymbol {\varepsilon }}.}
1200:
4678:
3390:
5392:
3944:
2012:
4162:
The model is estimated by OLS or another consistent (but inefficient) estimator, and the residuals are used to build a consistent estimator of the errors covariance matrix (to do so, one often needs to examine the model adding additional constraints; for example, if the errors follow a time series
4186:
In general, this estimator has different properties than GLS. For large samples (i.e., asymptotically), all properties are (under appropriate conditions) common with respect to GLS, but for finite samples, the properties of FGLS estimators are unknown: they vary dramatically with each particular
4182:
more efficient (provided the errors covariance matrix is consistently estimated), but for a small to medium-sized sample, it can be actually less efficient than OLS. This is why some authors prefer to use OLS and reformulate their inferences by simply considering an alternative estimator for the
3017:{\displaystyle \left(\mathbf {y} ^{*}-\mathbf {X} ^{*}{\boldsymbol {\beta }}\right)^{\mathrm {T} }(\mathbf {y} ^{*}-\mathbf {X} ^{*}{\boldsymbol {\beta }})=(\mathbf {y} -\mathbf {X} \mathbf {b} )^{\mathrm {T} }\,\mathbf {\Omega } ^{-1}(\mathbf {y} -\mathbf {X} \mathbf {b} ).}
2017:
2789:
909:{\displaystyle \mathbf {X} \equiv {\begin{pmatrix}1&x_{12}&x_{13}&\cdots &x_{1k}\\1&x_{22}&x_{23}&\cdots &x_{2k}\\\vdots &\vdots &\vdots &\ddots &\vdots \\1&x_{n2}&x_{n3}&\cdots &x_{nk}\end{pmatrix}},}
2210:
1087:
5538:
3274:{\displaystyle p({\boldsymbol {\varepsilon }}|\mathbf {b} )={\frac {1}{\sqrt {(2\pi )^{n}\det {\boldsymbol {\Omega }}}}}\exp \left(-{\frac {1}{2}}{\boldsymbol {\varepsilon }}^{\mathrm {T} }{\boldsymbol {\Omega }}^{-1}{\boldsymbol {\varepsilon }}\right).}
4263:
can be used instead. This approach is much safer, and it is the appropriate path to take unless the sample is large, where "large" is sometimes a slippery issue (e.g., if the error distribution is asymmetric the required sample will be much larger).
3288:
5022:
3618:{\displaystyle \log p(\mathbf {b} |{\boldsymbol {\varepsilon }})=\log p({\boldsymbol {\varepsilon }}|\mathbf {b} )+\cdots =-{\frac {1}{2}}{\boldsymbol {\varepsilon }}^{\mathrm {T} }{\boldsymbol {\Omega }}^{-1}{\boldsymbol {\varepsilon }}+\cdots ,}
1907:
4353:
4775:
4545:
4064:{\displaystyle {\hat {\boldsymbol {\beta }}}={\underset {\mathbf {b} }{\operatorname {argmin} }}\;{\frac {1}{2}}(\mathbf {y} -\mathbf {X} \mathbf {b} )^{\mathrm {T} }{\boldsymbol {\Omega }}^{-1}(\mathbf {y} -\mathbf {X} \mathbf {b} ).}
2455:
5226:
684:
5647:
4436:
2408:
5900:
5014:
3692:
595:
4250:
1237:
2682:
3074:
A special case of GLS, called weighted least squares (WLS), occurs when all the off-diagonal entries of Ω are 0. This situation arises when the variances of the observed values are unequal or when
3917:
1345:
1381:
4763:
487:
for the residuals. If this is unknown, estimating the covariance matrix gives the method of feasible generalized least squares (FGLS). However, FGLS provides fewer guarantees of improvement.
3456:
2130:{\displaystyle {\hat {\boldsymbol {\beta }}}=\left(\mathbf {X} ^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {X} \right)^{-1}\mathbf {X} ^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {y} .}
3939:
3105:
that the errors are independent and normally distributed with zero mean and common variance. In GLS, the prior is generalized to the case where errors may not be independent and may have
4537:
5435:
2167:
3057:
2837:
1367:
1284:
5424:
5213:{\displaystyle {\widehat {\Omega }}_{FGLS1}=\operatorname {diag} ({\widehat {\sigma }}_{FGLS1,1}^{2},{\widehat {\sigma }}_{FGLS1,2}^{2},\dots ,{\widehat {\sigma }}_{FGLS1,n}^{2})}
4145:
2355:{\displaystyle \operatorname {E} ={\boldsymbol {\beta }},\quad {\text{and}}\quad \operatorname {Cov} =(\mathbf {X} ^{\mathrm {T} }{\boldsymbol {\Omega }}^{-1}\mathbf {X} )^{-1}.}
4495:
1082:
2487:
3425:
5672:
4912:{\displaystyle {\widehat {\beta }}_{FGLS1}=(X^{\operatorname {T} }{\widehat {\Omega }}_{\text{OLS}}^{-1}X)^{-1}X^{\operatorname {T} }{\widehat {\Omega }}_{\text{OLS}}^{-1}y}
4727:
614:
4461:
of the error vector is diagonal, or equivalently that errors from distinct observations are uncorrelated. Then each diagonal entry may be estimated by the fitted residuals
4277:
3645:
3478:
3129:
2811:
1310:
1262:
1195:{\displaystyle \mathbf {y} =\mathbf {X} {\boldsymbol {\beta }}+{\boldsymbol {\varepsilon }},\quad \operatorname {E} =0,\quad \operatorname {Cov} ={\boldsymbol {\Omega }},}
1053:
1027:
1005:
983:
4683:
It is important to notice that the squared residuals cannot be used in the previous expression; an estimator of the errors' variances is needed. To do so, a parametric
5429:
Under regularity conditions, the FGLS estimator (or the estimator of its iterations, if a finite number of iterations are conducted) is asymptotically distributed as:
4673:{\displaystyle {\widehat {\Omega }}_{\text{OLS}}=\operatorname {diag} ({\widehat {\sigma }}_{1}^{2},{\widehat {\sigma }}_{2}^{2},\dots ,{\widehat {\sigma }}_{n}^{2}).}
4459:
4255:(which is inconsistent in this framework) and instead use a HAC (Heteroskedasticity and Autocorrelation Consistent) estimator. In the context of autocorrelation, the
4116:
4096:
3385:{\displaystyle p(\mathbf {b} |{\boldsymbol {\varepsilon }})={\frac {p({\boldsymbol {\varepsilon }}|\mathbf {b} )p(\mathbf {b} )}{p({\boldsymbol {\varepsilon }})}}.}
5387:{\displaystyle {\widehat {\beta }}_{FGLS2}=(X^{\operatorname {T} }{\widehat {\Omega }}_{FGLS1}^{-1}X)^{-1}X^{\operatorname {T} }{\widehat {\Omega }}_{FGLS1}^{-1}y}
2417:
5561:
2007:{\displaystyle 2\mathbf {X} ^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {X} {\mathbf {b} }-2\mathbf {X} ^{\mathrm {T} }\mathbf {\Omega } ^{-1}\mathbf {y} =0,}
954:
934:
5569:
4361:
2369:
5967:
5692:
5794:
Hansen, Christian B. (2007). "Generalized Least
Squares Inference in Panel and Multilevel Models with Serial Correlation and Fixed Effects".
4260:
4928:
4163:
process, a statistician generally needs some theoretical assumptions on this process to ensure that a consistent estimator is available).
3650:
3132:
431:
2784:{\displaystyle \operatorname {Var} =\mathbf {C} ^{-1}\mathbf {\Omega } \left(\mathbf {C} ^{-1}\right)^{\mathrm {T} }=\mathbf {I} }
502:
4684:
4441:
For simplicity, consider the model for heteroscedastic and non-autocorrelated errors. Assume that the variance-covariance matrix
3106:
3025:
341:
4196:
1205:
5753:
3887:
1315:
331:
4732:
1287:
3024:
This transformation effectively standardizes the scale of and de-correlates the errors. When OLS is used on data with
5889:
5847:
3430:
295:
3922:
5675:
5533:{\displaystyle {\sqrt {n}}({\hat {\beta }}_{FGLS}-\beta )\ \xrightarrow {d} \ {\mathcal {N}}\!\left(0,\,V\right)}
3699:
346:
284:
104:
79:
4500:
4190:
A reasonable option when samples are not too large is to apply OLS but discard the classical variance estimator
5857:
206:
2839:
can be efficiently estimated by applying OLS to the transformed data, which requires minimizing the objective,
3703:
3098:
2140:
165:
3040:
2820:
1350:
1267:
5697:
3877:
424:
5400:
4121:
5962:
4166:
Then, using the consistent estimator of the covariance matrix of the errors, one can implement GLS ideas.
367:
4464:
1065:
2460:
1239:
is a vector of unknown constants, called "regression coefficients", which are estimated from the data.
336:
305:
232:
4348:{\displaystyle {\widehat {\beta }}_{\text{OLS}}=(X^{\operatorname {T} }X)^{-1}X^{\operatorname {T} }y}
3881:
3399:
326:
315:
186:
5655:
4693:
4256:
3029:
387:
258:
181:
74:
53:
20:
3628:
3461:
3112:
2794:
1293:
1245:
1036:
1010:
988:
966:
5796:
3884:
is independent of terms in the objective function which do not involve said terms. Substituting
2204:
2200:
465:
417:
310:
5957:
4766:
4268:
3094:
3069:
2411:
2363:
473:
274:
269:
211:
5877:
5861:
5831:
4444:
4101:
4081:
1901:
461:
362:
58:
5784:
Greene, W. H. (2003). Econometric
Analysis (5th ed.). Upper Saddle River, NJ: Prentice Hall.
5746:
Data
Fitting and Uncertainty (A practical introduction to weighted least squares and beyond)
2450:{\displaystyle \mathbf {y} =\mathbf {X} {\boldsymbol {\beta }}+{\boldsymbol {\varepsilon }}}
2196:
2192:
1370:
382:
372:
253:
221:
176:
155:
63:
8:
300:
201:
196:
150:
99:
89:
34:
3078:
is present, but no correlations exist among the observed variances. The weight for unit
5936:
5928:
5546:
4171:
3075:
939:
919:
679:{\displaystyle \mathbf {y} \equiv {\begin{pmatrix}y_{1}\\\vdots \\y_{n}\end{pmatrix}},}
405:
134:
119:
5920:
5885:
5843:
5836:
5749:
5687:
3282:
3033:
1059:
1056:
496:
484:
453:
400:
191:
94:
48:
5940:
5912:
5805:
5726:
2366:(OLS) to a linearly transformed version of the data. This can be seen by factoring
2171:
960:
601:
477:
216:
145:
5809:
5717:
Aitken, A. C. (1935). "On Least
Squares and Linear Combinations of Observations".
5827:
4175:
2814:
468:
and reduce the risk of drawing erroneous inferences, as compared to conventional
377:
84:
5642:{\displaystyle V=\operatorname {p-lim} (X^{\operatorname {T} }\Omega ^{-1}X/n)}
3695:
3102:
129:
5730:
5951:
5924:
3393:
2180:
687:
469:
248:
124:
4431:{\displaystyle {\widehat {u}}_{j}=(Y-X{\widehat {\beta }}_{\text{OLS}})_{j}}
3872:
where the optimization problem has been re-written using the fact that the
3082:
is proportional to the reciprocal of the variance of the response for unit
114:
2403:{\displaystyle \mathbf {\Omega } =\mathbf {C} \mathbf {C} ^{\mathrm {T} }}
457:
160:
109:
5932:
5873:
445:
5009:{\displaystyle {\widehat {u}}_{FGLS1}=Y-X{\widehat {\beta }}_{FGLS1}}
3873:
5916:
5493:
1904:
problem. The stationary point of the objective function occurs when
1030:
5772:
Baltagi, B. H. (2008). Econometrics (4th ed.). New York: Springer.
3706:(MLE), which is equivalent to the optimization problem from above,
3687:{\displaystyle \log p({\boldsymbol {\varepsilon }}|\mathbf {b} )}
5901:"What To Do (and Not to Do) with Time-Series Cross-Section Data"
4922:
The procedure can be iterated. The first iteration is given by:
3089:
608: − 1 predictor values and one response value each.
590:{\displaystyle \{y_{i},x_{ij}\}_{i=1,\dots ,n,j=2,\dots ,k}}
5868:(Second ed.). New York: McGraw-Hill. pp. 208–242.
5884:(Second ed.). New York: Macmillan. pp. 607–650.
5878:"Generalized Linear Regression Model and Its Applications"
4245:{\displaystyle \sigma ^{2}*(X^{\operatorname {T} }X)^{-1}}
1232:{\displaystyle {\boldsymbol {\beta }}\in \mathbb {R} ^{k}}
452:
is a method used to estimate the unknown parameters in a
4178:, this is not true for FGLS. The feasible estimator is
3625:
where the hidden terms are those that do not depend on
710:
631:
5899:
Beck, Nathaniel; Katz, Jonathan N. (September 1995).
5658:
5572:
5549:
5438:
5403:
5229:
5025:
4931:
4778:
4735:
4696:
4548:
4503:
4467:
4447:
4364:
4280:
4199:
4147:, using an implementable version of GLS known as the
4124:
4104:
4084:
4073:
3947:
3925:
3912:{\displaystyle \mathbf {y} -\mathbf {X} \mathbf {b} }
3890:
3712:
3653:
3631:
3486:
3464:
3433:
3402:
3291:
3141:
3115:
3043:
2845:
2823:
2797:
2685:
2495:
2463:
2420:
2372:
2213:
2143:
2020:
1910:
1713:
1379:
1353:
1340:{\displaystyle \mathbf {y} -\mathbf {X} \mathbf {b} }
1318:
1296:
1270:
1248:
1208:
1090:
1068:
1039:
1013:
991:
969:
942:
922:
696:
617:
505:
464:
in the regression model. GLS is employed to improve
936:predictor variables (including a constant) for the
5835:
5666:
5641:
5555:
5532:
5418:
5386:
5212:
5008:
4911:
4757:
4721:
4672:
4531:
4489:
4453:
4430:
4347:
4259:can be used, and in heteroscedastic contexts, the
4244:
4139:
4110:
4090:
4063:
3933:
3911:
3862:
3686:
3639:
3617:
3472:
3458:is a marginal distribution, it does not depend on
3450:
3419:
3384:
3273:
3123:
3051:
3016:
2831:
2805:
2783:
2671:
2481:
2449:
2402:
2354:
2161:
2129:
2006:
1892:
1699:
1361:
1339:
1304:
1278:
1256:
1231:
1194:
1076:
1047:
1021:
999:
977:
948:
928:
908:
678:
589:
5509:
4758:{\displaystyle {\widehat {\Omega }}_{\text{OLS}}}
4098:is unknown, one can get a consistent estimate of
1347:. The generalized least squares method estimates
5949:
3194:
456:. It is used when there is a non-zero amount of
3451:{\displaystyle p({\boldsymbol {\varepsilon }})}
4687:model or nonparametric estimator can be used.
3880:and the property that the argument solving an
5768:
5766:
5719:Proceedings of the Royal Society of Edinburgh
4170:Whereas GLS is more efficient than OLS under
425:
5780:
5778:
3934:{\displaystyle {\boldsymbol {\varepsilon }}}
536:
506:
4158:In FGLS, modeling proceeds in two stages:
3090:Derivation by maximum likelihood estimation
686:and the predictor values are placed in the
611:The response values are placed in a vector,
5763:
5743:
4532:{\displaystyle {\widehat {\Omega }}_{OLS}}
3975:
3826:
3780:
3740:
432:
418:
5775:
5521:
3063:
2971:
1756:
1741:
1689:
1525:
1510:
1415:
1219:
5898:
5856:
3133:conditional probability density function
5826:
3951:
3927:
3840:
3804:
3758:
3716:
3667:
3602:
3574:
3533:
3510:
3441:
3369:
3329:
3309:
3259:
3231:
3149:
3045:
2928:
2880:
2825:
2697:
2662:
2633:
2533:
2524:
2443:
2435:
2276:
2250:
2226:
2162:{\displaystyle \mathbf {\Omega } ^{-1}}
2024:
1717:
1387:
1355:
1272:
1210:
1166:
1131:
1113:
1105:
5950:
5872:
5793:
5716:
3052:{\displaystyle {\boldsymbol {\beta }}}
2832:{\displaystyle {\boldsymbol {\beta }}}
1362:{\displaystyle {\boldsymbol {\beta }}}
1279:{\displaystyle {\boldsymbol {\beta }}}
5968:Regression with time series structure
4174:(also spelled heteroskedasticity) or
5419:{\displaystyle {\widehat {\Omega }}}
4140:{\displaystyle {\widehat {\Omega }}}
3032:applies, so the GLS estimate is the
2179:), a generalization of the diagonal
2489:yields an equivalent linear model:
476:methods. It was first described by
13:
5832:"Generalized Least Squares Theory"
5820:
5613:
5607:
5592:
5589:
5586:
5580:
5504:
5407:
5346:
5337:
5281:
5272:
5030:
4883:
4874:
4830:
4821:
4740:
4553:
4508:
4490:{\displaystyle {\widehat {u}}_{j}}
4448:
4337:
4311:
4271:(OLS) estimator is calculated by:
4221:
4149:feasible generalized least squares
4128:
4105:
4085:
4074:Feasible generalized least squares
4013:
3580:
3480:. Therefore the log-probability is
3237:
2965:
2891:
2767:
2394:
2310:
2214:
2098:
2050:
1969:
1922:
1861:
1847:
1802:
1788:
1750:
1663:
1610:
1568:
1519:
1443:
1121:
1077:{\displaystyle \mathbf {\Omega } }
916:where each row is a vector of the
14:
5979:
5905:American Political Science Review
2482:{\displaystyle \mathbf {C} ^{-1}}
2414:. Left-multiplying both sides of
5426:can be iterated to convergence.
4078:If the covariance of the errors
4051:
4046:
4038:
4021:
4003:
3998:
3990:
3969:
3905:
3900:
3892:
3850:
3820:
3794:
3774:
3748:
3734:
3677:
3633:
3588:
3543:
3500:
3466:
3410:
3353:
3339:
3299:
3245:
3198:
3159:
3135:of the errors are assumed to be:
3117:
3004:
2999:
2991:
2974:
2955:
2950:
2942:
2917:
2902:
2869:
2854:
2799:
2777:
2748:
2737:
2723:
2711:
2648:
2623:
2609:
2594:
2584:
2570:
2555:
2513:
2498:
2466:
2430:
2422:
2388:
2382:
2374:
2332:
2318:
2304:
2289:
2239:
2146:
2120:
2106:
2092:
2072:
2058:
2044:
1991:
1977:
1963:
1950:
1944:
1930:
1916:
1883:
1869:
1855:
1841:
1829:
1824:
1810:
1796:
1782:
1773:
1759:
1744:
1735:
1685:
1671:
1653:
1648:
1637:
1632:
1618:
1604:
1595:
1590:
1576:
1558:
1553:
1542:
1528:
1513:
1504:
1481:
1476:
1468:
1451:
1433:
1428:
1420:
1409:
1333:
1328:
1320:
1298:
1250:
1185:
1174:
1139:
1100:
1092:
1070:
1041:
1015:
993:
971:
698:
619:
399:
16:Statistical estimation technique
4358:and estimates of the residuals
3420:{\displaystyle p(\mathbf {b} )}
2630:
2591:
2552:
2546:
2263:
2257:
1155:
1120:
450:generalized least squares (GLS)
347:Least-squares spectral analysis
285:Generalized estimating equation
105:Multinomial logistic regression
80:Vector generalized linear model
5787:
5737:
5710:
5667:{\displaystyle {\text{p-lim}}}
5636:
5599:
5483:
5456:
5446:
5320:
5264:
5207:
5066:
4857:
4813:
4722:{\displaystyle \beta _{FGLS1}}
4664:
4577:
4419:
4387:
4320:
4303:
4230:
4213:
4055:
4034:
4008:
3986:
3954:
3854:
3845:
3836:
3808:
3799:
3790:
3762:
3753:
3744:
3719:
3681:
3672:
3663:
3547:
3538:
3529:
3514:
3505:
3496:
3445:
3437:
3414:
3406:
3373:
3365:
3357:
3349:
3343:
3334:
3325:
3313:
3304:
3295:
3185:
3175:
3163:
3154:
3145:
3034:best linear unbiased estimator
3008:
2987:
2960:
2938:
2932:
2897:
2715:
2692:
2362:GLS is equivalent to applying
2337:
2299:
2293:
2279:
2270:
2243:
2229:
2220:
2027:
1720:
1658:
1644:
1563:
1549:
1485:
1464:
1438:
1416:
1390:
1178:
1162:
1143:
1127:
1:
5810:10.1016/j.jeconom.2006.07.011
5703:
3099:maximum likelihood estimation
2186:
483:It requires knowledge of the
166:Nonlinear mixed-effects model
5842:. Harvard University Press.
5693:Effective degrees of freedom
3878:strictly increasing function
3640:{\displaystyle \mathbf {b} }
3473:{\displaystyle \mathbf {b} }
3124:{\displaystyle \mathbf {b} }
2806:{\displaystyle \mathbf {I} }
1305:{\displaystyle \mathbf {b} }
1264:is a candidate estimate for
1257:{\displaystyle \mathbf {b} }
1048:{\displaystyle \mathbf {X} }
1022:{\displaystyle \mathbf {X} }
1000:{\displaystyle \mathbf {X} }
978:{\displaystyle \mathbf {y} }
7:
5862:"Generalized Least-squares"
5681:
3704:maximum likelihood estimate
3702:(MAP) estimate is then the
3109:. For given fit parameters
1007:to be a linear function of
959:The model assumes that the
368:Mean and predicted response
10:
5984:
3067:
1369:by minimizing the squared
499:models, one observes data
161:Linear mixed-effects model
18:
5731:10.1017/s0370164600014346
1029:and that the conditional
490:
327:Least absolute deviations
5882:Elements of Econometrics
5698:Prais–Winsten estimation
5563:is the sample size, and
3394:uniform (improper) prior
1373:of this residual vector:
1033:of the error term given
75:Generalized linear model
21:generalized linear model
19:Not to be confused with
5797:Journal of Econometrics
4539:may be constructed by:
4454:{\displaystyle \Omega }
4111:{\displaystyle \Omega }
4091:{\displaystyle \Omega }
2410:using a method such as
454:linear regression model
5668:
5643:
5557:
5534:
5420:
5388:
5214:
5010:
4913:
4767:weighted least squares
4759:
4723:
4674:
4533:
4491:
4455:
4432:
4349:
4269:ordinary least squares
4261:Eicker–White estimator
4246:
4141:
4112:
4092:
4065:
3935:
3913:
3864:
3688:
3641:
3619:
3474:
3452:
3421:
3386:
3275:
3125:
3097:can be interpreted as
3095:Ordinary least squares
3070:Weighted least squares
3064:Weighted least squares
3053:
3018:
2833:
2807:
2785:
2673:
2483:
2451:
2412:Cholesky decomposition
2404:
2364:ordinary least squares
2356:
2163:
2131:
2008:
1894:
1707:which is equivalent to
1701:
1363:
1341:
1306:
1280:
1258:
1233:
1196:
1078:
1049:
1023:
1001:
979:
950:
930:
910:
680:
591:
474:weighted least squares
466:statistical efficiency
406:Mathematics portal
332:Iteratively reweighted
5838:Advanced Econometrics
5669:
5644:
5558:
5535:
5421:
5389:
5215:
5011:
4914:
4760:
4724:
4675:
4534:
4492:
4456:
4433:
4350:
4247:
4142:
4113:
4093:
4066:
3936:
3914:
3865:
3689:
3642:
3620:
3475:
3453:
3422:
3387:
3276:
3126:
3054:
3019:
2834:
2808:
2786:
2674:
2484:
2452:
2405:
2357:
2205:asymptotically normal
2191:The GLS estimator is
2164:
2132:
2009:
1902:quadratic programming
1895:
1702:
1364:
1342:
1307:
1281:
1259:
1234:
1197:
1079:
1050:
1024:
1002:
980:
951:
931:
911:
681:
592:
363:Regression validation
342:Bayesian multivariate
59:Polynomial regression
5676:limit in probability
5656:
5570:
5547:
5436:
5401:
5227:
5023:
4929:
4776:
4733:
4694:
4546:
4501:
4465:
4445:
4362:
4278:
4257:Newey–West estimator
4197:
4122:
4102:
4082:
3945:
3923:
3888:
3882:optimization problem
3710:
3700:maximum a posteriori
3651:
3629:
3484:
3462:
3431:
3400:
3289:
3139:
3113:
3041:
3030:Gauss–Markov theorem
2843:
2821:
2795:
2683:
2493:
2461:
2418:
2370:
2211:
2141:
2018:
1908:
1711:
1377:
1351:
1316:
1294:
1268:
1246:
1206:
1088:
1066:
1037:
1011:
989:
967:
940:
920:
694:
615:
503:
388:Gauss–Markov theorem
383:Studentized residual
373:Errors and residuals
207:Principal components
177:Nonlinear regression
64:General linear model
5866:Econometric Methods
5748:. Springer Vieweg.
5744:Strutz, T. (2016).
5497:
5397:This estimation of
5380:
5315:
5206:
5155:
5110:
4905:
4852:
4663:
4630:
4603:
3107:differing variances
2014:so the estimator is
233:Errors-in-variables
100:Logistic regression
90:Binomial regression
35:Regression analysis
29:Part of a series on
5963:Estimation methods
5664:
5639:
5553:
5530:
5416:
5384:
5342:
5277:
5210:
5165:
5114:
5069:
5006:
4909:
4879:
4826:
4755:
4719:
4685:heteroskedasticity
4670:
4640:
4607:
4580:
4529:
4487:
4451:
4428:
4345:
4242:
4172:heteroscedasticity
4137:
4108:
4088:
4061:
3973:
3931:
3909:
3860:
3824:
3778:
3738:
3684:
3637:
3615:
3470:
3448:
3417:
3382:
3271:
3121:
3076:heteroscedasticity
3049:
3014:
2829:
2803:
2781:
2669:
2479:
2447:
2400:
2352:
2159:
2127:
2004:
1890:
1739:
1697:
1695:
1508:
1413:
1371:Mahalanobis length
1359:
1337:
1302:
1276:
1254:
1229:
1192:
1074:
1045:
1019:
997:
975:
946:
926:
906:
897:
676:
667:
587:
120:Multinomial probit
5755:978-3-658-11455-8
5688:Confidence region
5662:
5585:
5556:{\displaystyle n}
5501:
5498:
5488:
5459:
5444:
5413:
5352:
5287:
5240:
5175:
5124:
5079:
5036:
4985:
4942:
4895:
4889:
4842:
4836:
4789:
4752:
4746:
4650:
4617:
4590:
4565:
4559:
4514:
4478:
4438:are constructed.
4415:
4409:
4375:
4297:
4291:
4134:
3984:
3964:
3957:
3815:
3769:
3729:
3722:
3570:
3377:
3227:
3203:
3202:
2550:
2282:
2261:
2232:
2177:dispersion matrix
2030:
1730:
1723:
1499:
1404:
1393:
1060:covariance matrix
949:{\displaystyle i}
929:{\displaystyle k}
602:statistical units
497:linear regression
485:covariance matrix
442:
441:
95:Binary regression
54:Simple regression
49:Linear regression
5975:
5944:
5895:
5869:
5853:
5841:
5828:Amemiya, Takeshi
5814:
5813:
5791:
5785:
5782:
5773:
5770:
5761:
5759:
5741:
5735:
5734:
5714:
5673:
5671:
5670:
5665:
5663:
5660:
5648:
5646:
5645:
5640:
5632:
5624:
5623:
5611:
5610:
5595:
5583:
5562:
5560:
5559:
5554:
5539:
5537:
5536:
5531:
5529:
5525:
5508:
5507:
5499:
5489:
5486:
5476:
5475:
5461:
5460:
5452:
5445:
5440:
5425:
5423:
5422:
5417:
5415:
5414:
5406:
5393:
5391:
5390:
5385:
5379:
5371:
5354:
5353:
5345:
5341:
5340:
5331:
5330:
5314:
5306:
5289:
5288:
5280:
5276:
5275:
5260:
5259:
5242:
5241:
5233:
5219:
5217:
5216:
5211:
5205:
5200:
5177:
5176:
5168:
5154:
5149:
5126:
5125:
5117:
5109:
5104:
5081:
5080:
5072:
5056:
5055:
5038:
5037:
5029:
5015:
5013:
5012:
5007:
5005:
5004:
4987:
4986:
4978:
4962:
4961:
4944:
4943:
4935:
4918:
4916:
4915:
4910:
4904:
4896:
4893:
4891:
4890:
4882:
4878:
4877:
4868:
4867:
4851:
4843:
4840:
4838:
4837:
4829:
4825:
4824:
4809:
4808:
4791:
4790:
4782:
4764:
4762:
4761:
4756:
4754:
4753:
4750:
4748:
4747:
4739:
4728:
4726:
4725:
4720:
4718:
4717:
4679:
4677:
4676:
4671:
4662:
4657:
4652:
4651:
4643:
4629:
4624:
4619:
4618:
4610:
4602:
4597:
4592:
4591:
4583:
4567:
4566:
4563:
4561:
4560:
4552:
4538:
4536:
4535:
4530:
4528:
4527:
4516:
4515:
4507:
4496:
4494:
4493:
4488:
4486:
4485:
4480:
4479:
4471:
4460:
4458:
4457:
4452:
4437:
4435:
4434:
4429:
4427:
4426:
4417:
4416:
4413:
4411:
4410:
4402:
4383:
4382:
4377:
4376:
4368:
4354:
4352:
4351:
4346:
4341:
4340:
4331:
4330:
4315:
4314:
4299:
4298:
4295:
4293:
4292:
4284:
4251:
4249:
4248:
4243:
4241:
4240:
4225:
4224:
4209:
4208:
4146:
4144:
4143:
4138:
4136:
4135:
4127:
4117:
4115:
4114:
4109:
4097:
4095:
4094:
4089:
4070:
4068:
4067:
4062:
4054:
4049:
4041:
4033:
4032:
4024:
4018:
4017:
4016:
4006:
4001:
3993:
3985:
3977:
3974:
3972:
3959:
3958:
3950:
3940:
3938:
3937:
3932:
3930:
3918:
3916:
3915:
3910:
3908:
3903:
3895:
3869:
3867:
3866:
3861:
3853:
3848:
3843:
3825:
3823:
3807:
3802:
3797:
3779:
3777:
3761:
3756:
3751:
3739:
3737:
3724:
3723:
3715:
3693:
3691:
3690:
3685:
3680:
3675:
3670:
3646:
3644:
3643:
3638:
3636:
3624:
3622:
3621:
3616:
3605:
3600:
3599:
3591:
3585:
3584:
3583:
3577:
3571:
3563:
3546:
3541:
3536:
3513:
3508:
3503:
3479:
3477:
3476:
3471:
3469:
3457:
3455:
3454:
3449:
3444:
3426:
3424:
3423:
3418:
3413:
3391:
3389:
3388:
3383:
3378:
3376:
3372:
3360:
3356:
3342:
3337:
3332:
3320:
3312:
3307:
3302:
3280:
3278:
3277:
3272:
3267:
3263:
3262:
3257:
3256:
3248:
3242:
3241:
3240:
3234:
3228:
3220:
3204:
3201:
3193:
3192:
3174:
3170:
3162:
3157:
3152:
3130:
3128:
3127:
3122:
3120:
3058:
3056:
3055:
3050:
3048:
3023:
3021:
3020:
3015:
3007:
3002:
2994:
2986:
2985:
2977:
2970:
2969:
2968:
2958:
2953:
2945:
2931:
2926:
2925:
2920:
2911:
2910:
2905:
2896:
2895:
2894:
2888:
2884:
2883:
2878:
2877:
2872:
2863:
2862:
2857:
2838:
2836:
2835:
2830:
2828:
2812:
2810:
2809:
2804:
2802:
2790:
2788:
2787:
2782:
2780:
2772:
2771:
2770:
2764:
2760:
2759:
2751:
2740:
2735:
2734:
2726:
2714:
2706:
2705:
2700:
2678:
2676:
2675:
2670:
2665:
2660:
2659:
2651:
2642:
2641:
2636:
2626:
2621:
2620:
2612:
2603:
2602:
2597:
2587:
2582:
2581:
2573:
2564:
2563:
2558:
2551:
2548:
2542:
2541:
2536:
2527:
2522:
2521:
2516:
2507:
2506:
2501:
2488:
2486:
2485:
2480:
2478:
2477:
2469:
2456:
2454:
2453:
2448:
2446:
2438:
2433:
2425:
2409:
2407:
2406:
2401:
2399:
2398:
2397:
2391:
2385:
2377:
2361:
2359:
2358:
2353:
2348:
2347:
2335:
2330:
2329:
2321:
2315:
2314:
2313:
2307:
2292:
2284:
2283:
2275:
2262:
2259:
2253:
2242:
2234:
2233:
2225:
2172:precision matrix
2169:is known as the
2168:
2166:
2165:
2160:
2158:
2157:
2149:
2136:
2134:
2133:
2128:
2123:
2118:
2117:
2109:
2103:
2102:
2101:
2095:
2089:
2088:
2080:
2076:
2075:
2070:
2069:
2061:
2055:
2054:
2053:
2047:
2032:
2031:
2023:
2013:
2011:
2010:
2005:
1994:
1989:
1988:
1980:
1974:
1973:
1972:
1966:
1954:
1953:
1947:
1942:
1941:
1933:
1927:
1926:
1925:
1919:
1899:
1897:
1896:
1891:
1886:
1881:
1880:
1872:
1866:
1865:
1864:
1858:
1852:
1851:
1850:
1844:
1832:
1827:
1822:
1821:
1813:
1807:
1806:
1805:
1799:
1793:
1792:
1791:
1785:
1776:
1771:
1770:
1762:
1755:
1754:
1753:
1747:
1740:
1738:
1725:
1724:
1716:
1706:
1704:
1703:
1698:
1696:
1688:
1683:
1682:
1674:
1668:
1667:
1666:
1656:
1651:
1640:
1635:
1630:
1629:
1621:
1615:
1614:
1613:
1607:
1598:
1593:
1588:
1587:
1579:
1573:
1572:
1571:
1561:
1556:
1545:
1540:
1539:
1531:
1524:
1523:
1522:
1516:
1509:
1507:
1491:
1484:
1479:
1471:
1463:
1462:
1454:
1448:
1447:
1446:
1436:
1431:
1423:
1414:
1412:
1395:
1394:
1386:
1368:
1366:
1365:
1360:
1358:
1346:
1344:
1343:
1338:
1336:
1331:
1323:
1311:
1309:
1308:
1303:
1301:
1285:
1283:
1282:
1277:
1275:
1263:
1261:
1260:
1255:
1253:
1238:
1236:
1235:
1230:
1228:
1227:
1222:
1213:
1201:
1199:
1198:
1193:
1188:
1177:
1169:
1142:
1134:
1116:
1108:
1103:
1095:
1083:
1081:
1080:
1075:
1073:
1054:
1052:
1051:
1046:
1044:
1028:
1026:
1025:
1020:
1018:
1006:
1004:
1003:
998:
996:
984:
982:
981:
976:
974:
961:conditional mean
955:
953:
952:
947:
935:
933:
932:
927:
915:
913:
912:
907:
902:
901:
894:
893:
874:
873:
859:
858:
810:
809:
790:
789:
778:
777:
759:
758:
739:
738:
727:
726:
701:
685:
683:
682:
677:
672:
671:
664:
663:
643:
642:
622:
596:
594:
593:
588:
586:
585:
534:
533:
518:
517:
478:Alexander Aitken
434:
427:
420:
404:
403:
311:Ridge regression
146:Multilevel model
26:
25:
5983:
5982:
5978:
5977:
5976:
5974:
5973:
5972:
5948:
5947:
5917:10.2307/2082979
5892:
5850:
5823:
5821:Further reading
5818:
5817:
5792:
5788:
5783:
5776:
5771:
5764:
5756:
5742:
5738:
5715:
5711:
5706:
5684:
5659:
5657:
5654:
5653:
5628:
5616:
5612:
5606:
5602:
5579:
5571:
5568:
5567:
5548:
5545:
5544:
5514:
5510:
5503:
5502:
5462:
5451:
5450:
5449:
5439:
5437:
5434:
5433:
5405:
5404:
5402:
5399:
5398:
5372:
5355:
5344:
5343:
5336:
5332:
5323:
5319:
5307:
5290:
5279:
5278:
5271:
5267:
5243:
5232:
5231:
5230:
5228:
5225:
5224:
5201:
5178:
5167:
5166:
5150:
5127:
5116:
5115:
5105:
5082:
5071:
5070:
5039:
5028:
5027:
5026:
5024:
5021:
5020:
4988:
4977:
4976:
4975:
4945:
4934:
4933:
4932:
4930:
4927:
4926:
4897:
4892:
4881:
4880:
4873:
4869:
4860:
4856:
4844:
4839:
4828:
4827:
4820:
4816:
4792:
4781:
4780:
4779:
4777:
4774:
4773:
4749:
4738:
4737:
4736:
4734:
4731:
4730:
4701:
4697:
4695:
4692:
4691:
4658:
4653:
4642:
4641:
4625:
4620:
4609:
4608:
4598:
4593:
4582:
4581:
4562:
4551:
4550:
4549:
4547:
4544:
4543:
4517:
4506:
4505:
4504:
4502:
4499:
4498:
4481:
4470:
4469:
4468:
4466:
4463:
4462:
4446:
4443:
4442:
4422:
4418:
4412:
4401:
4400:
4399:
4378:
4367:
4366:
4365:
4363:
4360:
4359:
4336:
4332:
4323:
4319:
4310:
4306:
4294:
4283:
4282:
4281:
4279:
4276:
4275:
4233:
4229:
4220:
4216:
4204:
4200:
4198:
4195:
4194:
4176:autocorrelation
4126:
4125:
4123:
4120:
4119:
4103:
4100:
4099:
4083:
4080:
4079:
4076:
4050:
4045:
4037:
4025:
4020:
4019:
4012:
4011:
4007:
4002:
3997:
3989:
3976:
3968:
3963:
3949:
3948:
3946:
3943:
3942:
3926:
3924:
3921:
3920:
3904:
3899:
3891:
3889:
3886:
3885:
3849:
3844:
3839:
3819:
3814:
3803:
3798:
3793:
3773:
3768:
3757:
3752:
3747:
3733:
3728:
3714:
3713:
3711:
3708:
3707:
3676:
3671:
3666:
3652:
3649:
3648:
3632:
3630:
3627:
3626:
3601:
3592:
3587:
3586:
3579:
3578:
3573:
3572:
3562:
3542:
3537:
3532:
3509:
3504:
3499:
3485:
3482:
3481:
3465:
3463:
3460:
3459:
3440:
3432:
3429:
3428:
3409:
3401:
3398:
3397:
3368:
3361:
3352:
3338:
3333:
3328:
3321:
3319:
3308:
3303:
3298:
3290:
3287:
3286:
3258:
3249:
3244:
3243:
3236:
3235:
3230:
3229:
3219:
3215:
3211:
3197:
3188:
3184:
3169:
3158:
3153:
3148:
3140:
3137:
3136:
3116:
3114:
3111:
3110:
3092:
3072:
3066:
3044:
3042:
3039:
3038:
3003:
2998:
2990:
2978:
2973:
2972:
2964:
2963:
2959:
2954:
2949:
2941:
2927:
2921:
2916:
2915:
2906:
2901:
2900:
2890:
2889:
2879:
2873:
2868:
2867:
2858:
2853:
2852:
2851:
2847:
2846:
2844:
2841:
2840:
2824:
2822:
2819:
2818:
2815:identity matrix
2798:
2796:
2793:
2792:
2776:
2766:
2765:
2752:
2747:
2746:
2742:
2741:
2736:
2727:
2722:
2721:
2710:
2701:
2696:
2695:
2684:
2681:
2680:
2679:In this model,
2661:
2652:
2647:
2646:
2637:
2632:
2631:
2622:
2613:
2608:
2607:
2598:
2593:
2592:
2583:
2574:
2569:
2568:
2559:
2554:
2553:
2547:
2537:
2532:
2531:
2523:
2517:
2512:
2511:
2502:
2497:
2496:
2494:
2491:
2490:
2470:
2465:
2464:
2462:
2459:
2458:
2442:
2434:
2429:
2421:
2419:
2416:
2415:
2393:
2392:
2387:
2386:
2381:
2373:
2371:
2368:
2367:
2340:
2336:
2331:
2322:
2317:
2316:
2309:
2308:
2303:
2302:
2288:
2274:
2273:
2258:
2249:
2238:
2224:
2223:
2212:
2209:
2208:
2189:
2150:
2145:
2144:
2142:
2139:
2138:
2119:
2110:
2105:
2104:
2097:
2096:
2091:
2090:
2081:
2071:
2062:
2057:
2056:
2049:
2048:
2043:
2042:
2041:
2037:
2036:
2022:
2021:
2019:
2016:
2015:
1990:
1981:
1976:
1975:
1968:
1967:
1962:
1961:
1949:
1948:
1943:
1934:
1929:
1928:
1921:
1920:
1915:
1914:
1909:
1906:
1905:
1882:
1873:
1868:
1867:
1860:
1859:
1854:
1853:
1846:
1845:
1840:
1839:
1828:
1823:
1814:
1809:
1808:
1801:
1800:
1795:
1794:
1787:
1786:
1781:
1780:
1772:
1763:
1758:
1757:
1749:
1748:
1743:
1742:
1734:
1729:
1715:
1714:
1712:
1709:
1708:
1694:
1693:
1684:
1675:
1670:
1669:
1662:
1661:
1657:
1652:
1647:
1636:
1631:
1622:
1617:
1616:
1609:
1608:
1603:
1602:
1594:
1589:
1580:
1575:
1574:
1567:
1566:
1562:
1557:
1552:
1541:
1532:
1527:
1526:
1518:
1517:
1512:
1511:
1503:
1498:
1489:
1488:
1480:
1475:
1467:
1455:
1450:
1449:
1442:
1441:
1437:
1432:
1427:
1419:
1408:
1403:
1396:
1385:
1384:
1380:
1378:
1375:
1374:
1354:
1352:
1349:
1348:
1332:
1327:
1319:
1317:
1314:
1313:
1297:
1295:
1292:
1291:
1271:
1269:
1266:
1265:
1249:
1247:
1244:
1243:
1223:
1218:
1217:
1209:
1207:
1204:
1203:
1184:
1173:
1165:
1138:
1130:
1112:
1104:
1099:
1091:
1089:
1086:
1085:
1069:
1067:
1064:
1063:
1040:
1038:
1035:
1034:
1014:
1012:
1009:
1008:
992:
990:
987:
986:
970:
968:
965:
964:
956:th data point.
941:
938:
937:
921:
918:
917:
896:
895:
886:
882:
880:
875:
866:
862:
860:
851:
847:
845:
839:
838:
833:
828:
823:
818:
812:
811:
802:
798:
796:
791:
785:
781:
779:
773:
769:
767:
761:
760:
751:
747:
745:
740:
734:
730:
728:
722:
718:
716:
706:
705:
697:
695:
692:
691:
666:
665:
659:
655:
652:
651:
645:
644:
638:
634:
627:
626:
618:
616:
613:
612:
539:
535:
526:
522:
513:
509:
504:
501:
500:
493:
438:
398:
378:Goodness of fit
85:Discrete choice
24:
17:
12:
11:
5:
5981:
5971:
5970:
5965:
5960:
5946:
5945:
5911:(3): 634–647.
5896:
5890:
5870:
5858:Johnston, John
5854:
5848:
5822:
5819:
5816:
5815:
5804:(2): 670–694.
5786:
5774:
5762:
5754:
5736:
5708:
5707:
5705:
5702:
5701:
5700:
5695:
5690:
5683:
5680:
5650:
5649:
5638:
5635:
5631:
5627:
5622:
5619:
5615:
5609:
5605:
5601:
5598:
5594:
5591:
5588:
5582:
5578:
5575:
5552:
5541:
5540:
5528:
5524:
5520:
5517:
5513:
5506:
5496:
5492:
5485:
5482:
5479:
5474:
5471:
5468:
5465:
5458:
5455:
5448:
5443:
5412:
5409:
5395:
5394:
5383:
5378:
5375:
5370:
5367:
5364:
5361:
5358:
5351:
5348:
5339:
5335:
5329:
5326:
5322:
5318:
5313:
5310:
5305:
5302:
5299:
5296:
5293:
5286:
5283:
5274:
5270:
5266:
5263:
5258:
5255:
5252:
5249:
5246:
5239:
5236:
5221:
5220:
5209:
5204:
5199:
5196:
5193:
5190:
5187:
5184:
5181:
5174:
5171:
5164:
5161:
5158:
5153:
5148:
5145:
5142:
5139:
5136:
5133:
5130:
5123:
5120:
5113:
5108:
5103:
5100:
5097:
5094:
5091:
5088:
5085:
5078:
5075:
5068:
5065:
5062:
5059:
5054:
5051:
5048:
5045:
5042:
5035:
5032:
5017:
5016:
5003:
5000:
4997:
4994:
4991:
4984:
4981:
4974:
4971:
4968:
4965:
4960:
4957:
4954:
4951:
4948:
4941:
4938:
4920:
4919:
4908:
4903:
4900:
4888:
4885:
4876:
4872:
4866:
4863:
4859:
4855:
4850:
4847:
4835:
4832:
4823:
4819:
4815:
4812:
4807:
4804:
4801:
4798:
4795:
4788:
4785:
4745:
4742:
4716:
4713:
4710:
4707:
4704:
4700:
4681:
4680:
4669:
4666:
4661:
4656:
4649:
4646:
4639:
4636:
4633:
4628:
4623:
4616:
4613:
4606:
4601:
4596:
4589:
4586:
4579:
4576:
4573:
4570:
4558:
4555:
4526:
4523:
4520:
4513:
4510:
4484:
4477:
4474:
4450:
4425:
4421:
4408:
4405:
4398:
4395:
4392:
4389:
4386:
4381:
4374:
4371:
4356:
4355:
4344:
4339:
4335:
4329:
4326:
4322:
4318:
4313:
4309:
4305:
4302:
4290:
4287:
4253:
4252:
4239:
4236:
4232:
4228:
4223:
4219:
4215:
4212:
4207:
4203:
4180:asymptotically
4168:
4167:
4164:
4155:) estimator.
4133:
4130:
4107:
4087:
4075:
4072:
4060:
4057:
4053:
4048:
4044:
4040:
4036:
4031:
4028:
4023:
4015:
4010:
4005:
4000:
3996:
3992:
3988:
3983:
3980:
3971:
3967:
3962:
3956:
3953:
3929:
3907:
3902:
3898:
3894:
3859:
3856:
3852:
3847:
3842:
3838:
3835:
3832:
3829:
3822:
3818:
3813:
3810:
3806:
3801:
3796:
3792:
3789:
3786:
3783:
3776:
3772:
3767:
3764:
3760:
3755:
3750:
3746:
3743:
3736:
3732:
3727:
3721:
3718:
3696:log-likelihood
3683:
3679:
3674:
3669:
3665:
3662:
3659:
3656:
3635:
3614:
3611:
3608:
3604:
3598:
3595:
3590:
3582:
3576:
3569:
3566:
3561:
3558:
3555:
3552:
3549:
3545:
3540:
3535:
3531:
3528:
3525:
3522:
3519:
3516:
3512:
3507:
3502:
3498:
3495:
3492:
3489:
3468:
3447:
3443:
3439:
3436:
3416:
3412:
3408:
3405:
3381:
3375:
3371:
3367:
3364:
3359:
3355:
3351:
3348:
3345:
3341:
3336:
3331:
3327:
3324:
3318:
3315:
3311:
3306:
3301:
3297:
3294:
3283:Bayes' theorem
3270:
3266:
3261:
3255:
3252:
3247:
3239:
3233:
3226:
3223:
3218:
3214:
3210:
3207:
3200:
3196:
3191:
3187:
3183:
3180:
3177:
3173:
3168:
3165:
3161:
3156:
3151:
3147:
3144:
3119:
3091:
3088:
3068:Main article:
3065:
3062:
3047:
3013:
3010:
3006:
3001:
2997:
2993:
2989:
2984:
2981:
2976:
2967:
2962:
2957:
2952:
2948:
2944:
2940:
2937:
2934:
2930:
2924:
2919:
2914:
2909:
2904:
2899:
2893:
2887:
2882:
2876:
2871:
2866:
2861:
2856:
2850:
2827:
2801:
2779:
2775:
2769:
2763:
2758:
2755:
2750:
2745:
2739:
2733:
2730:
2725:
2720:
2717:
2713:
2709:
2704:
2699:
2694:
2691:
2688:
2668:
2664:
2658:
2655:
2650:
2645:
2640:
2635:
2629:
2625:
2619:
2616:
2611:
2606:
2601:
2596:
2590:
2586:
2580:
2577:
2572:
2567:
2562:
2557:
2545:
2540:
2535:
2530:
2526:
2520:
2515:
2510:
2505:
2500:
2476:
2473:
2468:
2445:
2441:
2437:
2432:
2428:
2424:
2396:
2390:
2384:
2380:
2376:
2351:
2346:
2343:
2339:
2334:
2328:
2325:
2320:
2312:
2306:
2301:
2298:
2295:
2291:
2287:
2281:
2278:
2272:
2269:
2266:
2256:
2252:
2248:
2245:
2241:
2237:
2231:
2228:
2222:
2219:
2216:
2188:
2185:
2156:
2153:
2148:
2126:
2122:
2116:
2113:
2108:
2100:
2094:
2087:
2084:
2079:
2074:
2068:
2065:
2060:
2052:
2046:
2040:
2035:
2029:
2026:
2003:
2000:
1997:
1993:
1987:
1984:
1979:
1971:
1965:
1960:
1957:
1952:
1946:
1940:
1937:
1932:
1924:
1918:
1913:
1889:
1885:
1879:
1876:
1871:
1863:
1857:
1849:
1843:
1838:
1835:
1831:
1826:
1820:
1817:
1812:
1804:
1798:
1790:
1784:
1779:
1775:
1769:
1766:
1761:
1752:
1746:
1737:
1733:
1728:
1722:
1719:
1692:
1687:
1681:
1678:
1673:
1665:
1660:
1655:
1650:
1646:
1643:
1639:
1634:
1628:
1625:
1620:
1612:
1606:
1601:
1597:
1592:
1586:
1583:
1578:
1570:
1565:
1560:
1555:
1551:
1548:
1544:
1538:
1535:
1530:
1521:
1515:
1506:
1502:
1497:
1494:
1492:
1490:
1487:
1483:
1478:
1474:
1470:
1466:
1461:
1458:
1453:
1445:
1440:
1435:
1430:
1426:
1422:
1418:
1411:
1407:
1402:
1399:
1397:
1392:
1389:
1383:
1382:
1357:
1335:
1330:
1326:
1322:
1300:
1274:
1252:
1226:
1221:
1216:
1212:
1191:
1187:
1183:
1180:
1176:
1172:
1168:
1164:
1161:
1158:
1154:
1151:
1148:
1145:
1141:
1137:
1133:
1129:
1126:
1123:
1119:
1115:
1111:
1107:
1102:
1098:
1094:
1072:
1043:
1017:
995:
973:
945:
925:
905:
900:
892:
889:
885:
881:
879:
876:
872:
869:
865:
861:
857:
854:
850:
846:
844:
841:
840:
837:
834:
832:
829:
827:
824:
822:
819:
817:
814:
813:
808:
805:
801:
797:
795:
792:
788:
784:
780:
776:
772:
768:
766:
763:
762:
757:
754:
750:
746:
744:
741:
737:
733:
729:
725:
721:
717:
715:
712:
711:
709:
704:
700:
675:
670:
662:
658:
654:
653:
650:
647:
646:
641:
637:
633:
632:
630:
625:
621:
584:
581:
578:
575:
572:
569:
566:
563:
560:
557:
554:
551:
548:
545:
542:
538:
532:
529:
525:
521:
516:
512:
508:
492:
489:
440:
439:
437:
436:
429:
422:
414:
411:
410:
409:
408:
393:
392:
391:
390:
385:
380:
375:
370:
365:
357:
356:
352:
351:
350:
349:
344:
339:
334:
329:
321:
320:
319:
318:
313:
308:
303:
298:
290:
289:
288:
287:
282:
277:
272:
264:
263:
262:
261:
256:
251:
243:
242:
238:
237:
236:
235:
227:
226:
225:
224:
219:
214:
209:
204:
199:
194:
189:
187:Semiparametric
184:
179:
171:
170:
169:
168:
163:
158:
156:Random effects
153:
148:
140:
139:
138:
137:
132:
130:Ordered probit
127:
122:
117:
112:
107:
102:
97:
92:
87:
82:
77:
69:
68:
67:
66:
61:
56:
51:
43:
42:
38:
37:
31:
30:
15:
9:
6:
4:
3:
2:
5980:
5969:
5966:
5964:
5961:
5959:
5958:Least squares
5956:
5955:
5953:
5942:
5938:
5934:
5930:
5926:
5922:
5918:
5914:
5910:
5906:
5902:
5897:
5893:
5891:0-472-10886-7
5887:
5883:
5879:
5875:
5871:
5867:
5863:
5859:
5855:
5851:
5849:0-674-00560-0
5845:
5840:
5839:
5833:
5829:
5825:
5824:
5811:
5807:
5803:
5799:
5798:
5790:
5781:
5779:
5769:
5767:
5757:
5751:
5747:
5740:
5732:
5728:
5724:
5720:
5713:
5709:
5699:
5696:
5694:
5691:
5689:
5686:
5685:
5679:
5677:
5633:
5629:
5625:
5620:
5617:
5603:
5596:
5576:
5573:
5566:
5565:
5564:
5550:
5526:
5522:
5518:
5515:
5511:
5494:
5490:
5480:
5477:
5472:
5469:
5466:
5463:
5453:
5441:
5432:
5431:
5430:
5427:
5410:
5381:
5376:
5373:
5368:
5365:
5362:
5359:
5356:
5349:
5333:
5327:
5324:
5316:
5311:
5308:
5303:
5300:
5297:
5294:
5291:
5284:
5268:
5261:
5256:
5253:
5250:
5247:
5244:
5237:
5234:
5223:
5222:
5202:
5197:
5194:
5191:
5188:
5185:
5182:
5179:
5172:
5169:
5162:
5159:
5156:
5151:
5146:
5143:
5140:
5137:
5134:
5131:
5128:
5121:
5118:
5111:
5106:
5101:
5098:
5095:
5092:
5089:
5086:
5083:
5076:
5073:
5063:
5060:
5057:
5052:
5049:
5046:
5043:
5040:
5033:
5019:
5018:
5001:
4998:
4995:
4992:
4989:
4982:
4979:
4972:
4969:
4966:
4963:
4958:
4955:
4952:
4949:
4946:
4939:
4936:
4925:
4924:
4923:
4906:
4901:
4898:
4886:
4870:
4864:
4861:
4853:
4848:
4845:
4833:
4817:
4810:
4805:
4802:
4799:
4796:
4793:
4786:
4783:
4772:
4771:
4770:
4768:
4743:
4714:
4711:
4708:
4705:
4702:
4698:
4688:
4686:
4667:
4659:
4654:
4647:
4644:
4637:
4634:
4631:
4626:
4621:
4614:
4611:
4604:
4599:
4594:
4587:
4584:
4574:
4571:
4568:
4556:
4542:
4541:
4540:
4524:
4521:
4518:
4511:
4482:
4475:
4472:
4439:
4423:
4406:
4403:
4396:
4393:
4390:
4384:
4379:
4372:
4369:
4342:
4333:
4327:
4324:
4316:
4307:
4300:
4288:
4285:
4274:
4273:
4272:
4270:
4265:
4262:
4258:
4237:
4234:
4226:
4217:
4210:
4205:
4201:
4193:
4192:
4191:
4188:
4184:
4181:
4177:
4173:
4165:
4161:
4160:
4159:
4156:
4154:
4150:
4131:
4071:
4058:
4042:
4029:
4026:
3994:
3981:
3978:
3965:
3960:
3896:
3883:
3879:
3875:
3870:
3857:
3833:
3830:
3827:
3816:
3811:
3787:
3784:
3781:
3770:
3765:
3741:
3730:
3725:
3705:
3701:
3697:
3660:
3657:
3654:
3612:
3609:
3606:
3596:
3593:
3567:
3564:
3559:
3556:
3553:
3550:
3526:
3523:
3520:
3517:
3493:
3490:
3487:
3434:
3403:
3396:is taken for
3395:
3379:
3362:
3346:
3322:
3316:
3292:
3284:
3268:
3264:
3253:
3250:
3224:
3221:
3216:
3212:
3208:
3205:
3189:
3181:
3178:
3171:
3166:
3142:
3134:
3108:
3104:
3100:
3096:
3087:
3085:
3081:
3077:
3071:
3061:
3059:
3035:
3031:
3027:
3026:homoscedastic
3011:
2995:
2982:
2979:
2946:
2935:
2922:
2912:
2907:
2885:
2874:
2864:
2859:
2848:
2816:
2773:
2761:
2756:
2753:
2743:
2731:
2728:
2718:
2707:
2702:
2689:
2686:
2666:
2656:
2653:
2643:
2638:
2627:
2617:
2614:
2604:
2599:
2588:
2578:
2575:
2565:
2560:
2543:
2538:
2528:
2518:
2508:
2503:
2474:
2471:
2439:
2426:
2413:
2378:
2365:
2349:
2344:
2341:
2326:
2323:
2296:
2285:
2267:
2264:
2254:
2246:
2235:
2217:
2206:
2202:
2198:
2194:
2184:
2182:
2181:weight matrix
2178:
2174:
2173:
2154:
2151:
2137:The quantity
2124:
2114:
2111:
2085:
2082:
2077:
2066:
2063:
2038:
2033:
2001:
1998:
1995:
1985:
1982:
1958:
1955:
1938:
1935:
1911:
1903:
1887:
1877:
1874:
1836:
1833:
1818:
1815:
1777:
1767:
1764:
1731:
1726:
1690:
1679:
1676:
1641:
1626:
1623:
1599:
1584:
1581:
1546:
1536:
1533:
1500:
1495:
1493:
1472:
1459:
1456:
1424:
1405:
1400:
1398:
1372:
1324:
1289:
1240:
1224:
1214:
1189:
1181:
1170:
1159:
1156:
1152:
1149:
1146:
1135:
1124:
1117:
1109:
1096:
1061:
1058:
1032:
962:
957:
943:
923:
903:
898:
890:
887:
883:
877:
870:
867:
863:
855:
852:
848:
842:
835:
830:
825:
820:
815:
806:
803:
799:
793:
786:
782:
774:
770:
764:
755:
752:
748:
742:
735:
731:
723:
719:
713:
707:
702:
689:
688:design matrix
673:
668:
660:
656:
648:
639:
635:
628:
623:
609:
607:
603:
600:
582:
579:
576:
573:
570:
567:
564:
561:
558:
555:
552:
549:
546:
543:
540:
530:
527:
523:
519:
514:
510:
498:
488:
486:
481:
479:
475:
471:
470:least squares
467:
463:
459:
455:
451:
447:
435:
430:
428:
423:
421:
416:
415:
413:
412:
407:
402:
397:
396:
395:
394:
389:
386:
384:
381:
379:
376:
374:
371:
369:
366:
364:
361:
360:
359:
358:
354:
353:
348:
345:
343:
340:
338:
335:
333:
330:
328:
325:
324:
323:
322:
317:
314:
312:
309:
307:
304:
302:
299:
297:
294:
293:
292:
291:
286:
283:
281:
278:
276:
273:
271:
268:
267:
266:
265:
260:
257:
255:
252:
250:
249:Least squares
247:
246:
245:
244:
240:
239:
234:
231:
230:
229:
228:
223:
220:
218:
215:
213:
210:
208:
205:
203:
200:
198:
195:
193:
190:
188:
185:
183:
182:Nonparametric
180:
178:
175:
174:
173:
172:
167:
164:
162:
159:
157:
154:
152:
151:Fixed effects
149:
147:
144:
143:
142:
141:
136:
133:
131:
128:
126:
125:Ordered logit
123:
121:
118:
116:
113:
111:
108:
106:
103:
101:
98:
96:
93:
91:
88:
86:
83:
81:
78:
76:
73:
72:
71:
70:
65:
62:
60:
57:
55:
52:
50:
47:
46:
45:
44:
40:
39:
36:
33:
32:
28:
27:
22:
5908:
5904:
5881:
5865:
5837:
5801:
5795:
5789:
5745:
5739:
5722:
5718:
5712:
5651:
5542:
5428:
5396:
4921:
4689:
4682:
4440:
4357:
4266:
4254:
4189:
4185:
4179:
4169:
4157:
4152:
4148:
4077:
3871:
3093:
3083:
3079:
3073:
3037:
3028:errors, the
2190:
2176:
2170:
1241:
1057:non-singular
958:
610:
605:
598:
495:In standard
494:
482:
460:between the
449:
443:
306:Non-negative
279:
5874:Kmenta, Jan
5760:, chapter 3
1900:which is a
1290:vector for
1286:, then the
1055:is a known
458:correlation
316:Regularized
280:Generalized
212:Least angle
110:Mixed logit
5952:Categories
5704:References
3392:In GLS, a
2197:consistent
2187:Properties
1084:. That is,
446:statistics
355:Background
259:Non-linear
241:Estimation
5925:1537-5943
5725:: 42–48.
5618:−
5614:Ω
5597:
5481:β
5478:−
5457:^
5454:β
5411:^
5408:Ω
5374:−
5350:^
5347:Ω
5325:−
5309:−
5285:^
5282:Ω
5238:^
5235:β
5173:^
5170:σ
5160:…
5122:^
5119:σ
5077:^
5074:σ
5064:
5034:^
5031:Ω
4983:^
4980:β
4970:−
4940:^
4899:−
4887:^
4884:Ω
4862:−
4846:−
4834:^
4831:Ω
4787:^
4784:β
4744:^
4741:Ω
4699:β
4690:Estimate
4648:^
4645:σ
4635:…
4615:^
4612:σ
4588:^
4585:σ
4575:
4557:^
4554:Ω
4512:^
4509:Ω
4476:^
4449:Ω
4407:^
4404:β
4394:−
4373:^
4325:−
4289:^
4286:β
4235:−
4211:∗
4202:σ
4132:^
4129:Ω
4106:Ω
4086:Ω
4043:−
4027:−
4022:Ω
3995:−
3955:^
3952:β
3928:ε
3897:−
3874:logarithm
3841:ε
3831:
3805:ε
3785:
3759:ε
3720:^
3717:β
3668:ε
3658:
3610:⋯
3603:ε
3594:−
3589:Ω
3575:ε
3560:−
3554:⋯
3534:ε
3524:
3511:ε
3491:
3442:ε
3427:, and as
3370:ε
3330:ε
3310:ε
3260:ε
3251:−
3246:Ω
3232:ε
3217:−
3209:
3199:Ω
3182:π
3150:ε
3101:with the
3046:β
2996:−
2980:−
2975:Ω
2947:−
2929:β
2923:∗
2913:−
2908:∗
2881:β
2875:∗
2865:−
2860:∗
2826:β
2754:−
2738:Ω
2729:−
2708:∣
2703:∗
2698:ε
2690:
2663:ε
2654:−
2639:∗
2634:ε
2615:−
2600:∗
2576:−
2561:∗
2539:∗
2534:ε
2525:β
2519:∗
2504:∗
2472:−
2444:ε
2436:β
2375:Ω
2342:−
2324:−
2319:Ω
2286:∣
2280:^
2277:β
2268:
2251:β
2236:∣
2230:^
2227:β
2218:
2201:efficient
2152:−
2147:Ω
2112:−
2107:Ω
2083:−
2064:−
2059:Ω
2028:^
2025:β
1983:−
1978:Ω
1956:−
1936:−
1931:Ω
1875:−
1870:Ω
1834:−
1816:−
1811:Ω
1765:−
1760:Ω
1721:^
1718:β
1677:−
1672:Ω
1642:−
1624:−
1619:Ω
1600:−
1582:−
1577:Ω
1534:−
1529:Ω
1473:−
1457:−
1452:Ω
1425:−
1391:^
1388:β
1356:β
1325:−
1273:β
1215:∈
1211:β
1186:Ω
1171:∣
1167:ε
1160:
1136:∣
1132:ε
1125:
1114:ε
1106:β
1071:Ω
878:⋯
836:⋮
831:⋱
826:⋮
821:⋮
816:⋮
794:⋯
743:⋯
703:≡
649:⋮
624:≡
577:…
553:…
480:in 1935.
462:residuals
222:Segmented
5941:63222945
5876:(1986).
5860:(1972).
5830:(1985).
5682:See also
5491:→
2817:. Then,
2791:, where
2193:unbiased
1288:residual
1031:variance
337:Bayesian
275:Weighted
270:Ordinary
202:Isotonic
197:Quantile
5933:2082979
3694:is the
2813:is the
296:Partial
135:Poisson
5939:
5931:
5923:
5888:
5846:
5752:
5674:means
5652:where
5543:where
5500:
5487:
4765:using
4729:using
4118:, say
3966:argmin
3817:argmax
3771:argmax
3731:argmax
3698:. The
3647:, and
3131:, the
2203:, and
1732:argmin
1501:argmin
1406:argmin
1202:where
985:given
491:Method
254:Linear
192:Robust
115:Probit
41:Models
5937:S2CID
5929:JSTOR
5661:p-lim
3876:is a
3103:prior
2549:where
604:with
301:Total
217:Local
5921:ISSN
5886:ISBN
5844:ISBN
5750:ISBN
5061:diag
4572:diag
4267:The
4153:FGLS
3919:for
3036:for
2207:with
2175:(or
472:and
5913:doi
5806:doi
5802:140
5727:doi
4894:OLS
4841:OLS
4751:OLS
4564:OLS
4497:so
4414:OLS
4296:OLS
3828:log
3782:log
3655:log
3521:log
3488:log
3281:By
3206:exp
3195:det
2687:Var
2457:by
2265:Cov
2260:and
1312:is
1242:If
1157:Cov
963:of
597:on
444:In
5954::
5935:.
5927:.
5919:.
5909:89
5907:.
5903:.
5880:.
5864:.
5834:.
5800:.
5777:^
5765:^
5723:55
5721:.
5678:.
4769::
3941:,
3086:.
3060:.
2199:,
2195:,
2183:.
1062:,
787:23
775:22
736:13
724:12
448:,
5943:.
5915::
5894:.
5852:.
5812:.
5808::
5758:.
5733:.
5729::
5637:)
5634:n
5630:/
5626:X
5621:1
5608:T
5604:X
5600:(
5593:m
5590:i
5587:l
5584:-
5581:p
5577:=
5574:V
5551:n
5527:)
5523:V
5519:,
5516:0
5512:(
5505:N
5495:d
5484:)
5473:S
5470:L
5467:G
5464:F
5447:(
5442:n
5382:y
5377:1
5369:1
5366:S
5363:L
5360:G
5357:F
5338:T
5334:X
5328:1
5321:)
5317:X
5312:1
5304:1
5301:S
5298:L
5295:G
5292:F
5273:T
5269:X
5265:(
5262:=
5257:2
5254:S
5251:L
5248:G
5245:F
5208:)
5203:2
5198:n
5195:,
5192:1
5189:S
5186:L
5183:G
5180:F
5163:,
5157:,
5152:2
5147:2
5144:,
5141:1
5138:S
5135:L
5132:G
5129:F
5112:,
5107:2
5102:1
5099:,
5096:1
5093:S
5090:L
5087:G
5084:F
5067:(
5058:=
5053:1
5050:S
5047:L
5044:G
5041:F
5002:1
4999:S
4996:L
4993:G
4990:F
4973:X
4967:Y
4964:=
4959:1
4956:S
4953:L
4950:G
4947:F
4937:u
4907:y
4902:1
4875:T
4871:X
4865:1
4858:)
4854:X
4849:1
4822:T
4818:X
4814:(
4811:=
4806:1
4803:S
4800:L
4797:G
4794:F
4715:1
4712:S
4709:L
4706:G
4703:F
4668:.
4665:)
4660:2
4655:n
4638:,
4632:,
4627:2
4622:2
4605:,
4600:2
4595:1
4578:(
4569:=
4525:S
4522:L
4519:O
4483:j
4473:u
4424:j
4420:)
4397:X
4391:Y
4388:(
4385:=
4380:j
4370:u
4343:y
4338:T
4334:X
4328:1
4321:)
4317:X
4312:T
4308:X
4304:(
4301:=
4238:1
4231:)
4227:X
4222:T
4218:X
4214:(
4206:2
4151:(
4059:.
4056:)
4052:b
4047:X
4039:y
4035:(
4030:1
4014:T
4009:)
4004:b
3999:X
3991:y
3987:(
3982:2
3979:1
3970:b
3961:=
3906:b
3901:X
3893:y
3858:,
3855:)
3851:b
3846:|
3837:(
3834:p
3821:b
3812:=
3809:)
3800:|
3795:b
3791:(
3788:p
3775:b
3766:=
3763:)
3754:|
3749:b
3745:(
3742:p
3735:b
3726:=
3682:)
3678:b
3673:|
3664:(
3661:p
3634:b
3613:,
3607:+
3597:1
3581:T
3568:2
3565:1
3557:=
3551:+
3548:)
3544:b
3539:|
3530:(
3527:p
3518:=
3515:)
3506:|
3501:b
3497:(
3494:p
3467:b
3446:)
3438:(
3435:p
3415:)
3411:b
3407:(
3404:p
3380:.
3374:)
3366:(
3363:p
3358:)
3354:b
3350:(
3347:p
3344:)
3340:b
3335:|
3326:(
3323:p
3317:=
3314:)
3305:|
3300:b
3296:(
3293:p
3285:,
3269:.
3265:)
3254:1
3238:T
3225:2
3222:1
3213:(
3190:n
3186:)
3179:2
3176:(
3172:1
3167:=
3164:)
3160:b
3155:|
3146:(
3143:p
3118:b
3084:i
3080:i
3012:.
3009:)
3005:b
3000:X
2992:y
2988:(
2983:1
2966:T
2961:)
2956:b
2951:X
2943:y
2939:(
2936:=
2933:)
2918:X
2903:y
2898:(
2892:T
2886:)
2870:X
2855:y
2849:(
2800:I
2778:I
2774:=
2768:T
2762:)
2757:1
2749:C
2744:(
2732:1
2724:C
2719:=
2716:]
2712:X
2693:[
2667:.
2657:1
2649:C
2644:=
2628:,
2624:X
2618:1
2610:C
2605:=
2595:X
2589:,
2585:y
2579:1
2571:C
2566:=
2556:y
2544:,
2529:+
2514:X
2509:=
2499:y
2475:1
2467:C
2440:+
2431:X
2427:=
2423:y
2395:T
2389:C
2383:C
2379:=
2350:.
2345:1
2338:)
2333:X
2327:1
2311:T
2305:X
2300:(
2297:=
2294:]
2290:X
2271:[
2255:,
2247:=
2244:]
2240:X
2221:[
2215:E
2155:1
2125:.
2121:y
2115:1
2099:T
2093:X
2086:1
2078:)
2073:X
2067:1
2051:T
2045:X
2039:(
2034:=
2002:,
1999:0
1996:=
1992:y
1986:1
1970:T
1964:X
1959:2
1951:b
1945:X
1939:1
1923:T
1917:X
1912:2
1888:,
1884:y
1878:1
1862:T
1856:X
1848:T
1842:b
1837:2
1830:b
1825:X
1819:1
1803:T
1797:X
1789:T
1783:b
1778:+
1774:y
1768:1
1751:T
1745:y
1736:b
1727:=
1691:,
1686:y
1680:1
1664:T
1659:)
1654:b
1649:X
1645:(
1638:b
1633:X
1627:1
1611:T
1605:y
1596:b
1591:X
1585:1
1569:T
1564:)
1559:b
1554:X
1550:(
1547:+
1543:y
1537:1
1520:T
1514:y
1505:b
1496:=
1486:)
1482:b
1477:X
1469:y
1465:(
1460:1
1444:T
1439:)
1434:b
1429:X
1421:y
1417:(
1410:b
1401:=
1334:b
1329:X
1321:y
1299:b
1251:b
1225:k
1220:R
1190:,
1182:=
1179:]
1175:X
1163:[
1153:,
1150:0
1147:=
1144:]
1140:X
1128:[
1122:E
1118:,
1110:+
1101:X
1097:=
1093:y
1042:X
1016:X
994:X
972:y
944:i
924:k
904:,
899:)
891:k
888:n
884:x
871:3
868:n
864:x
856:2
853:n
849:x
843:1
807:k
804:2
800:x
783:x
771:x
765:1
756:k
753:1
749:x
732:x
720:x
714:1
708:(
699:X
690:,
674:,
669:)
661:n
657:y
640:1
636:y
629:(
620:y
606:k
599:n
583:k
580:,
574:,
571:2
568:=
565:j
562:,
559:n
556:,
550:,
547:1
544:=
541:i
537:}
531:j
528:i
524:x
520:,
515:i
511:y
507:{
433:e
426:t
419:v
23:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.