10980:
9448:
10975:{\displaystyle {\begin{aligned}{\hat {\theta }}&={\underset {\theta }{\operatorname {arg\,max} }}\,L_{P_{\theta }}(\mathbf {y} )={\underset {\theta }{\operatorname {arg\,max} }}\,P_{\theta }(\mathbf {y} )={\underset {\theta }{\operatorname {arg\,max} }}\,P(\mathbf {y} \mid \theta )\\&={\underset {\theta }{\operatorname {arg\,max} }}\,\prod _{i=1}^{n}P(y_{i}\mid \theta )={\underset {\theta }{\operatorname {arg\,max} }}\,\sum _{i=1}^{n}\log P(y_{i}\mid \theta )\\&={\underset {\theta }{\operatorname {arg\,max} }}\,\left(\sum _{i=1}^{n}\log P(y_{i}\mid \theta )-\sum _{i=1}^{n}\log P(y_{i}\mid \theta _{0})\right)={\underset {\theta }{\operatorname {arg\,max} }}\,\sum _{i=1}^{n}\left(\log P(y_{i}\mid \theta )-\log P(y_{i}\mid \theta _{0})\right)\\&={\underset {\theta }{\operatorname {arg\,max} }}\,\sum _{i=1}^{n}\log {\frac {P(y_{i}\mid \theta )}{P(y_{i}\mid \theta _{0})}}={\underset {\theta }{\operatorname {arg\,min} }}\,\sum _{i=1}^{n}\log {\frac {P(y_{i}\mid \theta _{0})}{P(y_{i}\mid \theta )}}={\underset {\theta }{\operatorname {arg\,min} }}\,{\frac {1}{n}}\sum _{i=1}^{n}\log {\frac {P(y_{i}\mid \theta _{0})}{P(y_{i}\mid \theta )}}\\&={\underset {\theta }{\operatorname {arg\,min} }}\,{\frac {1}{n}}\sum _{i=1}^{n}h_{\theta }(y_{i})\quad {\underset {n\to \infty }{\longrightarrow }}\quad {\underset {\theta }{\operatorname {arg\,min} }}\,E\\&={\underset {\theta }{\operatorname {arg\,min} }}\,\int P_{\theta _{0}}(y)h_{\theta }(y)dy={\underset {\theta }{\operatorname {arg\,min} }}\,\int P_{\theta _{0}}(y)\log {\frac {P(y\mid \theta _{0})}{P(y\mid \theta )}}dy\\&={\underset {\theta }{\operatorname {arg\,min} }}\,D_{\text{KL}}(P_{\theta _{0}}\parallel P_{\theta })\end{aligned}}}
2392:
1577:
2387:{\displaystyle \mathbf {H} \left({\widehat {\theta \,}}\right)={\begin{bmatrix}\left.{\frac {\partial ^{2}\ell }{\partial \theta _{1}^{2}}}\right|_{\theta ={\widehat {\theta \,}}}&\left.{\frac {\partial ^{2}\ell }{\partial \theta _{1}\,\partial \theta _{2}}}\right|_{\theta ={\widehat {\theta \,}}}&\dots &\left.{\frac {\partial ^{2}\ell }{\partial \theta _{1}\,\partial \theta _{k}}}\right|_{\theta ={\widehat {\theta \,}}}\\\left.{\frac {\partial ^{2}\ell }{\partial \theta _{2}\,\partial \theta _{1}}}\right|_{\theta ={\widehat {\theta \,}}}&\left.{\frac {\partial ^{2}\ell }{\partial \theta _{2}^{2}}}\right|_{\theta ={\widehat {\theta \,}}}&\dots &\left.{\frac {\partial ^{2}\ell }{\partial \theta _{2}\,\partial \theta _{k}}}\right|_{\theta ={\widehat {\theta \,}}}\\\vdots &\vdots &\ddots &\vdots \\\left.{\frac {\partial ^{2}\ell }{\partial \theta _{k}\,\partial \theta _{1}}}\right|_{\theta ={\widehat {\theta \,}}}&\left.{\frac {\partial ^{2}\ell }{\partial \theta _{k}\,\partial \theta _{2}}}\right|_{\theta ={\widehat {\theta \,}}}&\dots &\left.{\frac {\partial ^{2}\ell }{\partial \theta _{k}^{2}}}\right|_{\theta ={\widehat {\theta \,}}}\end{bmatrix}}~,}
23961:
19808:
11985:
11498:
12177:
23947:
4745:
23985:
11980:{\displaystyle {\begin{aligned}\operatorname {\mathbb {P} } {\bigl }&={\binom {80}{49}}({\tfrac {1}{3}})^{49}(1-{\tfrac {1}{3}})^{31}\approx 0.000,\\\operatorname {\mathbb {P} } {\bigl }&={\binom {80}{49}}({\tfrac {1}{2}})^{49}(1-{\tfrac {1}{2}})^{31}\approx 0.012,\\\operatorname {\mathbb {P} } {\bigl }&={\binom {80}{49}}({\tfrac {2}{3}})^{49}(1-{\tfrac {2}{3}})^{31}\approx 0.054~.\end{aligned}}}
23973:
12553:
14134:
7721:
13163:
6841:
17175:
13684:
8448:
14525:
1484:
18966:
19428:
16126:
8826:
14789:
15502:
13500:
18377:
6311:
12202:
19733:
13945:
7500:
3939:
Maximum-likelihood estimators have no optimum properties for finite samples, in the sense that (when evaluated on finite samples) other estimators may have greater concentration around the true parameter-value. However, like other estimation methods, maximum likelihood estimation possesses a number
12888:
5590:
The maximum likelihood estimator selects the parameter value which gives the observed data the largest possible probability (or probability density, in the continuous case). If the parameter consists of a number of components, then we define their separate maximum likelihood estimators, as the
5567:
5384:
12866:
5169:
830:
6577:
16874:
17547:
3443:
18203:
18744:
17961:
4751:
The identification condition establishes that the log-likelihood has a unique global maximum. Compactness implies that the likelihood cannot approach the maximum value arbitrarily close at some other point (as demonstrated for example in the picture on the right).
13522:
8324:
4683:
14288:
1352:
8103:
The
Bayesian Decision theory is about designing a classifier that minimizes total expected risk, especially, when the costs (the loss function) associated with different decisions are equal, the classifier is minimizing the error over the whole distribution.
18772:
8649:
526:
17360:
8536:
19214:
7396:
15832:
8694:
14586:
5051:
8240:
3850:
7293:
717:
16518:
15350:
13314:
4488:
18255:
18208:
Gradient descent method requires to calculate the gradient at the rth iteration, but no need to calculate the inverse of second-order derivative, i.e., the
Hessian matrix. Therefore, it is computationally faster than Newton-Raphson method.
4396:
15336:
12548:{\displaystyle {\begin{aligned}0&={\frac {\partial }{\partial p}}\left({\binom {80}{49}}p^{49}(1-p)^{31}\right)~,\\0&=49p^{48}(1-p)^{31}-31p^{49}(1-p)^{30}\\&=p^{48}(1-p)^{30}\left\\&=p^{48}(1-p)^{30}\left~.\end{aligned}}}
6174:
7936:
1169:
13287:
2562:
14129:{\displaystyle {\begin{aligned}0&={\frac {\partial }{\partial \sigma }}\log {\Bigl (}{\mathcal {L}}(\mu ,\sigma ^{2}){\Bigr )}=-{\frac {\,n\,}{\sigma }}+{\frac {1}{\sigma ^{3}}}\sum _{i=1}^{n}(\,x_{i}-\mu \,)^{2}.\end{aligned}}}
8957:
7716:{\displaystyle \operatorname {\mathbb {P} } (\theta \mid x_{1},x_{2},\ldots ,x_{n})={\frac {f(x_{1},x_{2},\ldots ,x_{n}\mid \theta )\operatorname {\mathbb {P} } (\theta )}{\operatorname {\mathbb {P} } (x_{1},x_{2},\ldots ,x_{n})}}}
14242:
15019:
13158:{\displaystyle f(x_{1},\ldots ,x_{n}\mid \mu ,\sigma ^{2})=\prod _{i=1}^{n}f(x_{i}\mid \mu ,\sigma ^{2})=\left({\frac {1}{2\pi \sigma ^{2}}}\right)^{n/2}\exp \left(-{\frac {\sum _{i=1}^{n}(x_{i}-\mu )^{2}}{2\sigma ^{2}}}\right).}
12161:
11079:
13815:
5888:
17683:
3636:
3214:
907:
6505:
3645:
of partial derivatives. Naturally, if the constraints are not binding at the maximum, the
Lagrange multipliers should be zero. This in turn allows for a statistical test of the "validity" of the constraint, known as the
5442:
19699:
7839:
17741:
12207:
5280:
18488:
5790:
19527:
19065:
999:
3002:
308:
12723:
6836:{\displaystyle b_{h}\;\equiv \;\operatorname {\mathbb {E} } {\biggl }\;=\;{\frac {1}{\,n\,}}\,\sum _{i,j,k=1}^{m}\;{\mathcal {I}}^{hi}\;{\mathcal {I}}^{jk}\left({\frac {1}{\,2\,}}\,K_{ijk}\;+\;J_{j,ik}\right)}
18013:
11503:
11480:. The coins have lost their labels, so which one it was is unknown. Using maximum likelihood estimation, the coin that has the largest likelihood can be found, given the data that were observed. By using the
5064:
728:
159:. The goal of maximum likelihood estimation is to determine the parameters for which the observed data have the highest joint probability. We write the parameters governing the joint distribution as a vector
13307:
of the likelihood, the values which maximize the likelihood will also maximize its logarithm (the log-likelihood itself is not necessarily strictly increasing). The log-likelihood can be written as follows:
17170:{\displaystyle f(x_{1},x_{2},\ldots ,x_{m}\mid p_{1},p_{2},\ldots ,p_{m})={\frac {n!}{\prod x_{i}!}}\prod p_{i}^{x_{i}}={\binom {n}{x_{1},x_{2},\ldots ,x_{m}}}p_{1}^{x_{1}}p_{2}^{x_{2}}\cdots p_{m}^{x_{m}}}
13898:
3934:
415:
17371:
6042:
3361:
3290:
85:
that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of
19133:
9344:
8050:
4923:
4713:
generate an identical distribution of the observable data. Then we would not be able to distinguish between these two parameters even with an infinite amount of data—these parameters would have been
18113:
17845:
14851:
18427:
14918:
8086:
7976:
7760:
18520:
17857:
19872:: provides a means of estimating the size and shape of the region of roughly equally-probable estimates for the population's parameter values, using the information from a single sample, using a
9025:
4141:
13950:
13679:{\displaystyle {\begin{aligned}0&={\frac {\partial }{\partial \mu }}\log {\Bigl (}{\mathcal {L}}(\mu ,\sigma ^{2}){\Bigr )}=0-{\frac {\;-2n({\bar {x}}-\mu )\;}{2\sigma ^{2}}}.\end{aligned}}}
13527:
9453:
8443:{\displaystyle w={\underset {w}{\operatorname {arg\;max} }}\;\int _{-\infty }^{\infty }\operatorname {\mathbb {P} } ({\text{ error}}\mid x)\operatorname {\mathbb {P} } (x)\,\operatorname {d} x~}
5226:
17609:
15677:
3727:
1216:
14520:{\displaystyle {\widehat {\sigma }}^{2}={\frac {1}{n}}\sum _{i=1}^{n}(x_{i}-{\bar {x}})^{2}={\frac {1}{n}}\sum _{i=1}^{n}x_{i}^{2}-{\frac {1}{n^{2}}}\sum _{i=1}^{n}\sum _{j=1}^{n}x_{i}x_{j}.}
12712:
1479:{\displaystyle {\frac {\partial \ell }{\partial \theta _{1}}}=0,\quad {\frac {\partial \ell }{\partial \theta _{2}}}=0,\quad \ldots ,\quad {\frac {\partial \ell }{\partial \theta _{k}}}=0~,}
246:
4601:
3575:
18961:{\displaystyle \mathbf {H} _{k+1}=\left(I-\gamma _{k}y_{k}s_{k}^{\mathsf {T}}\right)\mathbf {H} _{k}\left(I-\gamma _{k}s_{k}y_{k}^{\mathsf {T}}\right)+\gamma _{k}y_{k}y_{k}^{\mathsf {T}},}
18108:
16597:
2903:
1312:
14578:
9189:
that defines P), but even if they are not and the model we use is misspecified, still the MLE will give us the "closest" distribution (within the restriction of a model Q that depends on
16866:
16774:
16682:
15788:
1524:
581:
19793:
matrix, which is provided by a theorem proven by Fisher. Wilks continued to improve on the generality of the theorem throughout his life, with his most general proof published in 1962.
15251:
8573:
6148:
1566:
423:
17189:
19423:{\displaystyle B_{k+1}=B_{k}+{\frac {y_{k}y_{k}^{\mathsf {T}}}{y_{k}^{\mathsf {T}}s_{k}}}-{\frac {B_{k}s_{k}s_{k}^{\mathsf {T}}B_{k}^{\mathsf {T}}}{s_{k}^{\mathsf {T}}B_{k}s_{k}}}\ ,}
17801:
15170:
15058:
16121:{\displaystyle f(y_{1},\ldots ,y_{n})={\frac {1}{(2\pi )^{n/2}{\sqrt {\det({\mathit {\Sigma }})}}}}\exp \left(-{\frac {1}{2}}\left{\mathit {\Sigma }}^{-1}\left^{\mathrm {T} }\right)}
8459:
5430:
4577:
4534:
4265:
14280:
8821:{\displaystyle \operatorname {\mathbb {P} } (w_{i}\mid x)={\frac {\operatorname {\mathbb {P} } (x\mid w_{i})\operatorname {\mathbb {P} } (w_{i})}{\operatorname {\mathbb {P} } (x)}}}
5621:
5262:
4312:
2431:
1274:
940:
15736:
9384:
7308:
3343:
19590:
19196:
15199:
15114:
2749:
2713:
2678:
15816:
14784:{\displaystyle {\widehat {\sigma }}^{2}={\frac {1}{n}}\sum _{i=1}^{n}(\mu -\delta _{i})^{2}-{\frac {1}{n^{2}}}\sum _{i=1}^{n}\sum _{j=1}^{n}(\mu -\delta _{i})(\mu -\delta _{j}).}
6920:
6878:
3086:
9129:
5725:
4088:
13930:
8316:
11136:
9413:
9216:
9095:
9066:
8987:
7432:(at least within the curved exponential family), meaning that it has minimal mean squared error among all second-order bias-corrected estimators, up to the terms of the order
6344:
4968:
3984:
3119:
15497:{\displaystyle \log {\Bigl (}{\mathcal {L}}({\widehat {\mu }},{\widehat {\sigma }}){\Bigr )}={\frac {\,-n\;\;}{2}}{\bigl (}\,\log(2\pi {\widehat {\sigma }}^{2})+1\,{\bigr )}}
13495:{\displaystyle \log {\Bigl (}{\mathcal {L}}(\mu ,\sigma ^{2}){\Bigr )}=-{\frac {\,n\,}{2}}\log(2\pi \sigma ^{2})-{\frac {1}{2\sigma ^{2}}}\sum _{i=1}^{n}(\,x_{i}-\mu \,)^{2}}
11181:
9250:
9167:
8143:
3739:
3482:
2836:
11345:
11252:
6928:
593:
19716:. Therefore, it is important to assess the validity of the obtained solution to the likelihood equations, by verifying that the Hessian, evaluated at the solution, is both
16141:
1341:
18372:{\displaystyle \mathbf {d} _{r}\left({\widehat {\theta }}\right)=-\mathbf {H} _{r}^{-1}\left({\widehat {\theta }}\right)\mathbf {s} _{r}\left({\widehat {\theta }}\right)}
18250:
15085:
9440:
3243:
1070:
1041:
330:
13716:
8681:
5670:
4415:
4033:
18048:
16599:
are counts in cells / boxes 1 up to m; each box has a different probability (think of the boxes being bigger or smaller) and we fix the number of balls that fall to be
8568:
8269:
8138:
6306:{\displaystyle {\sqrt {n\,}}\,\left({\widehat {\theta \,}}_{\text{mle}}-\theta _{0}\right)\ \ \xrightarrow {d} \ \ {\mathcal {N}}\left(0,\ {\mathcal {I}}^{-1}\right)~,}
17765:
15134:
9187:
5690:
5641:
4585:. Thus, true consistency does not occur in practical applications. Nevertheless, consistency is often considered to be a desirable property for an estimator to have.
4323:
4053:
4004:
3310:
2789:
2769:
1236:
15259:
5926:
16801:
16709:
15583:
15556:
7848:
1089:
536:
13186:
7452: . It is possible to continue this process, that is to derive the third-order bias-correction term, and so on. However, the maximum likelihood estimator is
2482:
16617:
6085:
6065:
5946:
4188:
4168:
722:
The goal of maximum likelihood estimation is to find the values of the model parameters that maximize the likelihood function over the parameter space, that is
19202:
8838:
14145:
14926:
12038:
1568:
of the likelihood equations is indeed a (local) maximum depends on whether the matrix of second-order partial and cross-partial derivatives, the so-called
11138:
that maximizes some function will also be the one that maximizes some monotonic transformation of that function (i.e.: adding/multiplying by a constant).
10988:
13728:
5805:
17628:
4720:
The identification condition is absolutely necessary for the ML estimator to be consistent. When this condition holds, the limiting likelihood function
20474:
3584:
3216:
Because of the equivariance of the maximum likelihood estimator, the properties of the MLE apply to the restricted estimates also. For instance, in a
3124:
838:
100:
for finding maxima can be applied. In some cases, the first-order conditions of the likelihood function can be solved analytically; for instance, the
6090:
For example, the MLE parameters of the log-normal distribution are the same as those of the normal distribution fitted to the logarithm of the data.
5562:{\displaystyle {\sqrt {n}}\left({\widehat {\theta \,}}_{\mathrm {mle} }-\theta _{0}\right)\ \xrightarrow {d} \ {\mathcal {N}}\left(0,\,I^{-1}\right)}
6356:
15341:
In this case the MLEs could be obtained individually. In general this may not be the case, and the MLEs would have to be obtained simultaneously.
19778:. The theorem shows that the error in the logarithm of likelihood values for estimates from multiple independent observations is asymptotically
19616:
17614:
Maximizing log likelihood, with and without constraints, can be an unsolvable problem in closed form, then we have to use iterative procedures.
5379:{\displaystyle \sup _{\theta \in \Theta }\left\|\;{\widehat {\ell \,}}(\theta \mid x)-\ell (\theta )\;\right\|\ \xrightarrow {\text{a.s.}} \ 0.}
23082:
18507:
7769:
23587:
17691:
8832:
and if we further assume the zero-or-one loss function, which is a same loss for all errors, the Bayes
Decision rule can be reformulated as:
18436:
18766:
DFP formula finds a solution that is symmetric, positive-definite and closest to the current approximate value of second-order derivative:
12861:{\displaystyle f(x\mid \mu ,\sigma ^{2})={\frac {1}{{\sqrt {2\pi \sigma ^{2}}}\ }}\exp \left(-{\frac {(x-\mu )^{2}}{2\sigma ^{2}}}\right),}
5733:
19439:
18977:
5164:{\displaystyle \sup _{\theta \in \Theta }\left|{\widehat {\ell \,}}(\theta \mid x)-\ell (\theta )\,\right|\ {\xrightarrow {\text{p}}}\ 0.}
945:
825:{\displaystyle {\hat {\theta }}={\underset {\theta \in \Theta }{\operatorname {arg\;max} }}\,{\mathcal {L}}_{n}(\theta \,;\mathbf {y} )~.}
23737:
19869:
4755:
Compactness is only a sufficient condition and not a necessary condition. Compactness can be replaced by some other conditions, such as:
2908:
254:
17:
19847:: a measure of how 'good' an estimator of a distributional parameter is (be it the maximum likelihood estimator or some other estimator)
11310: + 1, ...}, rather than somewhere in the "middle" of the range of possible values, which would result in less bias.) The
23361:
22002:
17969:
3352:
In practice, restrictions are usually imposed using the method of
Lagrange which, given the constraints as defined above, leads to the
15585:
are independent only if their joint probability density function is the product of the individual probability density functions, i.e.
24021:
19931:
23135:
13838:
17542:{\displaystyle L(p_{1},p_{2},\ldots ,p_{m},\lambda )=\ell (p_{1},p_{2},\ldots ,p_{m})+\lambda \left(1-\sum _{i=1}^{m}p_{i}\right)}
3858:
3438:{\displaystyle {\frac {\partial \ell }{\partial \theta }}-{\frac {\partial h(\theta )^{\mathsf {T}}}{\partial \theta }}\lambda =0}
345:
23574:
18760:
3730:
18198:{\displaystyle \mathbf {d} _{r}\left({\widehat {\theta }}\right)=\nabla \ell \left({\widehat {\theta }}_{r};\mathbf {y} \right)}
24011:
11142:
3252:
19071:
18739:{\displaystyle \mathbf {d} _{r}\left({\widehat {\theta }}\right)=-\left^{-1}\mathbf {s} _{r}\left({\widehat {\theta }}\right)}
17956:{\displaystyle {\widehat {\theta }}_{r+1}={\widehat {\theta }}_{r}+\eta _{r}\mathbf {d} _{r}\left({\widehat {\theta }}\right)}
9276:
7981:
4831:
21577:
21513:
21494:
21286:
21258:
21233:
21083:
21058:
20959:
20804:
20714:
20687:
20423:
20390:
20348:
20053:
20025:
20000:
19972:
17806:
14797:
18385:
16523:
In this and other cases where a joint density function exists, the likelihood function is defined as above, in the section "
14856:
8055:
7945:
7729:
21997:
21697:
5059:, the dominance condition together with continuity establish the uniform convergence in probability of the log-likelihood:
1526:
but in general no closed-form solution to the maximization problem is known or available, and an MLE can only be found via
8992:
5954:
4093:
22601:
21749:
12879:
11102:
5184:
5181:
observations. In the non-i.i.d. case, the uniform convergence in probability can be checked by showing that the sequence
31:
20100:
17558:
15591:
4678:{\displaystyle \theta \neq \theta _{0}\quad \Leftrightarrow \quad f(\cdot \mid \theta )\neq f(\cdot \mid \theta _{0}).}
3684:
1181:
12671:
4579:
is a model, often in idealized form, of the process generated by the data. It is a common aphorism in statistics that
23384:
23276:
21555:
21536:
21443:
21421:
21402:
21020:
20841:
20776:
20748:
20662:
20637:
20609:
20448:
20274:
20223:
20196:
20144:
20082:
7472:
162:
128:
19789:
around any estimate of the parameters. The only difficult part of Wilks' proof depends on the expected value of the
14530:
To calculate its expected value, it is convenient to rewrite the expression in terms of zero-mean random variables (
3490:
23989:
23562:
23436:
20583:), the relationship between maximizing the likelihood and minimizing the cross-entropy, URL (version: 2019-11-06):
18071:
16537:
15823:
8644:{\displaystyle \;\operatorname {\mathbb {P} } ({\text{ error}}\mid x)=\operatorname {\mathbb {P} } (w_{2}\mid x)\;}
3217:
2847:
2446:
1279:
521:{\displaystyle {\mathcal {L}}_{n}(\theta )={\mathcal {L}}_{n}(\theta ;\mathbf {y} )=f_{n}(\mathbf {y} ;\theta )\;,}
17355:{\displaystyle \ell (p_{1},p_{2},\ldots ,p_{m})=\log n!-\sum _{i=1}^{m}\log x_{i}!+\sum _{i=1}^{m}x_{i}\log p_{i}}
14537:
4692:
correspond to different distributions within the model. If this condition did not hold, there would be some value
23620:
23281:
23026:
22397:
21987:
21431:
16809:
16714:
16622:
15741:
11217:
9132:
1492:
542:
120:
19701:, giving us the Fisher scoring algorithm. This procedure is standard in the estimation of many methods, such as
15211:
8531:{\displaystyle \operatorname {\mathbb {P} } ({\text{ error}}\mid x)=\operatorname {\mathbb {P} } (w_{1}\mid x)~}
6101:
1537:
23671:
22883:
22690:
22579:
22537:
21601:
19914:
8052:. Thus the Bayesian estimator coincides with the maximum likelihood estimator for a uniform prior distribution
7298:
Using these formulae it is possible to estimate the second-order bias of the maximum likelihood estimator, and
3941:
152:
22611:
20414:(1994). "Chapter 36: Large sample estimation and hypothesis testing". In Engle, Robert; McFadden, Dan (eds.).
20381:(1994). "Chapter 36: Large sample estimation and hypothesis testing". In Engle, Robert; McFadden, Dan (eds.).
17770:
15139:
15027:
7391:{\displaystyle {\widehat {\theta \,}}_{\text{mle}}^{*}={\widehat {\theta \,}}_{\text{mle}}-{\widehat {b\,}}~.}
3944:: As the sample size increases to infinity, sequences of maximum likelihood estimators have these properties:
23914:
22873:
21776:
21638:
21611:
19942:
19885:
7483:
5392:
4539:
4496:
4227:
2458:
835:
Intuitively, this selects the parameter values that make the observed data most probable. The specific value
20580:
14250:
5796:
5594:
5235:
4285:
4197:
2404:
1241:
912:
23465:
23414:
23399:
23389:
23258:
23130:
23097:
22923:
22878:
22708:
19909:
19826:
15689:
12872:
12715:
11438:, and suppose the coin was taken from a box containing three coins: one which gives heads with probability
11146:
9349:
5056:
3319:
584:
19599:
near an optimum. However, BFGS can have acceptable performance even for non-smooth optimization instances
19533:
19139:
15175:
15090:
2718:
2687:
2570:
23977:
23809:
23610:
23534:
22835:
22589:
22258:
21722:
21606:
19926:
15797:
6891:
6849:
6151:
5433:
5046:{\displaystyle {\Bigl |}\ln f(x\mid \theta ){\Bigr |}<D(x)\quad {\text{ for all }}\theta \in \Theta .}
4772:
3948:
3007:
9100:
8235:{\displaystyle ~\operatorname {\mathbb {P} } (w_{1}|x)\;>\;\operatorname {\mathbb {P} } (w_{2}|x)~;~}
5948:
is one to one and does not depend on the parameters to be estimated, then the density functions satisfy
5695:
4058:
3845:{\displaystyle {\widehat {\ell \,}}(\theta \,;x)={\frac {1}{n}}\sum _{i=1}^{n}\ln f(x_{i}\mid \theta ),}
23694:
23666:
23661:
23409:
23168:
23074:
23054:
22962:
22673:
22491:
21974:
21846:
19903:
19796:
Reviews of the development of maximum likelihood estimation have been provided by a number of authors.
17851:
are available, but the most commonly used ones are algorithms based on an updating formula of the form
13906:
8318:
are predictions of different classes. From a perspective of minimizing error, it can also be stated as
8278:
7288:{\displaystyle {\frac {1}{\,2\,}}\,K_{ijk}\;+\;J_{j,ik}\;=\;\operatorname {\mathbb {E} } \,{\biggl }~.}
4314:
4205:
2476:
sometimes need to be incorporated into the estimation process. The parameter space can be expressed as
712:{\displaystyle f_{n}(\mathbf {y} ;\theta )=\prod _{k=1}^{n}\,f_{k}^{\mathsf {univar}}(y_{k};\theta )~.}
18756:
Other quasi-Newton methods use more elaborate secant updates to give approximation of
Hessian matrix.
17180:
Each box taken separately against all the other boxes is a binomial and this is an extension thereof.
16513:{\displaystyle f(y_{1},y_{2})={\frac {1}{2\pi \sigma _{1}\sigma _{2}{\sqrt {1-\rho ^{2}}}}}\exp \left}
11112:
9389:
9192:
9071:
9042:
8965:
6319:
3960:
3095:
23426:
23194:
22915:
22840:
22769:
22698:
22618:
22606:
22476:
22464:
22457:
22165:
21886:
19900:(MAP) estimator: for a contrast in the way to calculate estimators when prior knowledge is postulated
19859:
19702:
19610:
11481:
11152:
9221:
9138:
7841:
is the probability of the data averaged over all parameters. Since the denominator is independent of
6347:
5229:
4714:
4200:
when the sample size tends to infinity. This means that no consistent estimator has lower asymptotic
3448:
2797:
2473:
1527:
1014:
74:
62:
19862:: yields a process for finding the best possible unbiased estimator (in the sense of having minimal
12635:
in the place of 80 to represent the number of
Bernoulli trials. Exactly the same calculation yields
11321:
11228:
7978:
is a uniform distribution, the
Bayesian estimator is obtained by maximizing the likelihood function
23909:
23676:
23539:
23224:
23189:
23153:
22938:
22380:
22289:
22248:
22160:
21851:
21690:
20857:
20136:
19936:
19873:
19779:
19756:
18503:
15512:
11492:(the "probability of success"), the likelihood function (defined below) takes one of three values:
4483:{\displaystyle {\widehat {\theta \,}}_{\mathrm {mle} }\ {\xrightarrow {\text{a.s.}}}\ \theta _{0}.}
3647:
3246:
2841:
2792:
2438:
58:
16524:
1321:
23818:
23431:
23371:
23308:
22946:
22930:
22668:
22530:
22520:
22370:
22284:
18222:
15531:
It may be the case that variables are correlated, that is, not independent. Two random variables
15063:
9418:
5271:
4402:
3313:
3226:
2791:
then, as a practical matter, means to find the maximum of the likelihood function subject to the
1489:
known as the likelihood equations. For some models, these equations can be explicitly solved for
1315:
1053:
1024:
313:
93:
20569:
Introduction to
Statistical Inference | Stanford (Lecture 16 — MLE under model misspecification)
13692:
8654:
6165:
5646:
4391:{\displaystyle {\widehat {\theta \,}}_{\mathrm {mle} }\ {\xrightarrow {\text{p}}}\ \theta _{0}.}
4009:
23856:
23786:
23579:
23516:
23271:
23158:
22155:
22052:
21959:
21838:
21737:
21303:
20833:
20241:"Why we always put log() before the joint pdf when we use MLE (Maximum likelihood Estimation)?"
20169:
18026:
15523:, which are generally more accurate than those using the asymptotic normality discussed above.
15331:{\displaystyle {\widehat {\theta \,}}=\left({\widehat {\mu }},{\widehat {\sigma }}^{2}\right).}
8544:
8245:
8114:
2681:
2398:
156:
101:
20794:
20740:
20734:
20704:
20070:
11351: + 1)/2. As a result, with a sample size of 1, the maximum likelihood estimator for
9135:, to the real probability distribution from which our data were generated (i.e., generated by
7931:{\displaystyle f(x_{1},x_{2},\ldots ,x_{n}\mid \theta )\operatorname {\mathbb {P} } (\theta )}
1164:{\displaystyle \ell (\theta \,;\mathbf {y} )=\ln {\mathcal {L}}_{n}(\theta \,;\mathbf {y} )~.}
24016:
23881:
23823:
23766:
23592:
23485:
23394:
23120:
23004:
22863:
22855:
22745:
22737:
22552:
22448:
22426:
22385:
22350:
22317:
22263:
22238:
22193:
22132:
22092:
21894:
21717:
20264:
20213:
20184:
19763:
however, between 1912 and 1922, who singlehandedly created the modern version of the method.
17750:
15119:
11485:
9172:
5675:
5626:
4928:
4038:
3989:
3295:
2774:
2754:
1221:
132:
86:
54:
20825:
20128:
13282:{\displaystyle {\mathcal {L}}(\mu ,\sigma ^{2})=f(x_{1},\ldots ,x_{n}\mid \mu ,\sigma ^{2})}
5896:
5432:, then under certain conditions, it can also be shown that the maximum likelihood estimator
23804:
23379:
23328:
23304:
23266:
23184:
23163:
23115:
22994:
22972:
22941:
22850:
22727:
22678:
22596:
22569:
22525:
22481:
22243:
22019:
21899:
21370:
21304:"On the history of maximum likelihood in relation to inverse probability and least squares"
20985:"On the History of Maximum Likelihood in Relation to Inverse Probability and Least Squares"
20969:
20312:
Schwallie, Daniel P. (1985). "Positive definite maximum likelihood covariance estimators".
19897:
19748:
19744:
17848:
16779:
16687:
15561:
15534:
11086:
7465:
6511:
4581:
4221:
3659:
2557:{\displaystyle \Theta =\left\{\theta :\theta \in \mathbb {R} ^{k},\;h(\theta )=0\right\}~,}
6150:
then under certain conditions, it can also be shown that the maximum likelihood estimator
8:
23951:
23876:
23799:
23480:
23244:
23237:
23199:
23107:
23087:
23059:
22792:
22658:
22653:
22643:
22635:
22453:
22414:
22304:
22294:
22203:
21982:
21938:
21856:
21781:
21683:
20535:
20129:
19988:
18750:
15516:
13506:
13300:
13297:
12666:
11435:
11419:
11412:
11376:
11201:
6547:
4193:
3954:
3578:
3089:
1018:
1002:
532:
148:
109:
66:
21016:"The large-sample distribution of the likelihood ratio for testing composite hypotheses"
20568:
8952:{\displaystyle h_{\text{Bayes}}={\underset {w}{\operatorname {arg\;max} }}\,{\bigl }\;,}
8091:
23965:
23776:
23630:
23526:
23475:
23351:
23248:
23232:
23209:
22986:
22720:
22703:
22663:
22574:
22469:
22431:
22402:
22362:
22322:
22268:
22185:
21871:
21866:
21525:
21473:
21392:
21325:
21204:
21184:
21166:
21125:
20920:
20881:
20821:
20598:
20552:
20483:
19863:
19844:
19838:
19832:
19813:
19790:
19752:
18430:
18213:
16602:
16132:
14237:{\displaystyle {\widehat {\sigma }}^{2}={\frac {1}{n}}\sum _{i=1}^{n}(x_{i}-\mu )^{2}.}
13510:
13304:
11488:
with sample size equal to 80, number successes equal to 49 but for different values of
9034:
7468:
6070:
6050:
5931:
5591:
corresponding component of the MLE of the complete parameter. Consistent with this, if
5579:
4204:
than the MLE (or other estimators attaining this bound), which also means that MLE has
4201:
4173:
4153:
3679:
3671:
2442:
1531:
1175:
136:
116:
21620:
20113:
15014:{\displaystyle \operatorname {\mathbb {E} } {\bigl }={\frac {\,n-1\,}{n}}\sigma ^{2}.}
23960:
23871:
23841:
23833:
23653:
23644:
23569:
23500:
23356:
23341:
23316:
23204:
23145:
23011:
22999:
22625:
22542:
22486:
22409:
22253:
22175:
21954:
21828:
21628:
21573:
21551:
21532:
21509:
21490:
21439:
21417:
21398:
21282:
21254:
21229:
21223:
21145:"F. Y. Edgeworth and R. A. Fisher on the efficiency of maximum likelihood estimation"
21099:
21079:
21054:
20996:
20955:
20943:
20837:
20826:
20800:
20772:
20744:
20710:
20683:
20658:
20633:
20605:
20444:
20419:
20386:
20344:
20325:
20270:
20219:
20192:
20162:
20140:
20078:
20049:
20021:
19996:
19968:
19920:
19807:
19786:
19775:
19717:
19603:
18491:
18016:
15791:
15520:
14531:
13293:
12156:{\displaystyle L(p)=f_{D}(\mathrm {H} =49\mid p)={\binom {80}{49}}p^{49}(1-p)^{31}~,}
9028:
8686:
7475:
5893:
The MLE is also equivariant with respect to certain transformations of the data. If
3221:
1076:
249:
124:
105:
70:
50:
11074:{\displaystyle h_{\theta }(x)=\log {\frac {P(x\mid \theta _{0})}{P(x\mid \theta )}}}
5274:, then a stronger condition of uniform convergence almost surely has to be imposed:
23896:
23851:
23615:
23602:
23495:
23470:
23404:
23336:
23214:
22822:
22715:
22648:
22561:
22508:
22327:
22198:
21992:
21876:
21791:
21758:
21657:
21465:
21356:
21315:
21196:
21156:
21115:
21029:
20947:
20912:
20873:
20514:
20321:
20109:
19945:: a variation using a likelihood function calculated from a transformed set of data
19853:: a method to estimate parameters of a mathematical model given data that contains
19721:
19712:
that is not necessarily a local or global maximum, but rather a local minimum or a
19709:
19596:
18058:
17744:
13810:{\displaystyle {\widehat {\mu }}={\bar {x}}=\sum _{i=1}^{n}{\frac {\,x_{i}\,}{n}}.}
8097:
5883:{\displaystyle {\bar {L}}(\alpha )=\sup _{\theta :\alpha =g(\theta )}L(\theta ).\,}
4761:
2434:
18065:(Note: here it is a maximization problem, so the sign before gradient is flipped)
17678:{\displaystyle {\frac {\partial \ell (\theta ;\mathbf {y} )}{\partial \theta }}=0}
13820:
This is indeed the maximum of the function, since it is the only turning point in
23813:
23557:
23419:
23346:
23021:
22895:
22868:
22845:
22814:
22441:
22436:
22390:
22120:
21771:
21565:
21366:
20965:
20730:
20411:
20378:
20041:
19771:
12628:
7479:
4593:
4146:
3642:
3631:{\displaystyle \;{\frac {\partial h(\theta )^{\mathsf {T}}}{\partial \theta }}\;}
2469:
2465:
1343:
339:
334:
97:
82:
30:
This article is about the statistical techniques. For computer data storage, see
23303:
3209:{\displaystyle \;\phi _{i}=h_{i}(\theta _{1},\theta _{2},\ldots ,\theta _{k})~.}
902:{\displaystyle ~{\hat {\theta }}={\hat {\theta }}_{n}(\mathbf {y} )\in \Theta ~}
23762:
23757:
22220:
22150:
21796:
20291:
20245:
18495:
13825:
11311:
8100:, maximum-likelihood estimation is used as the model for parameter estimation.
1569:
1080:
78:
21594:
21034:
21015:
19595:
BFGS method is not guaranteed to converge unless the function has a quadratic
17365:
The constraint has to be taken into account and use the
Lagrange multipliers:
6500:{\displaystyle {\mathcal {I}}_{jk}=\operatorname {\mathbb {E} } \,{\biggl }~.}
2844:
problem is the method of substitution, that is "filling out" the restrictions
27:
Method of estimating the parameters of a statistical model, given observations
24005:
23919:
23886:
23749:
23710:
23521:
23490:
22954:
22908:
22513:
22215:
22042:
21806:
21801:
21453:
21161:
21144:
21120:
21103:
21000:
20625:
20519:
20502:
19888:: methods related to the likelihood equation in maximum likelihood estimation
19760:
19736:
18511:
18051:
15508:
11390:
Suppose the coin is tossed 80 times: i.e. the sample might be something like
3951:: the sequence of MLEs converges in probability to the value being estimated.
1072:
the likelihood function may increase without ever reaching a supremum value.
1044:
21648:
21436:
Unifying Political Methodology: the Likehood Theory of Statistical Inference
21361:
21344:
21320:
17552:
By posing all the derivatives to be 0, the most natural estimate is derived
23861:
23794:
23771:
23686:
23016:
22312:
22210:
22145:
22087:
22072:
22009:
21964:
20764:
20584:
20556:
20465:
19841:: information matrix, its relationship to covariance matrix of ML estimates
19713:
15344:
The normal log-likelihood at its maximum takes a particularly simple form:
11183:
is constant, then the MLE is also asymptotically minimizing cross entropy.
1010:
20951:
20240:
19694:{\displaystyle {\mathcal {I}}(\theta )=\operatorname {\mathbb {E} } \left}
12627:
in the place of 49 to represent the observed number of 'successes' of our
9169:). In an ideal world, P and Q are the same (and the only thing unknown is
9068:
that maximizes the likelihood is asymptotically equivalent to finding the
108:
model maximizes the likelihood when the random errors are assumed to have
23904:
23866:
23549:
23450:
23312:
23125:
23092:
22584:
22501:
22496:
22140:
22097:
22077:
22057:
22047:
21816:
21388:
21274:
20581:
https://stats.stackexchange.com/users/22311/sycorax-says-reinstate-monica
20469:
20077:(2nd ed.). Cambridge: Cambridge University Press. pp. 651–655.
20016:
Chambers, Raymond L.; Steel, David G.; Wang, Suojin; Welsh, Alan (2012).
19891:
15507:
This maximum log-likelihood can be shown to be the same for more general
13719:
11372:
4739:
4220:
Under the conditions outlined below, the maximum likelihood estimator is
21251:
Statistics on the table: the history of statistical concepts and methods
20655:
Multinomial Probit: The Theory and its Application to Demand Forecasting
12176:
7834:{\displaystyle \operatorname {\mathbb {P} } (x_{1},x_{2},\ldots ,x_{n})}
4927:
The continuity here can be replaced with a slightly weaker condition of
4764:
of the log-likelihood function and compactness of some (nonempty) upper
22750:
22230:
21930:
21861:
21811:
21786:
21706:
21632:
21477:
21329:
21208:
21170:
21129:
20984:
20924:
20885:
20487:
20069:
Press, W.H.; Flannery, B.P.; Teukolsky, S.A.; Vetterling, W.T. (1992).
17736:{\displaystyle {\widehat {\theta }}={\widehat {\theta }}(\mathbf {y} )}
12189:
11098:
3658:
Nonparametric maximum likelihood estimation can be performed using the
38:
19965:
Mathematical Statistics: An Introduction to Likelihood Based Inference
19939:
estimator: an MLE estimator that is misspecified, but still consistent
18483:{\displaystyle \mathbf {H} _{r}^{-1}\left({\widehat {\theta }}\right)}
12660:
6164:-consistent and asymptotically efficient, meaning that it reaches the
6087:
differ only by a factor that does not depend on the model parameters.
4789:
the log-likelihood function is less than the maximum by at least some
1530:. Another problem is that in finite samples, there may exist multiple
22903:
22755:
22375:
22170:
22082:
22067:
22062:
22027:
21225:
The history of statistics: the measurement of uncertainty before 1900
19767:
19732:
13289:, over both parameters simultaneously, or if possible, individually.
11106:
8092:
Application of maximum-likelihood estimation in Bayes decision theory
6514:
of the maximum likelihood estimator is equal to zero up to the order
5785:{\displaystyle {\widehat {\alpha }}=g(\,{\widehat {\theta \,}}\,).\,}
4765:
3346:
1006:
21469:
21200:
20916:
20900:
20877:
20861:
19522:{\displaystyle y_{k}=\nabla \ell (x_{k}+s_{k})-\nabla \ell (x_{k}),}
19208:
BFGS also gives a solution that is symmetric and positive-definite:
19060:{\displaystyle y_{k}=\nabla \ell (x_{k}+s_{k})-\nabla \ell (x_{k}),}
12015:
9035:
Relation to minimizing Kullback–Leibler divergence and cross entropy
6541:
6242:
5513:
5364:
5148:
4744:
4457:
4365:
3936:, where this expectation is taken with respect to the true density.
994:{\displaystyle \;{\hat {\theta }}_{n}:\mathbb {R} ^{n}\to \Theta \;}
942:
is called the maximum likelihood estimate. Further, if the function
22419:
22037:
21914:
21909:
21904:
19854:
18502:
th iteration. But because the calculation of the Hessian matrix is
12623:
This result is easily generalized by substituting a letter such as
4588:
To establish consistency, the following conditions are sufficient.
4278:
with arbitrary precision. In mathematical terms this means that as
2997:{\displaystyle \;h_{1},h_{2},\ldots ,h_{r},h_{r+1},\ldots ,h_{k}\;}
1048:
303:{\displaystyle \;\{f(\cdot \,;\theta )\mid \theta \in \Theta \}\;,}
15515:. This is often used in determining likelihood-based approximate
13516:
We now compute the derivatives of this log-likelihood as follows.
23924:
23625:
20682:(Fourth ed.). College Station: Stata Press. pp. 13–20.
20098:
Myung, I.J. (2003). "Tutorial on maximum likelihood Estimation".
18008:{\displaystyle \mathbf {d} _{r}\left({\widehat {\theta }}\right)}
3121:
to itself, and reparameterize the likelihood function by setting
1017:
condition for its existence is for the likelihood function to be
20068:
19917:: another popular method for finding parameters of distributions
12180:
Likelihood function for proportion value of a binomial process (
11366:
9266:
For simplicity of notation, let's assume that P=Q. Let there be
23846:
22827:
22801:
22781:
22032:
21823:
19850:
5389:
Additionally, if (as assumed above) the data were generated by
5178:
21570:
Maximum Likelihood for Social Science: Strategies for Analysis
20046:
Maximum Likelihood for Social Science: Strategies for Analysis
13935:
Similarly we differentiate the log-likelihood with respect to
12649:
which is the maximum likelihood estimator for any sequence of
3855:
this being the sample analogue of the expected log-likelihood
3653:
21675:
21053:. London, UK; Boca Raton, FL: Chapman & Hall; CRC Press.
20075:
Numerical Recipes in FORTRAN: The Art of Scientific Computing
19201:
13893:{\displaystyle \operatorname {\mathbb {E} } {\bigl }=\mu ,\,}
9270:
4150:. The invariance property holds for arbitrary transformation
21345:"R.A. Fisher and the making of maximum likelihood 1912–1922"
20541:(lecture). Bayesian Decision Theory - CS 7616. Georgia Tech.
12558:
This is a product of three terms. The first term is 0 when
4401:
Under slightly stronger conditions, the estimator converges
3929:{\displaystyle \ell (\theta )=\operatorname {\mathbb {E} } }
410:{\displaystyle \;\mathbf {y} =(y_{1},y_{2},\ldots ,y_{n})\;}
21766:
20555:), Kullback–Leibler divergence, URL (version: 2017-11-18):
14794:
Simplifying the expression above, utilizing the facts that
4224:. The consistency means that if the data were generated by
2301:
2214:
2132:
2026:
1948:
1866:
1782:
1695:
1622:
342:. Evaluating the joint density at the observed data sample
21649:"maxLik: A package for maximum likelihood estimation in R"
19609:
Another popular method is to replace the Hessian with the
16135:
case, the joint probability density function is given by:
13824:
and the second derivative is strictly less than zero. Its
1534:
for the likelihood equations. Whether the identified root
20799:. Englewood Cliffs, NJ: Prentice-Hall. pp. 293–294.
19906:: a related method that is more robust in many situations
19866:); the MLE is often a good starting place for the process
19835:: a more general class of estimators to which MLE belongs
19829:: a criterion to compare statistical models, based on MLE
19708:
Although popular, quasi-Newton methods may converge to a
11434:
above). Suppose the outcome is 49 heads and 31
11318:
on the drawn ticket, and therefore the expected value of
6550:
of the distribution of this estimator, it turns out that
3285:{\displaystyle \;\Sigma =\Gamma ^{\mathsf {T}}\Gamma \;,}
21595:
Maximum likelihood vs least squares in linear regression
20604:(Second ed.). New York, NY: John Wiley & Sons.
20503:"Third-order efficiency implies fourth-order efficiency"
19128:{\displaystyle \gamma _{k}={\frac {1}{y_{k}^{T}s_{k}}},}
18506:, numerous alternatives have been proposed. The popular
12580:. The solution that maximizes the likelihood is clearly
11105:. The first several transitions have to do with laws of
9339:{\displaystyle \mathbf {y} =(y_{1},y_{2},\ldots ,y_{n})}
8045:{\displaystyle f(x_{1},x_{2},\ldots ,x_{n}\mid \theta )}
6546:
However, when we consider the higher-order terms in the
4918:{\displaystyle \operatorname {\mathbb {P} } {\Bigl }=1.}
4267:
and we have a sufficiently large number of observations
537:
independent and identically distributed random variables
17840:{\displaystyle \left\{{\widehat {\theta }}_{r}\right\}}
16806:
are not independent, the joint probability of a vector
14846:{\displaystyle \operatorname {\mathbb {E} } {\bigl }=0}
5177:
The dominance condition can be employed in the case of
21489:. Amsterdam, NL: VU University Press. pp. 53–68.
21394:
Econometric Applications of Maximum Likelihood Methods
21279:
A history of mathematical statistics from 1750 to 1930
20678:
Gould, William; Pitblado, Jeffrey; Poi, Brian (2010).
20015:
18422:{\displaystyle \mathbf {s} _{r}({\widehat {\theta }})}
14913:{\displaystyle \operatorname {E} {\bigl }=\sigma ^{2}}
12602: = 1 result in a likelihood of 0). Thus the
11940:
11909:
11857:
11784:
11753:
11701:
11628:
11597:
11545:
8081:{\displaystyle \operatorname {\mathbb {P} } (\theta )}
7971:{\displaystyle \operatorname {\mathbb {P} } (\theta )}
7755:{\displaystyle \operatorname {\mathbb {P} } (\theta )}
4493:
In practical applications, data is never generated by
1617:
19619:
19536:
19442:
19217:
19142:
19074:
18980:
18775:
18523:
18439:
18388:
18258:
18225:
18116:
18074:
18029:
17972:
17860:
17809:
17773:
17753:
17694:
17631:
17561:
17374:
17192:
16877:
16812:
16782:
16717:
16690:
16625:
16605:
16540:
16144:
15835:
15800:
15744:
15692:
15594:
15564:
15537:
15353:
15262:
15214:
15178:
15142:
15122:
15093:
15066:
15030:
14929:
14859:
14800:
14589:
14540:
14291:
14253:
14148:
13948:
13909:
13841:
13731:
13695:
13525:
13317:
13189:
12891:
12726:
12674:
12205:
12041:
11501:
11324:
11231:
11155:
11115:
10991:
9451:
9421:
9392:
9352:
9279:
9224:
9195:
9175:
9141:
9103:
9074:
9045:
8995:
8968:
8841:
8697:
8657:
8576:
8547:
8462:
8327:
8281:
8248:
8146:
8117:
8058:
7984:
7948:
7851:
7772:
7732:
7503:
7311:
6931:
6894:
6852:
6580:
6359:
6322:
6177:
6104:
6073:
6053:
5957:
5934:
5899:
5808:
5736:
5698:
5678:
5649:
5629:
5597:
5445:
5395:
5283:
5238:
5187:
5067:
4971:
4834:
4604:
4542:
4499:
4418:
4326:
4288:
4230:
4176:
4156:
4096:
4061:
4041:
4012:
3992:
3963:
3861:
3742:
3687:
3587:
3493:
3451:
3364:
3322:
3298:
3255:
3229:
3127:
3098:
3010:
2911:
2850:
2800:
2777:
2757:
2721:
2690:
2573:
2485:
2407:
1580:
1540:
1495:
1355:
1324:
1282:
1244:
1224:
1184:
1092:
1075:
In practice, it is often convenient to work with the
1056:
1027:
948:
915:
841:
731:
596:
545:
426:
348:
316:
257:
165:
23588:
Autoregressive conditional heteroskedasticity (ARCH)
20553:
https://stats.stackexchange.com/users/177679/cmplx96
20189:
Numerical Methods for Nonlinear Estimating Equations
19803:
14580:. Expressing the estimate in these variables yields
9020:{\displaystyle \;\operatorname {\mathbb {P} } (w)\;}
7401:
This estimator is unbiased up to the terms of order
6037:{\displaystyle f_{Y}(y)={\frac {f_{X}(x)}{|g'(x)|}}}
5232:. If one wants to demonstrate that the ML estimator
4136:{\displaystyle {\hat {\alpha }}=g({\hat {\theta }})}
139:, with the objective function being the likelihood.
20856:
20218:. New York, NY: John Wiley & Sons. p. 14.
18498:of the log-likelihood function, both evaluated the
17622:Except for special cases, the likelihood equations
12661:
Continuous distribution, continuous parameter space
11371:Suppose one wishes to determine just how biased an
11214:
are placed in a box and one is selected at random (
7845:, the Bayesian estimator is obtained by maximizing
5221:{\displaystyle {\widehat {\ell \,}}(\theta \mid x)}
4738:Compactness: the parameter space Θ of the model is
1346:for the occurrence of a maximum (or a minimum) are
23050:
21524:
21189:Journal of the Royal Statistical Society, Series A
21187:(1978). "Francis Ysidro Edgeworth, statistician".
20898:
20597:
20475:Journal of the Royal Statistical Society, Series B
20343:. Amsterdam: VU University Press. pp. 64–65.
20161:
19766:Maximum-likelihood estimation finally transcended
19693:
19584:
19521:
19422:
19190:
19127:
19059:
18960:
18759:
18738:
18482:
18421:
18371:
18244:
18197:
18102:
18042:
18007:
17955:
17839:
17795:
17759:
17735:
17677:
17603:
17541:
17354:
17169:
16860:
16795:
16768:
16703:
16676:
16611:
16591:
16512:
16120:
15818:. The joint probability density function of these
15810:
15782:
15730:
15671:
15577:
15550:
15496:
15330:
15245:
15193:
15164:
15128:
15108:
15079:
15052:
15013:
14912:
14845:
14783:
14572:
14519:
14274:
14236:
14128:
13924:
13903:which means that the maximum likelihood estimator
13892:
13809:
13710:
13678:
13494:
13281:
13157:
12860:
12706:
12547:
12155:
11979:
11339:
11246:
11225:is unknown, then the maximum likelihood estimator
11175:
11130:
11073:
10974:
9434:
9407:
9378:
9338:
9244:
9210:
9181:
9161:
9123:
9089:
9060:
9019:
8981:
8951:
8820:
8675:
8643:
8562:
8530:
8442:
8310:
8263:
8234:
8132:
8080:
8044:
7970:
7930:
7833:
7754:
7715:
7464:A maximum likelihood estimator coincides with the
7390:
7287:
6914:
6872:
6835:
6499:
6338:
6305:
6142:
6079:
6059:
6036:
5940:
5920:
5882:
5784:
5719:
5684:
5664:
5635:
5615:
5561:
5424:
5378:
5256:
5220:
5163:
5045:
4917:
4677:
4571:
4528:
4482:
4390:
4306:
4259:
4211:Second-order efficiency after correction for bias.
4182:
4162:
4135:
4082:
4047:
4027:
3998:
3978:
3928:
3844:
3721:
3630:
3569:
3476:
3437:
3337:
3304:
3284:
3237:
3208:
3113:
3080:
2996:
2897:
2830:
2783:
2763:
2743:
2707:
2672:
2556:
2425:
2386:
1560:
1518:
1478:
1335:
1306:
1268:
1230:
1210:
1163:
1064:
1035:
993:
934:
901:
824:
711:
575:
520:
409:
324:
302:
240:
21414:Maximum Likelihood Estimation: Logic and Practice
20126:
20071:"Least Squares as a Maximum Likelihood Estimator"
19923:, a variation of the maximum likelihood technique
17604:{\displaystyle {\hat {p}}_{i}={\frac {x_{i}}{n}}}
17092:
17038:
15672:{\displaystyle f(y_{1},y_{2})=f(y_{1})f(y_{2})\,}
15409:
15362:
14022:
13986:
13599:
13563:
13362:
13326:
13168:This family of distributions has two parameters:
12256:
12243:
12166:and the maximisation is over all possible values
12109:
12096:
12020:Now suppose that there was only one coin but its
12016:Discrete distribution, continuous parameter space
11899:
11886:
11743:
11730:
11587:
11574:
7459:
7274:
7002:
6674:
6608:
6542:Second-order efficiency after correction for bias
6486:
6390:
5005:
4974:
4904:
4847:
3722:{\displaystyle {\widehat {\ell \,}}(\theta \,;x)}
2840:Theoretically, the most natural approach to this
1211:{\displaystyle \;\ell (\theta \,;\mathbf {y} )\;}
24003:
21646:
20819:
20739:. Cambridge: Harvard University Press. pp.
20677:
20559:(at the youtube video, look at minutes 13 to 25)
20443:. New York: John Wiley & Sons. p. 223.
20018:Maximum Likelihood Estimation for Sample Surveys
19967:. New York: John Wiley & Sons. p. 227.
15912:
13505:(Note: the log-likelihood is closely related to
12707:{\displaystyle {\mathcal {N}}(\mu ,\sigma ^{2})}
11302:occurs at the lower extreme of possible values {
11195:
6098:As assumed above, if the data were generated by
5834:
5285:
5069:
2464:While the domain of the likelihood function—the
1013:, i.e. taking a given sample as its argument. A
61:, given some observed data. This is achieved by
23136:Multivariate adaptive regression splines (MARS)
21456:(1990). "Maximum likelihood: An Introduction".
20901:"On the probable errors of frequency-constants"
20862:"On the probable errors of frequency-constants"
20215:Geometrical Foundations of Asymptotic Inference
11466:and another which gives heads with probability
11411: = T, and the count of the number of
11298:. Note that the maximum likelihood estimate of
3249:; this restriction can be imposed by replacing
241:{\displaystyle \;\theta =\left^{\mathsf {T}}\;}
21660:. Mathematical Sciences / College of Science.
20127:Gourieroux, Christian; Monfort, Alain (1995).
19785:, which enables convenient determination of a
18050:captures the "step length," also known as the
11262:on the drawn ticket. (The likelihood is 0 for
3570:{\displaystyle ~\lambda =\left^{\mathsf {T}}~}
1009:. It is generally a function defined over the
21691:
21627:
21485:Magnus, Jan R. (2017). "Maximum Likelihood".
20771:. Oxford: Basil Blackwell. pp. 161–169.
20767:(1988). "Methods of Numerical Optimization".
20706:Machine Learning: A Probabilistic Perspective
20624:
20472:(1968). "A general definition of residuals".
20409:
20376:
20258:
20256:
19987:
18103:{\displaystyle \eta _{r}\in \mathbb {R} ^{+}}
17803:), one seeks to obtain a convergent sequence
17688:cannot be solved explicitly for an estimator
16592:{\displaystyle X_{1},\ X_{2},\ldots ,\ X_{m}}
15489:
15437:
14970:
14942:
14892:
14868:
14832:
14813:
13875:
13854:
11871:
11830:
11715:
11674:
11559:
11518:
11367:Discrete distribution, finite parameter space
11149:plus KL divergence, and since the entropy of
8940:
8886:
2898:{\displaystyle \;h_{1},h_{2},\ldots ,h_{r}\;}
2468:—is generally a finite-dimensional subset of
2452:
1307:{\displaystyle \ell (\theta \,;\mathbf {y} )}
21647:Toomet, Ott; Henningsen, Arne (2019-05-19).
21564:
21397:. New York, NY: Cambridge University Press.
20769:Lecture Notes on Advanced Econometric Theory
20657:. New York: Academic Press. pp. 61–78.
20269:. London, UK: Chapman and Hall. p. 79.
20238:
20191:. Oxford University Press. pp. 74–124.
20183:Small, Christoper G.; Wang, Jinfang (2003).
20040:
16868:is called the multinomial and has the form:
15526:
14573:{\displaystyle \delta _{i}\equiv \mu -x_{i}}
12882:normal random variables (the likelihood) is
7762:is the prior distribution for the parameter
4945:integrable with respect to the distribution
4190:is restricted to one-to-one transformations.
4055:, then the maximum likelihood estimator for
293:
259:
21506:Maximum Likelihood Estimation and Inference
20796:Nonlinear Programming: Analysis and Methods
20632:(Second ed.). New York, NY: Springer.
19993:Econometric Modeling: A Likelihood Approach
19879:
16861:{\displaystyle x_{1},\ x_{2},\ldots ,x_{m}}
16769:{\displaystyle p_{1}+p_{2}+\cdots +p_{m}=1}
16677:{\displaystyle x_{1}+x_{2}+\cdots +x_{m}=n}
15783:{\displaystyle (\mu _{1},\ldots ,\mu _{n})}
12032:The likelihood function to be maximised is
9131:) that has a minimal distance, in terms of
8107:Thus, the Bayes Decision Rule is stated as
7423:bias-corrected maximum likelihood estimator
4688:In other words, different parameter values
4271:, then it is possible to find the value of
3654:Nonparametric maximum likelihood estimation
1519:{\displaystyle \,{\widehat {\theta \,}}\,,}
1005:, then it is called the maximum likelihood
576:{\displaystyle f_{n}(\mathbf {y} ;\theta )}
147:We model a set of observations as a random
21736:
21698:
21684:
21487:Introduction to the Theory of Econometrics
20341:Introduction to the Theory of Econometrics
20253:
20239:Papadopoulos, Alecos (25 September 2013).
20168:. New York, NY: Harper & Row. p.
19743:Early users of maximum likelihood include
19203:Broyden–Fletcher–Goldfarb–Shanno algorithm
18057:
15428:
15427:
15246:{\displaystyle \theta =(\mu ,\sigma ^{2})}
14967:
14947:
14889:
14873:
14829:
14818:
13872:
13859:
13650:
13616:
11868:
11835:
11712:
11679:
11556:
11523:
9016:
8996:
8945:
8867:
8669:
8658:
8640:
8577:
8559:
8548:
8362:
8346:
8307:
8282:
8260:
8249:
8188:
8184:
8129:
8118:
7271:
7238:
7188:
7165:
7122:
7099:
7085:
7007:
6991:
6987:
6967:
6963:
6808:
6804:
6750:
6732:
6683:
6679:
6671:
6613:
6595:
6591:
6483:
6395:
6143:{\displaystyle ~f(\cdot \,;\theta _{0})~,}
5351:
5305:
4901:
4881:
4877:
4852:
4143:. This property is less commonly known as
3627:
3588:
3470:
3278:
3256:
3128:
3077:
3011:
2993:
2912:
2894:
2851:
2722:
2669:
2574:
2524:
1561:{\displaystyle \,{\widehat {\theta \,}}\,}
1207:
1185:
990:
949:
759:
514:
406:
349:
296:
258:
237:
166:
22349:
21550:. New York, NY: Oxford University Press.
21360:
21319:
21160:
21119:
21033:
20518:
20311:
20182:
19995:. Princeton: Princeton University Press.
19932:Partial likelihood methods for panel data
19641:
18110:that is small enough for convergence and
18090:
15738:, where each variable has means given by
15668:
15486:
15442:
15420:
15270:
14991:
14981:
14932:
14803:
14108:
14091:
14040:
14036:
13889:
13844:
13797:
13786:
13481:
13464:
13380:
13376:
11820:
11664:
11508:
11452:, one which gives heads with probability
10921:
10905:
10790:
10774:
10704:
10688:
10637:
10621:
10531:
10515:
10390:
10374:
10266:
10250:
10142:
10126:
10000:
9984:
9844:
9828:
9753:
9737:
9675:
9659:
9616:
9600:
9563:
9547:
9503:
9487:
9097:that defines a probability distribution (
8999:
8937:
8920:
8917:
8894:
8891:
8883:
8798:
8770:
8738:
8700:
8610:
8580:
8495:
8465:
8427:
8410:
8383:
8293:
8191:
8152:
8061:
7951:
7911:
7775:
7735:
7654:
7633:
7506:
7494:given the data, given by Bayes' theorem:
7374:
7350:
7320:
7254:
7184:
6999:
6994:
6946:
6942:
6938:
6787:
6783:
6779:
6698:
6694:
6690:
6598:
6466:
6387:
6382:
6200:
6186:
6183:
6117:
5879:
5781:
5774:
5766:
5758:
5605:
5540:
5466:
5405:
5313:
5246:
5195:
5134:
5096:
4837:
4552:
4509:
4427:
4335:
4296:
4240:
3922:
3890:
3879:
3764:
3750:
3709:
3695:
3674:obtained by maximizing, as a function of
3234:
3230:
3101:
2725:
2704:
2694:
2691:
2511:
2415:
2360:
2282:
2247:
2200:
2165:
2094:
2059:
2007:
1934:
1899:
1850:
1815:
1763:
1728:
1681:
1597:
1557:
1549:
1541:
1512:
1504:
1496:
1329:
1325:
1292:
1245:
1195:
1143:
1102:
1061:
1057:
1032:
1028:
974:
931:
916:
804:
783:
648:
321:
317:
271:
248:so that this distribution falls within a
213:
206:
192:
21545:
20937:
20931:
20905:Journal of the Royal Statistical Society
20866:Journal of the Royal Statistical Society
20680:Maximum Likelihood Estimation with Stata
20595:
20585:https://stats.stackexchange.com/q/364237
20557:https://stats.stackexchange.com/q/314472
20507:Journal of the Japan Statistical Society
20464:
20418:. Elsevier Science. pp. 2111–2245.
20385:. Elsevier Science. pp. 2111–2245.
20363:
20048:. New York: Cambridge University Press.
19731:
18212:
17796:{\displaystyle {\widehat {\theta }}_{1}}
15686:Gaussian vector out of random variables
15165:{\displaystyle {\widehat {\sigma }}^{2}}
15053:{\displaystyle {\widehat {\sigma }}^{2}}
12566: = 1. The third is zero when
12188:One way to maximize this function is by
12175:
11375:is. Call the probability of tossing a '
9415:that will maximize the likelihood using
6571:. This bias is equal to (componentwise)
5585:
5436:to a normal distribution. Specifically,
3986:is the maximum likelihood estimator for
155:which is expressed in terms of a set of
21522:
21411:
21342:
21248:
21221:
21183:
21078:. New York, NY: John Wiley & Sons.
20832:. London, UK: Academic Press. pp.
20729:
20652:
20533:
20292:"Does the MLE maximize the likelihood?"
20211:
19894:: an approach used in robust statistics
18749:
17617:
6047:and hence the likelihood functions for
5425:{\displaystyle f(\cdot \,;\theta _{0})}
4572:{\displaystyle f(\cdot \,;\theta _{0})}
4529:{\displaystyle f(\cdot \,;\theta _{0})}
4260:{\displaystyle f(\cdot \,;\theta _{0})}
3731:independent and identically distributed
1079:of the likelihood function, called the
909:that maximizes the likelihood function
14:
24004:
23662:Kaplan–Meier estimator (product limit)
21655:
21531:. Norwich: W. H. Hutchins & Sons.
21527:An Introduction to Likelihood Analysis
21503:
21484:
21452:
21387:
21098:
20792:
20763:
20702:
20338:
20262:
20212:Kass, Robert E.; Vos, Paul W. (1997).
20135:. Cambridge University Press. p.
19770:justification in a proof published by
19385:
19366:
19349:
19293:
19274:
18949:
18904:
18839:
18684:
14275:{\displaystyle \mu ={\widehat {\mu }}}
12562: = 0. The second is 0 when
7942:. If we further assume that the prior
5616:{\displaystyle {\widehat {\theta \,}}}
5257:{\displaystyle {\widehat {\theta \,}}}
4307:{\displaystyle {\widehat {\theta \,}}}
3610:
3558:
3409:
3329:
3269:
2426:{\displaystyle {\widehat {\theta \,}}}
1269:{\displaystyle \,{\mathcal {L}}_{n}~.}
935:{\displaystyle \,{\mathcal {L}}_{n}\,}
675:
672:
669:
666:
663:
660:
231:
112:distributions with the same variance.
23735:
23302:
23049:
22348:
22118:
21735:
21679:
21142:
21073:
21013:
20709:. Cambridge: MIT Press. p. 247.
20438:
20404:
20402:
20289:
20097:
19962:
15731:{\displaystyle (y_{1},\ldots ,y_{n})}
11383:. The goal then becomes to determine
9386:, that we try to estimate by finding
9379:{\displaystyle y\sim P_{\theta _{0}}}
3670:A maximum likelihood estimator is an
3338:{\displaystyle \Gamma ^{\mathsf {T}}}
121:maximum a posteriori (MAP) estimation
23972:
23672:Accelerated failure time (AFT) model
21430:
21301:
21273:
21048:
20982:
20500:
20164:Economic Statistics and Econometrics
20159:
19585:{\displaystyle s_{k}=x_{k+1}-x_{k}.}
19191:{\displaystyle s_{k}=x_{k+1}-x_{k}.}
18514:of the expected gradient, such that
17747:: starting from an initial guess of
15194:{\displaystyle {\widehat {\sigma }}}
15109:{\displaystyle {\widehat {\sigma }}}
9254:
2744:{\displaystyle \;\mathbb {R} ^{r}~.}
2708:{\displaystyle \,\mathbb {R} ^{k}\,}
2673:{\displaystyle \;h(\theta )=\left\;}
23984:
23267:Analysis of variance (ANOVA, anova)
22119:
21618:
20441:The Theory of Statistical Inference
19820:
19602:
15811:{\displaystyle {\mathit {\Sigma }}}
12880:independent identically distributed
11221:); thus, the sample size is 1. If
11103:law of the unconscious statistician
6915:{\displaystyle {\mathcal {I}}^{-1}}
6873:{\displaystyle {\mathcal {I}}^{jk}}
4170:, although the proof simplifies if
3081:{\displaystyle \;h^{\ast }=\left\;}
32:partial-response maximum-likelihood
24:
23362:Cochran–Mantel–Haenszel statistics
21988:Pearson product-moment correlation
21381:
20899:Edgeworth, Francis Y. (Dec 1908).
20399:
20131:Statistics and Econometrics Models
20101:Journal of Mathematical Psychology
19622:
19494:
19456:
19032:
18994:
18669:
18644:
18627:
18602:
18510:approximates the Hessian with the
18508:Berndt–Hall–Hall–Hausman algorithm
18152:
17743:. Instead, they need to be solved
17660:
17635:
17042:
16107:
16025:
15920:
15803:
15369:
14860:
13993:
13969:
13965:
13570:
13546:
13542:
13333:
13192:
12677:
12247:
12226:
12222:
12100:
12071:
11890:
11837:
11734:
11681:
11578:
11525:
11355:will systematically underestimate
10912:
10909:
10906:
10902:
10899:
10896:
10781:
10778:
10775:
10771:
10768:
10765:
10695:
10692:
10689:
10685:
10682:
10679:
10628:
10625:
10622:
10618:
10615:
10612:
10602:
10522:
10519:
10516:
10512:
10509:
10506:
10381:
10378:
10375:
10371:
10368:
10365:
10257:
10254:
10251:
10247:
10244:
10241:
10133:
10130:
10127:
10123:
10120:
10117:
9991:
9988:
9985:
9981:
9978:
9975:
9835:
9832:
9829:
9825:
9822:
9819:
9744:
9741:
9738:
9734:
9731:
9728:
9666:
9663:
9660:
9656:
9653:
9650:
9607:
9604:
9601:
9597:
9594:
9591:
9554:
9551:
9548:
9544:
9541:
9538:
9494:
9491:
9488:
9484:
9481:
9478:
9124:{\displaystyle Q_{\hat {\theta }}}
8874:
8871:
8868:
8864:
8861:
8858:
8428:
8376:
8371:
8353:
8350:
8347:
8343:
8340:
8337:
8096:In many practical applications in
7490:that maximizes the probability of
7255:
7241:
7190:
7168:
7123:
7100:
7086:
7072:
7022:
6898:
6856:
6754:
6736:
6641:
6638:
6635:
6467:
6453:
6403:
6363:
6328:
6278:
6256:
5720:{\displaystyle \alpha =g(\theta )}
5524:
5482:
5479:
5476:
5295:
5079:
5037:
4895:
4768:of the log-likelihood function, or
4743:
4443:
4440:
4437:
4351:
4348:
4345:
4083:{\displaystyle \alpha =g(\theta )}
3618:
3592:
3417:
3391:
3376:
3368:
3324:
3299:
3275:
3264:
3257:
3231:
2778:
2486:
2321:
2307:
2248:
2234:
2220:
2166:
2152:
2138:
2060:
2046:
2032:
1968:
1954:
1900:
1886:
1872:
1816:
1802:
1788:
1729:
1715:
1701:
1642:
1628:
1448:
1440:
1404:
1396:
1367:
1359:
1326:
1249:
1126:
1058:
1029:
987:
920:
893:
787:
777:
766:
763:
760:
756:
753:
750:
583:will be the product of univariate
456:
430:
318:
290:
25:
24033:
21587:
21021:Annals of Mathematical Statistics
20820:Gill, Philip E.; Murray, Walter;
20600:Practical Methods of Optimization
20091:
16684:. The probability of each box is
13925:{\displaystyle {\widehat {\mu }}}
13183:; so we maximize the likelihood,
11990:The likelihood is maximized when
8311:{\displaystyle \;w_{1}\,,w_{2}\;}
7428:This bias-corrected estimator is
7302:for that bias by subtracting it:
6880:(with superscripts) denotes the (
6510:In particular, it means that the
4728:|·) has unique global maximum at
338:, a finite-dimensional subset of
119:, MLE is generally equivalent to
24022:Probability distribution fitting
23983:
23971:
23959:
23946:
23945:
23736:
21548:Likelihood Methods in Statistics
19806:
19656:
18852:
18778:
18706:
18660:
18618:
18526:
18442:
18391:
18339:
18299:
18261:
18186:
18119:
17975:
17923:
17847:. Many methods for this kind of
17726:
17651:
15824:multivariate normal distribution
15822:random variables then follows a
15682:Suppose one constructs an order-
11131:{\displaystyle {\hat {\theta }}}
9624:
9578:
9525:
9408:{\displaystyle {\hat {\theta }}}
9281:
9211:{\displaystyle {\hat {\theta }}}
9090:{\displaystyle {\hat {\theta }}}
9061:{\displaystyle {\hat {\theta }}}
8982:{\displaystyle h_{\text{Bayes}}}
6339:{\displaystyle ~{\mathcal {I}}~}
6154:to a normal distribution. It is
3979:{\displaystyle {\hat {\theta }}}
3218:multivariate normal distribution
3114:{\displaystyle \mathbb {R} ^{k}}
1582:
1297:
1200:
1148:
1107:
883:
809:
611:
560:
501:
477:
351:
23621:Least-squares spectral analysis
21633:"Maximum Likelihood Estimation"
21621:"Maximum Likelihood Estimation"
21336:
21295:
21267:
21242:
21215:
21177:
21136:
21092:
21067:
21042:
21007:
20976:
20892:
20850:
20813:
20785:
20757:
20723:
20696:
20671:
20646:
20618:
20589:
20579:Sycorax says Reinstate Monica (
20573:
20562:
20545:
20527:
20494:
20457:
20432:
20416:Handbook of Econometrics, Vol.4
20383:Handbook of Econometrics, Vol.4
20369:
20357:
20332:
20305:
20283:
20232:
20205:
18761:Davidon–Fletcher–Powell formula
17183:The log-likelihood of this is:
11176:{\displaystyle P_{\theta _{0}}}
11085:helps see how we are using the
10608:
10589:
9245:{\displaystyle P_{\theta _{0}}}
9162:{\displaystyle P_{\theta _{0}}}
5025:
4625:
4621:
4282:goes to infinity the estimator
3477:{\displaystyle h(\theta )=0\;,}
3354:restricted likelihood equations
2831:{\displaystyle ~h(\theta )=0~.}
1436:
1429:
1392:
22602:Mean-unbiased minimum-variance
21705:
21572:. Cambridge University Press.
21438:. Cambridge University Press.
20176:
20153:
20120:
20062:
20034:
20009:
19981:
19956:
19915:Method of moments (statistics)
19633:
19627:
19513:
19500:
19488:
19462:
19051:
19038:
19026:
19000:
18664:
18650:
18622:
18608:
18416:
18401:
17730:
17722:
17655:
17641:
17569:
17483:
17438:
17429:
17378:
17241:
17196:
16971:
16881:
16776:. This is a case in which the
16473:
16446:
16412:
16386:
16383:
16357:
16318:
16291:
16277:
16258:
16174:
16148:
15925:
15915:
15893:
15883:
15871:
15839:
15777:
15745:
15725:
15693:
15665:
15652:
15646:
15633:
15624:
15598:
15477:
15449:
15404:
15374:
15240:
15221:
15024:This means that the estimator
14775:
14756:
14753:
14734:
14663:
14643:
14374:
14367:
14345:
14222:
14202:
14110:
14088:
14017:
13998:
13753:
13702:
13647:
13635:
13626:
13594:
13575:
13483:
13461:
13412:
13393:
13357:
13338:
13276:
13225:
13216:
13197:
13120:
13100:
13008:
12976:
12946:
12895:
12823:
12810:
12755:
12730:
12701:
12682:
12653:Bernoulli trials resulting in
12504:
12491:
12454:
12442:
12425:
12412:
12383:
12370:
12345:
12332:
12285:
12272:
12138:
12125:
12087:
12067:
12051:
12045:
11952:
11930:
11921:
11905:
11796:
11774:
11765:
11749:
11640:
11618:
11609:
11593:
11340:{\displaystyle {\widehat {n}}}
11247:{\displaystyle {\widehat {n}}}
11122:
11065:
11053:
11045:
11026:
11008:
11002:
10965:
10932:
10871:
10859:
10851:
10832:
10817:
10811:
10750:
10744:
10731:
10725:
10663:
10660:
10654:
10641:
10599:
10592:
10586:
10573:
10487:
10468:
10460:
10434:
10353:
10334:
10326:
10300:
10229:
10203:
10195:
10176:
10096:
10070:
10055:
10036:
9961:
9935:
9899:
9880:
9803:
9784:
9719:
9700:
9634:
9620:
9582:
9574:
9529:
9521:
9462:
9399:
9333:
9288:
9202:
9114:
9081:
9052:
9013:
9007:
8934:
8928:
8914:
8902:
8812:
8806:
8791:
8778:
8765:
8746:
8727:
8708:
8637:
8618:
8602:
8588:
8522:
8503:
8487:
8473:
8424:
8418:
8405:
8391:
8220:
8213:
8199:
8181:
8174:
8160:
8075:
8069:
8039:
7988:
7965:
7959:
7925:
7919:
7906:
7855:
7828:
7783:
7749:
7743:
7707:
7662:
7647:
7641:
7628:
7577:
7565:
7514:
7460:Relation to Bayesian inference
7235:
7222:
7162:
7149:
7067:
7054:
6448:
6435:
6131:
6111:
6027:
6023:
6017:
6005:
5999:
5993:
5974:
5968:
5915:
5909:
5873:
5867:
5859:
5853:
5827:
5821:
5815:
5775:
5755:
5714:
5708:
5659:
5653:
5419:
5399:
5353:
5348:
5342:
5333:
5321:
5301:
5215:
5203:
5131:
5125:
5116:
5104:
5022:
5016:
5000:
4988:
4898:
4892:
4874:
4862:
4669:
4650:
4641:
4629:
4622:
4566:
4546:
4523:
4503:
4254:
4234:
4215:
4130:
4124:
4115:
4103:
4077:
4071:
4022:
4016:
3970:
3923:
3919:
3900:
3887:
3871:
3865:
3836:
3817:
3771:
3758:
3716:
3703:
3605:
3598:
3461:
3455:
3404:
3397:
3197:
3152:
2813:
2807:
2751:Estimating the true parameter
2661:
2655:
2633:
2627:
2611:
2605:
2584:
2578:
2534:
2528:
1301:
1286:
1204:
1189:
1152:
1137:
1111:
1096:
984:
957:
887:
879:
867:
851:
813:
798:
738:
700:
681:
621:
607:
570:
556:
511:
497:
481:
467:
447:
441:
417:gives a real-valued function,
403:
358:
278:
265:
153:joint probability distribution
135:, MLE is a special case of an
131:in the region of interest. In
92:If the likelihood function is
13:
1:
24012:Maximum likelihood estimation
23915:Geographic information system
23131:Simultaneous equations models
20940:Parametric Statistical Theory
20628:; Wright, Stephen J. (2006).
20114:10.1016/S0022-2496(02)00028-7
19949:
19943:Restricted maximum likelihood
19886:Generalized method of moments
11196:Discrete uniform distribution
7484:maximum a posteriori estimate
6093:
5230:stochastically equicontinuous
3665:
2459:restricted maximum likelihood
142:
43:maximum likelihood estimation
23098:Coefficient of determination
22709:Uniformly most powerful test
21656:Lesser, Lawrence M. (2007).
21637:Quantitative Economics with
21568:; Ahlquist, John S. (2018).
21546:Severini, Thomas A. (2000).
21253:. Harvard University Press.
21249:Stigler, Stephen M. (1999).
21228:. Harvard University Press.
21222:Stigler, Stephen M. (1986).
20326:10.1016/0165-1765(85)90139-9
20299:Southern Illinois University
20044:; Ahlquist, John S. (2018).
19827:Akaike information criterion
15206:maximum likelihood estimator
15087:. It can also be shown that
12873:probability density function
12716:probability density function
12604:maximum likelihood estimator
11290:, and this is greatest when
11089:to move from the average of
5057:uniform law of large numbers
2437:. Conveniently, most common
1336:{\displaystyle \,\Theta \,,}
1218:occurs at the same value of
1015:sufficient but not necessary
7:
23667:Proportional hazards models
23611:Spectral density estimation
23593:Vector autoregression (VAR)
23027:Maximum posterior estimator
22259:Randomized controlled trial
21607:Encyclopedia of Mathematics
21602:"Maximum-likelihood method"
21504:Millar, Russell B. (2011).
19927:Minimum-distance estimation
19799:
18245:{\displaystyle \eta _{r}=1}
15080:{\displaystyle \sigma ^{2}}
13832:of the given distribution,
12006:maximum likelihood estimate
11418:The probability of tossing
11210:tickets numbered from 1 to
11190:
9435:{\displaystyle P_{\theta }}
9218:) to the real distribution
9133:Kullback–Leibler divergence
5795:It maximizes the so-called
3238:{\displaystyle \,\Sigma \,}
1065:{\displaystyle \,\Theta \,}
1036:{\displaystyle \,\Theta \,}
325:{\displaystyle \,\Theta \,}
69:so that, under the assumed
18:Maximum likelihood estimate
10:
24038:
23427:Multivariate distributions
21847:Average absolute deviation
21412:Eliason, Scott R. (1993).
21104:"On rereading R.A. Fisher"
20439:Zacks, Shelemyahu (1971).
19963:Rossi, Richard J. (2018).
19910:Maximum entropy estimation
19904:Maximum spacing estimation
19727:
18023:th "step," and the scalar
16530:
13828:is equal to the parameter
13711:{\displaystyle {\bar {x}}}
12024:could have been any value
11199:
8676:{\displaystyle \;w_{1}\;.}
6888:Fisher information matrix
5665:{\displaystyle g(\theta )}
4715:observationally equivalent
4028:{\displaystyle g(\theta )}
2456:
2453:Restricted parameter space
2433:, as this indicates local
29:
23941:
23895:
23832:
23785:
23748:
23744:
23731:
23703:
23685:
23652:
23643:
23601:
23548:
23509:
23458:
23449:
23415:Structural equation model
23370:
23327:
23323:
23298:
23257:
23223:
23177:
23144:
23106:
23073:
23069:
23045:
22985:
22894:
22813:
22777:
22768:
22751:Score/Lagrange multiplier
22736:
22689:
22634:
22560:
22551:
22361:
22357:
22344:
22303:
22277:
22229:
22184:
22166:Sample size determination
22131:
22127:
22114:
22018:
21973:
21947:
21929:
21885:
21837:
21757:
21748:
21744:
21731:
21713:
21593:Tilevik, Andreas (2022).
21074:Wilks, Samuel S. (1962).
20938:Pfanzagl, Johann (1994).
20793:Avriel, Mordecai (1976).
20703:Murphy, Kevin P. (2012).
20020:. Boca Raton: CRC Press.
19703:generalized linear models
19611:Fisher information matrix
18043:{\displaystyle \eta _{r}}
15527:Non-independent variables
15204:Formally we say that the
11482:probability mass function
8563:{\displaystyle \;w_{2}\;}
8264:{\displaystyle \;w_{2}\;}
8133:{\displaystyle \;w_{1}\;}
6348:Fisher information matrix
6152:converges in distribution
5672:is any transformation of
5580:Fisher information matrix
5434:converges in distribution
4822:for almost all values of
4802:Continuity: the function
4035:is any transformation of
2439:probability distributions
1174:Since the logarithm is a
23910:Environmental statistics
23432:Elliptical distributions
23225:Generalized linear model
23154:Simple linear regression
22924:Hodges–Lehmann estimator
22381:Probability distribution
22290:Stochastic approximation
21852:Coefficient of variation
21523:Pickles, Andrew (1986).
21149:The Annals of Statistics
21108:The Annals of Statistics
20653:Daganzo, Carlos (1979).
20534:Christensen, Henrikt I.
20520:10.14490/jjss1995.26.101
20160:Kane, Edward J. (1968).
19991:; Nielsen, Bent (2007).
19937:Quasi-maximum likelihood
19880:Other estimation methods
19874:chi-squared distribution
19757:Francis Ysidro Edgeworth
15513:non-linear least squares
4934:Dominance: there exists
4315:converges in probability
3648:Lagrange multiplier test
2842:constrained optimization
2457:Not to be confused with
115:From the perspective of
59:probability distribution
23570:Cross-correlation (XCF)
23178:Non-standard predictors
22612:Lehmann–Scheffé theorem
22285:Adaptive clinical trial
21281:. New York, NY: Wiley.
21143:Pratt, John W. (1976).
21076:Mathematical Statistics
21035:10.1214/aoms/1177732360
20463:See formula 20 in
20339:Magnus, Jan R. (2017).
17760:{\displaystyle \theta }
16527:," using this density.
15790:. Furthermore, let the
15129:{\displaystyle \sigma }
14247:Inserting the estimate
12631:, and a letter such as
9182:{\displaystyle \theta }
7456:third-order efficient.
5685:{\displaystyle \theta }
5636:{\displaystyle \theta }
4771:existence of a compact
4196:, i.e. it achieves the
4048:{\displaystyle \theta }
3999:{\displaystyle \theta }
3314:upper triangular matrix
3305:{\displaystyle \Gamma }
2784:{\displaystyle \Theta }
2764:{\displaystyle \theta }
2447:logarithmically concave
1238:as does the maximum of
1231:{\displaystyle \theta }
1021:over a parameter space
23966:Mathematics portal
23787:Engineering statistics
23695:Nelson–Aalen estimator
23272:Analysis of covariance
23159:Ordinary least squares
23083:Pearson product-moment
22487:Statistical functional
22398:Empirical distribution
22231:Controlled experiments
21960:Frequency distribution
21738:Descriptive statistics
21508:. Hoboken, NJ: Wiley.
21416:. Newbury Park: Sage.
21343:Aldrich, John (1997).
21162:10.1214/aos/1176343457
21121:10.1214/aos/1176343456
20828:Practical Optimization
20630:Numerical Optimization
20263:Silvey, S. D. (1975).
19740:
19695:
19586:
19523:
19424:
19192:
19129:
19061:
18962:
18740:
18598:
18504:computationally costly
18484:
18423:
18373:
18246:
18199:
18104:
18044:
18009:
17957:
17841:
17797:
17761:
17737:
17679:
17605:
17543:
17523:
17356:
17325:
17282:
17171:
16862:
16797:
16770:
16705:
16678:
16613:
16593:
16514:
16122:
15812:
15784:
15732:
15673:
15579:
15552:
15498:
15332:
15247:
15195:
15166:
15130:
15110:
15081:
15054:
15015:
14920:, allows us to obtain
14914:
14847:
14785:
14733:
14712:
14642:
14574:
14521:
14493:
14472:
14416:
14344:
14276:
14238:
14201:
14130:
14087:
13926:
13894:
13811:
13782:
13712:
13680:
13496:
13460:
13283:
13159:
13099:
12972:
12862:
12708:
12549:
12185:
12157:
11981:
11341:
11248:
11206:Consider a case where
11177:
11132:
11075:
10976:
10562:
10421:
10287:
10163:
10021:
9925:
9870:
9774:
9696:
9436:
9409:
9380:
9346:from some probability
9340:
9246:
9212:
9183:
9163:
9125:
9091:
9062:
9021:
8989:is the prediction and
8983:
8953:
8822:
8677:
8645:
8564:
8532:
8444:
8312:
8265:
8236:
8134:
8082:
8046:
7972:
7932:
7835:
7756:
7717:
7430:second-order efficient
7392:
7289:
6916:
6884:)-th component of the
6874:
6837:
6731:
6501:
6340:
6307:
6144:
6081:
6061:
6038:
5942:
5922:
5921:{\displaystyle y=g(x)}
5884:
5786:
5721:
5686:
5666:
5637:
5617:
5563:
5426:
5380:
5258:
5222:
5165:
5047:
4919:
4748:
4679:
4573:
4530:
4484:
4392:
4308:
4261:
4198:Cramér–Rao lower bound
4184:
4164:
4137:
4084:
4049:
4029:
4000:
3980:
3930:
3846:
3807:
3723:
3632:
3577:is a column-vector of
3571:
3478:
3439:
3339:
3306:
3286:
3239:
3210:
3115:
3082:
2998:
2899:
2832:
2785:
2765:
2745:
2709:
2682:vector-valued function
2674:
2558:
2427:
2399:negative semi-definite
2388:
1562:
1528:numerical optimization
1520:
1480:
1337:
1308:
1270:
1232:
1212:
1165:
1066:
1037:
995:
936:
903:
826:
713:
647:
577:
522:
411:
326:
304:
242:
102:ordinary least squares
77:is most probable. The
23882:Population statistics
23824:System identification
23558:Autocorrelation (ACF)
23486:Exponential smoothing
23400:Discriminant analysis
23395:Canonical correlation
23259:Partition of variance
23121:Regression validation
22965:(Jonckheere–Terpstra)
22864:Likelihood-ratio test
22553:Frequentist inference
22465:Location–scale family
22386:Sampling distribution
22351:Statistical inference
22318:Cross-sectional study
22305:Observational studies
22264:Randomized experiment
22093:Stem-and-leaf display
21895:Central limit theorem
21362:10.1214/ss/1030037906
21321:10.1214/ss/1009212248
21302:Hald, Anders (1999).
21049:Owen, Art B. (2001).
20983:Hald, Anders (1999).
20952:10.1515/9783110889765
20858:Edgeworth, Francis Y.
20736:Advanced Econometrics
20596:Fletcher, R. (1987).
20536:"Pattern Recognition"
20501:Kano, Yutaka (1996).
20290:Olive, David (2004).
20266:Statistical Inference
19860:Rao–Blackwell theorem
19735:
19696:
19587:
19524:
19425:
19193:
19130:
19062:
18963:
18741:
18578:
18485:
18424:
18374:
18247:
18214:Newton–Raphson method
18200:
18105:
18045:
18010:
17958:
17842:
17798:
17762:
17738:
17680:
17606:
17544:
17503:
17357:
17305:
17262:
17172:
16863:
16798:
16796:{\displaystyle X_{i}}
16771:
16711:, with a constraint:
16706:
16704:{\displaystyle p_{i}}
16679:
16614:
16594:
16515:
16123:
15813:
15785:
15733:
15674:
15580:
15578:{\displaystyle y_{2}}
15553:
15551:{\displaystyle y_{1}}
15499:
15333:
15248:
15196:
15167:
15131:
15111:
15082:
15055:
15016:
14915:
14848:
14786:
14713:
14692:
14622:
14575:
14522:
14473:
14452:
14396:
14324:
14277:
14239:
14181:
14131:
14067:
13927:
13895:
13812:
13762:
13713:
13681:
13497:
13440:
13296:function itself is a
13284:
13160:
13079:
12952:
12863:
12709:
12550:
12196:and setting to zero:
12179:
12158:
12004:, and so this is the
11982:
11486:binomial distribution
11404: = T, ...,
11342:
11249:
11178:
11133:
11076:
10977:
10542:
10401:
10267:
10143:
10001:
9905:
9850:
9754:
9676:
9437:
9410:
9381:
9341:
9247:
9213:
9184:
9164:
9126:
9092:
9063:
9022:
8984:
8954:
8823:
8678:
8646:
8565:
8533:
8445:
8313:
8266:
8237:
8135:
8083:
8047:
7973:
7933:
7836:
7757:
7718:
7393:
7290:
6917:
6875:
6838:
6699:
6502:
6341:
6308:
6145:
6082:
6062:
6039:
5943:
5923:
5885:
5787:
5722:
5687:
5667:
5638:
5618:
5586:Functional invariance
5564:
5427:
5381:
5259:
5223:
5166:
5048:
4929:upper semi-continuity
4920:
4785:such that outside of
4747:
4680:
4574:
4531:
4485:
4393:
4309:
4262:
4185:
4165:
4138:
4085:
4050:
4030:
4001:
3981:
3931:
3847:
3787:
3724:
3633:
3572:
3479:
3440:
3340:
3307:
3287:
3240:
3211:
3116:
3083:
2999:
2900:
2833:
2786:
2766:
2746:
2710:
2675:
2559:
2428:
2389:
1563:
1521:
1481:
1344:sufficient conditions
1338:
1309:
1271:
1233:
1213:
1166:
1067:
1038:
996:
937:
904:
827:
714:
627:
578:
523:
412:
327:
305:
243:
133:frequentist inference
87:statistical inference
23805:Probabilistic design
23390:Principal components
23233:Exponential families
23185:Nonlinear regression
23164:General linear model
23126:Mixed effects models
23116:Errors and residuals
23093:Confounding variable
22995:Bayesian probability
22973:Van der Waerden test
22963:Ordered alternative
22728:Multiple comparisons
22607:Rao–Blackwellization
22570:Estimating equations
22526:Statistical distance
22244:Factorial experiment
21777:Arithmetic-Geometric
21631:; Stachurski, John.
21051:Empirical Likelihood
21014:Wilks, S.S. (1938).
20946:. pp. 207–208.
20791:See theorem 10.1 in
20185:"Working with roots"
19898:Maximum a posteriori
19774:in 1938, now called
19749:Pierre-Simon Laplace
19745:Carl Friedrich Gauss
19617:
19534:
19440:
19215:
19140:
19072:
18978:
18773:
18751:Quasi-Newton methods
18521:
18437:
18386:
18256:
18223:
18114:
18072:
18027:
17970:
17858:
17849:optimization problem
17807:
17771:
17751:
17692:
17629:
17618:Iterative procedures
17559:
17372:
17190:
16875:
16810:
16780:
16715:
16688:
16623:
16603:
16538:
16142:
15833:
15798:
15742:
15690:
15592:
15562:
15535:
15517:confidence intervals
15351:
15260:
15212:
15176:
15140:
15120:
15091:
15064:
15028:
14927:
14857:
14798:
14587:
14538:
14289:
14251:
14146:
13946:
13939:and equate to zero:
13907:
13839:
13729:
13722:. This is solved by
13693:
13523:
13315:
13187:
12889:
12724:
12672:
12203:
12039:
11499:
11322:
11229:
11218:uniform distribution
11153:
11113:
11087:law of large numbers
10989:
9449:
9419:
9390:
9350:
9277:
9222:
9193:
9173:
9139:
9101:
9072:
9043:
8993:
8966:
8839:
8695:
8655:
8574:
8545:
8460:
8325:
8279:
8246:
8144:
8115:
8056:
7982:
7946:
7849:
7770:
7730:
7501:
7421:, and is called the
7309:
6929:
6892:
6850:
6578:
6357:
6320:
6175:
6102:
6071:
6051:
5955:
5932:
5897:
5806:
5734:
5696:
5676:
5647:
5627:
5595:
5443:
5393:
5281:
5236:
5185:
5065:
4969:
4832:
4602:
4582:all models are wrong
4540:
4497:
4416:
4324:
4286:
4228:
4206:asymptotic normality
4174:
4154:
4094:
4059:
4039:
4010:
3990:
3961:
3859:
3740:
3685:
3660:empirical likelihood
3585:
3579:Lagrange multipliers
3491:
3449:
3362:
3320:
3296:
3253:
3227:
3125:
3096:
3008:
2909:
2848:
2798:
2775:
2755:
2719:
2688:
2571:
2483:
2441:– in particular the
2405:
1578:
1538:
1493:
1353:
1322:
1280:
1242:
1222:
1182:
1090:
1054:
1025:
946:
913:
839:
729:
594:
543:
531:which is called the
424:
346:
314:
255:
163:
23877:Official statistics
23800:Methods engineering
23481:Seasonal adjustment
23249:Poisson regressions
23169:Bayesian regression
23108:Regression analysis
23088:Partial correlation
23060:Regression analysis
22659:Prediction interval
22654:Likelihood interval
22644:Confidence interval
22636:Interval estimation
22597:Unbiased estimators
22415:Model specification
22295:Up-and-down designs
21983:Partial correlation
21939:Index of dispersion
21857:Interquartile range
21662:University of Texas
21658:"'MLE' song lyrics"
21349:Statistical Science
21308:Statistical Science
21185:Stigler, Stephen M.
20989:Statistical Science
20822:Wright, Margaret H.
20410:Newey, Whitney K.;
20377:Newey, Whitney K.;
19390:
19371:
19354:
19298:
19279:
19108:
18954:
18909:
18844:
18459:
18316:
17166:
17141:
17119:
17031:
16497:
16342:
14888:
14431:
14139:which is solved by
13507:information entropy
13301:strictly increasing
12667:normal distribution
12598: = 0 and
11363: − 1)/2.
11202:German tank problem
8380:
7338:
6246:
5692:, then the MLE for
5517:
5368:
5152:
5028: for all
4461:
4369:
4317:to its true value:
3942:limiting properties
3090:one-to-one function
3004:in such a way that
2338:
1985:
1659:
680:
533:likelihood function
67:likelihood function
23897:Spatial statistics
23777:Medical statistics
23677:First hitting time
23631:Whittle likelihood
23282:Degrees of freedom
23277:Multivariate ANOVA
23210:Heteroscedasticity
23022:Bayesian estimator
22987:Bayesian inference
22836:Kolmogorov–Smirnov
22721:Randomization test
22691:Testing hypotheses
22664:Tolerance interval
22575:Maximum likelihood
22470:Exponential family
22403:Density estimation
22363:Statistical theory
22323:Natural experiment
22269:Scientific control
22186:Survey methodology
21872:Standard deviation
21100:Savage, Leonard J.
20408:By Theorem 3.3 in
20375:By Theorem 2.5 in
19864:mean squared error
19845:Mean squared error
19839:Fisher information
19833:Extremum estimator
19814:Mathematics portal
19791:Fisher information
19753:Thorvald N. Thiele
19741:
19691:
19582:
19519:
19420:
19374:
19355:
19338:
19282:
19263:
19188:
19125:
19094:
19057:
18958:
18938:
18893:
18828:
18736:
18480:
18440:
18419:
18369:
18297:
18242:
18195:
18100:
18040:
18005:
17953:
17837:
17793:
17757:
17733:
17675:
17601:
17539:
17352:
17167:
17145:
17120:
17098:
17010:
16858:
16793:
16766:
16701:
16674:
16609:
16589:
16510:
16483:
16328:
16118:
15808:
15780:
15728:
15669:
15575:
15548:
15521:confidence regions
15494:
15328:
15243:
15191:
15162:
15126:
15106:
15077:
15050:
15011:
14910:
14874:
14843:
14781:
14570:
14517:
14417:
14272:
14234:
14126:
14124:
13922:
13890:
13807:
13708:
13676:
13674:
13511:Fisher information
13492:
13303:function over the
13279:
13155:
12871:the corresponding
12858:
12704:
12545:
12543:
12186:
12153:
11977:
11975:
11949:
11918:
11866:
11793:
11762:
11710:
11637:
11606:
11554:
11337:
11244:
11173:
11128:
11071:
10972:
10970:
10919:
10788:
10702:
10635:
10606:
10529:
10388:
10264:
10140:
9998:
9842:
9751:
9673:
9614:
9561:
9501:
9432:
9405:
9376:
9336:
9242:
9208:
9179:
9159:
9121:
9087:
9058:
9017:
8979:
8949:
8881:
8818:
8673:
8641:
8560:
8528:
8440:
8363:
8360:
8308:
8261:
8232:
8130:
8078:
8042:
7968:
7928:
7831:
7752:
7713:
7476:prior distribution
7469:Bayesian estimator
7388:
7312:
7285:
6912:
6870:
6833:
6559:has bias of order
6497:
6336:
6303:
6140:
6077:
6057:
6034:
5938:
5918:
5880:
5863:
5797:profile likelihood
5782:
5717:
5682:
5662:
5633:
5613:
5559:
5422:
5376:
5299:
5254:
5218:
5161:
5083:
5043:
4915:
4749:
4675:
4569:
4526:
4480:
4388:
4304:
4257:
4202:mean squared error
4180:
4160:
4133:
4080:
4045:
4025:
3996:
3976:
3926:
3842:
3729:. If the data are
3719:
3680:objective function
3672:extremum estimator
3628:
3567:
3474:
3435:
3335:
3302:
3282:
3235:
3206:
3111:
3078:
2994:
2895:
2828:
2781:
2761:
2741:
2705:
2670:
2554:
2443:exponential family
2423:
2384:
2372:
2324:
1971:
1645:
1558:
1516:
1476:
1333:
1304:
1266:
1228:
1208:
1176:monotonic function
1161:
1062:
1033:
991:
932:
899:
822:
781:
709:
649:
573:
518:
407:
322:
300:
238:
137:extremum estimator
125:prior distribution
117:Bayesian inference
23999:
23998:
23937:
23936:
23933:
23932:
23872:National accounts
23842:Actuarial science
23834:Social statistics
23727:
23726:
23723:
23722:
23719:
23718:
23654:Survival function
23639:
23638:
23501:Granger causality
23342:Contingency table
23317:Survival analysis
23294:
23293:
23290:
23289:
23146:Linear regression
23041:
23040:
23037:
23036:
23012:Credible interval
22981:
22980:
22764:
22763:
22580:Method of moments
22449:Parametric family
22410:Statistical model
22340:
22339:
22336:
22335:
22254:Random assignment
22176:Statistical power
22110:
22109:
22106:
22105:
21955:Contingency table
21925:
21924:
21792:Generalized/power
21579:978-1-316-63682-4
21515:978-0-470-09482-2
21496:978-90-8659-766-6
21288:978-0-471-17912-2
21260:978-0-674-83601-3
21235:978-0-674-40340-6
21085:978-0-471-94650-2
21060:978-1-58488-071-4
20961:978-3-11-013863-4
20944:Walter de Gruyter
20806:978-0-486-43227-4
20716:978-0-262-01802-9
20689:978-1-59718-078-8
20425:978-0-444-88766-5
20392:978-0-444-88766-5
20350:978-90-8659-766-6
20314:Economics Letters
20055:978-1-107-18582-1
20042:Ward, Michael Don
20027:978-1-58488-632-7
20002:978-0-691-13128-3
19974:978-1-118-77104-4
19921:Method of support
19787:confidence region
19718:negative definite
19679:
19416:
19412:
19310:
19120:
18729:
18676:
18634:
18576:
18549:
18473:
18413:
18362:
18330:
18284:
18173:
18142:
18017:descent direction
17998:
17966:where the vector
17946:
17899:
17871:
17824:
17784:
17719:
17704:
17667:
17599:
17572:
17090:
17005:
16828:
16612:{\displaystyle n}
16578:
16556:
16498:
16438:
16343:
16281:
16233:
16230:
15955:
15931:
15928:
15792:covariance matrix
15468:
15433:
15401:
15386:
15311:
15295:
15275:
15188:
15153:
15103:
15041:
14996:
14958:
14690:
14620:
14600:
14532:statistical error
14450:
14394:
14370:
14322:
14302:
14269:
14179:
14159:
14065:
14045:
13976:
13919:
13869:
13802:
13756:
13741:
13705:
13667:
13638:
13553:
13438:
13385:
13145:
13042:
12848:
12791:
12789:
12785:
12537:
12301:
12254:
12233:
12149:
12107:
11969:
11948:
11917:
11897:
11865:
11792:
11761:
11741:
11709:
11636:
11605:
11585:
11553:
11422:is 1 −
11415:"H" is observed.
11334:
11241:
11188:
11187:
11147:Shannon's entropy
11125:
11109:and that finding
11069:
10929:
10894:
10875:
10763:
10677:
10610:
10591:
10540:
10504:
10491:
10399:
10363:
10357:
10239:
10233:
10115:
9973:
9817:
9726:
9648:
9589:
9536:
9476:
9465:
9402:
9205:
9117:
9084:
9055:
9029:prior probability
8976:
8856:
8849:
8816:
8594:
8527:
8479:
8439:
8397:
8335:
8242:otherwise decide
8231:
8225:
8149:
7711:
7486:is the parameter
7384:
7379:
7361:
7355:
7331:
7325:
7281:
7269:
7182:
7114:
7016:
6944:
6785:
6696:
6630:
6493:
6481:
6335:
6325:
6299:
6274:
6253:
6250:
6247:
6237:
6234:
6211:
6205:
6184:
6136:
6107:
6080:{\displaystyle Y}
6060:{\displaystyle X}
6032:
5941:{\displaystyle g}
5833:
5818:
5771:
5746:
5727:is by definition
5610:
5521:
5518:
5508:
5471:
5451:
5372:
5369:
5367:
5359:
5318:
5284:
5251:
5200:
5157:
5153:
5151:
5142:
5101:
5068:
5029:
4818:is continuous in
4466:
4462:
4460:
4451:
4432:
4374:
4370:
4368:
4359:
4340:
4301:
4183:{\displaystyle g}
4163:{\displaystyle g}
4127:
4106:
3973:
3785:
3755:
3700:
3625:
3566:
3496:
3424:
3383:
3247:positive-definite
3222:covariance matrix
3202:
2824:
2803:
2737:
2550:
2420:
2380:
2365:
2340:
2287:
2262:
2205:
2180:
2099:
2074:
2012:
1987:
1939:
1914:
1855:
1830:
1768:
1743:
1686:
1661:
1602:
1554:
1509:
1472:
1462:
1418:
1381:
1262:
1178:, the maximum of
1157:
1077:natural logarithm
960:
898:
870:
854:
844:
818:
748:
741:
705:
585:density functions
250:parametric family
106:linear regression
71:statistical model
49:) is a method of
16:(Redirected from
24029:
23987:
23986:
23975:
23974:
23964:
23963:
23949:
23948:
23852:Crime statistics
23746:
23745:
23733:
23732:
23650:
23649:
23616:Fourier analysis
23603:Frequency domain
23583:
23530:
23496:Structural break
23456:
23455:
23405:Cluster analysis
23352:Log-linear model
23325:
23324:
23300:
23299:
23241:
23215:Homoscedasticity
23071:
23070:
23047:
23046:
22966:
22958:
22950:
22949:(Kruskal–Wallis)
22934:
22919:
22874:Cross validation
22859:
22841:Anderson–Darling
22788:
22775:
22774:
22746:Likelihood-ratio
22738:Parametric tests
22716:Permutation test
22699:1- & 2-tails
22590:Minimum distance
22562:Point estimation
22558:
22557:
22509:Optimal decision
22460:
22359:
22358:
22346:
22345:
22328:Quasi-experiment
22278:Adaptive designs
22129:
22128:
22116:
22115:
21993:Rank correlation
21755:
21754:
21746:
21745:
21733:
21732:
21700:
21693:
21686:
21677:
21676:
21672:
21670:
21669:
21652:
21643:
21624:
21615:
21583:
21566:Ward, Michael D.
21561:
21542:
21530:
21519:
21500:
21481:
21449:
21427:
21408:
21375:
21374:
21364:
21340:
21334:
21333:
21323:
21299:
21293:
21292:
21271:
21265:
21264:
21246:
21240:
21239:
21219:
21213:
21212:
21181:
21175:
21174:
21164:
21140:
21134:
21133:
21123:
21096:
21090:
21089:
21071:
21065:
21064:
21046:
21040:
21039:
21037:
21011:
21005:
21004:
20980:
20974:
20973:
20935:
20929:
20928:
20896:
20890:
20889:
20854:
20848:
20847:
20831:
20817:
20811:
20810:
20789:
20783:
20782:
20761:
20755:
20754:
20731:Amemiya, Takeshi
20727:
20721:
20720:
20700:
20694:
20693:
20675:
20669:
20668:
20650:
20644:
20643:
20622:
20616:
20615:
20603:
20593:
20587:
20577:
20571:
20566:
20560:
20549:
20543:
20542:
20540:
20531:
20525:
20524:
20522:
20498:
20492:
20491:
20461:
20455:
20454:
20436:
20430:
20429:
20412:McFadden, Daniel
20406:
20397:
20396:
20379:McFadden, Daniel
20373:
20367:
20361:
20355:
20354:
20336:
20330:
20329:
20320:(1–2): 115–117.
20309:
20303:
20302:
20296:
20287:
20281:
20280:
20260:
20251:
20250:
20236:
20230:
20229:
20209:
20203:
20202:
20180:
20174:
20173:
20167:
20157:
20151:
20150:
20134:
20124:
20118:
20117:
20095:
20089:
20088:
20066:
20060:
20059:
20038:
20032:
20031:
20013:
20007:
20006:
19989:Hendry, David F.
19985:
19979:
19978:
19960:
19821:Related concepts
19816:
19811:
19810:
19722:well-conditioned
19710:stationary point
19700:
19698:
19697:
19692:
19690:
19686:
19685:
19681:
19680:
19672:
19665:
19664:
19659:
19645:
19644:
19626:
19625:
19604:Fisher's scoring
19597:Taylor expansion
19591:
19589:
19588:
19583:
19578:
19577:
19565:
19564:
19546:
19545:
19528:
19526:
19525:
19520:
19512:
19511:
19487:
19486:
19474:
19473:
19452:
19451:
19429:
19427:
19426:
19421:
19414:
19413:
19411:
19410:
19409:
19400:
19399:
19389:
19388:
19382:
19372:
19370:
19369:
19363:
19353:
19352:
19346:
19337:
19336:
19327:
19326:
19316:
19311:
19309:
19308:
19307:
19297:
19296:
19290:
19280:
19278:
19277:
19271:
19262:
19261:
19251:
19246:
19245:
19233:
19232:
19197:
19195:
19194:
19189:
19184:
19183:
19171:
19170:
19152:
19151:
19134:
19132:
19131:
19126:
19121:
19119:
19118:
19117:
19107:
19102:
19089:
19084:
19083:
19066:
19064:
19063:
19058:
19050:
19049:
19025:
19024:
19012:
19011:
18990:
18989:
18967:
18965:
18964:
18959:
18953:
18952:
18946:
18937:
18936:
18927:
18926:
18914:
18910:
18908:
18907:
18901:
18892:
18891:
18882:
18881:
18861:
18860:
18855:
18849:
18845:
18843:
18842:
18836:
18827:
18826:
18817:
18816:
18793:
18792:
18781:
18745:
18743:
18742:
18737:
18735:
18731:
18730:
18722:
18715:
18714:
18709:
18703:
18702:
18694:
18690:
18689:
18688:
18687:
18681:
18677:
18675:
18667:
18663:
18642:
18635:
18633:
18625:
18621:
18600:
18597:
18592:
18577:
18569:
18555:
18551:
18550:
18542:
18535:
18534:
18529:
18489:
18487:
18486:
18481:
18479:
18475:
18474:
18466:
18458:
18450:
18445:
18428:
18426:
18425:
18420:
18415:
18414:
18406:
18400:
18399:
18394:
18378:
18376:
18375:
18370:
18368:
18364:
18363:
18355:
18348:
18347:
18342:
18336:
18332:
18331:
18323:
18315:
18307:
18302:
18290:
18286:
18285:
18277:
18270:
18269:
18264:
18251:
18249:
18248:
18243:
18235:
18234:
18204:
18202:
18201:
18196:
18194:
18190:
18189:
18181:
18180:
18175:
18174:
18166:
18148:
18144:
18143:
18135:
18128:
18127:
18122:
18109:
18107:
18106:
18101:
18099:
18098:
18093:
18084:
18083:
18059:Gradient descent
18049:
18047:
18046:
18041:
18039:
18038:
18014:
18012:
18011:
18006:
18004:
18000:
17999:
17991:
17984:
17983:
17978:
17962:
17960:
17959:
17954:
17952:
17948:
17947:
17939:
17932:
17931:
17926:
17920:
17919:
17907:
17906:
17901:
17900:
17892:
17885:
17884:
17873:
17872:
17864:
17846:
17844:
17843:
17838:
17836:
17832:
17831:
17826:
17825:
17817:
17802:
17800:
17799:
17794:
17792:
17791:
17786:
17785:
17777:
17766:
17764:
17763:
17758:
17742:
17740:
17739:
17734:
17729:
17721:
17720:
17712:
17706:
17705:
17697:
17684:
17682:
17681:
17676:
17668:
17666:
17658:
17654:
17633:
17610:
17608:
17607:
17602:
17600:
17595:
17594:
17585:
17580:
17579:
17574:
17573:
17565:
17548:
17546:
17545:
17540:
17538:
17534:
17533:
17532:
17522:
17517:
17482:
17481:
17463:
17462:
17450:
17449:
17422:
17421:
17403:
17402:
17390:
17389:
17361:
17359:
17358:
17353:
17351:
17350:
17335:
17334:
17324:
17319:
17298:
17297:
17281:
17276:
17240:
17239:
17221:
17220:
17208:
17207:
17176:
17174:
17173:
17168:
17165:
17164:
17163:
17153:
17140:
17139:
17138:
17128:
17118:
17117:
17116:
17106:
17097:
17096:
17095:
17089:
17088:
17087:
17069:
17068:
17056:
17055:
17041:
17030:
17029:
17028:
17018:
17006:
17004:
17000:
16999:
16986:
16978:
16970:
16969:
16951:
16950:
16938:
16937:
16925:
16924:
16906:
16905:
16893:
16892:
16867:
16865:
16864:
16859:
16857:
16856:
16838:
16837:
16826:
16822:
16821:
16802:
16800:
16799:
16794:
16792:
16791:
16775:
16773:
16772:
16767:
16759:
16758:
16740:
16739:
16727:
16726:
16710:
16708:
16707:
16702:
16700:
16699:
16683:
16681:
16680:
16675:
16667:
16666:
16648:
16647:
16635:
16634:
16618:
16616:
16615:
16610:
16598:
16596:
16595:
16590:
16588:
16587:
16576:
16566:
16565:
16554:
16550:
16549:
16519:
16517:
16516:
16511:
16509:
16505:
16504:
16500:
16499:
16496:
16491:
16482:
16481:
16480:
16471:
16470:
16458:
16457:
16444:
16439:
16437:
16436:
16435:
16426:
16425:
16415:
16411:
16410:
16398:
16397:
16382:
16381:
16369:
16368:
16349:
16344:
16341:
16336:
16327:
16326:
16325:
16316:
16315:
16303:
16302:
16289:
16282:
16280:
16276:
16275:
16250:
16234:
16232:
16231:
16229:
16228:
16213:
16211:
16210:
16201:
16200:
16181:
16173:
16172:
16160:
16159:
16127:
16125:
16124:
16119:
16117:
16113:
16112:
16111:
16110:
16104:
16100:
16099:
16098:
16086:
16085:
16067:
16066:
16054:
16053:
16038:
16037:
16029:
16028:
16021:
16017:
16016:
16015:
16003:
16002:
15984:
15983:
15971:
15970:
15956:
15948:
15932:
15930:
15929:
15924:
15923:
15911:
15909:
15908:
15904:
15878:
15870:
15869:
15851:
15850:
15817:
15815:
15814:
15809:
15807:
15806:
15789:
15787:
15786:
15781:
15776:
15775:
15757:
15756:
15737:
15735:
15734:
15729:
15724:
15723:
15705:
15704:
15678:
15676:
15675:
15670:
15664:
15663:
15645:
15644:
15623:
15622:
15610:
15609:
15584:
15582:
15581:
15576:
15574:
15573:
15557:
15555:
15554:
15549:
15547:
15546:
15503:
15501:
15500:
15495:
15493:
15492:
15476:
15475:
15470:
15469:
15461:
15441:
15440:
15434:
15429:
15418:
15413:
15412:
15403:
15402:
15394:
15388:
15387:
15379:
15373:
15372:
15366:
15365:
15337:
15335:
15334:
15329:
15324:
15320:
15319:
15318:
15313:
15312:
15304:
15297:
15296:
15288:
15277:
15276:
15271:
15265:
15252:
15250:
15249:
15244:
15239:
15238:
15201:are consistent.
15200:
15198:
15197:
15192:
15190:
15189:
15181:
15171:
15169:
15168:
15163:
15161:
15160:
15155:
15154:
15146:
15136:, but that both
15135:
15133:
15132:
15127:
15115:
15113:
15112:
15107:
15105:
15104:
15096:
15086:
15084:
15083:
15078:
15076:
15075:
15059:
15057:
15056:
15051:
15049:
15048:
15043:
15042:
15034:
15020:
15018:
15017:
15012:
15007:
15006:
14997:
14992:
14979:
14974:
14973:
14966:
14965:
14960:
14959:
14951:
14946:
14945:
14936:
14935:
14919:
14917:
14916:
14911:
14909:
14908:
14896:
14895:
14887:
14882:
14872:
14871:
14852:
14850:
14849:
14844:
14836:
14835:
14828:
14827:
14817:
14816:
14807:
14806:
14790:
14788:
14787:
14782:
14774:
14773:
14752:
14751:
14732:
14727:
14711:
14706:
14691:
14689:
14688:
14676:
14671:
14670:
14661:
14660:
14641:
14636:
14621:
14613:
14608:
14607:
14602:
14601:
14593:
14579:
14577:
14576:
14571:
14569:
14568:
14550:
14549:
14526:
14524:
14523:
14518:
14513:
14512:
14503:
14502:
14492:
14487:
14471:
14466:
14451:
14449:
14448:
14436:
14430:
14425:
14415:
14410:
14395:
14387:
14382:
14381:
14372:
14371:
14363:
14357:
14356:
14343:
14338:
14323:
14315:
14310:
14309:
14304:
14303:
14295:
14281:
14279:
14278:
14273:
14271:
14270:
14262:
14243:
14241:
14240:
14235:
14230:
14229:
14214:
14213:
14200:
14195:
14180:
14172:
14167:
14166:
14161:
14160:
14152:
14135:
14133:
14132:
14127:
14125:
14118:
14117:
14101:
14100:
14086:
14081:
14066:
14064:
14063:
14051:
14046:
14041:
14034:
14026:
14025:
14016:
14015:
13997:
13996:
13990:
13989:
13977:
13975:
13964:
13938:
13931:
13929:
13928:
13923:
13921:
13920:
13912:
13899:
13897:
13896:
13891:
13879:
13878:
13871:
13870:
13862:
13858:
13857:
13848:
13847:
13831:
13823:
13816:
13814:
13813:
13808:
13803:
13798:
13796:
13795:
13784:
13781:
13776:
13758:
13757:
13749:
13743:
13742:
13734:
13717:
13715:
13714:
13709:
13707:
13706:
13698:
13685:
13683:
13682:
13677:
13675:
13668:
13666:
13665:
13664:
13651:
13640:
13639:
13631:
13614:
13603:
13602:
13593:
13592:
13574:
13573:
13567:
13566:
13554:
13552:
13541:
13501:
13499:
13498:
13493:
13491:
13490:
13474:
13473:
13459:
13454:
13439:
13437:
13436:
13435:
13419:
13411:
13410:
13386:
13381:
13374:
13366:
13365:
13356:
13355:
13337:
13336:
13330:
13329:
13288:
13286:
13285:
13280:
13275:
13274:
13256:
13255:
13237:
13236:
13215:
13214:
13196:
13195:
13182:
13164:
13162:
13161:
13156:
13151:
13147:
13146:
13144:
13143:
13142:
13129:
13128:
13127:
13112:
13111:
13098:
13093:
13077:
13061:
13060:
13056:
13047:
13043:
13041:
13040:
13039:
13020:
13007:
13006:
12988:
12987:
12971:
12966:
12945:
12944:
12926:
12925:
12907:
12906:
12878:
12875:for a sample of
12867:
12865:
12864:
12859:
12854:
12850:
12849:
12847:
12846:
12845:
12832:
12831:
12830:
12808:
12792:
12790:
12787:
12786:
12784:
12783:
12768:
12762:
12754:
12753:
12713:
12711:
12710:
12705:
12700:
12699:
12681:
12680:
12656:
12652:
12648:
12647:
12646:
12641:
12640:
12634:
12629:Bernoulli trials
12626:
12619:
12618:
12614:
12609:
12601:
12597:
12593:
12592:
12588:
12583:
12579:
12578:
12574:
12569:
12565:
12561:
12554:
12552:
12551:
12546:
12544:
12535:
12534:
12530:
12512:
12511:
12490:
12489:
12474:
12470:
12466:
12433:
12432:
12411:
12410:
12395:
12391:
12390:
12369:
12368:
12353:
12352:
12331:
12330:
12299:
12298:
12294:
12293:
12292:
12271:
12270:
12261:
12260:
12259:
12246:
12234:
12232:
12221:
12195:
12192:with respect to
12184: = 10)
12183:
12173:
12171:
12162:
12160:
12159:
12154:
12147:
12146:
12145:
12124:
12123:
12114:
12113:
12112:
12099:
12074:
12066:
12065:
12031:
12029:
12023:
12011:
12003:
12002:
11998:
11993:
11986:
11984:
11983:
11978:
11976:
11967:
11960:
11959:
11950:
11941:
11929:
11928:
11919:
11910:
11904:
11903:
11902:
11889:
11875:
11874:
11867:
11858:
11840:
11834:
11833:
11824:
11823:
11804:
11803:
11794:
11785:
11773:
11772:
11763:
11754:
11748:
11747:
11746:
11733:
11719:
11718:
11711:
11702:
11684:
11678:
11677:
11668:
11667:
11648:
11647:
11638:
11629:
11617:
11616:
11607:
11598:
11592:
11591:
11590:
11577:
11563:
11562:
11555:
11546:
11528:
11522:
11521:
11512:
11511:
11479:
11478:
11474:
11465:
11464:
11460:
11451:
11450:
11446:
11397: = H,
11346:
11344:
11343:
11338:
11336:
11335:
11327:
11281:
11280:
11274:
11266: <
11253:
11251:
11250:
11245:
11243:
11242:
11234:
11182:
11180:
11179:
11174:
11172:
11171:
11170:
11169:
11137:
11135:
11134:
11129:
11127:
11126:
11118:
11101:of it using the
11080:
11078:
11077:
11072:
11070:
11068:
11048:
11044:
11043:
11021:
11001:
11000:
10981:
10979:
10978:
10973:
10971:
10964:
10963:
10951:
10950:
10949:
10948:
10931:
10930:
10927:
10920:
10915:
10886:
10876:
10874:
10854:
10850:
10849:
10827:
10810:
10809:
10808:
10807:
10789:
10784:
10743:
10742:
10724:
10723:
10722:
10721:
10703:
10698:
10669:
10653:
10652:
10636:
10631:
10607:
10605:
10585:
10584:
10572:
10571:
10561:
10556:
10541:
10533:
10530:
10525:
10496:
10492:
10490:
10480:
10479:
10463:
10459:
10458:
10446:
10445:
10429:
10420:
10415:
10400:
10392:
10389:
10384:
10358:
10356:
10346:
10345:
10329:
10325:
10324:
10312:
10311:
10295:
10286:
10281:
10265:
10260:
10234:
10232:
10228:
10227:
10215:
10214:
10198:
10188:
10187:
10171:
10162:
10157:
10141:
10136:
10107:
10103:
10099:
10095:
10094:
10082:
10081:
10048:
10047:
10020:
10015:
9999:
9994:
9968:
9964:
9960:
9959:
9947:
9946:
9924:
9919:
9892:
9891:
9869:
9864:
9843:
9838:
9809:
9796:
9795:
9773:
9768:
9752:
9747:
9712:
9711:
9695:
9690:
9674:
9669:
9640:
9627:
9615:
9610:
9581:
9573:
9572:
9562:
9557:
9528:
9520:
9519:
9518:
9517:
9502:
9497:
9467:
9466:
9458:
9441:
9439:
9438:
9433:
9431:
9430:
9414:
9412:
9411:
9406:
9404:
9403:
9395:
9385:
9383:
9382:
9377:
9375:
9374:
9373:
9372:
9345:
9343:
9342:
9337:
9332:
9331:
9313:
9312:
9300:
9299:
9284:
9255:
9251:
9249:
9248:
9243:
9241:
9240:
9239:
9238:
9217:
9215:
9214:
9209:
9207:
9206:
9198:
9188:
9186:
9185:
9180:
9168:
9166:
9165:
9160:
9158:
9157:
9156:
9155:
9130:
9128:
9127:
9122:
9120:
9119:
9118:
9110:
9096:
9094:
9093:
9088:
9086:
9085:
9077:
9067:
9065:
9064:
9059:
9057:
9056:
9048:
9026:
9024:
9023:
9018:
9003:
9002:
8988:
8986:
8985:
8980:
8978:
8977:
8974:
8958:
8956:
8955:
8950:
8944:
8943:
8924:
8923:
8898:
8897:
8890:
8889:
8882:
8877:
8851:
8850:
8847:
8827:
8825:
8824:
8819:
8817:
8815:
8802:
8801:
8794:
8790:
8789:
8774:
8773:
8764:
8763:
8742:
8741:
8734:
8720:
8719:
8704:
8703:
8682:
8680:
8679:
8674:
8668:
8667:
8650:
8648:
8647:
8642:
8630:
8629:
8614:
8613:
8595:
8592:
8584:
8583:
8569:
8567:
8566:
8561:
8558:
8557:
8537:
8535:
8534:
8529:
8525:
8515:
8514:
8499:
8498:
8480:
8477:
8469:
8468:
8449:
8447:
8446:
8441:
8437:
8414:
8413:
8398:
8395:
8387:
8386:
8379:
8374:
8361:
8356:
8317:
8315:
8314:
8309:
8306:
8305:
8292:
8291:
8270:
8268:
8267:
8262:
8259:
8258:
8241:
8239:
8238:
8233:
8229:
8223:
8216:
8211:
8210:
8195:
8194:
8177:
8172:
8171:
8156:
8155:
8147:
8139:
8137:
8136:
8131:
8128:
8127:
8098:machine learning
8087:
8085:
8084:
8079:
8065:
8064:
8051:
8049:
8048:
8043:
8032:
8031:
8013:
8012:
8000:
7999:
7977:
7975:
7974:
7969:
7955:
7954:
7941:
7938:with respect to
7937:
7935:
7934:
7929:
7915:
7914:
7899:
7898:
7880:
7879:
7867:
7866:
7844:
7840:
7838:
7837:
7832:
7827:
7826:
7808:
7807:
7795:
7794:
7779:
7778:
7765:
7761:
7759:
7758:
7753:
7739:
7738:
7722:
7720:
7719:
7714:
7712:
7710:
7706:
7705:
7687:
7686:
7674:
7673:
7658:
7657:
7650:
7637:
7636:
7621:
7620:
7602:
7601:
7589:
7588:
7572:
7564:
7563:
7545:
7544:
7532:
7531:
7510:
7509:
7493:
7489:
7451:
7449:
7448:
7446:
7441:
7438:
7420:
7418:
7417:
7415:
7410:
7407:
7397:
7395:
7394:
7389:
7382:
7381:
7380:
7375:
7369:
7363:
7362:
7359:
7357:
7356:
7351:
7345:
7337:
7332:
7329:
7327:
7326:
7321:
7315:
7294:
7292:
7291:
7286:
7279:
7278:
7277:
7270:
7268:
7267:
7266:
7253:
7252:
7239:
7234:
7233:
7221:
7220:
7219:
7218:
7198:
7197:
7186:
7183:
7181:
7180:
7179:
7166:
7161:
7160:
7148:
7147:
7146:
7145:
7120:
7115:
7113:
7112:
7111:
7098:
7097:
7084:
7083:
7070:
7066:
7065:
7053:
7052:
7051:
7050:
7030:
7029:
7019:
7017:
7009:
7006:
7005:
6998:
6997:
6986:
6985:
6962:
6961:
6945:
6943:
6933:
6921:
6919:
6918:
6913:
6911:
6910:
6902:
6901:
6879:
6877:
6876:
6871:
6869:
6868:
6860:
6859:
6842:
6840:
6839:
6834:
6832:
6828:
6827:
6826:
6803:
6802:
6786:
6784:
6774:
6767:
6766:
6758:
6757:
6749:
6748:
6740:
6739:
6730:
6725:
6697:
6695:
6685:
6678:
6677:
6670:
6669:
6664:
6660:
6659:
6658:
6646:
6645:
6644:
6632:
6631:
6623:
6612:
6611:
6602:
6601:
6590:
6589:
6570:
6569:
6568:
6563:
6558:
6537:
6535:
6534:
6533:
6532:
6530:
6523:
6520:
6506:
6504:
6503:
6498:
6491:
6490:
6489:
6482:
6480:
6479:
6478:
6465:
6464:
6451:
6447:
6446:
6434:
6433:
6432:
6431:
6411:
6410:
6400:
6394:
6393:
6386:
6385:
6376:
6375:
6367:
6366:
6345:
6343:
6342:
6337:
6333:
6332:
6331:
6323:
6312:
6310:
6309:
6304:
6297:
6296:
6292:
6291:
6290:
6282:
6281:
6272:
6260:
6259:
6251:
6248:
6238:
6235:
6232:
6231:
6227:
6226:
6225:
6213:
6212:
6209:
6207:
6206:
6201:
6195:
6185:
6179:
6168:. Specifically,
6166:Cramér–Rao bound
6163:
6162:
6149:
6147:
6146:
6141:
6134:
6130:
6129:
6105:
6086:
6084:
6083:
6078:
6066:
6064:
6063:
6058:
6043:
6041:
6040:
6035:
6033:
6031:
6030:
6016:
6008:
6002:
5992:
5991:
5981:
5967:
5966:
5947:
5945:
5944:
5939:
5927:
5925:
5924:
5919:
5889:
5887:
5886:
5881:
5862:
5820:
5819:
5811:
5791:
5789:
5788:
5783:
5773:
5772:
5767:
5761:
5748:
5747:
5739:
5726:
5724:
5723:
5718:
5691:
5689:
5688:
5683:
5671:
5669:
5668:
5663:
5642:
5640:
5639:
5634:
5622:
5620:
5619:
5614:
5612:
5611:
5606:
5600:
5577:
5568:
5566:
5565:
5560:
5558:
5554:
5553:
5552:
5528:
5527:
5519:
5509:
5506:
5505:
5501:
5500:
5499:
5487:
5486:
5485:
5473:
5472:
5467:
5461:
5452:
5447:
5431:
5429:
5428:
5423:
5418:
5417:
5385:
5383:
5382:
5377:
5370:
5365:
5360:
5357:
5356:
5352:
5320:
5319:
5314:
5308:
5298:
5263:
5261:
5260:
5255:
5253:
5252:
5247:
5241:
5227:
5225:
5224:
5219:
5202:
5201:
5196:
5190:
5170:
5168:
5167:
5162:
5155:
5154:
5149:
5144:
5140:
5139:
5135:
5103:
5102:
5097:
5091:
5082:
5052:
5050:
5049:
5044:
5030:
5027:
5009:
5008:
4978:
4977:
4962:
4944:
4924:
4922:
4921:
4916:
4908:
4907:
4891:
4890:
4851:
4850:
4841:
4840:
4825:
4821:
4817:
4795:
4793:
4788:
4781:
4777:
4684:
4682:
4681:
4676:
4668:
4667:
4620:
4619:
4578:
4576:
4575:
4570:
4565:
4564:
4535:
4533:
4532:
4527:
4522:
4521:
4489:
4487:
4486:
4481:
4476:
4475:
4464:
4463:
4458:
4453:
4449:
4448:
4447:
4446:
4434:
4433:
4428:
4422:
4397:
4395:
4394:
4389:
4384:
4383:
4372:
4371:
4366:
4361:
4357:
4356:
4355:
4354:
4342:
4341:
4336:
4330:
4313:
4311:
4310:
4305:
4303:
4302:
4297:
4291:
4266:
4264:
4263:
4258:
4253:
4252:
4189:
4187:
4186:
4181:
4169:
4167:
4166:
4161:
4142:
4140:
4139:
4134:
4129:
4128:
4120:
4108:
4107:
4099:
4089:
4087:
4086:
4081:
4054:
4052:
4051:
4046:
4034:
4032:
4031:
4026:
4005:
4003:
4002:
3997:
3985:
3983:
3982:
3977:
3975:
3974:
3966:
3935:
3933:
3932:
3927:
3912:
3911:
3883:
3882:
3851:
3849:
3848:
3843:
3829:
3828:
3806:
3801:
3786:
3778:
3757:
3756:
3751:
3745:
3733:, then we have
3728:
3726:
3725:
3720:
3702:
3701:
3696:
3690:
3641:
3637:
3635:
3634:
3629:
3626:
3624:
3616:
3615:
3614:
3613:
3590:
3576:
3574:
3573:
3568:
3564:
3563:
3562:
3561:
3555:
3551:
3550:
3549:
3531:
3530:
3518:
3517:
3494:
3483:
3481:
3480:
3475:
3444:
3442:
3441:
3436:
3425:
3423:
3415:
3414:
3413:
3412:
3389:
3384:
3382:
3374:
3366:
3344:
3342:
3341:
3336:
3334:
3333:
3332:
3311:
3309:
3308:
3303:
3291:
3289:
3288:
3283:
3274:
3273:
3272:
3244:
3242:
3241:
3236:
3215:
3213:
3212:
3207:
3200:
3196:
3195:
3177:
3176:
3164:
3163:
3151:
3150:
3138:
3137:
3120:
3118:
3117:
3112:
3110:
3109:
3104:
3087:
3085:
3084:
3079:
3076:
3072:
3071:
3070:
3052:
3051:
3039:
3038:
3021:
3020:
3003:
3001:
3000:
2995:
2992:
2991:
2973:
2972:
2954:
2953:
2935:
2934:
2922:
2921:
2904:
2902:
2901:
2896:
2893:
2892:
2874:
2873:
2861:
2860:
2837:
2835:
2834:
2829:
2822:
2801:
2790:
2788:
2787:
2782:
2770:
2768:
2767:
2762:
2750:
2748:
2747:
2742:
2735:
2734:
2733:
2728:
2714:
2712:
2711:
2706:
2703:
2702:
2697:
2679:
2677:
2676:
2671:
2668:
2664:
2654:
2653:
2626:
2625:
2604:
2603:
2563:
2561:
2560:
2555:
2548:
2547:
2543:
2520:
2519:
2514:
2432:
2430:
2429:
2424:
2422:
2421:
2416:
2410:
2393:
2391:
2390:
2385:
2378:
2377:
2376:
2369:
2368:
2367:
2366:
2361:
2355:
2345:
2341:
2339:
2337:
2332:
2319:
2315:
2314:
2304:
2291:
2290:
2289:
2288:
2283:
2277:
2267:
2263:
2261:
2260:
2259:
2246:
2245:
2232:
2228:
2227:
2217:
2209:
2208:
2207:
2206:
2201:
2195:
2185:
2181:
2179:
2178:
2177:
2164:
2163:
2150:
2146:
2145:
2135:
2103:
2102:
2101:
2100:
2095:
2089:
2079:
2075:
2073:
2072:
2071:
2058:
2057:
2044:
2040:
2039:
2029:
2016:
2015:
2014:
2013:
2008:
2002:
1992:
1988:
1986:
1984:
1979:
1966:
1962:
1961:
1951:
1943:
1942:
1941:
1940:
1935:
1929:
1919:
1915:
1913:
1912:
1911:
1898:
1897:
1884:
1880:
1879:
1869:
1859:
1858:
1857:
1856:
1851:
1845:
1835:
1831:
1829:
1828:
1827:
1814:
1813:
1800:
1796:
1795:
1785:
1772:
1771:
1770:
1769:
1764:
1758:
1748:
1744:
1742:
1741:
1740:
1727:
1726:
1713:
1709:
1708:
1698:
1690:
1689:
1688:
1687:
1682:
1676:
1666:
1662:
1660:
1658:
1653:
1640:
1636:
1635:
1625:
1608:
1604:
1603:
1598:
1592:
1585:
1567:
1565:
1564:
1559:
1556:
1555:
1550:
1544:
1525:
1523:
1522:
1517:
1511:
1510:
1505:
1499:
1485:
1483:
1482:
1477:
1470:
1463:
1461:
1460:
1459:
1446:
1438:
1419:
1417:
1416:
1415:
1402:
1394:
1382:
1380:
1379:
1378:
1365:
1357:
1342:
1340:
1339:
1334:
1313:
1311:
1310:
1305:
1300:
1275:
1273:
1272:
1267:
1260:
1259:
1258:
1253:
1252:
1237:
1235:
1234:
1229:
1217:
1215:
1214:
1209:
1203:
1170:
1168:
1167:
1162:
1155:
1151:
1136:
1135:
1130:
1129:
1110:
1071:
1069:
1068:
1063:
1042:
1040:
1039:
1034:
1000:
998:
997:
992:
983:
982:
977:
968:
967:
962:
961:
953:
941:
939:
938:
933:
930:
929:
924:
923:
908:
906:
905:
900:
896:
886:
878:
877:
872:
871:
863:
856:
855:
847:
842:
831:
829:
828:
823:
816:
812:
797:
796:
791:
790:
782:
780:
769:
743:
742:
734:
718:
716:
715:
710:
703:
693:
692:
679:
678:
657:
646:
641:
614:
606:
605:
582:
580:
579:
574:
563:
555:
554:
527:
525:
524:
519:
504:
496:
495:
480:
466:
465:
460:
459:
440:
439:
434:
433:
416:
414:
413:
408:
402:
401:
383:
382:
370:
369:
354:
331:
329:
328:
323:
309:
307:
306:
301:
247:
245:
244:
239:
236:
235:
234:
228:
224:
223:
222:
202:
201:
188:
187:
151:from an unknown
104:estimator for a
21:
24037:
24036:
24032:
24031:
24030:
24028:
24027:
24026:
24002:
24001:
24000:
23995:
23958:
23929:
23891:
23828:
23814:quality control
23781:
23763:Clinical trials
23740:
23715:
23699:
23687:Hazard function
23681:
23635:
23597:
23581:
23544:
23540:Breusch–Godfrey
23528:
23505:
23445:
23420:Factor analysis
23366:
23347:Graphical model
23319:
23286:
23253:
23239:
23219:
23173:
23140:
23102:
23065:
23064:
23033:
22977:
22964:
22956:
22948:
22932:
22917:
22896:Rank statistics
22890:
22869:Model selection
22857:
22815:Goodness of fit
22809:
22786:
22760:
22732:
22685:
22630:
22619:Median unbiased
22547:
22458:
22391:Order statistic
22353:
22332:
22299:
22273:
22225:
22180:
22123:
22121:Data collection
22102:
22014:
21969:
21943:
21921:
21881:
21833:
21750:Continuous data
21740:
21727:
21709:
21704:
21667:
21665:
21629:Sargent, Thomas
21600:
21590:
21580:
21558:
21539:
21516:
21497:
21470:10.2307/1403464
21446:
21424:
21405:
21384:
21382:Further reading
21379:
21378:
21341:
21337:
21300:
21296:
21289:
21272:
21268:
21261:
21247:
21243:
21236:
21220:
21216:
21201:10.2307/2344804
21182:
21178:
21141:
21137:
21097:
21093:
21086:
21072:
21068:
21061:
21047:
21043:
21012:
21008:
20981:
20977:
20962:
20936:
20932:
20917:10.2307/2339378
20897:
20893:
20878:10.2307/2339293
20855:
20851:
20844:
20818:
20814:
20807:
20790:
20786:
20779:
20762:
20758:
20751:
20728:
20724:
20717:
20701:
20697:
20690:
20676:
20672:
20665:
20651:
20647:
20640:
20623:
20619:
20612:
20594:
20590:
20578:
20574:
20567:
20563:
20550:
20546:
20538:
20532:
20528:
20499:
20495:
20470:Snell, E. Joyce
20462:
20458:
20451:
20437:
20433:
20426:
20407:
20400:
20393:
20374:
20370:
20362:
20358:
20351:
20337:
20333:
20310:
20306:
20294:
20288:
20284:
20277:
20261:
20254:
20237:
20233:
20226:
20210:
20206:
20199:
20181:
20177:
20158:
20154:
20147:
20125:
20121:
20096:
20092:
20085:
20067:
20063:
20056:
20039:
20035:
20028:
20014:
20010:
20003:
19986:
19982:
19975:
19961:
19957:
19952:
19882:
19823:
19812:
19805:
19802:
19772:Samuel S. Wilks
19730:
19671:
19670:
19666:
19660:
19655:
19654:
19653:
19649:
19640:
19639:
19621:
19620:
19618:
19615:
19614:
19607:
19573:
19569:
19554:
19550:
19541:
19537:
19535:
19532:
19531:
19507:
19503:
19482:
19478:
19469:
19465:
19447:
19443:
19441:
19438:
19437:
19405:
19401:
19395:
19391:
19384:
19383:
19378:
19373:
19365:
19364:
19359:
19348:
19347:
19342:
19332:
19328:
19322:
19318:
19317:
19315:
19303:
19299:
19292:
19291:
19286:
19281:
19273:
19272:
19267:
19257:
19253:
19252:
19250:
19241:
19237:
19222:
19218:
19216:
19213:
19212:
19206:
19179:
19175:
19160:
19156:
19147:
19143:
19141:
19138:
19137:
19113:
19109:
19103:
19098:
19093:
19088:
19079:
19075:
19073:
19070:
19069:
19045:
19041:
19020:
19016:
19007:
19003:
18985:
18981:
18979:
18976:
18975:
18948:
18947:
18942:
18932:
18928:
18922:
18918:
18903:
18902:
18897:
18887:
18883:
18877:
18873:
18866:
18862:
18856:
18851:
18850:
18838:
18837:
18832:
18822:
18818:
18812:
18808:
18801:
18797:
18782:
18777:
18776:
18774:
18771:
18770:
18764:
18754:
18721:
18720:
18716:
18710:
18705:
18704:
18695:
18683:
18682:
18668:
18659:
18643:
18641:
18637:
18636:
18626:
18617:
18601:
18599:
18593:
18582:
18568:
18567:
18563:
18562:
18541:
18540:
18536:
18530:
18525:
18524:
18522:
18519:
18518:
18501:
18465:
18464:
18460:
18451:
18446:
18441:
18438:
18435:
18434:
18405:
18404:
18395:
18390:
18389:
18387:
18384:
18383:
18354:
18353:
18349:
18343:
18338:
18337:
18322:
18321:
18317:
18308:
18303:
18298:
18276:
18275:
18271:
18265:
18260:
18259:
18257:
18254:
18253:
18230:
18226:
18224:
18221:
18220:
18217:
18185:
18176:
18165:
18164:
18163:
18162:
18158:
18134:
18133:
18129:
18123:
18118:
18117:
18115:
18112:
18111:
18094:
18089:
18088:
18079:
18075:
18073:
18070:
18069:
18063:
18034:
18030:
18028:
18025:
18024:
18022:
17990:
17989:
17985:
17979:
17974:
17973:
17971:
17968:
17967:
17938:
17937:
17933:
17927:
17922:
17921:
17915:
17911:
17902:
17891:
17890:
17889:
17874:
17863:
17862:
17861:
17859:
17856:
17855:
17827:
17816:
17815:
17814:
17810:
17808:
17805:
17804:
17787:
17776:
17775:
17774:
17772:
17769:
17768:
17752:
17749:
17748:
17725:
17711:
17710:
17696:
17695:
17693:
17690:
17689:
17659:
17650:
17634:
17632:
17630:
17627:
17626:
17620:
17590:
17586:
17584:
17575:
17564:
17563:
17562:
17560:
17557:
17556:
17528:
17524:
17518:
17507:
17496:
17492:
17477:
17473:
17458:
17454:
17445:
17441:
17417:
17413:
17398:
17394:
17385:
17381:
17373:
17370:
17369:
17346:
17342:
17330:
17326:
17320:
17309:
17293:
17289:
17277:
17266:
17235:
17231:
17216:
17212:
17203:
17199:
17191:
17188:
17187:
17159:
17155:
17154:
17149:
17134:
17130:
17129:
17124:
17112:
17108:
17107:
17102:
17091:
17083:
17079:
17064:
17060:
17051:
17047:
17046:
17037:
17036:
17035:
17024:
17020:
17019:
17014:
16995:
16991:
16987:
16979:
16977:
16965:
16961:
16946:
16942:
16933:
16929:
16920:
16916:
16901:
16897:
16888:
16884:
16876:
16873:
16872:
16852:
16848:
16833:
16829:
16817:
16813:
16811:
16808:
16807:
16787:
16783:
16781:
16778:
16777:
16754:
16750:
16735:
16731:
16722:
16718:
16716:
16713:
16712:
16695:
16691:
16689:
16686:
16685:
16662:
16658:
16643:
16639:
16630:
16626:
16624:
16621:
16620:
16604:
16601:
16600:
16583:
16579:
16561:
16557:
16545:
16541:
16539:
16536:
16535:
16533:
16492:
16487:
16476:
16472:
16466:
16462:
16453:
16449:
16445:
16443:
16431:
16427:
16421:
16417:
16416:
16406:
16402:
16393:
16389:
16377:
16373:
16364:
16360:
16350:
16348:
16337:
16332:
16321:
16317:
16311:
16307:
16298:
16294:
16290:
16288:
16287:
16283:
16271:
16267:
16254:
16249:
16245:
16241:
16224:
16220:
16212:
16206:
16202:
16196:
16192:
16185:
16180:
16168:
16164:
16155:
16151:
16143:
16140:
16139:
16106:
16105:
16094:
16090:
16081:
16077:
16062:
16058:
16049:
16045:
16044:
16040:
16039:
16030:
16024:
16023:
16022:
16011:
16007:
15998:
15994:
15979:
15975:
15966:
15962:
15961:
15957:
15947:
15943:
15939:
15919:
15918:
15910:
15900:
15896:
15892:
15882:
15877:
15865:
15861:
15846:
15842:
15834:
15831:
15830:
15802:
15801:
15799:
15796:
15795:
15771:
15767:
15752:
15748:
15743:
15740:
15739:
15719:
15715:
15700:
15696:
15691:
15688:
15687:
15659:
15655:
15640:
15636:
15618:
15614:
15605:
15601:
15593:
15590:
15589:
15569:
15565:
15563:
15560:
15559:
15542:
15538:
15536:
15533:
15532:
15529:
15488:
15487:
15471:
15460:
15459:
15458:
15436:
15435:
15419:
15417:
15408:
15407:
15393:
15392:
15378:
15377:
15368:
15367:
15361:
15360:
15352:
15349:
15348:
15314:
15303:
15302:
15301:
15287:
15286:
15285:
15281:
15266:
15264:
15263:
15261:
15258:
15257:
15234:
15230:
15213:
15210:
15209:
15180:
15179:
15177:
15174:
15173:
15156:
15145:
15144:
15143:
15141:
15138:
15137:
15121:
15118:
15117:
15095:
15094:
15092:
15089:
15088:
15071:
15067:
15065:
15062:
15061:
15044:
15033:
15032:
15031:
15029:
15026:
15025:
15002:
14998:
14980:
14978:
14969:
14968:
14961:
14950:
14949:
14948:
14941:
14940:
14931:
14930:
14928:
14925:
14924:
14904:
14900:
14891:
14890:
14883:
14878:
14867:
14866:
14858:
14855:
14854:
14831:
14830:
14823:
14819:
14812:
14811:
14802:
14801:
14799:
14796:
14795:
14769:
14765:
14747:
14743:
14728:
14717:
14707:
14696:
14684:
14680:
14675:
14666:
14662:
14656:
14652:
14637:
14626:
14612:
14603:
14592:
14591:
14590:
14588:
14585:
14584:
14564:
14560:
14545:
14541:
14539:
14536:
14535:
14508:
14504:
14498:
14494:
14488:
14477:
14467:
14456:
14444:
14440:
14435:
14426:
14421:
14411:
14400:
14386:
14377:
14373:
14362:
14361:
14352:
14348:
14339:
14328:
14314:
14305:
14294:
14293:
14292:
14290:
14287:
14286:
14261:
14260:
14252:
14249:
14248:
14225:
14221:
14209:
14205:
14196:
14185:
14171:
14162:
14151:
14150:
14149:
14147:
14144:
14143:
14123:
14122:
14113:
14109:
14096:
14092:
14082:
14071:
14059:
14055:
14050:
14035:
14033:
14021:
14020:
14011:
14007:
13992:
13991:
13985:
13984:
13968:
13963:
13956:
13949:
13947:
13944:
13943:
13936:
13911:
13910:
13908:
13905:
13904:
13874:
13873:
13861:
13860:
13853:
13852:
13843:
13842:
13840:
13837:
13836:
13829:
13821:
13791:
13787:
13785:
13783:
13777:
13766:
13748:
13747:
13733:
13732:
13730:
13727:
13726:
13697:
13696:
13694:
13691:
13690:
13673:
13672:
13660:
13656:
13652:
13630:
13629:
13615:
13613:
13598:
13597:
13588:
13584:
13569:
13568:
13562:
13561:
13545:
13540:
13533:
13526:
13524:
13521:
13520:
13486:
13482:
13469:
13465:
13455:
13444:
13431:
13427:
13423:
13418:
13406:
13402:
13375:
13373:
13361:
13360:
13351:
13347:
13332:
13331:
13325:
13324:
13316:
13313:
13312:
13270:
13266:
13251:
13247:
13232:
13228:
13210:
13206:
13191:
13190:
13188:
13185:
13184:
13169:
13138:
13134:
13130:
13123:
13119:
13107:
13103:
13094:
13083:
13078:
13076:
13072:
13068:
13052:
13048:
13035:
13031:
13024:
13019:
13015:
13014:
13002:
12998:
12983:
12979:
12967:
12956:
12940:
12936:
12921:
12917:
12902:
12898:
12890:
12887:
12886:
12876:
12841:
12837:
12833:
12826:
12822:
12809:
12807:
12803:
12799:
12779:
12775:
12767:
12766:
12761:
12749:
12745:
12725:
12722:
12721:
12695:
12691:
12676:
12675:
12673:
12670:
12669:
12663:
12654:
12650:
12644:
12643:
12638:
12637:
12636:
12632:
12624:
12616:
12612:
12611:
12607:
12599:
12595:
12590:
12586:
12585:
12581:
12576:
12572:
12571:
12567:
12563:
12559:
12542:
12541:
12517:
12513:
12507:
12503:
12485:
12481:
12472:
12471:
12438:
12434:
12428:
12424:
12406:
12402:
12393:
12392:
12386:
12382:
12364:
12360:
12348:
12344:
12326:
12322:
12312:
12306:
12305:
12288:
12284:
12266:
12262:
12255:
12242:
12241:
12240:
12239:
12235:
12225:
12220:
12213:
12206:
12204:
12201:
12200:
12193:
12190:differentiating
12181:
12169:
12167:
12141:
12137:
12119:
12115:
12108:
12095:
12094:
12093:
12070:
12061:
12057:
12040:
12037:
12036:
12027:
12025:
12021:
12018:
12009:
12000:
11996:
11995:
11991:
11974:
11973:
11955:
11951:
11939:
11924:
11920:
11908:
11898:
11885:
11884:
11883:
11876:
11870:
11869:
11856:
11836:
11829:
11828:
11819:
11818:
11815:
11814:
11799:
11795:
11783:
11768:
11764:
11752:
11742:
11729:
11728:
11727:
11720:
11714:
11713:
11700:
11680:
11673:
11672:
11663:
11662:
11659:
11658:
11643:
11639:
11627:
11612:
11608:
11596:
11586:
11573:
11572:
11571:
11564:
11558:
11557:
11544:
11524:
11517:
11516:
11507:
11506:
11502:
11500:
11497:
11496:
11476:
11472:
11471:
11462:
11458:
11457:
11448:
11444:
11443:
11410:
11403:
11396:
11369:
11326:
11325:
11323:
11320:
11319:
11276:
11272:
11271:
11233:
11232:
11230:
11227:
11226:
11204:
11198:
11193:
11165:
11161:
11160:
11156:
11154:
11151:
11150:
11117:
11116:
11114:
11111:
11110:
11049:
11039:
11035:
11022:
11020:
10996:
10992:
10990:
10987:
10986:
10969:
10968:
10959:
10955:
10944:
10940:
10939:
10935:
10926:
10922:
10895:
10893:
10884:
10883:
10855:
10845:
10841:
10828:
10826:
10803:
10799:
10798:
10794:
10764:
10762:
10738:
10734:
10717:
10713:
10712:
10708:
10678:
10676:
10667:
10666:
10648:
10644:
10611:
10609:
10595:
10590:
10580:
10576:
10567:
10563:
10557:
10546:
10532:
10505:
10503:
10494:
10493:
10475:
10471:
10464:
10454:
10450:
10441:
10437:
10430:
10428:
10416:
10405:
10391:
10364:
10362:
10341:
10337:
10330:
10320:
10316:
10307:
10303:
10296:
10294:
10282:
10271:
10240:
10238:
10223:
10219:
10210:
10206:
10199:
10183:
10179:
10172:
10170:
10158:
10147:
10116:
10114:
10105:
10104:
10090:
10086:
10077:
10073:
10043:
10039:
10026:
10022:
10016:
10005:
9974:
9972:
9955:
9951:
9942:
9938:
9920:
9909:
9887:
9883:
9865:
9854:
9849:
9845:
9818:
9816:
9807:
9806:
9791:
9787:
9769:
9758:
9727:
9725:
9707:
9703:
9691:
9680:
9649:
9647:
9638:
9637:
9623:
9590:
9588:
9577:
9568:
9564:
9537:
9535:
9524:
9513:
9509:
9508:
9504:
9477:
9475:
9468:
9457:
9456:
9452:
9450:
9447:
9446:
9426:
9422:
9420:
9417:
9416:
9394:
9393:
9391:
9388:
9387:
9368:
9364:
9363:
9359:
9351:
9348:
9347:
9327:
9323:
9308:
9304:
9295:
9291:
9280:
9278:
9275:
9274:
9234:
9230:
9229:
9225:
9223:
9220:
9219:
9197:
9196:
9194:
9191:
9190:
9174:
9171:
9170:
9151:
9147:
9146:
9142:
9140:
9137:
9136:
9109:
9108:
9104:
9102:
9099:
9098:
9076:
9075:
9073:
9070:
9069:
9047:
9046:
9044:
9041:
9040:
9037:
8998:
8997:
8994:
8991:
8990:
8973:
8969:
8967:
8964:
8963:
8939:
8938:
8919:
8918:
8893:
8892:
8885:
8884:
8857:
8855:
8846:
8842:
8840:
8837:
8836:
8797:
8796:
8795:
8785:
8781:
8769:
8768:
8759:
8755:
8737:
8736:
8735:
8733:
8715:
8711:
8699:
8698:
8696:
8693:
8692:
8663:
8659:
8656:
8653:
8652:
8625:
8621:
8609:
8608:
8591:
8579:
8578:
8575:
8572:
8571:
8553:
8549:
8546:
8543:
8542:
8510:
8506:
8494:
8493:
8476:
8464:
8463:
8461:
8458:
8457:
8409:
8408:
8394:
8382:
8381:
8375:
8367:
8336:
8334:
8326:
8323:
8322:
8301:
8297:
8287:
8283:
8280:
8277:
8276:
8254:
8250:
8247:
8244:
8243:
8212:
8206:
8202:
8190:
8189:
8173:
8167:
8163:
8151:
8150:
8145:
8142:
8141:
8123:
8119:
8116:
8113:
8112:
8094:
8060:
8059:
8057:
8054:
8053:
8027:
8023:
8008:
8004:
7995:
7991:
7983:
7980:
7979:
7950:
7949:
7947:
7944:
7943:
7939:
7910:
7909:
7894:
7890:
7875:
7871:
7862:
7858:
7850:
7847:
7846:
7842:
7822:
7818:
7803:
7799:
7790:
7786:
7774:
7773:
7771:
7768:
7767:
7763:
7734:
7733:
7731:
7728:
7727:
7701:
7697:
7682:
7678:
7669:
7665:
7653:
7652:
7651:
7632:
7631:
7616:
7612:
7597:
7593:
7584:
7580:
7573:
7571:
7559:
7555:
7540:
7536:
7527:
7523:
7505:
7504:
7502:
7499:
7498:
7491:
7487:
7462:
7444:
7442:
7439:
7436:
7435:
7433:
7413:
7411:
7408:
7405:
7404:
7402:
7370:
7368:
7367:
7358:
7346:
7344:
7343:
7342:
7333:
7328:
7316:
7314:
7313:
7310:
7307:
7306:
7273:
7272:
7262:
7258:
7248:
7244:
7240:
7229:
7225:
7214:
7210:
7209:
7205:
7193:
7189:
7187:
7185:
7175:
7171:
7167:
7156:
7152:
7141:
7137:
7136:
7132:
7121:
7119:
7107:
7103:
7093:
7089:
7079:
7075:
7071:
7061:
7057:
7046:
7042:
7041:
7037:
7025:
7021:
7020:
7018:
7008:
7001:
7000:
6993:
6992:
6972:
6968:
6951:
6947:
6937:
6932:
6930:
6927:
6926:
6903:
6897:
6896:
6895:
6893:
6890:
6889:
6861:
6855:
6854:
6853:
6851:
6848:
6847:
6813:
6809:
6792:
6788:
6778:
6773:
6772:
6768:
6759:
6753:
6752:
6751:
6741:
6735:
6734:
6733:
6726:
6703:
6689:
6684:
6673:
6672:
6665:
6654:
6650:
6634:
6633:
6622:
6621:
6620:
6619:
6615:
6614:
6607:
6606:
6597:
6596:
6585:
6581:
6579:
6576:
6575:
6566:
6565:
6561:
6560:
6557:
6551:
6544:
6528:
6527:
6525:
6524:
6521:
6518:
6517:
6515:
6485:
6484:
6474:
6470:
6460:
6456:
6452:
6442:
6438:
6427:
6423:
6422:
6418:
6406:
6402:
6401:
6399:
6389:
6388:
6381:
6380:
6368:
6362:
6361:
6360:
6358:
6355:
6354:
6327:
6326:
6321:
6318:
6317:
6283:
6277:
6276:
6275:
6265:
6261:
6255:
6254:
6221:
6217:
6208:
6196:
6194:
6193:
6192:
6191:
6187:
6178:
6176:
6173:
6172:
6157:
6155:
6125:
6121:
6103:
6100:
6099:
6096:
6072:
6069:
6068:
6052:
6049:
6048:
6026:
6009:
6004:
6003:
5987:
5983:
5982:
5980:
5962:
5958:
5956:
5953:
5952:
5933:
5930:
5929:
5898:
5895:
5894:
5837:
5810:
5809:
5807:
5804:
5803:
5762:
5760:
5759:
5738:
5737:
5735:
5732:
5731:
5697:
5694:
5693:
5677:
5674:
5673:
5648:
5645:
5644:
5628:
5625:
5624:
5623:is the MLE for
5601:
5599:
5598:
5596:
5593:
5592:
5588:
5573:
5545:
5541:
5533:
5529:
5523:
5522:
5495:
5491:
5475:
5474:
5462:
5460:
5459:
5458:
5457:
5453:
5446:
5444:
5441:
5440:
5413:
5409:
5394:
5391:
5390:
5309:
5307:
5306:
5304:
5300:
5288:
5282:
5279:
5278:
5270:
5242:
5240:
5239:
5237:
5234:
5233:
5191:
5189:
5188:
5186:
5183:
5182:
5175:
5143:
5092:
5090:
5089:
5088:
5084:
5072:
5066:
5063:
5062:
5026:
5004:
5003:
4973:
4972:
4970:
4967:
4966:
4960:
4946:
4935:
4903:
4902:
4886:
4882:
4846:
4845:
4836:
4835:
4833:
4830:
4829:
4823:
4819:
4803:
4799:
4791:
4790:
4786:
4784:
4779:
4775:
4734:
4718:
4712:
4705:
4698:
4663:
4659:
4615:
4611:
4603:
4600:
4599:
4560:
4556:
4541:
4538:
4537:
4517:
4513:
4498:
4495:
4494:
4471:
4467:
4452:
4436:
4435:
4423:
4421:
4420:
4419:
4417:
4414:
4413:
4379:
4375:
4360:
4344:
4343:
4331:
4329:
4328:
4327:
4325:
4322:
4321:
4292:
4290:
4289:
4287:
4284:
4283:
4277:
4248:
4244:
4229:
4226:
4225:
4218:
4175:
4172:
4171:
4155:
4152:
4151:
4119:
4118:
4098:
4097:
4095:
4092:
4091:
4060:
4057:
4056:
4040:
4037:
4036:
4011:
4008:
4007:
3991:
3988:
3987:
3965:
3964:
3962:
3959:
3958:
3907:
3903:
3878:
3877:
3860:
3857:
3856:
3824:
3820:
3802:
3791:
3777:
3746:
3744:
3743:
3741:
3738:
3737:
3691:
3689:
3688:
3686:
3683:
3682:
3668:
3656:
3643:Jacobian matrix
3639:
3617:
3609:
3608:
3604:
3591:
3589:
3586:
3583:
3582:
3557:
3556:
3545:
3541:
3526:
3522:
3513:
3509:
3508:
3504:
3503:
3492:
3489:
3488:
3450:
3447:
3446:
3416:
3408:
3407:
3403:
3390:
3388:
3375:
3367:
3365:
3363:
3360:
3359:
3328:
3327:
3323:
3321:
3318:
3317:
3297:
3294:
3293:
3268:
3267:
3263:
3254:
3251:
3250:
3228:
3225:
3224:
3191:
3187:
3172:
3168:
3159:
3155:
3146:
3142:
3133:
3129:
3126:
3123:
3122:
3105:
3100:
3099:
3097:
3094:
3093:
3066:
3062:
3047:
3043:
3034:
3030:
3029:
3025:
3016:
3012:
3009:
3006:
3005:
2987:
2983:
2962:
2958:
2949:
2945:
2930:
2926:
2917:
2913:
2910:
2907:
2906:
2888:
2884:
2869:
2865:
2856:
2852:
2849:
2846:
2845:
2799:
2796:
2795:
2776:
2773:
2772:
2756:
2753:
2752:
2729:
2724:
2723:
2720:
2717:
2716:
2698:
2693:
2692:
2689:
2686:
2685:
2649:
2645:
2621:
2617:
2599:
2595:
2594:
2590:
2572:
2569:
2568:
2515:
2510:
2509:
2496:
2492:
2484:
2481:
2480:
2470:Euclidean space
2466:parameter space
2462:
2455:
2411:
2409:
2408:
2406:
2403:
2402:
2371:
2370:
2356:
2354:
2353:
2346:
2333:
2328:
2320:
2310:
2306:
2305:
2303:
2300:
2299:
2297:
2292:
2278:
2276:
2275:
2268:
2255:
2251:
2241:
2237:
2233:
2223:
2219:
2218:
2216:
2213:
2212:
2210:
2196:
2194:
2193:
2186:
2173:
2169:
2159:
2155:
2151:
2141:
2137:
2136:
2134:
2131:
2130:
2127:
2126:
2121:
2116:
2111:
2105:
2104:
2090:
2088:
2087:
2080:
2067:
2063:
2053:
2049:
2045:
2035:
2031:
2030:
2028:
2025:
2024:
2022:
2017:
2003:
2001:
2000:
1993:
1980:
1975:
1967:
1957:
1953:
1952:
1950:
1947:
1946:
1944:
1930:
1928:
1927:
1920:
1907:
1903:
1893:
1889:
1885:
1875:
1871:
1870:
1868:
1865:
1864:
1861:
1860:
1846:
1844:
1843:
1836:
1823:
1819:
1809:
1805:
1801:
1791:
1787:
1786:
1784:
1781:
1780:
1778:
1773:
1759:
1757:
1756:
1749:
1736:
1732:
1722:
1718:
1714:
1704:
1700:
1699:
1697:
1694:
1693:
1691:
1677:
1675:
1674:
1667:
1654:
1649:
1641:
1631:
1627:
1626:
1624:
1621:
1620:
1613:
1612:
1593:
1591:
1590:
1586:
1581:
1579:
1576:
1575:
1545:
1543:
1542:
1539:
1536:
1535:
1500:
1498:
1497:
1494:
1491:
1490:
1455:
1451:
1447:
1439:
1437:
1411:
1407:
1403:
1395:
1393:
1374:
1370:
1366:
1358:
1356:
1354:
1351:
1350:
1323:
1320:
1319:
1296:
1281:
1278:
1277:
1254:
1248:
1247:
1246:
1243:
1240:
1239:
1223:
1220:
1219:
1199:
1183:
1180:
1179:
1147:
1131:
1125:
1124:
1123:
1106:
1091:
1088:
1087:
1055:
1052:
1051:
1026:
1023:
1022:
978:
973:
972:
963:
952:
951:
950:
947:
944:
943:
925:
919:
918:
917:
914:
911:
910:
882:
873:
862:
861:
860:
846:
845:
840:
837:
836:
808:
792:
786:
785:
784:
770:
749:
747:
733:
732:
730:
727:
726:
688:
684:
659:
658:
653:
642:
631:
610:
601:
597:
595:
592:
591:
559:
550:
546:
544:
541:
540:
500:
491:
487:
476:
461:
455:
454:
453:
435:
429:
428:
427:
425:
422:
421:
397:
393:
378:
374:
365:
361:
350:
347:
344:
343:
340:Euclidean space
335:parameter space
315:
312:
311:
256:
253:
252:
230:
229:
218:
214:
197:
193:
183:
179:
178:
174:
173:
164:
161:
160:
145:
98:derivative test
83:parameter space
35:
28:
23:
22:
15:
12:
11:
5:
24035:
24025:
24024:
24019:
24014:
23997:
23996:
23994:
23993:
23981:
23969:
23955:
23942:
23939:
23938:
23935:
23934:
23931:
23930:
23928:
23927:
23922:
23917:
23912:
23907:
23901:
23899:
23893:
23892:
23890:
23889:
23884:
23879:
23874:
23869:
23864:
23859:
23854:
23849:
23844:
23838:
23836:
23830:
23829:
23827:
23826:
23821:
23816:
23807:
23802:
23797:
23791:
23789:
23783:
23782:
23780:
23779:
23774:
23769:
23760:
23758:Bioinformatics
23754:
23752:
23742:
23741:
23729:
23728:
23725:
23724:
23721:
23720:
23717:
23716:
23714:
23713:
23707:
23705:
23701:
23700:
23698:
23697:
23691:
23689:
23683:
23682:
23680:
23679:
23674:
23669:
23664:
23658:
23656:
23647:
23641:
23640:
23637:
23636:
23634:
23633:
23628:
23623:
23618:
23613:
23607:
23605:
23599:
23598:
23596:
23595:
23590:
23585:
23577:
23572:
23567:
23566:
23565:
23563:partial (PACF)
23554:
23552:
23546:
23545:
23543:
23542:
23537:
23532:
23524:
23519:
23513:
23511:
23510:Specific tests
23507:
23506:
23504:
23503:
23498:
23493:
23488:
23483:
23478:
23473:
23468:
23462:
23460:
23453:
23447:
23446:
23444:
23443:
23442:
23441:
23440:
23439:
23424:
23423:
23422:
23412:
23410:Classification
23407:
23402:
23397:
23392:
23387:
23382:
23376:
23374:
23368:
23367:
23365:
23364:
23359:
23357:McNemar's test
23354:
23349:
23344:
23339:
23333:
23331:
23321:
23320:
23296:
23295:
23292:
23291:
23288:
23287:
23285:
23284:
23279:
23274:
23269:
23263:
23261:
23255:
23254:
23252:
23251:
23235:
23229:
23227:
23221:
23220:
23218:
23217:
23212:
23207:
23202:
23197:
23195:Semiparametric
23192:
23187:
23181:
23179:
23175:
23174:
23172:
23171:
23166:
23161:
23156:
23150:
23148:
23142:
23141:
23139:
23138:
23133:
23128:
23123:
23118:
23112:
23110:
23104:
23103:
23101:
23100:
23095:
23090:
23085:
23079:
23077:
23067:
23066:
23063:
23062:
23057:
23051:
23043:
23042:
23039:
23038:
23035:
23034:
23032:
23031:
23030:
23029:
23019:
23014:
23009:
23008:
23007:
23002:
22991:
22989:
22983:
22982:
22979:
22978:
22976:
22975:
22970:
22969:
22968:
22960:
22952:
22936:
22933:(Mann–Whitney)
22928:
22927:
22926:
22913:
22912:
22911:
22900:
22898:
22892:
22891:
22889:
22888:
22887:
22886:
22881:
22876:
22866:
22861:
22858:(Shapiro–Wilk)
22853:
22848:
22843:
22838:
22833:
22825:
22819:
22817:
22811:
22810:
22808:
22807:
22799:
22790:
22778:
22772:
22770:Specific tests
22766:
22765:
22762:
22761:
22759:
22758:
22753:
22748:
22742:
22740:
22734:
22733:
22731:
22730:
22725:
22724:
22723:
22713:
22712:
22711:
22701:
22695:
22693:
22687:
22686:
22684:
22683:
22682:
22681:
22676:
22666:
22661:
22656:
22651:
22646:
22640:
22638:
22632:
22631:
22629:
22628:
22623:
22622:
22621:
22616:
22615:
22614:
22609:
22594:
22593:
22592:
22587:
22582:
22577:
22566:
22564:
22555:
22549:
22548:
22546:
22545:
22540:
22535:
22534:
22533:
22523:
22518:
22517:
22516:
22506:
22505:
22504:
22499:
22494:
22484:
22479:
22474:
22473:
22472:
22467:
22462:
22446:
22445:
22444:
22439:
22434:
22424:
22423:
22422:
22417:
22407:
22406:
22405:
22395:
22394:
22393:
22383:
22378:
22373:
22367:
22365:
22355:
22354:
22342:
22341:
22338:
22337:
22334:
22333:
22331:
22330:
22325:
22320:
22315:
22309:
22307:
22301:
22300:
22298:
22297:
22292:
22287:
22281:
22279:
22275:
22274:
22272:
22271:
22266:
22261:
22256:
22251:
22246:
22241:
22235:
22233:
22227:
22226:
22224:
22223:
22221:Standard error
22218:
22213:
22208:
22207:
22206:
22201:
22190:
22188:
22182:
22181:
22179:
22178:
22173:
22168:
22163:
22158:
22153:
22151:Optimal design
22148:
22143:
22137:
22135:
22125:
22124:
22112:
22111:
22108:
22107:
22104:
22103:
22101:
22100:
22095:
22090:
22085:
22080:
22075:
22070:
22065:
22060:
22055:
22050:
22045:
22040:
22035:
22030:
22024:
22022:
22016:
22015:
22013:
22012:
22007:
22006:
22005:
22000:
21990:
21985:
21979:
21977:
21971:
21970:
21968:
21967:
21962:
21957:
21951:
21949:
21948:Summary tables
21945:
21944:
21942:
21941:
21935:
21933:
21927:
21926:
21923:
21922:
21920:
21919:
21918:
21917:
21912:
21907:
21897:
21891:
21889:
21883:
21882:
21880:
21879:
21874:
21869:
21864:
21859:
21854:
21849:
21843:
21841:
21835:
21834:
21832:
21831:
21826:
21821:
21820:
21819:
21814:
21809:
21804:
21799:
21794:
21789:
21784:
21782:Contraharmonic
21779:
21774:
21763:
21761:
21752:
21742:
21741:
21729:
21728:
21726:
21725:
21720:
21714:
21711:
21710:
21703:
21702:
21695:
21688:
21680:
21674:
21673:
21653:
21644:
21625:
21616:
21598:
21589:
21588:External links
21586:
21585:
21584:
21578:
21562:
21556:
21543:
21537:
21520:
21514:
21501:
21495:
21482:
21464:(2): 153–171.
21454:Le Cam, Lucien
21450:
21444:
21428:
21422:
21409:
21403:
21383:
21380:
21377:
21376:
21355:(3): 162–176.
21335:
21314:(2): 214–222.
21294:
21287:
21266:
21259:
21241:
21234:
21214:
21195:(3): 287–322.
21176:
21155:(3): 501–514.
21135:
21114:(3): 441–500.
21091:
21084:
21066:
21059:
21041:
21006:
20995:(2): 214–222.
20975:
20960:
20930:
20911:(4): 651–678.
20891:
20872:(3): 499–512.
20849:
20842:
20812:
20805:
20784:
20777:
20756:
20749:
20722:
20715:
20695:
20688:
20670:
20663:
20645:
20638:
20626:Nocedal, Jorge
20617:
20610:
20588:
20572:
20561:
20544:
20526:
20493:
20482:(2): 248–275.
20456:
20449:
20431:
20424:
20398:
20391:
20368:
20366:, p. 206)
20364:Pfanzagl (1994
20356:
20349:
20331:
20304:
20282:
20275:
20252:
20246:Stack Exchange
20231:
20224:
20204:
20197:
20175:
20152:
20145:
20119:
20090:
20083:
20061:
20054:
20033:
20026:
20008:
20001:
19980:
19973:
19954:
19953:
19951:
19948:
19947:
19946:
19940:
19934:
19929:
19924:
19918:
19912:
19907:
19901:
19895:
19889:
19881:
19878:
19877:
19876:
19870:Wilks' theorem
19867:
19857:
19848:
19842:
19836:
19830:
19822:
19819:
19818:
19817:
19801:
19798:
19776:Wilks' theorem
19729:
19726:
19689:
19684:
19678:
19675:
19669:
19663:
19658:
19652:
19648:
19643:
19638:
19635:
19632:
19629:
19624:
19606:
19601:
19593:
19592:
19581:
19576:
19572:
19568:
19563:
19560:
19557:
19553:
19549:
19544:
19540:
19529:
19518:
19515:
19510:
19506:
19502:
19499:
19496:
19493:
19490:
19485:
19481:
19477:
19472:
19468:
19464:
19461:
19458:
19455:
19450:
19446:
19431:
19430:
19419:
19408:
19404:
19398:
19394:
19387:
19381:
19377:
19368:
19362:
19358:
19351:
19345:
19341:
19335:
19331:
19325:
19321:
19314:
19306:
19302:
19295:
19289:
19285:
19276:
19270:
19266:
19260:
19256:
19249:
19244:
19240:
19236:
19231:
19228:
19225:
19221:
19205:
19200:
19199:
19198:
19187:
19182:
19178:
19174:
19169:
19166:
19163:
19159:
19155:
19150:
19146:
19135:
19124:
19116:
19112:
19106:
19101:
19097:
19092:
19087:
19082:
19078:
19067:
19056:
19053:
19048:
19044:
19040:
19037:
19034:
19031:
19028:
19023:
19019:
19015:
19010:
19006:
19002:
18999:
18996:
18993:
18988:
18984:
18969:
18968:
18957:
18951:
18945:
18941:
18935:
18931:
18925:
18921:
18917:
18913:
18906:
18900:
18896:
18890:
18886:
18880:
18876:
18872:
18869:
18865:
18859:
18854:
18848:
18841:
18835:
18831:
18825:
18821:
18815:
18811:
18807:
18804:
18800:
18796:
18791:
18788:
18785:
18780:
18763:
18758:
18753:
18748:
18747:
18746:
18734:
18728:
18725:
18719:
18713:
18708:
18701:
18698:
18693:
18686:
18680:
18674:
18671:
18666:
18662:
18658:
18655:
18652:
18649:
18646:
18640:
18632:
18629:
18624:
18620:
18616:
18613:
18610:
18607:
18604:
18596:
18591:
18588:
18585:
18581:
18575:
18572:
18566:
18561:
18558:
18554:
18548:
18545:
18539:
18533:
18528:
18499:
18496:Hessian matrix
18478:
18472:
18469:
18463:
18457:
18454:
18449:
18444:
18418:
18412:
18409:
18403:
18398:
18393:
18380:
18379:
18367:
18361:
18358:
18352:
18346:
18341:
18335:
18329:
18326:
18320:
18314:
18311:
18306:
18301:
18296:
18293:
18289:
18283:
18280:
18274:
18268:
18263:
18241:
18238:
18233:
18229:
18216:
18211:
18206:
18205:
18193:
18188:
18184:
18179:
18172:
18169:
18161:
18157:
18154:
18151:
18147:
18141:
18138:
18132:
18126:
18121:
18097:
18092:
18087:
18082:
18078:
18062:
18056:
18037:
18033:
18020:
18015:indicates the
18003:
17997:
17994:
17988:
17982:
17977:
17964:
17963:
17951:
17945:
17942:
17936:
17930:
17925:
17918:
17914:
17910:
17905:
17898:
17895:
17888:
17883:
17880:
17877:
17870:
17867:
17835:
17830:
17823:
17820:
17813:
17790:
17783:
17780:
17756:
17732:
17728:
17724:
17718:
17715:
17709:
17703:
17700:
17686:
17685:
17674:
17671:
17665:
17662:
17657:
17653:
17649:
17646:
17643:
17640:
17637:
17619:
17616:
17612:
17611:
17598:
17593:
17589:
17583:
17578:
17571:
17568:
17550:
17549:
17537:
17531:
17527:
17521:
17516:
17513:
17510:
17506:
17502:
17499:
17495:
17491:
17488:
17485:
17480:
17476:
17472:
17469:
17466:
17461:
17457:
17453:
17448:
17444:
17440:
17437:
17434:
17431:
17428:
17425:
17420:
17416:
17412:
17409:
17406:
17401:
17397:
17393:
17388:
17384:
17380:
17377:
17363:
17362:
17349:
17345:
17341:
17338:
17333:
17329:
17323:
17318:
17315:
17312:
17308:
17304:
17301:
17296:
17292:
17288:
17285:
17280:
17275:
17272:
17269:
17265:
17261:
17258:
17255:
17252:
17249:
17246:
17243:
17238:
17234:
17230:
17227:
17224:
17219:
17215:
17211:
17206:
17202:
17198:
17195:
17178:
17177:
17162:
17158:
17152:
17148:
17144:
17137:
17133:
17127:
17123:
17115:
17111:
17105:
17101:
17094:
17086:
17082:
17078:
17075:
17072:
17067:
17063:
17059:
17054:
17050:
17045:
17040:
17034:
17027:
17023:
17017:
17013:
17009:
17003:
16998:
16994:
16990:
16985:
16982:
16976:
16973:
16968:
16964:
16960:
16957:
16954:
16949:
16945:
16941:
16936:
16932:
16928:
16923:
16919:
16915:
16912:
16909:
16904:
16900:
16896:
16891:
16887:
16883:
16880:
16855:
16851:
16847:
16844:
16841:
16836:
16832:
16825:
16820:
16816:
16790:
16786:
16765:
16762:
16757:
16753:
16749:
16746:
16743:
16738:
16734:
16730:
16725:
16721:
16698:
16694:
16673:
16670:
16665:
16661:
16657:
16654:
16651:
16646:
16642:
16638:
16633:
16629:
16608:
16586:
16582:
16575:
16572:
16569:
16564:
16560:
16553:
16548:
16544:
16532:
16529:
16521:
16520:
16508:
16503:
16495:
16490:
16486:
16479:
16475:
16469:
16465:
16461:
16456:
16452:
16448:
16442:
16434:
16430:
16424:
16420:
16414:
16409:
16405:
16401:
16396:
16392:
16388:
16385:
16380:
16376:
16372:
16367:
16363:
16359:
16356:
16353:
16347:
16340:
16335:
16331:
16324:
16320:
16314:
16310:
16306:
16301:
16297:
16293:
16286:
16279:
16274:
16270:
16266:
16263:
16260:
16257:
16253:
16248:
16244:
16240:
16237:
16227:
16223:
16219:
16216:
16209:
16205:
16199:
16195:
16191:
16188:
16184:
16179:
16176:
16171:
16167:
16163:
16158:
16154:
16150:
16147:
16129:
16128:
16116:
16109:
16103:
16097:
16093:
16089:
16084:
16080:
16076:
16073:
16070:
16065:
16061:
16057:
16052:
16048:
16043:
16036:
16033:
16027:
16020:
16014:
16010:
16006:
16001:
15997:
15993:
15990:
15987:
15982:
15978:
15974:
15969:
15965:
15960:
15954:
15951:
15946:
15942:
15938:
15935:
15927:
15922:
15917:
15914:
15907:
15903:
15899:
15895:
15891:
15888:
15885:
15881:
15876:
15873:
15868:
15864:
15860:
15857:
15854:
15849:
15845:
15841:
15838:
15805:
15794:be denoted by
15779:
15774:
15770:
15766:
15763:
15760:
15755:
15751:
15747:
15727:
15722:
15718:
15714:
15711:
15708:
15703:
15699:
15695:
15680:
15679:
15667:
15662:
15658:
15654:
15651:
15648:
15643:
15639:
15635:
15632:
15629:
15626:
15621:
15617:
15613:
15608:
15604:
15600:
15597:
15572:
15568:
15545:
15541:
15528:
15525:
15505:
15504:
15491:
15485:
15482:
15479:
15474:
15467:
15464:
15457:
15454:
15451:
15448:
15445:
15439:
15432:
15426:
15423:
15416:
15411:
15406:
15400:
15397:
15391:
15385:
15382:
15376:
15371:
15364:
15359:
15356:
15339:
15338:
15327:
15323:
15317:
15310:
15307:
15300:
15294:
15291:
15284:
15280:
15274:
15269:
15242:
15237:
15233:
15229:
15226:
15223:
15220:
15217:
15187:
15184:
15159:
15152:
15149:
15125:
15116:is biased for
15102:
15099:
15074:
15070:
15060:is biased for
15047:
15040:
15037:
15022:
15021:
15010:
15005:
15001:
14995:
14990:
14987:
14984:
14977:
14972:
14964:
14957:
14954:
14944:
14939:
14934:
14907:
14903:
14899:
14894:
14886:
14881:
14877:
14870:
14865:
14862:
14842:
14839:
14834:
14826:
14822:
14815:
14810:
14805:
14792:
14791:
14780:
14777:
14772:
14768:
14764:
14761:
14758:
14755:
14750:
14746:
14742:
14739:
14736:
14731:
14726:
14723:
14720:
14716:
14710:
14705:
14702:
14699:
14695:
14687:
14683:
14679:
14674:
14669:
14665:
14659:
14655:
14651:
14648:
14645:
14640:
14635:
14632:
14629:
14625:
14619:
14616:
14611:
14606:
14599:
14596:
14567:
14563:
14559:
14556:
14553:
14548:
14544:
14528:
14527:
14516:
14511:
14507:
14501:
14497:
14491:
14486:
14483:
14480:
14476:
14470:
14465:
14462:
14459:
14455:
14447:
14443:
14439:
14434:
14429:
14424:
14420:
14414:
14409:
14406:
14403:
14399:
14393:
14390:
14385:
14380:
14376:
14369:
14366:
14360:
14355:
14351:
14347:
14342:
14337:
14334:
14331:
14327:
14321:
14318:
14313:
14308:
14301:
14298:
14268:
14265:
14259:
14256:
14245:
14244:
14233:
14228:
14224:
14220:
14217:
14212:
14208:
14204:
14199:
14194:
14191:
14188:
14184:
14178:
14175:
14170:
14165:
14158:
14155:
14137:
14136:
14121:
14116:
14112:
14107:
14104:
14099:
14095:
14090:
14085:
14080:
14077:
14074:
14070:
14062:
14058:
14054:
14049:
14044:
14039:
14032:
14029:
14024:
14019:
14014:
14010:
14006:
14003:
14000:
13995:
13988:
13983:
13980:
13974:
13971:
13967:
13962:
13959:
13957:
13955:
13952:
13951:
13918:
13915:
13901:
13900:
13888:
13885:
13882:
13877:
13868:
13865:
13856:
13851:
13846:
13826:expected value
13818:
13817:
13806:
13801:
13794:
13790:
13780:
13775:
13772:
13769:
13765:
13761:
13755:
13752:
13746:
13740:
13737:
13704:
13701:
13687:
13686:
13671:
13663:
13659:
13655:
13649:
13646:
13643:
13637:
13634:
13628:
13625:
13622:
13619:
13612:
13609:
13606:
13601:
13596:
13591:
13587:
13583:
13580:
13577:
13572:
13565:
13560:
13557:
13551:
13548:
13544:
13539:
13536:
13534:
13532:
13529:
13528:
13503:
13502:
13489:
13485:
13480:
13477:
13472:
13468:
13463:
13458:
13453:
13450:
13447:
13443:
13434:
13430:
13426:
13422:
13417:
13414:
13409:
13405:
13401:
13398:
13395:
13392:
13389:
13384:
13379:
13372:
13369:
13364:
13359:
13354:
13350:
13346:
13343:
13340:
13335:
13328:
13323:
13320:
13278:
13273:
13269:
13265:
13262:
13259:
13254:
13250:
13246:
13243:
13240:
13235:
13231:
13227:
13224:
13221:
13218:
13213:
13209:
13205:
13202:
13199:
13194:
13166:
13165:
13154:
13150:
13141:
13137:
13133:
13126:
13122:
13118:
13115:
13110:
13106:
13102:
13097:
13092:
13089:
13086:
13082:
13075:
13071:
13067:
13064:
13059:
13055:
13051:
13046:
13038:
13034:
13030:
13027:
13023:
13018:
13013:
13010:
13005:
13001:
12997:
12994:
12991:
12986:
12982:
12978:
12975:
12970:
12965:
12962:
12959:
12955:
12951:
12948:
12943:
12939:
12935:
12932:
12929:
12924:
12920:
12916:
12913:
12910:
12905:
12901:
12897:
12894:
12869:
12868:
12857:
12853:
12844:
12840:
12836:
12829:
12825:
12821:
12818:
12815:
12812:
12806:
12802:
12798:
12795:
12782:
12778:
12774:
12771:
12765:
12760:
12757:
12752:
12748:
12744:
12741:
12738:
12735:
12732:
12729:
12703:
12698:
12694:
12690:
12687:
12684:
12679:
12662:
12659:
12556:
12555:
12540:
12533:
12529:
12526:
12523:
12520:
12516:
12510:
12506:
12502:
12499:
12496:
12493:
12488:
12484:
12480:
12477:
12475:
12473:
12469:
12465:
12462:
12459:
12456:
12453:
12450:
12447:
12444:
12441:
12437:
12431:
12427:
12423:
12420:
12417:
12414:
12409:
12405:
12401:
12398:
12396:
12394:
12389:
12385:
12381:
12378:
12375:
12372:
12367:
12363:
12359:
12356:
12351:
12347:
12343:
12340:
12337:
12334:
12329:
12325:
12321:
12318:
12315:
12313:
12311:
12308:
12307:
12304:
12297:
12291:
12287:
12283:
12280:
12277:
12274:
12269:
12265:
12258:
12253:
12250:
12245:
12238:
12231:
12228:
12224:
12219:
12216:
12214:
12212:
12209:
12208:
12164:
12163:
12152:
12144:
12140:
12136:
12133:
12130:
12127:
12122:
12118:
12111:
12106:
12103:
12098:
12092:
12089:
12086:
12083:
12080:
12077:
12073:
12069:
12064:
12060:
12056:
12053:
12050:
12047:
12044:
12017:
12014:
11988:
11987:
11972:
11966:
11963:
11958:
11954:
11947:
11944:
11938:
11935:
11932:
11927:
11923:
11916:
11913:
11907:
11901:
11896:
11893:
11888:
11882:
11879:
11877:
11873:
11864:
11861:
11855:
11852:
11849:
11846:
11843:
11839:
11832:
11827:
11822:
11817:
11816:
11813:
11810:
11807:
11802:
11798:
11791:
11788:
11782:
11779:
11776:
11771:
11767:
11760:
11757:
11751:
11745:
11740:
11737:
11732:
11726:
11723:
11721:
11717:
11708:
11705:
11699:
11696:
11693:
11690:
11687:
11683:
11676:
11671:
11666:
11661:
11660:
11657:
11654:
11651:
11646:
11642:
11635:
11632:
11626:
11623:
11620:
11615:
11611:
11604:
11601:
11595:
11589:
11584:
11581:
11576:
11570:
11567:
11565:
11561:
11552:
11549:
11543:
11540:
11537:
11534:
11531:
11527:
11520:
11515:
11510:
11505:
11504:
11408:
11401:
11394:
11368:
11365:
11333:
11330:
11314:of the number
11312:expected value
11258:is the number
11240:
11237:
11200:Main article:
11197:
11194:
11192:
11189:
11186:
11185:
11168:
11164:
11159:
11124:
11121:
11067:
11064:
11061:
11058:
11055:
11052:
11047:
11042:
11038:
11034:
11031:
11028:
11025:
11019:
11016:
11013:
11010:
11007:
11004:
10999:
10995:
10983:
10982:
10967:
10962:
10958:
10954:
10947:
10943:
10938:
10934:
10925:
10918:
10914:
10911:
10908:
10904:
10901:
10898:
10892:
10889:
10887:
10885:
10882:
10879:
10873:
10870:
10867:
10864:
10861:
10858:
10853:
10848:
10844:
10840:
10837:
10834:
10831:
10825:
10822:
10819:
10816:
10813:
10806:
10802:
10797:
10793:
10787:
10783:
10780:
10777:
10773:
10770:
10767:
10761:
10758:
10755:
10752:
10749:
10746:
10741:
10737:
10733:
10730:
10727:
10720:
10716:
10711:
10707:
10701:
10697:
10694:
10691:
10687:
10684:
10681:
10675:
10672:
10670:
10668:
10665:
10662:
10659:
10656:
10651:
10647:
10643:
10640:
10634:
10630:
10627:
10624:
10620:
10617:
10614:
10604:
10601:
10598:
10594:
10588:
10583:
10579:
10575:
10570:
10566:
10560:
10555:
10552:
10549:
10545:
10539:
10536:
10528:
10524:
10521:
10518:
10514:
10511:
10508:
10502:
10499:
10497:
10495:
10489:
10486:
10483:
10478:
10474:
10470:
10467:
10462:
10457:
10453:
10449:
10444:
10440:
10436:
10433:
10427:
10424:
10419:
10414:
10411:
10408:
10404:
10398:
10395:
10387:
10383:
10380:
10377:
10373:
10370:
10367:
10361:
10355:
10352:
10349:
10344:
10340:
10336:
10333:
10328:
10323:
10319:
10315:
10310:
10306:
10302:
10299:
10293:
10290:
10285:
10280:
10277:
10274:
10270:
10263:
10259:
10256:
10253:
10249:
10246:
10243:
10237:
10231:
10226:
10222:
10218:
10213:
10209:
10205:
10202:
10197:
10194:
10191:
10186:
10182:
10178:
10175:
10169:
10166:
10161:
10156:
10153:
10150:
10146:
10139:
10135:
10132:
10129:
10125:
10122:
10119:
10113:
10110:
10108:
10106:
10102:
10098:
10093:
10089:
10085:
10080:
10076:
10072:
10069:
10066:
10063:
10060:
10057:
10054:
10051:
10046:
10042:
10038:
10035:
10032:
10029:
10025:
10019:
10014:
10011:
10008:
10004:
9997:
9993:
9990:
9987:
9983:
9980:
9977:
9971:
9967:
9963:
9958:
9954:
9950:
9945:
9941:
9937:
9934:
9931:
9928:
9923:
9918:
9915:
9912:
9908:
9904:
9901:
9898:
9895:
9890:
9886:
9882:
9879:
9876:
9873:
9868:
9863:
9860:
9857:
9853:
9848:
9841:
9837:
9834:
9831:
9827:
9824:
9821:
9815:
9812:
9810:
9808:
9805:
9802:
9799:
9794:
9790:
9786:
9783:
9780:
9777:
9772:
9767:
9764:
9761:
9757:
9750:
9746:
9743:
9740:
9736:
9733:
9730:
9724:
9721:
9718:
9715:
9710:
9706:
9702:
9699:
9694:
9689:
9686:
9683:
9679:
9672:
9668:
9665:
9662:
9658:
9655:
9652:
9646:
9643:
9641:
9639:
9636:
9633:
9630:
9626:
9622:
9619:
9613:
9609:
9606:
9603:
9599:
9596:
9593:
9587:
9584:
9580:
9576:
9571:
9567:
9560:
9556:
9553:
9550:
9546:
9543:
9540:
9534:
9531:
9527:
9523:
9516:
9512:
9507:
9500:
9496:
9493:
9490:
9486:
9483:
9480:
9474:
9471:
9469:
9464:
9461:
9455:
9454:
9429:
9425:
9401:
9398:
9371:
9367:
9362:
9358:
9355:
9335:
9330:
9326:
9322:
9319:
9316:
9311:
9307:
9303:
9298:
9294:
9290:
9287:
9283:
9262:
9261:
9237:
9233:
9228:
9204:
9201:
9178:
9154:
9150:
9145:
9116:
9113:
9107:
9083:
9080:
9054:
9051:
9036:
9033:
9015:
9012:
9009:
9006:
9001:
8972:
8960:
8959:
8948:
8942:
8936:
8933:
8930:
8927:
8922:
8916:
8913:
8910:
8907:
8904:
8901:
8896:
8888:
8880:
8876:
8873:
8870:
8866:
8863:
8860:
8854:
8845:
8830:
8829:
8814:
8811:
8808:
8805:
8800:
8793:
8788:
8784:
8780:
8777:
8772:
8767:
8762:
8758:
8754:
8751:
8748:
8745:
8740:
8732:
8729:
8726:
8723:
8718:
8714:
8710:
8707:
8702:
8687:Bayes' theorem
8672:
8666:
8662:
8639:
8636:
8633:
8628:
8624:
8620:
8617:
8612:
8607:
8604:
8601:
8598:
8590:
8587:
8582:
8556:
8552:
8539:
8538:
8524:
8521:
8518:
8513:
8509:
8505:
8502:
8497:
8492:
8489:
8486:
8483:
8475:
8472:
8467:
8451:
8450:
8436:
8433:
8430:
8426:
8423:
8420:
8417:
8412:
8407:
8404:
8401:
8393:
8390:
8385:
8378:
8373:
8370:
8366:
8359:
8355:
8352:
8349:
8345:
8342:
8339:
8333:
8330:
8304:
8300:
8296:
8290:
8286:
8273:
8272:
8257:
8253:
8228:
8222:
8219:
8215:
8209:
8205:
8201:
8198:
8193:
8187:
8183:
8180:
8176:
8170:
8166:
8162:
8159:
8154:
8126:
8122:
8093:
8090:
8077:
8074:
8071:
8068:
8063:
8041:
8038:
8035:
8030:
8026:
8022:
8019:
8016:
8011:
8007:
8003:
7998:
7994:
7990:
7987:
7967:
7964:
7961:
7958:
7953:
7927:
7924:
7921:
7918:
7913:
7908:
7905:
7902:
7897:
7893:
7889:
7886:
7883:
7878:
7874:
7870:
7865:
7861:
7857:
7854:
7830:
7825:
7821:
7817:
7814:
7811:
7806:
7802:
7798:
7793:
7789:
7785:
7782:
7777:
7751:
7748:
7745:
7742:
7737:
7724:
7723:
7709:
7704:
7700:
7696:
7693:
7690:
7685:
7681:
7677:
7672:
7668:
7664:
7661:
7656:
7649:
7646:
7643:
7640:
7635:
7630:
7627:
7624:
7619:
7615:
7611:
7608:
7605:
7600:
7596:
7592:
7587:
7583:
7579:
7576:
7570:
7567:
7562:
7558:
7554:
7551:
7548:
7543:
7539:
7535:
7530:
7526:
7522:
7519:
7516:
7513:
7508:
7482:. Indeed, the
7461:
7458:
7431:
7399:
7398:
7387:
7378:
7373:
7366:
7354:
7349:
7341:
7336:
7324:
7319:
7296:
7295:
7284:
7276:
7265:
7261:
7257:
7251:
7247:
7243:
7237:
7232:
7228:
7224:
7217:
7213:
7208:
7204:
7201:
7196:
7192:
7178:
7174:
7170:
7164:
7159:
7155:
7151:
7144:
7140:
7135:
7131:
7128:
7125:
7118:
7110:
7106:
7102:
7096:
7092:
7088:
7082:
7078:
7074:
7069:
7064:
7060:
7056:
7049:
7045:
7040:
7036:
7033:
7028:
7024:
7015:
7012:
7004:
6996:
6990:
6984:
6981:
6978:
6975:
6971:
6966:
6960:
6957:
6954:
6950:
6941:
6936:
6909:
6906:
6900:
6867:
6864:
6858:
6844:
6843:
6831:
6825:
6822:
6819:
6816:
6812:
6807:
6801:
6798:
6795:
6791:
6782:
6777:
6771:
6765:
6762:
6756:
6747:
6744:
6738:
6729:
6724:
6721:
6718:
6715:
6712:
6709:
6706:
6702:
6693:
6688:
6682:
6676:
6668:
6663:
6657:
6653:
6649:
6643:
6640:
6637:
6629:
6626:
6618:
6610:
6605:
6600:
6594:
6588:
6584:
6555:
6543:
6540:
6508:
6507:
6496:
6488:
6477:
6473:
6469:
6463:
6459:
6455:
6450:
6445:
6441:
6437:
6430:
6426:
6421:
6417:
6414:
6409:
6405:
6398:
6392:
6384:
6379:
6374:
6371:
6365:
6330:
6314:
6313:
6302:
6295:
6289:
6286:
6280:
6271:
6268:
6264:
6258:
6245:
6241:
6230:
6224:
6220:
6216:
6204:
6199:
6190:
6182:
6139:
6133:
6128:
6124:
6120:
6116:
6113:
6110:
6095:
6092:
6076:
6056:
6045:
6044:
6029:
6025:
6022:
6019:
6015:
6012:
6007:
6001:
5998:
5995:
5990:
5986:
5979:
5976:
5973:
5970:
5965:
5961:
5937:
5917:
5914:
5911:
5908:
5905:
5902:
5891:
5890:
5878:
5875:
5872:
5869:
5866:
5861:
5858:
5855:
5852:
5849:
5846:
5843:
5840:
5836:
5832:
5829:
5826:
5823:
5817:
5814:
5793:
5792:
5780:
5777:
5770:
5765:
5757:
5754:
5751:
5745:
5742:
5716:
5713:
5710:
5707:
5704:
5701:
5681:
5661:
5658:
5655:
5652:
5632:
5609:
5604:
5587:
5584:
5570:
5569:
5557:
5551:
5548:
5544:
5539:
5536:
5532:
5526:
5516:
5512:
5504:
5498:
5494:
5490:
5484:
5481:
5478:
5470:
5465:
5456:
5450:
5421:
5416:
5412:
5408:
5404:
5401:
5398:
5387:
5386:
5375:
5363:
5355:
5350:
5347:
5344:
5341:
5338:
5335:
5332:
5329:
5326:
5323:
5317:
5312:
5303:
5297:
5294:
5291:
5287:
5268:
5250:
5245:
5217:
5214:
5211:
5208:
5205:
5199:
5194:
5174:
5173:
5172:
5171:
5160:
5147:
5138:
5133:
5130:
5127:
5124:
5121:
5118:
5115:
5112:
5109:
5106:
5100:
5095:
5087:
5081:
5078:
5075:
5071:
5054:
5053:
5042:
5039:
5036:
5033:
5024:
5021:
5018:
5015:
5012:
5007:
5002:
4999:
4996:
4993:
4990:
4987:
4984:
4981:
4976:
4958:
4932:
4926:
4925:
4914:
4911:
4906:
4900:
4897:
4894:
4889:
4885:
4880:
4876:
4873:
4870:
4867:
4864:
4861:
4858:
4855:
4849:
4844:
4839:
4800:
4798:
4797:
4782:
4769:
4757:
4736:
4732:
4710:
4703:
4696:
4686:
4685:
4674:
4671:
4666:
4662:
4658:
4655:
4652:
4649:
4646:
4643:
4640:
4637:
4634:
4631:
4628:
4624:
4618:
4614:
4610:
4607:
4596:of the model:
4594:Identification
4590:
4568:
4563:
4559:
4555:
4551:
4548:
4545:
4525:
4520:
4516:
4512:
4508:
4505:
4502:
4491:
4490:
4479:
4474:
4470:
4456:
4445:
4442:
4439:
4431:
4426:
4399:
4398:
4387:
4382:
4378:
4364:
4353:
4350:
4347:
4339:
4334:
4300:
4295:
4275:
4256:
4251:
4247:
4243:
4239:
4236:
4233:
4217:
4214:
4213:
4212:
4209:
4191:
4179:
4159:
4132:
4126:
4123:
4117:
4114:
4111:
4105:
4102:
4079:
4076:
4073:
4070:
4067:
4064:
4044:
4024:
4021:
4018:
4015:
3995:
3972:
3969:
3952:
3940:of attractive
3925:
3921:
3918:
3915:
3910:
3906:
3902:
3899:
3896:
3893:
3889:
3886:
3881:
3876:
3873:
3870:
3867:
3864:
3853:
3852:
3841:
3838:
3835:
3832:
3827:
3823:
3819:
3816:
3813:
3810:
3805:
3800:
3797:
3794:
3790:
3784:
3781:
3776:
3773:
3770:
3767:
3763:
3760:
3754:
3749:
3718:
3715:
3712:
3708:
3705:
3699:
3694:
3667:
3664:
3655:
3652:
3623:
3620:
3612:
3607:
3603:
3600:
3597:
3594:
3560:
3554:
3548:
3544:
3540:
3537:
3534:
3529:
3525:
3521:
3516:
3512:
3507:
3502:
3499:
3485:
3484:
3473:
3469:
3466:
3463:
3460:
3457:
3454:
3434:
3431:
3428:
3422:
3419:
3411:
3406:
3402:
3399:
3396:
3393:
3387:
3381:
3378:
3373:
3370:
3331:
3326:
3301:
3281:
3277:
3271:
3266:
3262:
3259:
3233:
3205:
3199:
3194:
3190:
3186:
3183:
3180:
3175:
3171:
3167:
3162:
3158:
3154:
3149:
3145:
3141:
3136:
3132:
3108:
3103:
3075:
3069:
3065:
3061:
3058:
3055:
3050:
3046:
3042:
3037:
3033:
3028:
3024:
3019:
3015:
2990:
2986:
2982:
2979:
2976:
2971:
2968:
2965:
2961:
2957:
2952:
2948:
2944:
2941:
2938:
2933:
2929:
2925:
2920:
2916:
2891:
2887:
2883:
2880:
2877:
2872:
2868:
2864:
2859:
2855:
2827:
2821:
2818:
2815:
2812:
2809:
2806:
2780:
2760:
2740:
2732:
2727:
2701:
2696:
2667:
2663:
2660:
2657:
2652:
2648:
2644:
2641:
2638:
2635:
2632:
2629:
2624:
2620:
2616:
2613:
2610:
2607:
2602:
2598:
2593:
2589:
2586:
2583:
2580:
2577:
2565:
2564:
2553:
2546:
2542:
2539:
2536:
2533:
2530:
2527:
2523:
2518:
2513:
2508:
2505:
2502:
2499:
2495:
2491:
2488:
2454:
2451:
2419:
2414:
2395:
2394:
2383:
2375:
2364:
2359:
2352:
2349:
2344:
2336:
2331:
2327:
2323:
2318:
2313:
2309:
2302:
2298:
2296:
2293:
2286:
2281:
2274:
2271:
2266:
2258:
2254:
2250:
2244:
2240:
2236:
2231:
2226:
2222:
2215:
2211:
2204:
2199:
2192:
2189:
2184:
2176:
2172:
2168:
2162:
2158:
2154:
2149:
2144:
2140:
2133:
2129:
2128:
2125:
2122:
2120:
2117:
2115:
2112:
2110:
2107:
2106:
2098:
2093:
2086:
2083:
2078:
2070:
2066:
2062:
2056:
2052:
2048:
2043:
2038:
2034:
2027:
2023:
2021:
2018:
2011:
2006:
1999:
1996:
1991:
1983:
1978:
1974:
1970:
1965:
1960:
1956:
1949:
1945:
1938:
1933:
1926:
1923:
1918:
1910:
1906:
1902:
1896:
1892:
1888:
1883:
1878:
1874:
1867:
1863:
1862:
1854:
1849:
1842:
1839:
1834:
1826:
1822:
1818:
1812:
1808:
1804:
1799:
1794:
1790:
1783:
1779:
1777:
1774:
1767:
1762:
1755:
1752:
1747:
1739:
1735:
1731:
1725:
1721:
1717:
1712:
1707:
1703:
1696:
1692:
1685:
1680:
1673:
1670:
1665:
1657:
1652:
1648:
1644:
1639:
1634:
1630:
1623:
1619:
1618:
1616:
1611:
1607:
1601:
1596:
1589:
1584:
1570:Hessian matrix
1553:
1548:
1515:
1508:
1503:
1487:
1486:
1475:
1469:
1466:
1458:
1454:
1450:
1445:
1442:
1435:
1432:
1428:
1425:
1422:
1414:
1410:
1406:
1401:
1398:
1391:
1388:
1385:
1377:
1373:
1369:
1364:
1361:
1332:
1328:
1316:differentiable
1303:
1299:
1295:
1291:
1288:
1285:
1265:
1257:
1251:
1227:
1206:
1202:
1198:
1194:
1191:
1188:
1172:
1171:
1160:
1154:
1150:
1146:
1142:
1139:
1134:
1128:
1122:
1119:
1116:
1113:
1109:
1105:
1101:
1098:
1095:
1081:log-likelihood
1060:
1031:
1001:so defined is
989:
986:
981:
976:
971:
966:
959:
956:
928:
922:
895:
892:
889:
885:
881:
876:
869:
866:
859:
853:
850:
833:
832:
821:
815:
811:
807:
803:
800:
795:
789:
779:
776:
773:
768:
765:
762:
758:
755:
752:
746:
740:
737:
720:
719:
708:
702:
699:
696:
691:
687:
683:
677:
674:
671:
668:
665:
662:
656:
652:
645:
640:
637:
634:
630:
626:
623:
620:
617:
613:
609:
604:
600:
572:
569:
566:
562:
558:
553:
549:
529:
528:
517:
513:
510:
507:
503:
499:
494:
490:
486:
483:
479:
475:
472:
469:
464:
458:
452:
449:
446:
443:
438:
432:
405:
400:
396:
392:
389:
386:
381:
377:
373:
368:
364:
360:
357:
353:
332:is called the
320:
299:
295:
292:
289:
286:
283:
280:
277:
274:
270:
267:
264:
261:
233:
227:
221:
217:
212:
209:
205:
200:
196:
191:
186:
182:
177:
172:
169:
144:
141:
94:differentiable
57:of an assumed
26:
9:
6:
4:
3:
2:
24034:
24023:
24020:
24018:
24015:
24013:
24010:
24009:
24007:
23992:
23991:
23982:
23980:
23979:
23970:
23968:
23967:
23962:
23956:
23954:
23953:
23944:
23943:
23940:
23926:
23923:
23921:
23920:Geostatistics
23918:
23916:
23913:
23911:
23908:
23906:
23903:
23902:
23900:
23898:
23894:
23888:
23887:Psychometrics
23885:
23883:
23880:
23878:
23875:
23873:
23870:
23868:
23865:
23863:
23860:
23858:
23855:
23853:
23850:
23848:
23845:
23843:
23840:
23839:
23837:
23835:
23831:
23825:
23822:
23820:
23817:
23815:
23811:
23808:
23806:
23803:
23801:
23798:
23796:
23793:
23792:
23790:
23788:
23784:
23778:
23775:
23773:
23770:
23768:
23764:
23761:
23759:
23756:
23755:
23753:
23751:
23750:Biostatistics
23747:
23743:
23739:
23734:
23730:
23712:
23711:Log-rank test
23709:
23708:
23706:
23702:
23696:
23693:
23692:
23690:
23688:
23684:
23678:
23675:
23673:
23670:
23668:
23665:
23663:
23660:
23659:
23657:
23655:
23651:
23648:
23646:
23642:
23632:
23629:
23627:
23624:
23622:
23619:
23617:
23614:
23612:
23609:
23608:
23606:
23604:
23600:
23594:
23591:
23589:
23586:
23584:
23582:(Box–Jenkins)
23578:
23576:
23573:
23571:
23568:
23564:
23561:
23560:
23559:
23556:
23555:
23553:
23551:
23547:
23541:
23538:
23536:
23535:Durbin–Watson
23533:
23531:
23525:
23523:
23520:
23518:
23517:Dickey–Fuller
23515:
23514:
23512:
23508:
23502:
23499:
23497:
23494:
23492:
23491:Cointegration
23489:
23487:
23484:
23482:
23479:
23477:
23474:
23472:
23469:
23467:
23466:Decomposition
23464:
23463:
23461:
23457:
23454:
23452:
23448:
23438:
23435:
23434:
23433:
23430:
23429:
23428:
23425:
23421:
23418:
23417:
23416:
23413:
23411:
23408:
23406:
23403:
23401:
23398:
23396:
23393:
23391:
23388:
23386:
23383:
23381:
23378:
23377:
23375:
23373:
23369:
23363:
23360:
23358:
23355:
23353:
23350:
23348:
23345:
23343:
23340:
23338:
23337:Cohen's kappa
23335:
23334:
23332:
23330:
23326:
23322:
23318:
23314:
23310:
23306:
23301:
23297:
23283:
23280:
23278:
23275:
23273:
23270:
23268:
23265:
23264:
23262:
23260:
23256:
23250:
23246:
23242:
23236:
23234:
23231:
23230:
23228:
23226:
23222:
23216:
23213:
23211:
23208:
23206:
23203:
23201:
23198:
23196:
23193:
23191:
23190:Nonparametric
23188:
23186:
23183:
23182:
23180:
23176:
23170:
23167:
23165:
23162:
23160:
23157:
23155:
23152:
23151:
23149:
23147:
23143:
23137:
23134:
23132:
23129:
23127:
23124:
23122:
23119:
23117:
23114:
23113:
23111:
23109:
23105:
23099:
23096:
23094:
23091:
23089:
23086:
23084:
23081:
23080:
23078:
23076:
23072:
23068:
23061:
23058:
23056:
23053:
23052:
23048:
23044:
23028:
23025:
23024:
23023:
23020:
23018:
23015:
23013:
23010:
23006:
23003:
23001:
22998:
22997:
22996:
22993:
22992:
22990:
22988:
22984:
22974:
22971:
22967:
22961:
22959:
22953:
22951:
22945:
22944:
22943:
22940:
22939:Nonparametric
22937:
22935:
22929:
22925:
22922:
22921:
22920:
22914:
22910:
22909:Sample median
22907:
22906:
22905:
22902:
22901:
22899:
22897:
22893:
22885:
22882:
22880:
22877:
22875:
22872:
22871:
22870:
22867:
22865:
22862:
22860:
22854:
22852:
22849:
22847:
22844:
22842:
22839:
22837:
22834:
22832:
22830:
22826:
22824:
22821:
22820:
22818:
22816:
22812:
22806:
22804:
22800:
22798:
22796:
22791:
22789:
22784:
22780:
22779:
22776:
22773:
22771:
22767:
22757:
22754:
22752:
22749:
22747:
22744:
22743:
22741:
22739:
22735:
22729:
22726:
22722:
22719:
22718:
22717:
22714:
22710:
22707:
22706:
22705:
22702:
22700:
22697:
22696:
22694:
22692:
22688:
22680:
22677:
22675:
22672:
22671:
22670:
22667:
22665:
22662:
22660:
22657:
22655:
22652:
22650:
22647:
22645:
22642:
22641:
22639:
22637:
22633:
22627:
22624:
22620:
22617:
22613:
22610:
22608:
22605:
22604:
22603:
22600:
22599:
22598:
22595:
22591:
22588:
22586:
22583:
22581:
22578:
22576:
22573:
22572:
22571:
22568:
22567:
22565:
22563:
22559:
22556:
22554:
22550:
22544:
22541:
22539:
22536:
22532:
22529:
22528:
22527:
22524:
22522:
22519:
22515:
22514:loss function
22512:
22511:
22510:
22507:
22503:
22500:
22498:
22495:
22493:
22490:
22489:
22488:
22485:
22483:
22480:
22478:
22475:
22471:
22468:
22466:
22463:
22461:
22455:
22452:
22451:
22450:
22447:
22443:
22440:
22438:
22435:
22433:
22430:
22429:
22428:
22425:
22421:
22418:
22416:
22413:
22412:
22411:
22408:
22404:
22401:
22400:
22399:
22396:
22392:
22389:
22388:
22387:
22384:
22382:
22379:
22377:
22374:
22372:
22369:
22368:
22366:
22364:
22360:
22356:
22352:
22347:
22343:
22329:
22326:
22324:
22321:
22319:
22316:
22314:
22311:
22310:
22308:
22306:
22302:
22296:
22293:
22291:
22288:
22286:
22283:
22282:
22280:
22276:
22270:
22267:
22265:
22262:
22260:
22257:
22255:
22252:
22250:
22247:
22245:
22242:
22240:
22237:
22236:
22234:
22232:
22228:
22222:
22219:
22217:
22216:Questionnaire
22214:
22212:
22209:
22205:
22202:
22200:
22197:
22196:
22195:
22192:
22191:
22189:
22187:
22183:
22177:
22174:
22172:
22169:
22167:
22164:
22162:
22159:
22157:
22154:
22152:
22149:
22147:
22144:
22142:
22139:
22138:
22136:
22134:
22130:
22126:
22122:
22117:
22113:
22099:
22096:
22094:
22091:
22089:
22086:
22084:
22081:
22079:
22076:
22074:
22071:
22069:
22066:
22064:
22061:
22059:
22056:
22054:
22051:
22049:
22046:
22044:
22043:Control chart
22041:
22039:
22036:
22034:
22031:
22029:
22026:
22025:
22023:
22021:
22017:
22011:
22008:
22004:
22001:
21999:
21996:
21995:
21994:
21991:
21989:
21986:
21984:
21981:
21980:
21978:
21976:
21972:
21966:
21963:
21961:
21958:
21956:
21953:
21952:
21950:
21946:
21940:
21937:
21936:
21934:
21932:
21928:
21916:
21913:
21911:
21908:
21906:
21903:
21902:
21901:
21898:
21896:
21893:
21892:
21890:
21888:
21884:
21878:
21875:
21873:
21870:
21868:
21865:
21863:
21860:
21858:
21855:
21853:
21850:
21848:
21845:
21844:
21842:
21840:
21836:
21830:
21827:
21825:
21822:
21818:
21815:
21813:
21810:
21808:
21805:
21803:
21800:
21798:
21795:
21793:
21790:
21788:
21785:
21783:
21780:
21778:
21775:
21773:
21770:
21769:
21768:
21765:
21764:
21762:
21760:
21756:
21753:
21751:
21747:
21743:
21739:
21734:
21730:
21724:
21721:
21719:
21716:
21715:
21712:
21708:
21701:
21696:
21694:
21689:
21687:
21682:
21681:
21678:
21664:. El Paso, TX
21663:
21659:
21654:
21650:
21645:
21641:
21640:
21634:
21630:
21626:
21622:
21617:
21613:
21609:
21608:
21603:
21599:
21596:
21592:
21591:
21581:
21575:
21571:
21567:
21563:
21559:
21557:0-19-850650-3
21553:
21549:
21544:
21540:
21538:0-86094-190-6
21534:
21529:
21528:
21521:
21517:
21511:
21507:
21502:
21498:
21492:
21488:
21483:
21479:
21475:
21471:
21467:
21463:
21459:
21455:
21451:
21447:
21445:0-521-36697-6
21441:
21437:
21433:
21429:
21425:
21423:0-8039-4107-2
21419:
21415:
21410:
21406:
21404:0-521-25317-9
21400:
21396:
21395:
21390:
21386:
21385:
21372:
21368:
21363:
21358:
21354:
21350:
21346:
21339:
21331:
21327:
21322:
21317:
21313:
21309:
21305:
21298:
21290:
21284:
21280:
21276:
21270:
21262:
21256:
21252:
21245:
21237:
21231:
21227:
21226:
21218:
21210:
21206:
21202:
21198:
21194:
21190:
21186:
21180:
21172:
21168:
21163:
21158:
21154:
21150:
21146:
21139:
21131:
21127:
21122:
21117:
21113:
21109:
21105:
21101:
21095:
21087:
21081:
21077:
21070:
21062:
21056:
21052:
21045:
21036:
21031:
21027:
21023:
21022:
21017:
21010:
21002:
20998:
20994:
20990:
20986:
20979:
20971:
20967:
20963:
20957:
20953:
20949:
20945:
20941:
20934:
20926:
20922:
20918:
20914:
20910:
20906:
20902:
20895:
20887:
20883:
20879:
20875:
20871:
20867:
20863:
20859:
20853:
20845:
20843:0-12-283950-1
20839:
20835:
20830:
20829:
20823:
20816:
20808:
20802:
20798:
20797:
20788:
20780:
20778:0-631-14956-2
20774:
20770:
20766:
20765:Sargan, Denis
20760:
20752:
20750:0-674-00560-0
20746:
20742:
20738:
20737:
20732:
20726:
20718:
20712:
20708:
20707:
20699:
20691:
20685:
20681:
20674:
20666:
20664:0-12-201150-3
20660:
20656:
20649:
20641:
20639:0-387-30303-0
20635:
20631:
20627:
20621:
20613:
20611:0-471-91547-5
20607:
20602:
20601:
20592:
20586:
20582:
20576:
20570:
20565:
20558:
20554:
20548:
20537:
20530:
20521:
20516:
20512:
20508:
20504:
20497:
20489:
20485:
20481:
20477:
20476:
20471:
20467:
20466:Cox, David R.
20460:
20452:
20450:0-471-98103-6
20446:
20442:
20435:
20427:
20421:
20417:
20413:
20405:
20403:
20394:
20388:
20384:
20380:
20372:
20365:
20360:
20352:
20346:
20342:
20335:
20327:
20323:
20319:
20315:
20308:
20300:
20293:
20286:
20278:
20276:0-412-13820-4
20272:
20268:
20267:
20259:
20257:
20248:
20247:
20242:
20235:
20227:
20225:0-471-82668-5
20221:
20217:
20216:
20208:
20200:
20198:0-19-850688-0
20194:
20190:
20186:
20179:
20171:
20166:
20165:
20156:
20148:
20146:0-521-40551-3
20142:
20138:
20133:
20132:
20123:
20115:
20111:
20108:(1): 90–100.
20107:
20103:
20102:
20094:
20086:
20084:0-521-43064-X
20080:
20076:
20072:
20065:
20057:
20051:
20047:
20043:
20037:
20029:
20023:
20019:
20012:
20004:
19998:
19994:
19990:
19984:
19976:
19970:
19966:
19959:
19955:
19944:
19941:
19938:
19935:
19933:
19930:
19928:
19925:
19922:
19919:
19916:
19913:
19911:
19908:
19905:
19902:
19899:
19896:
19893:
19890:
19887:
19884:
19883:
19875:
19871:
19868:
19865:
19861:
19858:
19856:
19852:
19849:
19846:
19843:
19840:
19837:
19834:
19831:
19828:
19825:
19824:
19815:
19809:
19804:
19797:
19794:
19792:
19788:
19784:
19782:
19777:
19773:
19769:
19764:
19762:
19761:Ronald Fisher
19758:
19754:
19750:
19746:
19738:
19737:Ronald Fisher
19734:
19725:
19723:
19719:
19715:
19711:
19706:
19704:
19687:
19682:
19676:
19673:
19667:
19661:
19650:
19646:
19636:
19630:
19612:
19605:
19600:
19598:
19579:
19574:
19570:
19566:
19561:
19558:
19555:
19551:
19547:
19542:
19538:
19530:
19516:
19508:
19504:
19497:
19491:
19483:
19479:
19475:
19470:
19466:
19459:
19453:
19448:
19444:
19436:
19435:
19434:
19417:
19406:
19402:
19396:
19392:
19379:
19375:
19360:
19356:
19343:
19339:
19333:
19329:
19323:
19319:
19312:
19304:
19300:
19287:
19283:
19268:
19264:
19258:
19254:
19247:
19242:
19238:
19234:
19229:
19226:
19223:
19219:
19211:
19210:
19209:
19204:
19185:
19180:
19176:
19172:
19167:
19164:
19161:
19157:
19153:
19148:
19144:
19136:
19122:
19114:
19110:
19104:
19099:
19095:
19090:
19085:
19080:
19076:
19068:
19054:
19046:
19042:
19035:
19029:
19021:
19017:
19013:
19008:
19004:
18997:
18991:
18986:
18982:
18974:
18973:
18972:
18955:
18943:
18939:
18933:
18929:
18923:
18919:
18915:
18911:
18898:
18894:
18888:
18884:
18878:
18874:
18870:
18867:
18863:
18857:
18846:
18833:
18829:
18823:
18819:
18813:
18809:
18805:
18802:
18798:
18794:
18789:
18786:
18783:
18769:
18768:
18767:
18762:
18757:
18752:
18732:
18726:
18723:
18717:
18711:
18699:
18696:
18691:
18678:
18672:
18656:
18653:
18647:
18638:
18630:
18614:
18611:
18605:
18594:
18589:
18586:
18583:
18579:
18573:
18570:
18564:
18559:
18556:
18552:
18546:
18543:
18537:
18531:
18517:
18516:
18515:
18513:
18512:outer product
18509:
18505:
18497:
18493:
18476:
18470:
18467:
18461:
18455:
18452:
18447:
18432:
18410:
18407:
18396:
18365:
18359:
18356:
18350:
18344:
18333:
18327:
18324:
18318:
18312:
18309:
18304:
18294:
18291:
18287:
18281:
18278:
18272:
18266:
18239:
18236:
18231:
18227:
18219:
18218:
18215:
18210:
18191:
18182:
18177:
18170:
18167:
18159:
18155:
18149:
18145:
18139:
18136:
18130:
18124:
18095:
18085:
18080:
18076:
18068:
18067:
18066:
18060:
18055:
18053:
18052:learning rate
18035:
18031:
18018:
18001:
17995:
17992:
17986:
17980:
17949:
17943:
17940:
17934:
17928:
17916:
17912:
17908:
17903:
17896:
17893:
17886:
17881:
17878:
17875:
17868:
17865:
17854:
17853:
17852:
17850:
17833:
17828:
17821:
17818:
17811:
17788:
17781:
17778:
17754:
17746:
17716:
17713:
17707:
17701:
17698:
17672:
17669:
17663:
17647:
17644:
17638:
17625:
17624:
17623:
17615:
17596:
17591:
17587:
17581:
17576:
17566:
17555:
17554:
17553:
17535:
17529:
17525:
17519:
17514:
17511:
17508:
17504:
17500:
17497:
17493:
17489:
17486:
17478:
17474:
17470:
17467:
17464:
17459:
17455:
17451:
17446:
17442:
17435:
17432:
17426:
17423:
17418:
17414:
17410:
17407:
17404:
17399:
17395:
17391:
17386:
17382:
17375:
17368:
17367:
17366:
17347:
17343:
17339:
17336:
17331:
17327:
17321:
17316:
17313:
17310:
17306:
17302:
17299:
17294:
17290:
17286:
17283:
17278:
17273:
17270:
17267:
17263:
17259:
17256:
17253:
17250:
17247:
17244:
17236:
17232:
17228:
17225:
17222:
17217:
17213:
17209:
17204:
17200:
17193:
17186:
17185:
17184:
17181:
17160:
17156:
17150:
17146:
17142:
17135:
17131:
17125:
17121:
17113:
17109:
17103:
17099:
17084:
17080:
17076:
17073:
17070:
17065:
17061:
17057:
17052:
17048:
17043:
17032:
17025:
17021:
17015:
17011:
17007:
17001:
16996:
16992:
16988:
16983:
16980:
16974:
16966:
16962:
16958:
16955:
16952:
16947:
16943:
16939:
16934:
16930:
16926:
16921:
16917:
16913:
16910:
16907:
16902:
16898:
16894:
16889:
16885:
16878:
16871:
16870:
16869:
16853:
16849:
16845:
16842:
16839:
16834:
16830:
16823:
16818:
16814:
16805:
16788:
16784:
16763:
16760:
16755:
16751:
16747:
16744:
16741:
16736:
16732:
16728:
16723:
16719:
16696:
16692:
16671:
16668:
16663:
16659:
16655:
16652:
16649:
16644:
16640:
16636:
16631:
16627:
16606:
16584:
16580:
16573:
16570:
16567:
16562:
16558:
16551:
16546:
16542:
16528:
16526:
16506:
16501:
16493:
16488:
16484:
16477:
16467:
16463:
16459:
16454:
16450:
16440:
16432:
16428:
16422:
16418:
16407:
16403:
16399:
16394:
16390:
16378:
16374:
16370:
16365:
16361:
16354:
16351:
16345:
16338:
16333:
16329:
16322:
16312:
16308:
16304:
16299:
16295:
16284:
16272:
16268:
16264:
16261:
16255:
16251:
16246:
16242:
16238:
16235:
16225:
16221:
16217:
16214:
16207:
16203:
16197:
16193:
16189:
16186:
16182:
16177:
16169:
16165:
16161:
16156:
16152:
16145:
16138:
16137:
16136:
16134:
16114:
16101:
16095:
16091:
16087:
16082:
16078:
16074:
16071:
16068:
16063:
16059:
16055:
16050:
16046:
16041:
16034:
16031:
16018:
16012:
16008:
16004:
15999:
15995:
15991:
15988:
15985:
15980:
15976:
15972:
15967:
15963:
15958:
15952:
15949:
15944:
15940:
15936:
15933:
15905:
15901:
15897:
15889:
15886:
15879:
15874:
15866:
15862:
15858:
15855:
15852:
15847:
15843:
15836:
15829:
15828:
15827:
15825:
15821:
15793:
15772:
15768:
15764:
15761:
15758:
15753:
15749:
15720:
15716:
15712:
15709:
15706:
15701:
15697:
15685:
15660:
15656:
15649:
15641:
15637:
15630:
15627:
15619:
15615:
15611:
15606:
15602:
15595:
15588:
15587:
15586:
15570:
15566:
15543:
15539:
15524:
15522:
15518:
15514:
15510:
15509:least squares
15483:
15480:
15472:
15465:
15462:
15455:
15452:
15446:
15443:
15430:
15424:
15421:
15414:
15398:
15395:
15389:
15383:
15380:
15357:
15354:
15347:
15346:
15345:
15342:
15325:
15321:
15315:
15308:
15305:
15298:
15292:
15289:
15282:
15278:
15272:
15267:
15256:
15255:
15254:
15235:
15231:
15227:
15224:
15218:
15215:
15207:
15202:
15185:
15182:
15157:
15150:
15147:
15123:
15100:
15097:
15072:
15068:
15045:
15038:
15035:
15008:
15003:
14999:
14993:
14988:
14985:
14982:
14975:
14962:
14955:
14952:
14937:
14923:
14922:
14921:
14905:
14901:
14897:
14884:
14879:
14875:
14863:
14840:
14837:
14824:
14820:
14808:
14778:
14770:
14766:
14762:
14759:
14748:
14744:
14740:
14737:
14729:
14724:
14721:
14718:
14714:
14708:
14703:
14700:
14697:
14693:
14685:
14681:
14677:
14672:
14667:
14657:
14653:
14649:
14646:
14638:
14633:
14630:
14627:
14623:
14617:
14614:
14609:
14604:
14597:
14594:
14583:
14582:
14581:
14565:
14561:
14557:
14554:
14551:
14546:
14542:
14533:
14514:
14509:
14505:
14499:
14495:
14489:
14484:
14481:
14478:
14474:
14468:
14463:
14460:
14457:
14453:
14445:
14441:
14437:
14432:
14427:
14422:
14418:
14412:
14407:
14404:
14401:
14397:
14391:
14388:
14383:
14378:
14364:
14358:
14353:
14349:
14340:
14335:
14332:
14329:
14325:
14319:
14316:
14311:
14306:
14299:
14296:
14285:
14284:
14283:
14266:
14263:
14257:
14254:
14231:
14226:
14218:
14215:
14210:
14206:
14197:
14192:
14189:
14186:
14182:
14176:
14173:
14168:
14163:
14156:
14153:
14142:
14141:
14140:
14119:
14114:
14105:
14102:
14097:
14093:
14083:
14078:
14075:
14072:
14068:
14060:
14056:
14052:
14047:
14042:
14037:
14030:
14027:
14012:
14008:
14004:
14001:
13981:
13978:
13972:
13960:
13958:
13953:
13942:
13941:
13940:
13933:
13932:is unbiased.
13916:
13913:
13886:
13883:
13880:
13866:
13863:
13849:
13835:
13834:
13833:
13827:
13804:
13799:
13792:
13788:
13778:
13773:
13770:
13767:
13763:
13759:
13750:
13744:
13738:
13735:
13725:
13724:
13723:
13721:
13699:
13669:
13661:
13657:
13653:
13644:
13641:
13632:
13623:
13620:
13617:
13610:
13607:
13604:
13589:
13585:
13581:
13578:
13558:
13555:
13549:
13537:
13535:
13530:
13519:
13518:
13517:
13514:
13512:
13508:
13487:
13478:
13475:
13470:
13466:
13456:
13451:
13448:
13445:
13441:
13432:
13428:
13424:
13420:
13415:
13407:
13403:
13399:
13396:
13390:
13387:
13382:
13377:
13370:
13367:
13352:
13348:
13344:
13341:
13321:
13318:
13311:
13310:
13309:
13306:
13302:
13299:
13295:
13290:
13271:
13267:
13263:
13260:
13257:
13252:
13248:
13244:
13241:
13238:
13233:
13229:
13222:
13219:
13211:
13207:
13203:
13200:
13180:
13176:
13172:
13152:
13148:
13139:
13135:
13131:
13124:
13116:
13113:
13108:
13104:
13095:
13090:
13087:
13084:
13080:
13073:
13069:
13065:
13062:
13057:
13053:
13049:
13044:
13036:
13032:
13028:
13025:
13021:
13016:
13011:
13003:
12999:
12995:
12992:
12989:
12984:
12980:
12973:
12968:
12963:
12960:
12957:
12953:
12949:
12941:
12937:
12933:
12930:
12927:
12922:
12918:
12914:
12911:
12908:
12903:
12899:
12892:
12885:
12884:
12883:
12881:
12874:
12855:
12851:
12842:
12838:
12834:
12827:
12819:
12816:
12813:
12804:
12800:
12796:
12793:
12780:
12776:
12772:
12769:
12763:
12758:
12750:
12746:
12742:
12739:
12736:
12733:
12727:
12720:
12719:
12718:
12717:
12696:
12692:
12688:
12685:
12668:
12658:
12657:'successes'.
12630:
12621:
12605:
12584: =
12570: =
12538:
12531:
12527:
12524:
12521:
12518:
12514:
12508:
12500:
12497:
12494:
12486:
12482:
12478:
12476:
12467:
12463:
12460:
12457:
12451:
12448:
12445:
12439:
12435:
12429:
12421:
12418:
12415:
12407:
12403:
12399:
12397:
12387:
12379:
12376:
12373:
12365:
12361:
12357:
12354:
12349:
12341:
12338:
12335:
12327:
12323:
12319:
12316:
12314:
12309:
12302:
12295:
12289:
12281:
12278:
12275:
12267:
12263:
12251:
12248:
12236:
12229:
12217:
12215:
12210:
12199:
12198:
12197:
12191:
12178:
12174:
12150:
12142:
12134:
12131:
12128:
12120:
12116:
12104:
12101:
12090:
12084:
12081:
12078:
12075:
12062:
12058:
12054:
12048:
12042:
12035:
12034:
12033:
12013:
12007:
11994: =
11970:
11964:
11961:
11956:
11945:
11942:
11936:
11933:
11925:
11914:
11911:
11894:
11891:
11880:
11878:
11862:
11859:
11853:
11850:
11847:
11844:
11841:
11825:
11811:
11808:
11805:
11800:
11789:
11786:
11780:
11777:
11769:
11758:
11755:
11738:
11735:
11724:
11722:
11706:
11703:
11697:
11694:
11691:
11688:
11685:
11669:
11655:
11652:
11649:
11644:
11633:
11630:
11624:
11621:
11613:
11602:
11599:
11582:
11579:
11568:
11566:
11550:
11547:
11541:
11538:
11535:
11532:
11529:
11513:
11495:
11494:
11493:
11491:
11487:
11483:
11470: =
11469:
11456: =
11455:
11442: =
11441:
11437:
11433:
11429:
11425:
11421:
11416:
11414:
11407:
11400:
11393:
11388:
11386:
11382:
11378:
11374:
11364:
11362:
11358:
11354:
11350:
11331:
11328:
11317:
11313:
11309:
11305:
11301:
11297:
11294: =
11293:
11289:
11286: ≥
11285:
11279:
11269:
11265:
11261:
11257:
11238:
11235:
11224:
11220:
11219:
11213:
11209:
11203:
11184:
11166:
11162:
11157:
11148:
11144:
11143:cross entropy
11139:
11119:
11108:
11104:
11100:
11096:
11092:
11088:
11084:
11062:
11059:
11056:
11050:
11040:
11036:
11032:
11029:
11023:
11017:
11014:
11011:
11005:
10997:
10993:
10960:
10956:
10952:
10945:
10941:
10936:
10923:
10916:
10890:
10888:
10880:
10877:
10868:
10865:
10862:
10856:
10846:
10842:
10838:
10835:
10829:
10823:
10820:
10814:
10804:
10800:
10795:
10791:
10785:
10759:
10756:
10753:
10747:
10739:
10735:
10728:
10718:
10714:
10709:
10705:
10699:
10673:
10671:
10657:
10649:
10645:
10638:
10632:
10596:
10581:
10577:
10568:
10564:
10558:
10553:
10550:
10547:
10543:
10537:
10534:
10526:
10500:
10498:
10484:
10481:
10476:
10472:
10465:
10455:
10451:
10447:
10442:
10438:
10431:
10425:
10422:
10417:
10412:
10409:
10406:
10402:
10396:
10393:
10385:
10359:
10350:
10347:
10342:
10338:
10331:
10321:
10317:
10313:
10308:
10304:
10297:
10291:
10288:
10283:
10278:
10275:
10272:
10268:
10261:
10235:
10224:
10220:
10216:
10211:
10207:
10200:
10192:
10189:
10184:
10180:
10173:
10167:
10164:
10159:
10154:
10151:
10148:
10144:
10137:
10111:
10109:
10100:
10091:
10087:
10083:
10078:
10074:
10067:
10064:
10061:
10058:
10052:
10049:
10044:
10040:
10033:
10030:
10027:
10023:
10017:
10012:
10009:
10006:
10002:
9995:
9969:
9965:
9956:
9952:
9948:
9943:
9939:
9932:
9929:
9926:
9921:
9916:
9913:
9910:
9906:
9902:
9896:
9893:
9888:
9884:
9877:
9874:
9871:
9866:
9861:
9858:
9855:
9851:
9846:
9839:
9813:
9811:
9800:
9797:
9792:
9788:
9781:
9778:
9775:
9770:
9765:
9762:
9759:
9755:
9748:
9722:
9716:
9713:
9708:
9704:
9697:
9692:
9687:
9684:
9681:
9677:
9670:
9644:
9642:
9631:
9628:
9617:
9611:
9585:
9569:
9565:
9558:
9532:
9514:
9510:
9505:
9498:
9472:
9470:
9459:
9445:
9444:
9443:
9427:
9423:
9396:
9369:
9365:
9360:
9356:
9353:
9328:
9324:
9320:
9317:
9314:
9309:
9305:
9301:
9296:
9292:
9285:
9273:data samples
9272:
9269:
9264:
9263:
9260:
9257:
9256:
9253:
9235:
9231:
9226:
9199:
9176:
9152:
9148:
9143:
9134:
9111:
9105:
9078:
9049:
9032:
9030:
9010:
9004:
8970:
8946:
8931:
8925:
8911:
8908:
8905:
8899:
8878:
8852:
8843:
8835:
8834:
8833:
8809:
8803:
8786:
8782:
8775:
8760:
8756:
8752:
8749:
8743:
8730:
8724:
8721:
8716:
8712:
8705:
8691:
8690:
8689:
8688:
8683:
8670:
8664:
8660:
8651:if we decide
8634:
8631:
8626:
8622:
8615:
8605:
8599:
8596:
8585:
8554:
8550:
8541:if we decide
8519:
8516:
8511:
8507:
8500:
8490:
8484:
8481:
8470:
8456:
8455:
8454:
8434:
8431:
8421:
8415:
8402:
8399:
8388:
8368:
8364:
8357:
8331:
8328:
8321:
8320:
8319:
8302:
8298:
8294:
8288:
8284:
8255:
8251:
8226:
8217:
8207:
8203:
8196:
8185:
8178:
8168:
8164:
8157:
8124:
8120:
8110:
8109:
8108:
8105:
8101:
8099:
8089:
8072:
8066:
8036:
8033:
8028:
8024:
8020:
8017:
8014:
8009:
8005:
8001:
7996:
7992:
7985:
7962:
7956:
7922:
7916:
7903:
7900:
7895:
7891:
7887:
7884:
7881:
7876:
7872:
7868:
7863:
7859:
7852:
7823:
7819:
7815:
7812:
7809:
7804:
7800:
7796:
7791:
7787:
7780:
7746:
7740:
7702:
7698:
7694:
7691:
7688:
7683:
7679:
7675:
7670:
7666:
7659:
7644:
7638:
7625:
7622:
7617:
7613:
7609:
7606:
7603:
7598:
7594:
7590:
7585:
7581:
7574:
7568:
7560:
7556:
7552:
7549:
7546:
7541:
7537:
7533:
7528:
7524:
7520:
7517:
7511:
7497:
7496:
7495:
7485:
7481:
7477:
7474:
7470:
7467:
7466:most probable
7457:
7455:
7429:
7426:
7424:
7385:
7376:
7371:
7364:
7352:
7347:
7339:
7334:
7322:
7317:
7305:
7304:
7303:
7301:
7282:
7263:
7259:
7249:
7245:
7230:
7226:
7215:
7211:
7206:
7202:
7199:
7194:
7176:
7172:
7157:
7153:
7142:
7138:
7133:
7129:
7126:
7116:
7108:
7104:
7094:
7090:
7080:
7076:
7062:
7058:
7047:
7043:
7038:
7034:
7031:
7026:
7013:
7010:
6988:
6982:
6979:
6976:
6973:
6969:
6964:
6958:
6955:
6952:
6948:
6939:
6934:
6925:
6924:
6923:
6907:
6904:
6887:
6883:
6865:
6862:
6829:
6823:
6820:
6817:
6814:
6810:
6805:
6799:
6796:
6793:
6789:
6780:
6775:
6769:
6763:
6760:
6745:
6742:
6727:
6722:
6719:
6716:
6713:
6710:
6707:
6704:
6700:
6691:
6686:
6680:
6666:
6661:
6655:
6651:
6647:
6627:
6624:
6616:
6603:
6592:
6586:
6582:
6574:
6573:
6572:
6554:
6549:
6539:
6513:
6494:
6475:
6471:
6461:
6457:
6443:
6439:
6428:
6424:
6419:
6415:
6412:
6407:
6396:
6377:
6372:
6369:
6353:
6352:
6351:
6349:
6300:
6293:
6287:
6284:
6269:
6266:
6262:
6243:
6239:
6228:
6222:
6218:
6214:
6202:
6197:
6188:
6180:
6171:
6170:
6169:
6167:
6160:
6153:
6137:
6126:
6122:
6118:
6114:
6108:
6091:
6088:
6074:
6054:
6020:
6013:
6010:
5996:
5988:
5984:
5977:
5971:
5963:
5959:
5951:
5950:
5949:
5935:
5912:
5906:
5903:
5900:
5876:
5870:
5864:
5856:
5850:
5847:
5844:
5841:
5838:
5830:
5824:
5812:
5802:
5801:
5800:
5798:
5778:
5768:
5763:
5752:
5749:
5743:
5740:
5730:
5729:
5728:
5711:
5705:
5702:
5699:
5679:
5656:
5650:
5630:
5607:
5602:
5583:
5581:
5576:
5555:
5549:
5546:
5542:
5537:
5534:
5530:
5514:
5510:
5502:
5496:
5492:
5488:
5468:
5463:
5454:
5448:
5439:
5438:
5437:
5435:
5414:
5410:
5406:
5402:
5396:
5373:
5361:
5345:
5339:
5336:
5330:
5327:
5324:
5315:
5310:
5292:
5289:
5277:
5276:
5275:
5273:
5272:almost surely
5267:
5264:converges to
5248:
5243:
5231:
5212:
5209:
5206:
5197:
5192:
5180:
5158:
5145:
5136:
5128:
5122:
5119:
5113:
5110:
5107:
5098:
5093:
5085:
5076:
5073:
5061:
5060:
5058:
5040:
5034:
5031:
5019:
5013:
5010:
4997:
4994:
4991:
4985:
4982:
4979:
4965:
4964:
4957:
4954: |
4953:
4949:
4942:
4938:
4933:
4930:
4912:
4909:
4887:
4883:
4878:
4871:
4868:
4865:
4859:
4856:
4853:
4842:
4828:
4827:
4815:
4812: |
4811:
4807:
4801:
4774:
4770:
4767:
4763:
4759:
4758:
4756:
4753:
4746:
4741:
4737:
4731:
4727:
4723:
4719:
4716:
4709:
4702:
4695:
4691:
4672:
4664:
4660:
4656:
4653:
4647:
4644:
4638:
4635:
4632:
4626:
4616:
4612:
4608:
4605:
4598:
4597:
4595:
4592:
4591:
4589:
4586:
4584:
4583:
4561:
4557:
4553:
4549:
4543:
4518:
4514:
4510:
4506:
4500:
4477:
4472:
4468:
4454:
4429:
4424:
4412:
4411:
4410:
4408:
4404:
4403:almost surely
4385:
4380:
4376:
4362:
4337:
4332:
4320:
4319:
4318:
4316:
4298:
4293:
4281:
4274:
4270:
4249:
4245:
4241:
4237:
4231:
4223:
4210:
4207:
4203:
4199:
4195:
4192:
4177:
4157:
4149:
4148:
4121:
4112:
4109:
4100:
4074:
4068:
4065:
4062:
4042:
4019:
4013:
3993:
3967:
3956:
3953:
3950:
3947:
3946:
3945:
3943:
3937:
3916:
3913:
3908:
3904:
3897:
3894:
3891:
3884:
3874:
3868:
3862:
3839:
3833:
3830:
3825:
3821:
3814:
3811:
3808:
3803:
3798:
3795:
3792:
3788:
3782:
3779:
3774:
3768:
3765:
3761:
3752:
3747:
3736:
3735:
3734:
3732:
3713:
3710:
3706:
3697:
3692:
3681:
3677:
3673:
3663:
3661:
3651:
3649:
3644:
3621:
3601:
3595:
3580:
3552:
3546:
3542:
3538:
3535:
3532:
3527:
3523:
3519:
3514:
3510:
3505:
3500:
3497:
3471:
3467:
3464:
3458:
3452:
3432:
3429:
3426:
3420:
3400:
3394:
3385:
3379:
3371:
3358:
3357:
3356:
3355:
3350:
3348:
3315:
3279:
3260:
3248:
3223:
3219:
3203:
3192:
3188:
3184:
3181:
3178:
3173:
3169:
3165:
3160:
3156:
3147:
3143:
3139:
3134:
3130:
3106:
3091:
3073:
3067:
3063:
3059:
3056:
3053:
3048:
3044:
3040:
3035:
3031:
3026:
3022:
3017:
3013:
2988:
2984:
2980:
2977:
2974:
2969:
2966:
2963:
2959:
2955:
2950:
2946:
2942:
2939:
2936:
2931:
2927:
2923:
2918:
2914:
2889:
2885:
2881:
2878:
2875:
2870:
2866:
2862:
2857:
2853:
2843:
2838:
2825:
2819:
2816:
2810:
2804:
2794:
2771:belonging to
2758:
2738:
2730:
2699:
2683:
2665:
2658:
2650:
2646:
2642:
2639:
2636:
2630:
2622:
2618:
2614:
2608:
2600:
2596:
2591:
2587:
2581:
2575:
2551:
2544:
2540:
2537:
2531:
2525:
2521:
2516:
2506:
2503:
2500:
2497:
2493:
2489:
2479:
2478:
2477:
2475:
2472:, additional
2471:
2467:
2460:
2450:
2448:
2444:
2440:
2436:
2417:
2412:
2400:
2381:
2373:
2362:
2357:
2350:
2347:
2342:
2334:
2329:
2325:
2316:
2311:
2294:
2284:
2279:
2272:
2269:
2264:
2256:
2252:
2242:
2238:
2229:
2224:
2202:
2197:
2190:
2187:
2182:
2174:
2170:
2160:
2156:
2147:
2142:
2123:
2118:
2113:
2108:
2096:
2091:
2084:
2081:
2076:
2068:
2064:
2054:
2050:
2041:
2036:
2019:
2009:
2004:
1997:
1994:
1989:
1981:
1976:
1972:
1963:
1958:
1936:
1931:
1924:
1921:
1916:
1908:
1904:
1894:
1890:
1881:
1876:
1852:
1847:
1840:
1837:
1832:
1824:
1820:
1810:
1806:
1797:
1792:
1775:
1765:
1760:
1753:
1750:
1745:
1737:
1733:
1723:
1719:
1710:
1705:
1683:
1678:
1671:
1668:
1663:
1655:
1650:
1646:
1637:
1632:
1614:
1609:
1605:
1599:
1594:
1587:
1574:
1573:
1572:
1571:
1551:
1546:
1533:
1529:
1513:
1506:
1501:
1473:
1467:
1464:
1456:
1452:
1443:
1433:
1430:
1426:
1423:
1420:
1412:
1408:
1399:
1389:
1386:
1383:
1375:
1371:
1362:
1349:
1348:
1347:
1345:
1330:
1317:
1293:
1289:
1283:
1263:
1255:
1225:
1196:
1192:
1186:
1177:
1158:
1144:
1140:
1132:
1120:
1117:
1114:
1103:
1099:
1093:
1086:
1085:
1084:
1082:
1078:
1073:
1050:
1046:
1020:
1016:
1012:
1008:
1004:
979:
969:
964:
954:
926:
890:
874:
864:
857:
848:
819:
805:
801:
793:
774:
771:
744:
735:
725:
724:
723:
706:
697:
694:
689:
685:
654:
650:
643:
638:
635:
632:
628:
624:
618:
615:
602:
598:
590:
589:
588:
586:
567:
564:
551:
547:
538:
534:
515:
508:
505:
492:
488:
484:
473:
470:
462:
450:
444:
436:
420:
419:
418:
398:
394:
390:
387:
384:
379:
375:
371:
366:
362:
355:
341:
337:
336:
297:
287:
284:
281:
275:
272:
268:
262:
251:
225:
219:
215:
210:
207:
203:
198:
194:
189:
184:
180:
175:
170:
167:
158:
154:
150:
140:
138:
134:
130:
126:
122:
118:
113:
111:
107:
103:
99:
95:
90:
88:
84:
80:
76:
75:observed data
72:
68:
64:
60:
56:
52:
48:
44:
40:
33:
19:
24017:M-estimators
23988:
23976:
23957:
23950:
23862:Econometrics
23812: /
23795:Chemometrics
23772:Epidemiology
23765: /
23738:Applications
23580:ARIMA model
23527:Q-statistic
23476:Stationarity
23372:Multivariate
23315: /
23311: /
23309:Multivariate
23307: /
23247: /
23243: /
23017:Bayes factor
22916:Signed rank
22828:
22802:
22794:
22782:
22477:Completeness
22313:Cohort study
22211:Opinion poll
22146:Missing data
22133:Study design
22088:Scatter plot
22010:Scatter plot
22003:Spearman's ρ
21965:Grouped data
21666:. Retrieved
21661:
21636:
21619:Purcell, S.
21605:
21569:
21547:
21526:
21505:
21486:
21461:
21457:
21435:
21413:
21393:
21389:Cramer, J.S.
21352:
21348:
21338:
21311:
21307:
21297:
21278:
21275:Hald, Anders
21269:
21250:
21244:
21224:
21217:
21192:
21188:
21179:
21152:
21148:
21138:
21111:
21107:
21094:
21075:
21069:
21050:
21044:
21025:
21019:
21009:
20992:
20988:
20978:
20939:
20933:
20908:
20904:
20894:
20869:
20865:
20860:(Sep 1908).
20852:
20827:
20815:
20795:
20787:
20768:
20759:
20735:
20725:
20705:
20698:
20679:
20673:
20654:
20648:
20629:
20620:
20599:
20591:
20575:
20564:
20547:
20529:
20510:
20506:
20496:
20479:
20473:
20459:
20440:
20434:
20415:
20382:
20371:
20359:
20340:
20334:
20317:
20313:
20307:
20298:
20285:
20265:
20244:
20234:
20214:
20207:
20188:
20178:
20163:
20155:
20130:
20122:
20105:
20099:
20093:
20074:
20064:
20045:
20036:
20017:
20011:
19992:
19983:
19964:
19958:
19795:
19783:-distributed
19780:
19765:
19742:
19714:saddle point
19707:
19608:
19594:
19432:
19207:
18970:
18765:
18755:
18381:
18207:
18064:
17965:
17687:
17621:
17613:
17551:
17364:
17182:
17179:
16803:
16534:
16522:
16130:
15819:
15683:
15681:
15530:
15506:
15343:
15340:
15205:
15203:
15023:
14793:
14529:
14246:
14138:
13934:
13902:
13819:
13688:
13515:
13504:
13291:
13178:
13174:
13170:
13167:
12870:
12664:
12622:
12603:
12557:
12187:
12165:
12019:
12005:
11989:
11489:
11467:
11453:
11439:
11431:
11427:
11423:
11417:
11405:
11398:
11391:
11389:
11384:
11380:
11370:
11360:
11356:
11352:
11348:
11315:
11307:
11303:
11299:
11295:
11291:
11287:
11283:
11277:
11267:
11263:
11259:
11255:
11222:
11215:
11211:
11207:
11205:
11140:
11094:
11090:
11082:
10984:
9267:
9265:
9258:
9038:
8961:
8831:
8685:By applying
8684:
8540:
8452:
8274:
8106:
8102:
8095:
7725:
7463:
7453:
7427:
7422:
7400:
7299:
7297:
6885:
6881:
6845:
6552:
6545:
6509:
6315:
6158:
6097:
6089:
6046:
5892:
5794:
5589:
5574:
5571:
5388:
5265:
5176:
4955:
4951:
4947:
4940:
4936:
4813:
4809:
4805:
4773:neighborhood
4754:
4750:
4729:
4725:
4721:
4707:
4700:
4693:
4689:
4687:
4587:
4580:
4492:
4406:
4400:
4279:
4272:
4268:
4219:
4147:equivariance
4144:
3938:
3854:
3675:
3669:
3657:
3486:
3353:
3351:
2839:
2566:
2474:restrictions
2463:
2396:
1488:
1173:
1074:
1011:sample space
834:
721:
530:
333:
146:
114:
91:
46:
42:
36:
23990:WikiProject
23905:Cartography
23867:Jurimetrics
23819:Reliability
23550:Time domain
23529:(Ljung–Box)
23451:Time-series
23329:Categorical
23313:Time-series
23305:Categorical
23240:(Bernoulli)
23075:Correlation
23055:Correlation
22851:Jarque–Bera
22823:Chi-squared
22585:M-estimator
22538:Asymptotics
22482:Sufficiency
22249:Interaction
22161:Replication
22141:Effect size
22098:Violin plot
22078:Radar chart
22058:Forest plot
22048:Correlogram
21998:Kendall's τ
20513:: 101–117.
19892:M-estimator
17745:iteratively
15511:, even for
13720:sample mean
11373:unfair coin
8593: error
8478: error
8396: error
4216:Consistency
4145:functional
3949:Consistency
24006:Categories
23857:Demography
23575:ARMA model
23380:Regression
22957:(Friedman)
22918:(Wilcoxon)
22856:Normality
22846:Lilliefors
22793:Student's
22669:Resampling
22543:Robustness
22531:divergence
22521:Efficiency
22459:(monotone)
22454:Likelihood
22371:Population
22204:Stratified
22156:Population
21975:Dependence
21931:Count data
21862:Percentile
21839:Dispersion
21772:Arithmetic
21707:Statistics
21668:2021-03-06
21458:ISI Review
21432:King, Gary
19950:References
16525:principles
15826:given by:
14282:we obtain
13298:continuous
13292:Since the
12714:which has
11099:expectancy
7766:and where
7480:parameters
6094:Efficiency
4963:such that
4766:level sets
4699:such that
4536:. Rather,
4222:consistent
4194:Efficiency
3955:Invariance
3666:Properties
3312:is a real
2793:constraint
1019:continuous
1003:measurable
157:parameters
143:Principles
63:maximizing
55:parameters
51:estimating
39:statistics
23238:Logistic
23005:posterior
22931:Rank sum
22679:Jackknife
22674:Bootstrap
22492:Bootstrap
22427:Parameter
22376:Statistic
22171:Statistic
22083:Run chart
22068:Pie chart
22063:Histogram
22053:Fan chart
22028:Bar chart
21910:L-moments
21797:Geometric
21612:EMS Press
21028:: 60–62.
21001:0883-4237
20551:cmplx96 (
19768:heuristic
19759:. It was
19677:^
19674:θ
19647:
19631:θ
19567:−
19498:ℓ
19495:∇
19492:−
19460:ℓ
19457:∇
19313:−
19173:−
19077:γ
19036:ℓ
19033:∇
19030:−
18998:ℓ
18995:∇
18920:γ
18875:γ
18871:−
18810:γ
18806:−
18727:^
18724:θ
18697:−
18673:θ
18670:∂
18654:θ
18648:ℓ
18645:∂
18631:θ
18628:∂
18612:θ
18606:ℓ
18603:∂
18580:∑
18560:−
18547:^
18544:θ
18471:^
18468:θ
18453:−
18411:^
18408:θ
18360:^
18357:θ
18328:^
18325:θ
18310:−
18295:−
18282:^
18279:θ
18228:η
18171:^
18168:θ
18156:ℓ
18153:∇
18140:^
18137:θ
18086:∈
18077:η
18032:η
17996:^
17993:θ
17944:^
17941:θ
17913:η
17897:^
17894:θ
17869:^
17866:θ
17822:^
17819:θ
17782:^
17779:θ
17755:θ
17717:^
17714:θ
17702:^
17699:θ
17664:θ
17661:∂
17645:θ
17639:ℓ
17636:∂
17570:^
17505:∑
17501:−
17490:λ
17468:…
17436:ℓ
17427:λ
17408:…
17340:
17307:∑
17287:
17264:∑
17260:−
17251:
17226:…
17194:ℓ
17143:⋯
17074:…
17008:∏
16989:∏
16956:…
16927:∣
16911:…
16843:…
16745:⋯
16653:⋯
16571:…
16485:σ
16464:μ
16460:−
16429:σ
16419:σ
16404:μ
16400:−
16375:μ
16371:−
16355:ρ
16346:−
16330:σ
16309:μ
16305:−
16269:ρ
16265:−
16247:−
16239:
16222:ρ
16218:−
16204:σ
16194:σ
16190:π
16133:bivariate
16092:μ
16088:−
16072:…
16060:μ
16056:−
16032:−
16026:Σ
16009:μ
16005:−
15989:…
15977:μ
15973:−
15945:−
15937:
15921:Σ
15890:π
15856:…
15804:Σ
15769:μ
15762:…
15750:μ
15710:…
15466:^
15463:σ
15456:π
15447:
15422:−
15399:^
15396:σ
15384:^
15381:μ
15358:
15309:^
15306:σ
15293:^
15290:μ
15273:^
15268:θ
15232:σ
15225:μ
15216:θ
15186:^
15183:σ
15151:^
15148:σ
15124:σ
15101:^
15098:σ
15069:σ
15039:^
15036:σ
15000:σ
14986:−
14956:^
14953:σ
14938:
14902:σ
14876:δ
14864:
14821:δ
14809:
14767:δ
14763:−
14760:μ
14745:δ
14741:−
14738:μ
14715:∑
14694:∑
14673:−
14654:δ
14650:−
14647:μ
14624:∑
14598:^
14595:σ
14558:−
14555:μ
14552:≡
14543:δ
14475:∑
14454:∑
14433:−
14398:∑
14368:¯
14359:−
14326:∑
14300:^
14297:σ
14267:^
14264:μ
14255:μ
14219:μ
14216:−
14183:∑
14157:^
14154:σ
14106:μ
14103:−
14069:∑
14057:σ
14043:σ
14031:−
14009:σ
14002:μ
13982:
13973:σ
13970:∂
13966:∂
13917:^
13914:μ
13884:μ
13867:^
13864:μ
13850:
13764:∑
13754:¯
13739:^
13736:μ
13703:¯
13658:σ
13645:μ
13642:−
13636:¯
13618:−
13611:−
13586:σ
13579:μ
13559:
13550:μ
13547:∂
13543:∂
13479:μ
13476:−
13442:∑
13429:σ
13416:−
13404:σ
13400:π
13391:
13371:−
13349:σ
13342:μ
13322:
13294:logarithm
13268:σ
13261:μ
13258:∣
13242:…
13208:σ
13201:μ
13173: = (
13136:σ
13117:μ
13114:−
13081:∑
13074:−
13066:
13033:σ
13029:π
13000:σ
12993:μ
12990:∣
12954:∏
12938:σ
12931:μ
12928:∣
12912:…
12839:σ
12820:μ
12817:−
12805:−
12797:
12777:σ
12773:π
12747:σ
12740:μ
12737:∣
12693:σ
12686:μ
12522:−
12498:−
12458:−
12449:−
12419:−
12377:−
12355:−
12339:−
12279:−
12227:∂
12223:∂
12132:−
12082:∣
12008:for
11962:≈
11937:−
11848:∣
11826:
11806:≈
11781:−
11692:∣
11670:
11650:≈
11625:−
11536:∣
11514:
11426:(so here
11332:^
11239:^
11163:θ
11123:^
11120:θ
11107:logarithm
11097:) to the
11063:θ
11060:∣
11037:θ
11033:∣
11018:
10998:θ
10961:θ
10953:∥
10942:θ
10917:θ
10869:θ
10866:∣
10843:θ
10839:∣
10824:
10801:θ
10792:∫
10786:θ
10740:θ
10715:θ
10706:∫
10700:θ
10650:θ
10633:θ
10603:∞
10600:→
10593:⟶
10569:θ
10544:∑
10527:θ
10485:θ
10482:∣
10452:θ
10448:∣
10426:
10403:∑
10386:θ
10351:θ
10348:∣
10318:θ
10314:∣
10292:
10269:∑
10262:θ
10221:θ
10217:∣
10193:θ
10190:∣
10168:
10145:∑
10138:θ
10088:θ
10084:∣
10065:
10059:−
10053:θ
10050:∣
10031:
10003:∑
9996:θ
9953:θ
9949:∣
9930:
9907:∑
9903:−
9897:θ
9894:∣
9875:
9852:∑
9840:θ
9801:θ
9798:∣
9779:
9756:∑
9749:θ
9717:θ
9714:∣
9678:∏
9671:θ
9632:θ
9629:∣
9612:θ
9570:θ
9559:θ
9515:θ
9499:θ
9463:^
9460:θ
9428:θ
9400:^
9397:θ
9366:θ
9357:∼
9318:…
9232:θ
9203:^
9200:θ
9177:θ
9149:θ
9115:^
9112:θ
9082:^
9079:θ
9053:^
9050:θ
9005:
8926:
8909:∣
8900:
8804:
8776:
8753:∣
8744:
8722:∣
8706:
8632:∣
8616:
8597:∣
8586:
8517:∣
8501:
8482:∣
8471:
8432:
8416:
8400:∣
8389:
8377:∞
8372:∞
8369:−
8365:∫
8197:
8158:
8073:θ
8067:
8037:θ
8034:∣
8018:…
7963:θ
7957:
7923:θ
7917:
7904:θ
7901:∣
7885:…
7813:…
7781:
7747:θ
7741:
7692:…
7660:
7645:θ
7639:
7626:θ
7623:∣
7607:…
7550:…
7521:∣
7518:θ
7512:
7377:^
7365:−
7353:^
7348:θ
7335:∗
7323:^
7318:θ
7260:θ
7256:∂
7246:θ
7242:∂
7212:θ
7203:
7191:∂
7173:θ
7169:∂
7139:θ
7130:
7124:∂
7105:θ
7101:∂
7091:θ
7087:∂
7077:θ
7073:∂
7044:θ
7035:
7023:∂
6905:−
6701:∑
6652:θ
6648:−
6628:^
6625:θ
6604:
6593:≡
6548:expansion
6472:θ
6468:∂
6458:θ
6454:∂
6425:θ
6416:
6404:∂
6397:−
6285:−
6219:θ
6215:−
6203:^
6198:θ
6123:θ
6115:⋅
5871:θ
5857:θ
5845:α
5839:θ
5825:α
5816:¯
5769:^
5764:θ
5744:^
5741:α
5712:θ
5700:α
5680:θ
5657:θ
5643:, and if
5631:θ
5608:^
5603:θ
5547:−
5493:θ
5489:−
5469:^
5464:θ
5411:θ
5403:⋅
5346:θ
5340:ℓ
5337:−
5328:∣
5325:θ
5316:^
5311:ℓ
5296:Θ
5293:∈
5290:θ
5249:^
5244:θ
5210:∣
5207:θ
5198:^
5193:ℓ
5129:θ
5123:ℓ
5120:−
5111:∣
5108:θ
5099:^
5094:ℓ
5080:Θ
5077:∈
5074:θ
5038:Θ
5035:∈
5032:θ
4998:θ
4995:∣
4983:
4896:Θ
4879:∈
4872:θ
4869:∣
4857:
4843:
4762:concavity
4661:θ
4657:∣
4654:⋅
4645:≠
4639:θ
4636:∣
4633:⋅
4623:⇔
4613:θ
4609:≠
4606:θ
4558:θ
4550:⋅
4515:θ
4507:⋅
4469:θ
4430:^
4425:θ
4377:θ
4338:^
4333:θ
4299:^
4294:θ
4246:θ
4238:⋅
4125:^
4122:θ
4104:^
4101:α
4075:θ
4063:α
4043:θ
4020:θ
4006:, and if
3994:θ
3971:^
3968:θ
3917:θ
3914:∣
3895:
3885:
3869:θ
3863:ℓ
3834:θ
3831:∣
3812:
3789:∑
3762:θ
3753:^
3748:ℓ
3707:θ
3698:^
3693:ℓ
3622:θ
3619:∂
3602:θ
3593:∂
3543:λ
3536:…
3524:λ
3511:λ
3498:λ
3459:θ
3427:λ
3421:θ
3418:∂
3401:θ
3392:∂
3386:−
3380:θ
3377:∂
3372:ℓ
3369:∂
3347:transpose
3325:Γ
3300:Γ
3276:Γ
3265:Γ
3258:Σ
3232:Σ
3189:θ
3182:…
3170:θ
3157:θ
3131:ϕ
3057:…
3018:∗
2978:…
2940:…
2905:to a set
2879:…
2811:θ
2779:Θ
2759:θ
2659:θ
2640:…
2631:θ
2609:θ
2582:θ
2532:θ
2507:∈
2504:θ
2498:θ
2487:Θ
2435:concavity
2418:^
2413:θ
2363:^
2358:θ
2348:θ
2326:θ
2322:∂
2317:ℓ
2308:∂
2295:…
2285:^
2280:θ
2270:θ
2253:θ
2249:∂
2239:θ
2235:∂
2230:ℓ
2221:∂
2203:^
2198:θ
2188:θ
2171:θ
2167:∂
2157:θ
2153:∂
2148:ℓ
2139:∂
2124:⋮
2119:⋱
2114:⋮
2109:⋮
2097:^
2092:θ
2082:θ
2065:θ
2061:∂
2051:θ
2047:∂
2042:ℓ
2033:∂
2020:…
2010:^
2005:θ
1995:θ
1973:θ
1969:∂
1964:ℓ
1955:∂
1937:^
1932:θ
1922:θ
1905:θ
1901:∂
1891:θ
1887:∂
1882:ℓ
1873:∂
1853:^
1848:θ
1838:θ
1821:θ
1817:∂
1807:θ
1803:∂
1798:ℓ
1789:∂
1776:…
1766:^
1761:θ
1751:θ
1734:θ
1730:∂
1720:θ
1716:∂
1711:ℓ
1702:∂
1684:^
1679:θ
1669:θ
1647:θ
1643:∂
1638:ℓ
1629:∂
1600:^
1595:θ
1552:^
1547:θ
1507:^
1502:θ
1453:θ
1449:∂
1444:ℓ
1441:∂
1431:…
1409:θ
1405:∂
1400:ℓ
1397:∂
1372:θ
1368:∂
1363:ℓ
1360:∂
1327:Θ
1290:θ
1284:ℓ
1226:θ
1193:θ
1187:ℓ
1141:θ
1121:
1100:θ
1094:ℓ
1059:Θ
1047:. For an
1030:Θ
1007:estimator
988:Θ
985:→
958:^
955:θ
894:Θ
891:∈
868:^
865:θ
852:^
849:θ
802:θ
778:Θ
775:∈
772:θ
739:^
736:θ
698:θ
629:∏
619:θ
568:θ
509:θ
471:θ
445:θ
388:…
319:Θ
291:Θ
288:∈
285:θ
282:∣
276:θ
269:⋅
216:θ
208:…
195:θ
181:θ
168:θ
23952:Category
23645:Survival
23522:Johansen
23245:Binomial
23200:Isotonic
22787:(normal)
22432:location
22239:Blocking
22194:Sampling
22073:Q–Q plot
22038:Box plot
22020:Graphics
21915:Skewness
21905:Kurtosis
21877:Variance
21807:Heronian
21802:Harmonic
21434:(1989).
21391:(1986).
21277:(1998).
21102:(1976).
20824:(1981).
20733:(1985).
19855:outliers
19800:See also
12665:For the
11191:Examples
11145:is just
11081:. Using
9442:, then:
9039:Finding
8111:"decide
7471:given a
6240:→
6014:′
5511:→
5362:→
5354:‖
5302:‖
5146:→
4455:→
4407:strongly
4363:→
3245:must be
2684:mapping
1043:that is
127:that is
23978:Commons
23925:Kriging
23810:Process
23767:studies
23626:Wavelet
23459:General
22626:Plug-in
22420:L space
22199:Cluster
21900:Moments
21718:Outline
21614:, 2001
21597:(video)
21478:1403464
21371:1617519
21330:2676741
21209:2344804
21171:2958222
21130:2958221
20970:1291393
20925:2339378
20886:2339293
20741:137–138
20488:2984505
19739:in 1913
19728:History
18494:of the
18492:inverse
18490:is the
18429:is the
18019:of the
16531:Example
16131:In the
13718:is the
13177:,
12642:⁄
12615:⁄
12594:(since
12589:⁄
12575:⁄
11999:⁄
11484:of the
11475:⁄
11461:⁄
11447:⁄
11306:,
11275:⁄
9027:is the
7478:on the
7473:uniform
7450:
7447:
7443:
7434:
7419:
7416:
7412:
7403:
7300:correct
6886:inverse
6564:⁄
6536:
6526:√
6516:
6346:is the
6156:√
5578:is the
5055:By the
4740:compact
3638:is the
3345:is its
1045:compact
129:uniform
123:with a
81:in the
23847:Census
23437:Normal
23385:Manova
23205:Robust
22955:2-way
22947:1-way
22785:-test
22456:
22033:Biplot
21824:Median
21817:Lehmer
21759:Center
21639:Python
21576:
21554:
21535:
21512:
21493:
21476:
21442:
21420:
21401:
21369:
21328:
21285:
21257:
21232:
21207:
21169:
21128:
21082:
21057:
20999:
20968:
20958:
20923:
20884:
20840:
20836:–313.
20803:
20775:
20747:
20713:
20686:
20661:
20636:
20608:
20486:
20447:
20422:
20389:
20347:
20273:
20222:
20195:
20143:
20081:
20052:
20024:
19999:
19971:
19851:RANSAC
19755:, and
19433:where
19415:
18971:where
18382:where
18061:method
16827:
16577:
16555:
13689:where
12788:
12536:
12300:
12148:
11968:
11347:, is (
11141:Since
10985:Where
9259:Proof.
8962:where
8526:
8453:where
8438:
8275:where
8230:
8224:
8148:
7726:where
7383:
7280:
6922:, and
6846:where
6531:
6492:
6334:
6324:
6316:where
6298:
6273:
6252:
6249:
6236:
6233:
6161:
6135:
6106:
5928:where
5572:where
5520:
5507:
5371:
5358:
5179:i.i.d.
5156:
5141:
4794:> 0
4465:
4450:
4373:
4358:
3678:, the
3565:
3495:
3487:where
3292:where
3201:
2823:
2802:
2736:
2567:where
2549:
2445:– are
2379:
1471:
1261:
1156:
897:
843:
817:
704:
535:. For
310:where
149:sample
110:normal
96:, the
73:, the
23471:Trend
23000:prior
22942:anova
22831:-test
22805:-test
22797:-test
22704:Power
22649:Pivot
22442:shape
22437:scale
21887:Shape
21867:Range
21812:Heinz
21787:Cubic
21723:Index
21474:JSTOR
21326:JSTOR
21205:JSTOR
21167:JSTOR
21126:JSTOR
20921:JSTOR
20882:JSTOR
20539:(PDF)
20484:JSTOR
20295:(PDF)
18431:score
17767:(say
13305:range
12172:≤ 1 .
12030:≤ 1 .
11965:0.054
11809:0.012
11653:0.000
11436:tails
11420:tails
11413:heads
9271:i.i.d
8975:Bayes
8848:Bayes
4760:both
3957:: If
3640:k × r
3092:from
3088:is a
2715:into
2680:is a
1532:roots
79:point
23704:Test
22904:Sign
22756:Wald
21829:Mode
21767:Mean
21574:ISBN
21552:ISBN
21533:ISBN
21510:ISBN
21491:ISBN
21440:ISBN
21418:ISBN
21399:ISBN
21283:ISBN
21255:ISBN
21230:ISBN
21080:ISBN
21055:ISBN
20997:ISSN
20956:ISBN
20838:ISBN
20801:ISBN
20773:ISBN
20745:ISBN
20711:ISBN
20684:ISBN
20659:ISBN
20634:ISBN
20606:ISBN
20445:ISBN
20420:ISBN
20387:ISBN
20345:ISBN
20271:ISBN
20220:ISBN
20193:ISBN
20141:ISBN
20079:ISBN
20050:ISBN
20022:ISBN
19997:ISBN
19969:ISBN
19720:and
18433:and
18252:and
15558:and
15519:and
15208:for
15172:and
14853:and
13509:and
12606:for
12168:0 ≤
12026:0 ≤
11377:head
11359:by (
11282:for
11216:see
8570:and
8186:>
6512:bias
6067:and
5366:a.s.
5011:<
4706:and
4459:a.s.
4405:(or
3581:and
3445:and
3316:and
3220:the
1049:open
53:the
22884:BIC
22879:AIC
21466:doi
21357:doi
21316:doi
21197:doi
21193:141
21157:doi
21116:doi
21030:doi
20948:doi
20913:doi
20874:doi
20834:312
20515:doi
20322:doi
20170:179
20137:161
20110:doi
17337:log
17284:log
17248:log
16236:exp
15934:exp
15913:det
15444:log
15355:log
15253:is
14534:)
13979:log
13556:log
13513:.)
13388:log
13319:log
13063:exp
12794:exp
12610:is
11430:is
11254:of
11015:log
10821:log
10423:log
10289:log
10165:log
10062:log
10028:log
9927:log
9872:log
9776:log
8140:if
7454:not
7360:mle
7330:mle
6882:j,k
6556:mle
6210:mle
5835:sup
5286:sup
5228:is
5070:sup
4804:ln
4778:of
4409:):
4090:is
2401:at
2397:is
1318:in
1314:is
1276:If
47:MLE
37:In
24008::
21635:.
21610:,
21604:,
21472:.
21462:58
21460:.
21367:MR
21365:.
21353:12
21351:.
21347:.
21324:.
21312:14
21310:.
21306:.
21203:.
21191:.
21165:.
21151:.
21147:.
21124:.
21110:.
21106:.
21024:.
21018:.
20993:14
20991:.
20987:.
20966:MR
20964:.
20954:.
20942:.
20919:.
20909:71
20907:.
20903:.
20880:.
20870:71
20868:.
20864:.
20743:.
20511:26
20509:.
20505:.
20480:30
20478:.
20468:;
20401:^
20318:17
20316:.
20297:.
20255:^
20243:.
20187:.
20139:.
20106:47
20104:.
20073:.
19751:,
19747:,
19724:.
19705:.
19613:,
18054:.
12620:.
12617:80
12613:49
12591:80
12587:49
12577:80
12573:49
12525:80
12519:49
12509:30
12487:48
12461:31
12440:49
12430:30
12408:48
12388:30
12366:49
12358:31
12350:31
12328:48
12320:49
12290:31
12268:49
12252:49
12249:80
12143:31
12121:49
12105:49
12102:80
12079:49
12012:.
11957:31
11926:49
11895:49
11892:80
11845:49
11801:31
11770:49
11739:49
11736:80
11689:49
11645:31
11614:49
11583:49
11580:80
11533:49
11409:80
11387:.
11379:'
11270:,
10928:KL
9252:.
9031:.
8088:.
7425:.
7200:ln
7127:ln
7032:ln
6538:.
6413:ln
6350::
5799::
5582:.
5374:0.
5159:0.
4980:ln
4913:1.
4854:ln
4826::
4742:.
4717:.
3892:ln
3809:ln
3662:.
3650:.
3349:.
2449:.
1118:ln
1083::
587::
539:,
89:.
65:a
41:,
22829:G
22803:F
22795:t
22783:Z
22502:V
22497:U
21699:e
21692:t
21685:v
21671:.
21651:.
21642:.
21623:.
21582:.
21560:.
21541:.
21518:.
21499:.
21480:.
21468::
21448:.
21426:.
21407:.
21373:.
21359::
21332:.
21318::
21291:.
21263:.
21238:.
21211:.
21199::
21173:.
21159::
21153:4
21132:.
21118::
21112:4
21088:.
21063:.
21038:.
21032::
21026:9
21003:.
20972:.
20950::
20927:.
20915::
20888:.
20876::
20846:.
20809:.
20781:.
20753:.
20719:.
20692:.
20667:.
20642:.
20614:.
20523:.
20517::
20490:.
20453:.
20428:.
20395:.
20353:.
20328:.
20324::
20301:.
20279:.
20249:.
20228:.
20201:.
20172:.
20149:.
20116:.
20112::
20087:.
20058:.
20030:.
20005:.
19977:.
19781:χ
19688:]
19683:)
19668:(
19662:r
19657:H
19651:[
19642:E
19637:=
19634:)
19628:(
19623:I
19580:.
19575:k
19571:x
19562:1
19559:+
19556:k
19552:x
19548:=
19543:k
19539:s
19517:,
19514:)
19509:k
19505:x
19501:(
19489:)
19484:k
19480:s
19476:+
19471:k
19467:x
19463:(
19454:=
19449:k
19445:y
19418:,
19407:k
19403:s
19397:k
19393:B
19386:T
19380:k
19376:s
19367:T
19361:k
19357:B
19350:T
19344:k
19340:s
19334:k
19330:s
19324:k
19320:B
19305:k
19301:s
19294:T
19288:k
19284:y
19275:T
19269:k
19265:y
19259:k
19255:y
19248:+
19243:k
19239:B
19235:=
19230:1
19227:+
19224:k
19220:B
19186:.
19181:k
19177:x
19168:1
19165:+
19162:k
19158:x
19154:=
19149:k
19145:s
19123:,
19115:k
19111:s
19105:T
19100:k
19096:y
19091:1
19086:=
19081:k
19055:,
19052:)
19047:k
19043:x
19039:(
19027:)
19022:k
19018:s
19014:+
19009:k
19005:x
19001:(
18992:=
18987:k
18983:y
18956:,
18950:T
18944:k
18940:y
18934:k
18930:y
18924:k
18916:+
18912:)
18905:T
18899:k
18895:y
18889:k
18885:s
18879:k
18868:I
18864:(
18858:k
18853:H
18847:)
18840:T
18834:k
18830:s
18824:k
18820:y
18814:k
18803:I
18799:(
18795:=
18790:1
18787:+
18784:k
18779:H
18733:)
18718:(
18712:r
18707:s
18700:1
18692:]
18685:T
18679:)
18665:)
18661:y
18657:;
18651:(
18639:(
18623:)
18619:y
18615:;
18609:(
18595:n
18590:1
18587:=
18584:t
18574:n
18571:1
18565:[
18557:=
18553:)
18538:(
18532:r
18527:d
18500:r
18477:)
18462:(
18456:1
18448:r
18443:H
18417:)
18402:(
18397:r
18392:s
18366:)
18351:(
18345:r
18340:s
18334:)
18319:(
18313:1
18305:r
18300:H
18292:=
18288:)
18273:(
18267:r
18262:d
18240:1
18237:=
18232:r
18192:)
18187:y
18183:;
18178:r
18160:(
18150:=
18146:)
18131:(
18125:r
18120:d
18096:+
18091:R
18081:r
18036:r
18021:r
18002:)
17987:(
17981:r
17976:d
17950:)
17935:(
17929:r
17924:d
17917:r
17909:+
17904:r
17887:=
17882:1
17879:+
17876:r
17834:}
17829:r
17812:{
17789:1
17731:)
17727:y
17723:(
17708:=
17673:0
17670:=
17656:)
17652:y
17648:;
17642:(
17597:n
17592:i
17588:x
17582:=
17577:i
17567:p
17536:)
17530:i
17526:p
17520:m
17515:1
17512:=
17509:i
17498:1
17494:(
17487:+
17484:)
17479:m
17475:p
17471:,
17465:,
17460:2
17456:p
17452:,
17447:1
17443:p
17439:(
17433:=
17430:)
17424:,
17419:m
17415:p
17411:,
17405:,
17400:2
17396:p
17392:,
17387:1
17383:p
17379:(
17376:L
17348:i
17344:p
17332:i
17328:x
17322:m
17317:1
17314:=
17311:i
17303:+
17300:!
17295:i
17291:x
17279:m
17274:1
17271:=
17268:i
17257:!
17254:n
17245:=
17242:)
17237:m
17233:p
17229:,
17223:,
17218:2
17214:p
17210:,
17205:1
17201:p
17197:(
17161:m
17157:x
17151:m
17147:p
17136:2
17132:x
17126:2
17122:p
17114:1
17110:x
17104:1
17100:p
17093:)
17085:m
17081:x
17077:,
17071:,
17066:2
17062:x
17058:,
17053:1
17049:x
17044:n
17039:(
17033:=
17026:i
17022:x
17016:i
17012:p
17002:!
16997:i
16993:x
16984:!
16981:n
16975:=
16972:)
16967:m
16963:p
16959:,
16953:,
16948:2
16944:p
16940:,
16935:1
16931:p
16922:m
16918:x
16914:,
16908:,
16903:2
16899:x
16895:,
16890:1
16886:x
16882:(
16879:f
16854:m
16850:x
16846:,
16840:,
16835:2
16831:x
16824:,
16819:1
16815:x
16804:s
16789:i
16785:X
16764:1
16761:=
16756:m
16752:p
16748:+
16742:+
16737:2
16733:p
16729:+
16724:1
16720:p
16697:i
16693:p
16672:n
16669:=
16664:m
16660:x
16656:+
16650:+
16645:2
16641:x
16637:+
16632:1
16628:x
16619::
16607:n
16585:m
16581:X
16574:,
16568:,
16563:2
16559:X
16552:,
16547:1
16543:X
16507:]
16502:)
16494:2
16489:2
16478:2
16474:)
16468:2
16455:2
16451:y
16447:(
16441:+
16433:2
16423:1
16413:)
16408:2
16395:2
16391:y
16387:(
16384:)
16379:1
16366:1
16362:y
16358:(
16352:2
16339:2
16334:1
16323:2
16319:)
16313:1
16300:1
16296:y
16292:(
16285:(
16278:)
16273:2
16262:1
16259:(
16256:2
16252:1
16243:[
16226:2
16215:1
16208:2
16198:1
16187:2
16183:1
16178:=
16175:)
16170:2
16166:y
16162:,
16157:1
16153:y
16149:(
16146:f
16115:)
16108:T
16102:]
16096:n
16083:n
16079:y
16075:,
16069:,
16064:1
16051:1
16047:y
16042:[
16035:1
16019:]
16013:n
16000:n
15996:y
15992:,
15986:,
15981:1
15968:1
15964:y
15959:[
15953:2
15950:1
15941:(
15926:)
15916:(
15906:2
15902:/
15898:n
15894:)
15887:2
15884:(
15880:1
15875:=
15872:)
15867:n
15863:y
15859:,
15853:,
15848:1
15844:y
15840:(
15837:f
15820:n
15778:)
15773:n
15765:,
15759:,
15754:1
15746:(
15726:)
15721:n
15717:y
15713:,
15707:,
15702:1
15698:y
15694:(
15684:n
15666:)
15661:2
15657:y
15653:(
15650:f
15647:)
15642:1
15638:y
15634:(
15631:f
15628:=
15625:)
15620:2
15616:y
15612:,
15607:1
15603:y
15599:(
15596:f
15571:2
15567:y
15544:1
15540:y
15490:)
15484:1
15481:+
15478:)
15473:2
15453:2
15450:(
15438:(
15431:2
15425:n
15415:=
15410:)
15405:)
15390:,
15375:(
15370:L
15363:(
15326:.
15322:)
15316:2
15299:,
15283:(
15279:=
15241:)
15236:2
15228:,
15222:(
15219:=
15158:2
15073:2
15046:2
15009:.
15004:2
14994:n
14989:1
14983:n
14976:=
14971:]
14963:2
14943:[
14933:E
14906:2
14898:=
14893:]
14885:2
14880:i
14869:[
14861:E
14841:0
14838:=
14833:]
14825:i
14814:[
14804:E
14779:.
14776:)
14771:j
14757:(
14754:)
14749:i
14735:(
14730:n
14725:1
14722:=
14719:j
14709:n
14704:1
14701:=
14698:i
14686:2
14682:n
14678:1
14668:2
14664:)
14658:i
14644:(
14639:n
14634:1
14631:=
14628:i
14618:n
14615:1
14610:=
14605:2
14566:i
14562:x
14547:i
14515:.
14510:j
14506:x
14500:i
14496:x
14490:n
14485:1
14482:=
14479:j
14469:n
14464:1
14461:=
14458:i
14446:2
14442:n
14438:1
14428:2
14423:i
14419:x
14413:n
14408:1
14405:=
14402:i
14392:n
14389:1
14384:=
14379:2
14375:)
14365:x
14354:i
14350:x
14346:(
14341:n
14336:1
14333:=
14330:i
14320:n
14317:1
14312:=
14307:2
14258:=
14232:.
14227:2
14223:)
14211:i
14207:x
14203:(
14198:n
14193:1
14190:=
14187:i
14177:n
14174:1
14169:=
14164:2
14120:.
14115:2
14111:)
14098:i
14094:x
14089:(
14084:n
14079:1
14076:=
14073:i
14061:3
14053:1
14048:+
14038:n
14028:=
14023:)
14018:)
14013:2
14005:,
13999:(
13994:L
13987:(
13961:=
13954:0
13937:σ
13887:,
13881:=
13876:]
13855:[
13845:E
13830:μ
13822:μ
13805:.
13800:n
13793:i
13789:x
13779:n
13774:1
13771:=
13768:i
13760:=
13751:x
13745:=
13700:x
13670:.
13662:2
13654:2
13648:)
13633:x
13627:(
13624:n
13621:2
13608:0
13605:=
13600:)
13595:)
13590:2
13582:,
13576:(
13571:L
13564:(
13538:=
13531:0
13488:2
13484:)
13471:i
13467:x
13462:(
13457:n
13452:1
13449:=
13446:i
13433:2
13425:2
13421:1
13413:)
13408:2
13397:2
13394:(
13383:2
13378:n
13368:=
13363:)
13358:)
13353:2
13345:,
13339:(
13334:L
13327:(
13277:)
13272:2
13264:,
13253:n
13249:x
13245:,
13239:,
13234:1
13230:x
13226:(
13223:f
13220:=
13217:)
13212:2
13204:,
13198:(
13193:L
13181:)
13179:σ
13175:μ
13171:θ
13153:.
13149:)
13140:2
13132:2
13125:2
13121:)
13109:i
13105:x
13101:(
13096:n
13091:1
13088:=
13085:i
13070:(
13058:2
13054:/
13050:n
13045:)
13037:2
13026:2
13022:1
13017:(
13012:=
13009:)
13004:2
12996:,
12985:i
12981:x
12977:(
12974:f
12969:n
12964:1
12961:=
12958:i
12950:=
12947:)
12942:2
12934:,
12923:n
12919:x
12915:,
12909:,
12904:1
12900:x
12896:(
12893:f
12877:n
12856:,
12852:)
12843:2
12835:2
12828:2
12824:)
12814:x
12811:(
12801:(
12781:2
12770:2
12764:1
12759:=
12756:)
12751:2
12743:,
12734:x
12731:(
12728:f
12702:)
12697:2
12689:,
12683:(
12678:N
12655:s
12651:n
12645:n
12639:s
12633:n
12625:s
12608:p
12600:p
12596:p
12582:p
12568:p
12564:p
12560:p
12539:.
12532:]
12528:p
12515:[
12505:)
12501:p
12495:1
12492:(
12483:p
12479:=
12468:]
12464:p
12455:)
12452:p
12446:1
12443:(
12436:[
12426:)
12422:p
12416:1
12413:(
12404:p
12400:=
12384:)
12380:p
12374:1
12371:(
12362:p
12346:)
12342:p
12336:1
12333:(
12324:p
12317:=
12310:0
12303:,
12296:)
12286:)
12282:p
12276:1
12273:(
12264:p
12257:)
12244:(
12237:(
12230:p
12218:=
12211:0
12194:p
12182:n
12170:p
12151:,
12139:)
12135:p
12129:1
12126:(
12117:p
12110:)
12097:(
12091:=
12088:)
12085:p
12076:=
12072:H
12068:(
12063:D
12059:f
12055:=
12052:)
12049:p
12046:(
12043:L
12028:p
12022:p
12010:p
12001:3
11997:2
11992:p
11971:.
11953:)
11946:3
11943:2
11934:1
11931:(
11922:)
11915:3
11912:2
11906:(
11900:)
11887:(
11881:=
11872:]
11863:3
11860:2
11854:=
11851:p
11842:=
11838:H
11831:[
11821:P
11812:,
11797:)
11790:2
11787:1
11778:1
11775:(
11766:)
11759:2
11756:1
11750:(
11744:)
11731:(
11725:=
11716:]
11707:2
11704:1
11698:=
11695:p
11686:=
11682:H
11675:[
11665:P
11656:,
11641:)
11634:3
11631:1
11622:1
11619:(
11610:)
11603:3
11600:1
11594:(
11588:)
11575:(
11569:=
11560:]
11551:3
11548:1
11542:=
11539:p
11530:=
11526:H
11519:[
11509:P
11490:p
11477:3
11473:2
11468:p
11463:2
11459:1
11454:p
11449:3
11445:1
11440:p
11432:θ
11428:p
11424:p
11406:x
11402:2
11399:x
11395:1
11392:x
11385:p
11381:p
11361:n
11357:n
11353:n
11349:n
11329:n
11316:m
11308:m
11304:m
11300:n
11296:m
11292:n
11288:m
11284:n
11278:n
11273:1
11268:m
11264:n
11260:m
11256:n
11236:n
11223:n
11212:n
11208:n
11167:0
11158:P
11095:x
11093:(
11091:h
11083:h
11066:)
11057:x
11054:(
11051:P
11046:)
11041:0
11030:x
11027:(
11024:P
11012:=
11009:)
11006:x
11003:(
10994:h
10966:)
10957:P
10946:0
10937:P
10933:(
10924:D
10913:n
10910:i
10907:m
10903:g
10900:r
10897:a
10891:=
10881:y
10878:d
10872:)
10863:y
10860:(
10857:P
10852:)
10847:0
10836:y
10833:(
10830:P
10818:)
10815:y
10812:(
10805:0
10796:P
10782:n
10779:i
10776:m
10772:g
10769:r
10766:a
10760:=
10757:y
10754:d
10751:)
10748:y
10745:(
10736:h
10732:)
10729:y
10726:(
10719:0
10710:P
10696:n
10693:i
10690:m
10686:g
10683:r
10680:a
10674:=
10664:]
10661:)
10658:y
10655:(
10646:h
10642:[
10639:E
10629:n
10626:i
10623:m
10619:g
10616:r
10613:a
10597:n
10587:)
10582:i
10578:y
10574:(
10565:h
10559:n
10554:1
10551:=
10548:i
10538:n
10535:1
10523:n
10520:i
10517:m
10513:g
10510:r
10507:a
10501:=
10488:)
10477:i
10473:y
10469:(
10466:P
10461:)
10456:0
10443:i
10439:y
10435:(
10432:P
10418:n
10413:1
10410:=
10407:i
10397:n
10394:1
10382:n
10379:i
10376:m
10372:g
10369:r
10366:a
10360:=
10354:)
10343:i
10339:y
10335:(
10332:P
10327:)
10322:0
10309:i
10305:y
10301:(
10298:P
10284:n
10279:1
10276:=
10273:i
10258:n
10255:i
10252:m
10248:g
10245:r
10242:a
10236:=
10230:)
10225:0
10212:i
10208:y
10204:(
10201:P
10196:)
10185:i
10181:y
10177:(
10174:P
10160:n
10155:1
10152:=
10149:i
10134:x
10131:a
10128:m
10124:g
10121:r
10118:a
10112:=
10101:)
10097:)
10092:0
10079:i
10075:y
10071:(
10068:P
10056:)
10045:i
10041:y
10037:(
10034:P
10024:(
10018:n
10013:1
10010:=
10007:i
9992:x
9989:a
9986:m
9982:g
9979:r
9976:a
9970:=
9966:)
9962:)
9957:0
9944:i
9940:y
9936:(
9933:P
9922:n
9917:1
9914:=
9911:i
9900:)
9889:i
9885:y
9881:(
9878:P
9867:n
9862:1
9859:=
9856:i
9847:(
9836:x
9833:a
9830:m
9826:g
9823:r
9820:a
9814:=
9804:)
9793:i
9789:y
9785:(
9782:P
9771:n
9766:1
9763:=
9760:i
9745:x
9742:a
9739:m
9735:g
9732:r
9729:a
9723:=
9720:)
9709:i
9705:y
9701:(
9698:P
9693:n
9688:1
9685:=
9682:i
9667:x
9664:a
9661:m
9657:g
9654:r
9651:a
9645:=
9635:)
9625:y
9621:(
9618:P
9608:x
9605:a
9602:m
9598:g
9595:r
9592:a
9586:=
9583:)
9579:y
9575:(
9566:P
9555:x
9552:a
9549:m
9545:g
9542:r
9539:a
9533:=
9530:)
9526:y
9522:(
9511:P
9506:L
9495:x
9492:a
9489:m
9485:g
9482:r
9479:a
9473:=
9424:P
9370:0
9361:P
9354:y
9334:)
9329:n
9325:y
9321:,
9315:,
9310:2
9306:y
9302:,
9297:1
9293:y
9289:(
9286:=
9282:y
9268:n
9236:0
9227:P
9153:0
9144:P
9106:Q
9014:)
9011:w
9008:(
9000:P
8971:h
8947:,
8941:]
8935:)
8932:w
8929:(
8921:P
8915:)
8912:w
8906:x
8903:(
8895:P
8887:[
8879:w
8875:x
8872:a
8869:m
8865:g
8862:r
8859:a
8853:=
8844:h
8828:,
8813:)
8810:x
8807:(
8799:P
8792:)
8787:i
8783:w
8779:(
8771:P
8766:)
8761:i
8757:w
8750:x
8747:(
8739:P
8731:=
8728:)
8725:x
8717:i
8713:w
8709:(
8701:P
8671:.
8665:1
8661:w
8638:)
8635:x
8627:2
8623:w
8619:(
8611:P
8606:=
8603:)
8600:x
8589:(
8581:P
8555:2
8551:w
8523:)
8520:x
8512:1
8508:w
8504:(
8496:P
8491:=
8488:)
8485:x
8474:(
8466:P
8435:x
8429:d
8425:)
8422:x
8419:(
8411:P
8406:)
8403:x
8392:(
8384:P
8358:w
8354:x
8351:a
8348:m
8344:g
8341:r
8338:a
8332:=
8329:w
8303:2
8299:w
8295:,
8289:1
8285:w
8271:"
8256:2
8252:w
8227:;
8221:)
8218:x
8214:|
8208:2
8204:w
8200:(
8192:P
8182:)
8179:x
8175:|
8169:1
8165:w
8161:(
8153:P
8125:1
8121:w
8076:)
8070:(
8062:P
8040:)
8029:n
8025:x
8021:,
8015:,
8010:2
8006:x
8002:,
7997:1
7993:x
7989:(
7986:f
7966:)
7960:(
7952:P
7940:θ
7926:)
7920:(
7912:P
7907:)
7896:n
7892:x
7888:,
7882:,
7877:2
7873:x
7869:,
7864:1
7860:x
7856:(
7853:f
7843:θ
7829:)
7824:n
7820:x
7816:,
7810:,
7805:2
7801:x
7797:,
7792:1
7788:x
7784:(
7776:P
7764:θ
7750:)
7744:(
7736:P
7708:)
7703:n
7699:x
7695:,
7689:,
7684:2
7680:x
7676:,
7671:1
7667:x
7663:(
7655:P
7648:)
7642:(
7634:P
7629:)
7618:n
7614:x
7610:,
7604:,
7599:2
7595:x
7591:,
7586:1
7582:x
7578:(
7575:f
7569:=
7566:)
7561:n
7557:x
7553:,
7547:,
7542:2
7538:x
7534:,
7529:1
7525:x
7515:(
7507:P
7492:θ
7488:θ
7445:n
7440:/
7437:1
7414:n
7409:/
7406:1
7386:.
7372:b
7340:=
7283:.
7275:]
7264:k
7250:i
7236:)
7231:t
7227:X
7223:(
7216:0
7207:f
7195:2
7177:j
7163:)
7158:t
7154:X
7150:(
7143:0
7134:f
7117:+
7109:k
7095:j
7081:i
7068:)
7063:t
7059:X
7055:(
7048:0
7039:f
7027:3
7014:2
7011:1
7003:[
6995:E
6989:=
6983:k
6980:i
6977:,
6974:j
6970:J
6965:+
6959:k
6956:j
6953:i
6949:K
6940:2
6935:1
6908:1
6899:I
6866:k
6863:j
6857:I
6830:)
6824:k
6821:i
6818:,
6815:j
6811:J
6806:+
6800:k
6797:j
6794:i
6790:K
6781:2
6776:1
6770:(
6764:k
6761:j
6755:I
6746:i
6743:h
6737:I
6728:m
6723:1
6720:=
6717:k
6714:,
6711:j
6708:,
6705:i
6692:n
6687:1
6681:=
6675:]
6667:h
6662:)
6656:0
6642:e
6639:l
6636:m
6617:(
6609:[
6599:E
6587:h
6583:b
6567:n
6562:1
6553:θ
6529:n
6522:/
6519:1
6495:.
6487:]
6476:k
6462:j
6449:)
6444:t
6440:X
6436:(
6429:0
6420:f
6408:2
6391:[
6383:E
6378:=
6373:k
6370:j
6364:I
6329:I
6301:,
6294:)
6288:1
6279:I
6270:,
6267:0
6263:(
6257:N
6244:d
6229:)
6223:0
6189:(
6181:n
6159:n
6138:,
6132:)
6127:0
6119:;
6112:(
6109:f
6075:Y
6055:X
6028:|
6024:)
6021:x
6018:(
6011:g
6006:|
6000:)
5997:x
5994:(
5989:X
5985:f
5978:=
5975:)
5972:y
5969:(
5964:Y
5960:f
5936:g
5916:)
5913:x
5910:(
5907:g
5904:=
5901:y
5877:.
5874:)
5868:(
5865:L
5860:)
5854:(
5851:g
5848:=
5842::
5831:=
5828:)
5822:(
5813:L
5779:.
5776:)
5756:(
5753:g
5750:=
5715:)
5709:(
5706:g
5703:=
5660:)
5654:(
5651:g
5575:I
5556:)
5550:1
5543:I
5538:,
5535:0
5531:(
5525:N
5515:d
5503:)
5497:0
5483:e
5480:l
5477:m
5455:(
5449:n
5420:)
5415:0
5407:;
5400:(
5397:f
5349:)
5343:(
5334:)
5331:x
5322:(
5269:0
5266:θ
5216:)
5213:x
5204:(
5150:p
5137:|
5132:)
5126:(
5117:)
5114:x
5105:(
5086:|
5041:.
5023:)
5020:x
5017:(
5014:D
5006:|
5001:)
4992:x
4989:(
4986:f
4975:|
4961:)
4959:0
4956:θ
4952:x
4950:(
4948:f
4943:)
4941:x
4939:(
4937:D
4931:.
4910:=
4905:]
4899:)
4893:(
4888:0
4884:C
4875:)
4866:x
4863:(
4860:f
4848:[
4838:P
4824:x
4820:θ
4816:)
4814:θ
4810:x
4808:(
4806:f
4796:.
4792:ε
4787:N
4783:0
4780:θ
4776:N
4735:.
4733:0
4730:θ
4726:θ
4724:(
4722:ℓ
4711:1
4708:θ
4704:0
4701:θ
4697:1
4694:θ
4690:θ
4673:.
4670:)
4665:0
4651:(
4648:f
4642:)
4630:(
4627:f
4617:0
4567:)
4562:0
4554:;
4547:(
4544:f
4524:)
4519:0
4511:;
4504:(
4501:f
4478:.
4473:0
4444:e
4441:l
4438:m
4386:.
4381:0
4367:p
4352:e
4349:l
4346:m
4280:n
4276:0
4273:θ
4269:n
4255:)
4250:0
4242:;
4235:(
4232:f
4208:.
4178:g
4158:g
4131:)
4116:(
4113:g
4110:=
4078:)
4072:(
4069:g
4066:=
4023:)
4017:(
4014:g
3924:]
3920:)
3909:i
3905:x
3901:(
3898:f
3888:[
3880:E
3875:=
3872:)
3866:(
3840:,
3837:)
3826:i
3822:x
3818:(
3815:f
3804:n
3799:1
3796:=
3793:i
3783:n
3780:1
3775:=
3772:)
3769:x
3766:;
3759:(
3717:)
3714:x
3711:;
3704:(
3676:θ
3611:T
3606:)
3599:(
3596:h
3559:T
3553:]
3547:r
3539:,
3533:,
3528:2
3520:,
3515:1
3506:[
3501:=
3472:,
3468:0
3465:=
3462:)
3456:(
3453:h
3433:0
3430:=
3410:T
3405:)
3398:(
3395:h
3330:T
3280:,
3270:T
3261:=
3204:.
3198:)
3193:k
3185:,
3179:,
3174:2
3166:,
3161:1
3153:(
3148:i
3144:h
3140:=
3135:i
3107:k
3102:R
3074:]
3068:k
3064:h
3060:,
3054:,
3049:2
3045:h
3041:,
3036:1
3032:h
3027:[
3023:=
3014:h
2989:k
2985:h
2981:,
2975:,
2970:1
2967:+
2964:r
2960:h
2956:,
2951:r
2947:h
2943:,
2937:,
2932:2
2928:h
2924:,
2919:1
2915:h
2890:r
2886:h
2882:,
2876:,
2871:2
2867:h
2863:,
2858:1
2854:h
2826:.
2820:0
2817:=
2814:)
2808:(
2805:h
2739:.
2731:r
2726:R
2700:k
2695:R
2666:]
2662:)
2656:(
2651:r
2647:h
2643:,
2637:,
2634:)
2628:(
2623:2
2619:h
2615:,
2612:)
2606:(
2601:1
2597:h
2592:[
2588:=
2585:)
2579:(
2576:h
2552:,
2545:}
2541:0
2538:=
2535:)
2529:(
2526:h
2522:,
2517:k
2512:R
2501::
2494:{
2490:=
2461:.
2382:,
2374:]
2351:=
2343:|
2335:2
2330:k
2312:2
2273:=
2265:|
2257:2
2243:k
2225:2
2191:=
2183:|
2175:1
2161:k
2143:2
2085:=
2077:|
2069:k
2055:2
2037:2
1998:=
1990:|
1982:2
1977:2
1959:2
1925:=
1917:|
1909:1
1895:2
1877:2
1841:=
1833:|
1825:k
1811:1
1793:2
1754:=
1746:|
1738:2
1724:1
1706:2
1672:=
1664:|
1656:2
1651:1
1633:2
1615:[
1610:=
1606:)
1588:(
1583:H
1514:,
1474:,
1468:0
1465:=
1457:k
1434:,
1427:,
1424:0
1421:=
1413:2
1390:,
1387:0
1384:=
1376:1
1331:,
1302:)
1298:y
1294:;
1287:(
1264:.
1256:n
1250:L
1205:)
1201:y
1197:;
1190:(
1159:.
1153:)
1149:y
1145:;
1138:(
1133:n
1127:L
1115:=
1112:)
1108:y
1104:;
1097:(
980:n
975:R
970::
965:n
927:n
921:L
888:)
884:y
880:(
875:n
858:=
820:.
814:)
810:y
806:;
799:(
794:n
788:L
767:x
764:a
761:m
757:g
754:r
751:a
745:=
707:.
701:)
695:;
690:k
686:y
682:(
676:r
673:a
670:v
667:i
664:n
661:u
655:k
651:f
644:n
639:1
636:=
633:k
625:=
622:)
616:;
612:y
608:(
603:n
599:f
571:)
565:;
561:y
557:(
552:n
548:f
516:,
512:)
506:;
502:y
498:(
493:n
489:f
485:=
482:)
478:y
474:;
468:(
463:n
457:L
451:=
448:)
442:(
437:n
431:L
404:)
399:n
395:y
391:,
385:,
380:2
376:y
372:,
367:1
363:y
359:(
356:=
352:y
298:,
294:}
279:)
273:;
266:(
263:f
260:{
232:T
226:]
220:k
211:,
204:,
199:2
190:,
185:1
176:[
171:=
45:(
34:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.