36:
9704:
341:
9690:
4373:. Empirical Bayes methods enable the use of auxiliary empirical data, from observations of related parameters, in the development of a Bayes estimator. This is done under the assumption that the estimated parameters are obtained from a common prior. For example, if independent observations of different parameters are performed, then the estimation performance of a particular parameter can sometimes be improved by using data from other observations.
9728:
9716:
109:
2517:
2833:
7201:, where W is the weighted rating and C is the average rating of all films. So, in simpler terms, the fewer ratings/votes cast for a film, the more that film's Weighted Rating will skew towards the average across all films, while films with many ratings/votes will have a rating approaching its pure arithmetic average rating.
2322:
2057:
Risk functions are chosen depending on how one measures the distance between the estimate and the unknown parameter. The MSE is the most common risk function in use, primarily due to its simplicity. However, alternative risk functions are also occasionally used. The following are several examples of
6880:
For example, if ÎŁ=Ď/2, then the deviation of 4 measurements combined matches the deviation of the prior (assuming that errors of measurements are independent). And the weights Îą,β in the formula for posterior match this: the weight of the prior is 4 times the weight of the measurement. Combining
3067:
The use of an improper prior means that the Bayes risk is undefined (since the prior is not a probability distribution and we cannot take an expectation under it). As a consequence, it is no longer meaningful to speak of a Bayes estimator that minimizes the Bayes risk. Nevertheless, in many cases,
6053:
By contrast, generalized Bayes rules often have undefined Bayes risk in the case of improper priors. These rules are often inadmissible and the verification of their admissibility can be difficult. For example, the generalized Bayes estimator of a location parameter θ based on
Gaussian samples
1215:
Conjugate priors are especially useful for sequential estimation, where the posterior of the current measurement is used as the prior in the next measurement. In sequential estimation, unless a conjugate prior is used, the posterior distribution typically becomes more complex with each added
1492:
1212:, for which the resulting posterior distribution also belongs to the same family. This is an important property, since the Bayes estimator, as well as its statistical properties (variance, confidence interval, etc.), can all be derived from the posterior distribution.
4190:
2671:
3758:
6449:
4990:
2512:{\displaystyle L(\theta ,{\widehat {\theta }})={\begin{cases}a|\theta -{\widehat {\theta }}|,&{\mbox{for }}\theta -{\widehat {\theta }}\geq 0\\b|\theta -{\widehat {\theta }}|,&{\mbox{for }}\theta -{\widehat {\theta }}<0\end{cases}}}
3193:
2047:
3576:
5944:
3274:
is typically well-defined and finite. Recall that, for a proper prior, the Bayes estimator minimizes the posterior expected loss. When the prior is improper, an estimator which minimizes the posterior expected loss is referred to as a
6954:
Compare to the example of binomial distribution: there the prior has the weight of (Ď/ÎŁ)²â1 measurements. One can see that the exact weight does depend on the details of the distribution, but when ĎâŤÎŁ, the difference becomes small.
6752:
5635:
6871:
5780:
2199:
1181:
3202:, since Bayes' theorem can only be applied when all distributions are proper. However, it is not uncommon for the resulting "posterior" to be a valid probability distribution. In this case, the posterior expected loss
1733:
4882:
5702:
2597:
2273:
6259:
1366:
537:
6261:
be a sequence of Bayes estimators of θ based on an increasing number of measurements. We are interested in analyzing the asymptotic performance of this sequence of estimators, i.e., the performance of
702:
3269:
6617:
5853:
6971:
which is claimed to give "a true
Bayesian estimate". The following Bayesian formula was initially used to calculate a weighted average score for the Top 250, though the formula has since changed:
3875:
1309:
849:
6949:
5276:
3023:
605:
6780:
bits of the new information. In applications, one often knows very little about fine details of the prior distribution; in particular, there is no reason to assume that it coincides with B(
1848:
2828:{\displaystyle L(\theta ,{\widehat {\theta }})={\begin{cases}0,&{\mbox{for }}|\theta -{\widehat {\theta }}|<K\\L,&{\mbox{for }}|\theta -{\widehat {\theta }}|\geq K.\end{cases}}}
1902:
1357:
1605:
5491:
2925:
5539:
5444:
3402:
7189:, the confidence of the average rating surpasses the confidence of the mean vote for all films (C), and the weighted bayesian rating (W) approaches a straight average (R). The closer
3587:
5133:
1022:
7030:
5402:
5351:
4487:
784:
755:
642:
6788:) exactly. In such a case, one possible interpretation of this calculation is: "there is a non-pathological prior distribution with the mean value 0.5 and the standard deviation
6136:
4808:
4681:
4436:
1649:
5315:
4349:
6184:
5060:
4762:
4649:
4050:
1789:
1548:
6873:, with weights in this weighted average being Îą=Ď², β=Σ². Moreover, the squared posterior deviation is Σ²+Ď². In other words, the prior is combined with the measurement in
5977:
4520:
4293:
4733:
3440:
3324:
2971:
6348:
6320:
6286:
4888:
4580:
7204:
IMDb's approach ensures that a film with only a few ratings, all at 10, would not rank above "the
Godfather", for example, with a 9.2 average from over 500,000 ratings.
3074:
3058:
1911:
1249:
5024:
2309:
6010:
4553:
4247:
3935:
3448:
3344:
1045:
726:
557:
461:
6768:
is small, the prior information is still relevant to the decision problem and affects the estimate. To see the relative weight of the prior information, assume that
6078:
2662:
2636:
2117:
4042:
3989:
3962:
5858:
4701:
4620:
4600:
3790:
484:
7159:
7134:
7109:
7084:
7059:
4015:
3898:
6039:
If a Bayes rule is unique then it is admissible. For example, as stated above, under mean squared error (MSE) the Bayes rule is unique and therefore admissible.
4214:
3810:
2878:
2076:
1065:
900:
870:
6628:
5545:
6814:
6049:
If θ belongs to a continuous (non-discrete) set, and if the risk function R(θ,δ) is continuous in θ for every δ, then all Bayes rules are admissible.
5708:
2126:
1085:
6799:
Another example of the same phenomena is the case when the prior estimate and a measurement are normally distributed. If the prior is centered at
7136:= weight given to the prior estimate (in this case, the number of votes IMDB deemed necessary for average rating to approach statistical validity)
2938:, i.e., a prior distribution which does not imply a preference for any particular value of the unknown parameter. One can still define a function
2934:, of all real numbers) for which every real number is equally likely. Yet, in some sense, such a "distribution" seems like a natural choice for a
6951:; in particular, the prior plays the same role as 4 measurements made in advance. In general, the prior has the weight of (Ď/ÎŁ)² measurements.
1658:
8825:
6776:; in this case each measurement brings in 1 new bit of information; the formula above shows that the prior information has the same weight as
9330:
6335:
371:
2855:
9480:
162:
9104:
7745:
6330:, the effect of the prior probability on the posterior is negligible. Moreover, if δ is the Bayes estimator under MSE risk, then it is
4820:
6507:) where θ denotes the probability for success. Assuming θ is distributed according to the conjugate prior, which in this case is the
5646:
1487:{\displaystyle {\widehat {\theta }}(x)={\frac {\sigma ^{2}}{\sigma ^{2}+\tau ^{2}}}\mu +{\frac {\tau ^{2}}{\sigma ^{2}+\tau ^{2}}}x.}
2523:
8878:
2205:
244:
6189:
489:
9317:
2612:, or a point close to it depending on the curvature and properties of the posterior distribution. Small values of the parameter
647:
6968:
7268:
3208:
790:
if it minimizes the Bayes risk among all estimators. Equivalently, the estimator which minimizes the posterior expected loss
6525:
5788:
7740:
7440:
3818:
1258:
793:
8344:
7492:
6892:
5139:
2979:
6487:(MLE). The relations between the maximum likelihood and Bayes estimators can be shown in the following simple example.
6477:
2930:
However, occasionally this can be a restrictive requirement. For example, there is no distribution (covering the set,
2087:
9127:
9019:
7384:
7365:
7338:
1792:
566:
364:
327:
79:
57:
1797:
50:
9732:
9305:
9179:
1856:
1314:
254:
1560:
9363:
9024:
8769:
8140:
7730:
3753:{\displaystyle E=\int {L(a-\theta )p(\theta |x)d\theta }={\frac {1}{p(x)}}\int L(a-\theta )f(x-\theta )d\theta .}
432:
280:
157:
5449:
2886:
9414:
8626:
8433:
8322:
8280:
5499:
5407:
3349:
218:
8354:
9657:
8616:
7519:
7411:
7213:
5068:
940:
9759:
9208:
9157:
9142:
9132:
9001:
8873:
8840:
8666:
8621:
8451:
7218:
6977:
6484:
1208:
is sometimes chosen for simplicity. A conjugate prior is defined as a prior distribution belonging to some
357:
249:
187:
17:
5356:
5320:
4441:
1216:
measurement, and the Bayes estimator cannot usually be calculated without resorting to numerical methods.
760:
731:
618:
9720:
9552:
9353:
9277:
8578:
8332:
8001:
7465:
7406:
7326:
6095:
4767:
4654:
4395:
7401:
4185:{\displaystyle \int L(a-\theta )f(x_{1}-\theta )d\theta =\int L(a-x_{1}-\theta ')f(-\theta ')d\theta '.}
1613:
9437:
9409:
9404:
9152:
8911:
8817:
8797:
8705:
8416:
8234:
7717:
7589:
5284:
4301:
239:
208:
6145:
9169:
8937:
8658:
8583:
8512:
8441:
8361:
8349:
8219:
8207:
8200:
7908:
7629:
5033:
4738:
4625:
4381:
1742:
1501:
921:
301:
182:
6444:{\displaystyle {\sqrt {n}}(\delta _{n}-\theta _{0})\to N\left(0,{\frac {1}{I(\theta _{0})}}\right),}
4985:{\displaystyle {\widehat {\sigma }}_{m}^{2}={\frac {1}{n}}\sum {(x_{i}-{\widehat {\mu }}_{m})^{2}}.}
4392:
The following is a simple example of parametric empirical Bayes estimation. Given past observations
2842:
is the most widely used and validated. Other loss functions are used in statistics, particularly in
2710:
2361:
9652:
9419:
9282:
8967:
8932:
8896:
8681:
8123:
8032:
7991:
7903:
7594:
7433:
7170:
6032:
6026:
5949:
4996:
4492:
4252:
3188:{\displaystyle p(\theta |x)={\frac {p(x|\theta )p(\theta )}{\int p(x|\theta )p(\theta )d\theta }}.}
2042:{\displaystyle {\widehat {\theta }}(X)={\frac {(a+n)\max {(\theta _{0},x_{1},...,x_{n})}}{a+n-1}}.}
322:
234:
44:
6519:), the posterior distribution is known to be B(a+x,b+n-x). Thus, the Bayes estimator under MSE is
4709:
3410:
3294:
2941:
9561:
9174:
9114:
9051:
8689:
8673:
8411:
8273:
8263:
8113:
8027:
6964:
6331:
6298:
6264:
4558:
6967:
uses a formula for calculating and comparing the ratings of films by its users, including their
3571:{\displaystyle p(\theta |x)={\frac {p(x|\theta )p(\theta )}{p(x)}}={\frac {f(x-\theta )}{p(x)}}}
3034:
9599:
9529:
9322:
9259:
9014:
8901:
7898:
7795:
7702:
7581:
7480:
4366:
4360:
1226:
1076:
213:
61:
5855:, and if we assume a normal prior (which is a conjugate prior in this case), we conclude that
5002:
2282:
1904:, then the posterior is also Pareto distributed, and the Bayes estimator under MSE is given by
9624:
9566:
9509:
9335:
9228:
9137:
8863:
8747:
8606:
8598:
8488:
8480:
8295:
8191:
8169:
8128:
8093:
8060:
8006:
7981:
7936:
7875:
7835:
7637:
7460:
6081:
5982:
5939:{\displaystyle \theta _{n+1}\sim N({\widehat {\mu }}_{\pi },{\widehat {\sigma }}_{\pi }^{2})}
5027:
4525:
4377:
4219:
3907:
3442:
in this case, especially when no other more subjective information is available. This yields
3329:
3029:
2935:
1651:, then the posterior is also Gamma distributed, and the Bayes estimator under MSE is given by
1252:
1075:
Using the MSE as risk, the Bayes estimate of the unknown parameter is simply the mean of the
1030:
711:
542:
446:
409:
116:
6057:
2641:
2615:
2096:
1204:
If there is no inherent reason to prefer one prior probability distribution over another, a
9547:
9122:
9071:
9047:
9009:
8927:
8906:
8858:
8737:
8715:
8684:
8593:
8470:
8421:
8339:
8312:
8268:
8224:
7986:
7762:
7642:
7348:
4020:
3967:
3940:
1554:
296:
177:
147:
6295:
To this end, it is customary to regard θ as a deterministic parameter whose true value is
4686:
4605:
4585:
3766:
469:
8:
9694:
9619:
9542:
9223:
8987:
8980:
8942:
8850:
8830:
8802:
8535:
8401:
8396:
8386:
8378:
8196:
8157:
8047:
8037:
7946:
7725:
7681:
7599:
7524:
7426:
7141:
7116:
7091:
7066:
7041:
6339:
3994:
1851:
428:
128:
120:
100:
3880:
9754:
9708:
9519:
9373:
9269:
9218:
9094:
8991:
8975:
8952:
8729:
8463:
8446:
8406:
8317:
8212:
8174:
8145:
8105:
8065:
8011:
7928:
7614:
7609:
6463:
4811:
4199:
3795:
3288:
2863:
2839:
2315:
from the posterior distribution, and is a generalization of the previous loss function:
2061:
1608:
1050:
885:
855:
464:
345:
270:
172:
142:
2973:, but this would not be a proper probability distribution since it has infinite mass,
9703:
9614:
9584:
9576:
9396:
9387:
9312:
9243:
9099:
9084:
9059:
8947:
8888:
8754:
8742:
8368:
8285:
8229:
8152:
7996:
7918:
7697:
7571:
7380:
7361:
7334:
7264:
6747:{\displaystyle \delta _{n}(x)={\frac {a+b}{a+b+n}}E+{\frac {n}{a+b+n}}\delta _{MLE}.}
6508:
5630:{\displaystyle \sigma _{\pi }^{2}=\sigma _{m}^{2}-\sigma _{f}^{2}=\sigma _{m}^{2}-K.}
3199:
2843:
1209:
927:
385:
340:
275:
152:
124:
6866:{\displaystyle {\frac {\alpha }{\alpha +\beta }}B+{\frac {\beta }{\alpha +\beta }}b}
9639:
9594:
9358:
9345:
9238:
9213:
9147:
9079:
8957:
8565:
8458:
8391:
8304:
8251:
8070:
7941:
7735:
7619:
7534:
7501:
6054:(described in the "Generalized Bayes estimator" section above) is inadmissible for
167:
9556:
9300:
9162:
9089:
8638:
8611:
8588:
8557:
8184:
8179:
8133:
7863:
7514:
7344:
6326:), the posterior density of θ is approximately normal. In other words, for large
1359:, then the posterior is also Normal and the Bayes estimator under MSE is given by
1205:
1199:
389:
203:
9046:
2058:
such alternatives. We denote the posterior generalized distribution function by
9505:
9500:
7963:
7893:
7539:
5775:{\displaystyle {\widehat {\sigma }}_{\pi }^{2}={\widehat {\sigma }}_{m}^{2}-K.}
3904:
In this case it can be shown that the generalized Bayes estimator has the form
3061:
2609:
2194:{\displaystyle L(\theta ,{\widehat {\theta }})=a|\theta -{\widehat {\theta }}|}
1176:{\displaystyle {\widehat {\theta }}(x)=E=\int \theta \,p(\theta |x)\,d\theta .}
877:
705:
412:
9748:
9662:
9629:
9492:
9453:
9264:
9233:
8697:
8651:
8256:
7958:
7785:
7549:
7544:
608:
416:
405:
4683:
We can then use the past observations to determine the mean and variance of
9604:
9537:
9514:
9429:
8759:
8055:
7953:
7888:
7830:
7815:
7752:
7707:
6043:
317:
1728:{\displaystyle {\widehat {\theta }}(X)={\frac {n{\overline {X}}+a}{n+b}}.}
9647:
9609:
9292:
9193:
9055:
8868:
8835:
8327:
8244:
8239:
7883:
7840:
7820:
7800:
7790:
7559:
7086:= average rating for the movie as a number from 1 to 10 (mean) = (Rating)
6761:â â, the Bayes estimator (in the described problem) is close to the MLE.
2880:
has thus far been assumed to be a true probability distribution, in that
8493:
7973:
7673:
7604:
7554:
7529:
7449:
7377:
Bayesian
Estimation and Experimental Design in Linear Regression Models
2856:
Admissible decision rule § Bayes rules and generalized Bayes rules
7263:(5. print. ed.). Cambridge : Cambridge Univ. Press. p. 172.
6877:
the same way as if it were an extra measurement to take into account.
6483:
Another estimator which is asymptotically normal and efficient is the
6035:. The following are some specific examples of admissibility theorems.
8646:
8498:
8118:
7913:
7825:
7810:
7805:
7770:
401:
8162:
7780:
7657:
7652:
7647:
6490:
2312:
2279:
Another "linear" loss function, which assigns different "weights"
926:
The most common risk function used for
Bayesian estimation is the
873:
also minimizes the Bayes risk and therefore is a Bayes estimator.
9667:
9368:
7310:
4877:{\displaystyle {\widehat {\mu }}_{m}={\frac {1}{n}}\sum {x_{i}},}
424:
6322:. Under specific conditions, for large samples (large values of
427:
function. An alternative way of formulating an estimator within
9589:
8570:
8544:
8524:
7775:
7566:
5697:{\displaystyle {\widehat {\mu }}_{\pi }={\widehat {\mu }}_{m},}
4602:
which depends on unknown parameters. For example, suppose that
2638:
are recommended, in order to use the mode as an approximation (
3060:, which are not probability distributions, are referred to as
2608:
The following loss function is trickier: it yields either the
2592:{\displaystyle F({\widehat {\theta }}(x)|X)={\frac {a}{a+b}}.}
1027:
where the expectation is taken over the joint distribution of
880:
then an estimator which minimizes the posterior expected loss
7418:
5404:, which are assumed to be known. In particular, suppose that
2268:{\displaystyle F({\widehat {\theta }}(x)|X)={\tfrac {1}{2}}.}
423:). Equivalently, it maximizes the posterior expectation of a
108:
7193:(the number of ratings for the film) is to zero, the closer
6254:{\displaystyle \delta _{n}=\delta _{n}(x_{1},\ldots ,x_{n})}
532:{\displaystyle {\widehat {\theta }}={\widehat {\theta }}(x)}
7509:
2821:
2505:
2119:, which yields the posterior median as the Bayes' estimate:
6808:
6792:
which gives the weight of prior information equal to 1/(4
6139:
1551:
697:{\displaystyle E_{\pi }(L(\theta ,{\widehat {\theta }}))}
6958:
1193:
5640:
Finally, we obtain the estimated moments of the prior,
3264:{\displaystyle \int {L(\theta ,a)p(\theta |x)d\theta }}
2081:
915:
7379:. Chichester: John Wiley & Sons. pp. 38â117.
6612:{\displaystyle \delta _{n}(x)=E={\frac {a+x}{a+b+n}}.}
6092:
Let θ be an unknown random variable, and suppose that
5848:{\displaystyle x_{i}|\theta _{i}\sim N(\theta _{i},1)}
2775:
2722:
2472:
2401:
2251:
2088:
Bias of an estimator § Median-unbiased estimators
7161:= the mean vote across the whole pool (currently 7.0)
7144:
7119:
7094:
7069:
7044:
6980:
6895:
6817:
6803:
with deviation ÎŁ, and the measurement is centered at
6631:
6528:
6499:
Consider the estimator of θ based on binomial sample
6351:
6301:
6267:
6192:
6148:
6098:
6060:
5985:
5952:
5861:
5791:
5711:
5649:
5548:
5502:
5452:
5410:
5359:
5323:
5287:
5142:
5071:
5036:
5005:
4891:
4823:
4770:
4741:
4712:
4689:
4657:
4628:
4608:
4588:
4561:
4528:
4495:
4444:
4398:
4304:
4255:
4222:
4202:
4053:
4023:
3997:
3970:
3943:
3910:
3883:
3870:{\displaystyle \int L(a-\theta )f(x-\theta )d\theta }
3821:
3798:
3769:
3590:
3451:
3413:
3352:
3332:
3297:
3211:
3077:
3037:
2982:
2944:
2889:
2866:
2674:
2644:
2618:
2526:
2325:
2285:
2208:
2129:
2099:
2064:
1914:
1859:
1800:
1745:
1661:
1616:
1563:
1504:
1369:
1317:
1261:
1229:
1088:
1053:
1033:
943:
888:
858:
796:
763:
734:
714:
650:
621:
569:
545:
492:
472:
449:
9331:
Autoregressive conditional heteroskedasticity (ARCH)
2838:
Other loss functions can be conceived, although the
1304:{\displaystyle x|\theta \sim N(\theta ,\sigma ^{2})}
844:{\displaystyle E(L(\theta ,{\widehat {\theta }})|x)}
6944:{\displaystyle {\frac {4}{4+n}}V+{\frac {n}{4+n}}v}
6031:Bayes rules having finite Bayes risk are typically
8793:
7153:
7128:
7103:
7078:
7053:
7024:
6943:
6865:
6746:
6611:
6443:
6314:
6280:
6253:
6178:
6130:
6072:
6004:
5971:
5938:
5847:
5774:
5696:
5629:
5533:
5485:
5438:
5396:
5345:
5309:
5271:{\displaystyle \sigma _{m}^{2}=E_{\pi }+E_{\pi },}
5270:
5127:
5054:
5018:
4984:
4876:
4802:
4756:
4727:
4695:
4675:
4643:
4614:
4594:
4574:
4547:
4514:
4481:
4430:
4343:
4287:
4241:
4208:
4184:
4036:
4009:
3983:
3956:
3929:
3892:
3869:
3804:
3784:
3752:
3570:
3434:
3396:
3338:
3318:
3263:
3187:
3052:
3018:{\displaystyle \int {p(\theta )d\theta }=\infty .}
3017:
2965:
2919:
2872:
2827:
2656:
2630:
2591:
2511:
2303:
2267:
2193:
2111:
2070:
2041:
1896:
1842:
1783:
1727:
1643:
1599:
1542:
1486:
1351:
1303:
1243:
1175:
1059:
1039:
1016:
894:
864:
843:
778:
749:
728:: this defines the risk function as a function of
720:
696:
636:
599:
551:
531:
478:
455:
7331:Statistical decision theory and Bayesian Analysis
7111:= number of votes/ratings for the movie = (votes)
5527:
5121:
4753:
4724:
4669:
4640:
4340:
1219:Following are some examples of conjugate priors.
9746:
5353:are the moments of the conditional distribution
3198:This is a definition, and not an application of
1957:
8879:Multivariate adaptive regression splines (MARS)
2849:
600:{\displaystyle L(\theta ,{\widehat {\theta }})}
7355:
4249:. Thus, the expression minimizing is given by
1843:{\displaystyle x_{i}|\theta \sim U(0,\theta )}
708:is taken over the probability distribution of
7434:
4354:
4295:, so that the optimal estimator has the form
3763:The generalized Bayes estimator is the value
2052:
1897:{\displaystyle \theta \sim Pa(\theta _{0},a)}
1352:{\displaystyle \theta \sim N(\mu ,\tau ^{2})}
365:
7375:Pilz, JĂźrgen (1991). "Bayesian estimation".
1600:{\displaystyle x_{i}|\theta \sim P(\theta )}
7333:(2nd ed.). New York: Springer-Verlag.
6622:The MLE in this case is x/n and so we get,
3792:that minimizes this expression for a given
7479:
7441:
7427:
7292:Lehmann and Casella (1998), Theorem 5.2.4.
5486:{\displaystyle \sigma _{f}^{2}(\theta )=K}
4384:approaches to empirical Bayes estimation.
3068:one can define the posterior distribution
2920:{\displaystyle \int p(\theta )d\theta =1.}
372:
358:
8092:
7245:
7243:
5534:{\displaystyle \mu _{\pi }=\mu _{m}\,\!,}
5526:
5439:{\displaystyle \mu _{f}(\theta )=\theta }
5120:
4752:
4723:
4668:
4639:
4339:
3397:{\displaystyle p(x|\theta )=f(x-\theta )}
1163:
1142:
80:Learn how and when to remove this message
7261:Probability Theory: The Logic of Science
6087:
245:Integrated nested Laplace approximations
43:This article includes a list of general
7301:Lehmann and Casella (1998), section 6.8
6470:. It follows that the Bayes estimator δ
3407:It is common to use the improper prior
2311:to over or sub estimation. It yields a
14:
9747:
9405:KaplanâMeier estimator (product limit)
7325:
7258:
7240:
7185:. As the number of ratings surpasses
6046:, then all Bayes rules are admissible.
5128:{\displaystyle \mu _{m}=E_{\pi }\,\!,}
4365:A Bayes estimator derived through the
4196:This is identical to (1), except that
1017:{\displaystyle \mathrm {MSE} =E\left,}
9478:
9045:
8792:
8091:
7861:
7478:
7422:
7249:Lehmann and Casella, Definition 4.2.9
7025:{\displaystyle W={Rv+Cm \over v+m}\ }
6959:Practical example of Bayes estimators
6889:results in the posterior centered at
3287:A typical example is estimation of a
1194:Bayes estimators for conjugate priors
9715:
9415:Accelerated failure time (AFT) model
7374:
7356:Lehmann, E. L.; Casella, G. (1998).
6757:The last equation implies that, for
5946:, from which the Bayes estimator of
5397:{\displaystyle f(x_{i}|\theta _{i})}
5346:{\displaystyle \sigma _{f}(\theta )}
4482:{\displaystyle f(x_{i}|\theta _{i})}
2082:Posterior median and other quantiles
916:Minimum mean square error estimation
779:{\displaystyle {\widehat {\theta }}}
750:{\displaystyle {\widehat {\theta }}}
637:{\displaystyle {\widehat {\theta }}}
29:
9727:
9010:Analysis of variance (ANOVA, anova)
7862:
6131:{\displaystyle x_{1},x_{2},\ldots }
4803:{\displaystyle x_{1},\ldots ,x_{n}}
4676:{\displaystyle \sigma _{\pi }\,\!.}
4431:{\displaystyle x_{1},\ldots ,x_{n}}
3812:. This is equivalent to minimizing
24:
9105:CochranâMantelâHaenszel statistics
7731:Pearson product-moment correlation
7237:Lehmann and Casella, Theorem 4.1.1
4489:, one is interested in estimating
3009:
1644:{\displaystyle \theta \sim G(a,b)}
951:
948:
945:
49:it lacks sufficient corresponding
25:
9771:
7394:
5310:{\displaystyle \mu _{f}(\theta )}
4344:{\displaystyle a(x)=a_{0}+x.\,\!}
3991:be the value minimizing (1) when
3291:with a loss function of the type
2602:
1070:
9726:
9714:
9702:
9689:
9688:
9479:
6179:{\displaystyle f(x_{i}|\theta )}
6020:
4764:of the marginal distribution of
4438:having conditional distribution
4017:. Then, given a different value
3581:so the posterior expected loss
339:
255:Approximate Bayesian computation
107:
34:
9364:Least-squares spectral analysis
5055:{\displaystyle \sigma _{m}^{2}}
4757:{\displaystyle \sigma _{m}\,\!}
4644:{\displaystyle \mu _{\pi }\,\!}
4192: (2)
3900: (1)
3346:is a location parameter, i.e.,
2093:A "linear" loss function, with
1784:{\displaystyle x_{1},...,x_{n}}
1543:{\displaystyle x_{1},...,x_{n}}
433:maximum a posteriori estimation
281:Maximum a posteriori estimation
8345:Mean-unbiased minimum-variance
7448:
7304:
7295:
7286:
7277:
7252:
7231:
6796:)-1 bits of new information."
6695:
6689:
6648:
6642:
6568:
6561:
6554:
6545:
6539:
6427:
6414:
6388:
6385:
6359:
6248:
6216:
6173:
6166:
6152:
5933:
5884:
5842:
5823:
5803:
5474:
5468:
5427:
5421:
5391:
5377:
5363:
5340:
5334:
5304:
5298:
5262:
5253:
5236:
5230:
5217:
5214:
5198:
5195:
5189:
5171:
5117:
5114:
5108:
5095:
4969:
4933:
4476:
4462:
4448:
4314:
4308:
4165:
4151:
4145:
4115:
4097:
4078:
4072:
4060:
3858:
3846:
3840:
3828:
3779:
3773:
3738:
3726:
3720:
3708:
3696:
3690:
3668:
3661:
3654:
3648:
3636:
3623:
3616:
3612:
3600:
3594:
3562:
3556:
3548:
3536:
3521:
3515:
3507:
3501:
3495:
3488:
3481:
3469:
3462:
3455:
3423:
3417:
3391:
3379:
3370:
3363:
3356:
3313:
3301:
3251:
3244:
3237:
3231:
3219:
3170:
3164:
3158:
3151:
3144:
3133:
3127:
3121:
3114:
3107:
3095:
3088:
3081:
3047:
3041:
2996:
2990:
2954:
2948:
2902:
2896:
2805:
2782:
2752:
2729:
2699:
2678:
2562:
2555:
2551:
2545:
2530:
2462:
2439:
2391:
2368:
2350:
2329:
2244:
2237:
2233:
2227:
2212:
2187:
2164:
2154:
2133:
2012:
1961:
1954:
1942:
1933:
1927:
1891:
1872:
1837:
1825:
1812:
1680:
1674:
1638:
1626:
1594:
1588:
1575:
1388:
1382:
1346:
1327:
1298:
1279:
1266:
1234:
1160:
1153:
1146:
1130:
1123:
1116:
1107:
1101:
997:
987:
981:
966:
838:
831:
827:
806:
800:
691:
688:
667:
661:
594:
573:
526:
520:
13:
1:
9658:Geographic information system
8874:Simultaneous equations models
7319:
7214:Recursive Bayesian estimation
6811:the posterior is centered at
6015:
5972:{\displaystyle \theta _{n+1}}
4515:{\displaystyle \theta _{n+1}}
4288:{\displaystyle a-x_{1}=a_{0}}
611:, such as squared error. The
443:Suppose an unknown parameter
438:
8841:Coefficient of determination
8452:Uniformly most powerful test
7219:Generalized expected utility
6485:maximum likelihood estimator
4728:{\displaystyle \mu _{m}\,\!}
4706:First, we estimate the mean
4622:is normal with unknown mean
4387:
3435:{\displaystyle p(\theta )=1}
3319:{\displaystyle L(a-\theta )}
2966:{\displaystyle p(\theta )=1}
2850:Generalized Bayes estimators
1697:
559:(based on some measurements
188:Principle of maximum entropy
7:
9410:Proportional hazards models
9354:Spectral density estimation
9336:Vector autoregression (VAR)
8770:Maximum posterior estimator
8002:Randomized controlled trial
7407:Encyclopedia of Mathematics
7283:Berger (1980), section 4.5.
7207:
6315:{\displaystyle \theta _{0}}
6281:{\displaystyle \delta _{n}}
4575:{\displaystyle \theta _{i}}
3277:generalized Bayes estimator
1311:, and the prior is normal,
910:
905:generalized Bayes estimator
158:Bernsteinâvon Mises theorem
10:
9776:
9170:Multivariate distributions
7590:Average absolute deviation
7360:(2nd ed.). Springer.
7358:Theory of Point Estimation
6885:measurements with average
6495:in a binomial distribution
6024:
4358:
4355:Empirical Bayes estimators
3282:
3053:{\displaystyle p(\theta )}
2853:
2085:
2053:Alternative risk functions
1197:
919:
27:Mathematical decision rule
9684:
9638:
9575:
9528:
9491:
9487:
9474:
9446:
9428:
9395:
9386:
9344:
9291:
9252:
9201:
9192:
9158:Structural equation model
9113:
9070:
9066:
9041:
9000:
8966:
8920:
8887:
8849:
8816:
8812:
8788:
8728:
8637:
8556:
8520:
8511:
8494:Score/Lagrange multiplier
8479:
8432:
8377:
8303:
8294:
8104:
8100:
8087:
8046:
8020:
7972:
7927:
7909:Sample size determination
7874:
7870:
7857:
7761:
7716:
7690:
7672:
7628:
7580:
7500:
7491:
7487:
7474:
7456:
6336:converges in distribution
4371:empirical Bayes estimator
1244:{\displaystyle x|\theta }
1188:minimum mean square error
922:Minimum mean square error
183:Principle of indifference
9653:Environmental statistics
9175:Elliptical distributions
8968:Generalized linear model
8897:Simple linear regression
8667:HodgesâLehmann estimator
8124:Probability distribution
8033:Stochastic approximation
7595:Coefficient of variation
7224:
7171:weighted arithmetic mean
6764:On the other hand, when
6478:asymptotically efficient
6027:Admissible decision rule
5019:{\displaystyle \mu _{m}}
4997:law of total expectation
2304:{\displaystyle a,b>0}
934:. The MSE is defined by
235:Markov chain Monte Carlo
9313:Cross-correlation (XCF)
8921:Non-standard predictors
8355:LehmannâScheffĂŠ theorem
8028:Adaptive clinical trial
6965:Internet Movie Database
6332:asymptotically unbiased
6005:{\displaystyle x_{n+1}}
4582:'s have a common prior
4548:{\displaystyle x_{n+1}}
4242:{\displaystyle a-x_{1}}
3930:{\displaystyle x+a_{0}}
3339:{\displaystyle \theta }
2860:The prior distribution
1040:{\displaystyle \theta }
721:{\displaystyle \theta }
552:{\displaystyle \theta }
456:{\displaystyle \theta }
421:posterior expected loss
240:Laplace's approximation
227:Posterior approximation
64:more precise citations.
9709:Mathematics portal
9530:Engineering statistics
9438:NelsonâAalen estimator
9015:Analysis of covariance
8902:Ordinary least squares
8826:Pearson product-moment
8230:Statistical functional
8141:Empirical distribution
7974:Controlled experiments
7703:Frequency distribution
7481:Descriptive statistics
7155:
7130:
7105:
7080:
7055:
7026:
6945:
6867:
6748:
6613:
6445:
6316:
6282:
6255:
6180:
6132:
6074:
6073:{\displaystyle p>2}
6006:
5973:
5940:
5849:
5776:
5698:
5631:
5535:
5487:
5440:
5398:
5347:
5311:
5272:
5129:
5056:
5020:
4986:
4878:
4804:
4758:
4729:
4703:in the following way.
4697:
4677:
4645:
4616:
4596:
4576:
4549:
4516:
4483:
4432:
4367:empirical Bayes method
4361:Empirical Bayes method
4345:
4289:
4243:
4210:
4186:
4038:
4011:
3985:
3958:
3931:
3894:
3871:
3806:
3786:
3754:
3572:
3436:
3398:
3340:
3320:
3265:
3189:
3054:
3019:
2967:
2921:
2874:
2829:
2658:
2657:{\displaystyle L>0}
2632:
2631:{\displaystyle K>0}
2593:
2513:
2305:
2269:
2195:
2113:
2112:{\displaystyle a>0}
2072:
2043:
1898:
1850:, and if the prior is
1844:
1785:
1729:
1645:
1607:, and if the prior is
1601:
1544:
1488:
1353:
1305:
1245:
1177:
1077:posterior distribution
1061:
1041:
1018:
896:
866:
845:
780:
751:
722:
698:
638:
601:
553:
533:
480:
457:
346:Mathematics portal
289:Evidence approximation
9625:Population statistics
9567:System identification
9301:Autocorrelation (ACF)
9229:Exponential smoothing
9143:Discriminant analysis
9138:Canonical correlation
9002:Partition of variance
8864:Regression validation
8708:(JonckheereâTerpstra)
8607:Likelihood-ratio test
8296:Frequentist inference
8208:Locationâscale family
8129:Sampling distribution
8094:Statistical inference
8061:Cross-sectional study
8048:Observational studies
8007:Randomized experiment
7836:Stem-and-leaf display
7638:Central limit theorem
7259:Jaynes, E.T. (2007).
7156:
7131:
7106:
7081:
7056:
7027:
6946:
6868:
6749:
6614:
6446:
6317:
6283:
6256:
6181:
6142:samples with density
6133:
6088:Asymptotic efficiency
6075:
6007:
5974:
5941:
5850:
5777:
5699:
5632:
5536:
5488:
5441:
5399:
5348:
5312:
5273:
5130:
5057:
5028:law of total variance
5021:
4987:
4879:
4805:
4759:
4730:
4698:
4678:
4646:
4617:
4597:
4577:
4550:
4517:
4484:
4433:
4346:
4290:
4244:
4216:has been replaced by
4211:
4187:
4039:
4037:{\displaystyle x_{1}}
4012:
3986:
3984:{\displaystyle a_{0}}
3959:
3957:{\displaystyle a_{0}}
3932:
3895:
3872:
3807:
3787:
3755:
3573:
3437:
3399:
3341:
3321:
3266:
3190:
3055:
3020:
2968:
2936:non-informative prior
2922:
2875:
2830:
2659:
2633:
2594:
2514:
2306:
2270:
2196:
2114:
2073:
2044:
1899:
1845:
1793:uniformly distributed
1786:
1730:
1646:
1602:
1545:
1489:
1354:
1306:
1246:
1186:This is known as the
1178:
1062:
1042:
1019:
897:
867:
846:
781:
752:
723:
699:
639:
602:
554:
534:
481:
458:
250:Variational inference
9548:Probabilistic design
9133:Principal components
8976:Exponential families
8928:Nonlinear regression
8907:General linear model
8869:Mixed effects models
8859:Errors and residuals
8836:Confounding variable
8738:Bayesian probability
8716:Van der Waerden test
8706:Ordered alternative
8471:Multiple comparisons
8350:RaoâBlackwellization
8313:Estimating equations
8269:Statistical distance
7987:Factorial experiment
7520:Arithmetic-Geometric
7402:"Bayesian estimator"
7142:
7117:
7092:
7067:
7042:
6978:
6969:Top Rated 250 Titles
6893:
6815:
6629:
6526:
6491:Example: estimating
6349:
6299:
6265:
6190:
6146:
6096:
6058:
5983:
5950:
5859:
5789:
5709:
5647:
5546:
5500:
5450:
5408:
5357:
5321:
5285:
5140:
5069:
5034:
5003:
4889:
4821:
4768:
4739:
4710:
4696:{\displaystyle \pi }
4687:
4655:
4626:
4615:{\displaystyle \pi }
4606:
4595:{\displaystyle \pi }
4586:
4559:
4526:
4493:
4442:
4396:
4302:
4253:
4220:
4200:
4051:
4021:
3995:
3968:
3941:
3937:, for some constant
3908:
3881:
3819:
3796:
3785:{\displaystyle a(x)}
3767:
3588:
3449:
3411:
3350:
3330:
3295:
3209:
3075:
3035:
2980:
2942:
2887:
2864:
2672:
2642:
2616:
2524:
2323:
2283:
2206:
2127:
2097:
2062:
1912:
1857:
1798:
1743:
1659:
1614:
1561:
1502:
1367:
1315:
1259:
1227:
1086:
1051:
1031:
941:
886:
856:
794:
761:
732:
712:
648:
619:
567:
543:
490:
479:{\displaystyle \pi }
470:
447:
328:Posterior predictive
297:Evidence lower bound
178:Likelihood principle
148:Bayesian probability
9760:Bayesian estimation
9620:Official statistics
9543:Methods engineering
9224:Seasonal adjustment
8992:Poisson regressions
8912:Bayesian regression
8851:Regression analysis
8831:Partial correlation
8803:Regression analysis
8402:Prediction interval
8397:Likelihood interval
8387:Confidence interval
8379:Interval estimation
8340:Unbiased estimators
8158:Model specification
8038:Up-and-down designs
7726:Partial correlation
7682:Index of dispersion
7600:Interquartile range
7181:with weight vector
7154:{\displaystyle C\ }
7129:{\displaystyle m\ }
7104:{\displaystyle v\ }
7079:{\displaystyle R\ }
7054:{\displaystyle W\ }
6340:normal distribution
6080:; this is known as
6012:can be calculated.
5932:
5762:
5735:
5617:
5599:
5581:
5563:
5467:
5188:
5157:
5051:
4915:
4044:, we must minimize
4010:{\displaystyle x=0}
3964:. To see this, let
930:(MSE), also called
539:be an estimator of
463:is known to have a
429:Bayesian statistics
408:that minimizes the
101:Bayesian statistics
95:Part of a series on
9640:Spatial statistics
9520:Medical statistics
9420:First hitting time
9374:Whittle likelihood
9025:Degrees of freedom
9020:Multivariate ANOVA
8953:Heteroscedasticity
8765:Bayesian estimator
8730:Bayesian inference
8579:KolmogorovâSmirnov
8464:Randomization test
8434:Testing hypotheses
8407:Tolerance interval
8318:Maximum likelihood
8213:Exponential family
8146:Density estimation
8106:Statistical theory
8066:Natural experiment
8012:Scientific control
7929:Survey methodology
7615:Standard deviation
7151:
7126:
7101:
7076:
7051:
7022:
6941:
6863:
6807:with deviation Ď,
6744:
6609:
6464:Fisher information
6441:
6312:
6278:
6251:
6176:
6128:
6082:Stein's phenomenon
6070:
6042:If θ belongs to a
6002:
5969:
5936:
5909:
5845:
5772:
5739:
5712:
5694:
5627:
5603:
5585:
5567:
5549:
5531:
5483:
5453:
5436:
5394:
5343:
5307:
5268:
5174:
5143:
5125:
5052:
5037:
5016:
4982:
4892:
4874:
4812:maximum likelihood
4800:
4754:
4725:
4693:
4673:
4641:
4612:
4592:
4572:
4555:. Assume that the
4545:
4512:
4479:
4428:
4341:
4285:
4239:
4206:
4182:
4034:
4007:
3981:
3954:
3927:
3893:{\displaystyle x.}
3890:
3867:
3802:
3782:
3750:
3568:
3432:
3394:
3336:
3316:
3289:location parameter
3261:
3185:
3050:
3015:
2963:
2917:
2870:
2840:mean squared error
2825:
2820:
2779:
2726:
2654:
2628:
2589:
2509:
2504:
2476:
2405:
2301:
2265:
2260:
2191:
2109:
2068:
2039:
1894:
1852:Pareto distributed
1840:
1781:
1725:
1641:
1597:
1540:
1484:
1349:
1301:
1241:
1190:(MMSE) estimator.
1173:
1057:
1037:
1014:
932:squared error risk
892:
862:
841:
776:
747:
718:
694:
634:
597:
549:
529:
476:
465:prior distribution
453:
271:Bayesian estimator
219:Hierarchical model
143:Bayesian inference
9742:
9741:
9680:
9679:
9676:
9675:
9615:National accounts
9585:Actuarial science
9577:Social statistics
9470:
9469:
9466:
9465:
9462:
9461:
9397:Survival function
9382:
9381:
9244:Granger causality
9085:Contingency table
9060:Survival analysis
9037:
9036:
9033:
9032:
8889:Linear regression
8784:
8783:
8780:
8779:
8755:Credible interval
8724:
8723:
8507:
8506:
8323:Method of moments
8192:Parametric family
8153:Statistical model
8083:
8082:
8079:
8078:
7997:Random assignment
7919:Statistical power
7853:
7852:
7849:
7848:
7698:Contingency table
7668:
7667:
7535:Generalized/power
7270:978-0-521-59271-0
7150:
7125:
7100:
7075:
7061:= weighted rating
7050:
7021:
7017:
6936:
6912:
6858:
6834:
6723:
6684:
6604:
6509:Beta distribution
6431:
6357:
5919:
5897:
5749:
5722:
5682:
5660:
4995:Next, we use the
4959:
4927:
4902:
4854:
4834:
4209:{\displaystyle a}
3805:{\displaystyle x}
3700:
3566:
3525:
3180:
2873:{\displaystyle p}
2844:robust statistics
2801:
2778:
2748:
2725:
2696:
2584:
2542:
2493:
2475:
2458:
2422:
2404:
2387:
2347:
2259:
2224:
2183:
2151:
2071:{\displaystyle F}
2034:
1924:
1720:
1700:
1671:
1609:Gamma distributed
1557:random variables
1476:
1431:
1379:
1210:parametric family
1098:
1060:{\displaystyle x}
978:
928:mean square error
895:{\displaystyle x}
865:{\displaystyle x}
824:
773:
744:
685:
631:
591:
517:
502:
386:estimation theory
382:
381:
276:Credible interval
209:Linear regression
90:
89:
82:
16:(Redirected from
9767:
9730:
9729:
9718:
9717:
9707:
9706:
9692:
9691:
9595:Crime statistics
9489:
9488:
9476:
9475:
9393:
9392:
9359:Fourier analysis
9346:Frequency domain
9326:
9273:
9239:Structural break
9199:
9198:
9148:Cluster analysis
9095:Log-linear model
9068:
9067:
9043:
9042:
8984:
8958:Homoscedasticity
8814:
8813:
8790:
8789:
8709:
8701:
8693:
8692:(KruskalâWallis)
8677:
8662:
8617:Cross validation
8602:
8584:AndersonâDarling
8531:
8518:
8517:
8489:Likelihood-ratio
8481:Parametric tests
8459:Permutation test
8442:1- & 2-tails
8333:Minimum distance
8305:Point estimation
8301:
8300:
8252:Optimal decision
8203:
8102:
8101:
8089:
8088:
8071:Quasi-experiment
8021:Adaptive designs
7872:
7871:
7859:
7858:
7736:Rank correlation
7498:
7497:
7489:
7488:
7476:
7475:
7443:
7436:
7429:
7420:
7419:
7415:
7390:
7371:
7352:
7327:Berger, James O.
7313:
7308:
7302:
7299:
7293:
7290:
7284:
7281:
7275:
7274:
7256:
7250:
7247:
7238:
7235:
7160:
7158:
7157:
7152:
7148:
7135:
7133:
7132:
7127:
7123:
7110:
7108:
7107:
7102:
7098:
7085:
7083:
7082:
7077:
7073:
7060:
7058:
7057:
7052:
7048:
7031:
7029:
7028:
7023:
7019:
7018:
7016:
7005:
6988:
6950:
6948:
6947:
6942:
6937:
6935:
6921:
6913:
6911:
6897:
6881:this prior with
6872:
6870:
6869:
6864:
6859:
6857:
6843:
6835:
6833:
6819:
6753:
6751:
6750:
6745:
6740:
6739:
6724:
6722:
6702:
6685:
6683:
6666:
6655:
6641:
6640:
6618:
6616:
6615:
6610:
6605:
6603:
6586:
6575:
6564:
6538:
6537:
6450:
6448:
6447:
6442:
6437:
6433:
6432:
6430:
6426:
6425:
6406:
6384:
6383:
6371:
6370:
6358:
6353:
6321:
6319:
6318:
6313:
6311:
6310:
6287:
6285:
6284:
6279:
6277:
6276:
6260:
6258:
6257:
6252:
6247:
6246:
6228:
6227:
6215:
6214:
6202:
6201:
6185:
6183:
6182:
6177:
6169:
6164:
6163:
6137:
6135:
6134:
6129:
6121:
6120:
6108:
6107:
6079:
6077:
6076:
6071:
6011:
6009:
6008:
6003:
6001:
6000:
5978:
5976:
5975:
5970:
5968:
5967:
5945:
5943:
5942:
5937:
5931:
5926:
5921:
5920:
5912:
5905:
5904:
5899:
5898:
5890:
5877:
5876:
5854:
5852:
5851:
5846:
5835:
5834:
5816:
5815:
5806:
5801:
5800:
5785:For example, if
5781:
5779:
5778:
5773:
5761:
5756:
5751:
5750:
5742:
5734:
5729:
5724:
5723:
5715:
5703:
5701:
5700:
5695:
5690:
5689:
5684:
5683:
5675:
5668:
5667:
5662:
5661:
5653:
5636:
5634:
5633:
5628:
5616:
5611:
5598:
5593:
5580:
5575:
5562:
5557:
5540:
5538:
5537:
5532:
5525:
5524:
5512:
5511:
5492:
5490:
5489:
5484:
5466:
5461:
5445:
5443:
5442:
5437:
5420:
5419:
5403:
5401:
5400:
5395:
5390:
5389:
5380:
5375:
5374:
5352:
5350:
5349:
5344:
5333:
5332:
5316:
5314:
5313:
5308:
5297:
5296:
5277:
5275:
5274:
5269:
5261:
5260:
5251:
5250:
5229:
5228:
5213:
5212:
5187:
5182:
5170:
5169:
5156:
5151:
5134:
5132:
5131:
5126:
5107:
5106:
5094:
5093:
5081:
5080:
5061:
5059:
5058:
5053:
5050:
5045:
5025:
5023:
5022:
5017:
5015:
5014:
4991:
4989:
4988:
4983:
4978:
4977:
4976:
4967:
4966:
4961:
4960:
4952:
4945:
4944:
4928:
4920:
4914:
4909:
4904:
4903:
4895:
4883:
4881:
4880:
4875:
4870:
4869:
4868:
4855:
4847:
4842:
4841:
4836:
4835:
4827:
4809:
4807:
4806:
4801:
4799:
4798:
4780:
4779:
4763:
4761:
4760:
4755:
4751:
4750:
4734:
4732:
4731:
4726:
4722:
4721:
4702:
4700:
4699:
4694:
4682:
4680:
4679:
4674:
4667:
4666:
4650:
4648:
4647:
4642:
4638:
4637:
4621:
4619:
4618:
4613:
4601:
4599:
4598:
4593:
4581:
4579:
4578:
4573:
4571:
4570:
4554:
4552:
4551:
4546:
4544:
4543:
4521:
4519:
4518:
4513:
4511:
4510:
4488:
4486:
4485:
4480:
4475:
4474:
4465:
4460:
4459:
4437:
4435:
4434:
4429:
4427:
4426:
4408:
4407:
4350:
4348:
4347:
4342:
4329:
4328:
4294:
4292:
4291:
4286:
4284:
4283:
4271:
4270:
4248:
4246:
4245:
4240:
4238:
4237:
4215:
4213:
4212:
4207:
4191:
4189:
4188:
4183:
4178:
4164:
4144:
4133:
4132:
4090:
4089:
4043:
4041:
4040:
4035:
4033:
4032:
4016:
4014:
4013:
4008:
3990:
3988:
3987:
3982:
3980:
3979:
3963:
3961:
3960:
3955:
3953:
3952:
3936:
3934:
3933:
3928:
3926:
3925:
3899:
3897:
3896:
3891:
3876:
3874:
3873:
3868:
3811:
3809:
3808:
3803:
3791:
3789:
3788:
3783:
3759:
3757:
3756:
3751:
3701:
3699:
3682:
3677:
3664:
3619:
3577:
3575:
3574:
3569:
3567:
3565:
3551:
3531:
3526:
3524:
3510:
3491:
3476:
3465:
3441:
3439:
3438:
3433:
3403:
3401:
3400:
3395:
3366:
3345:
3343:
3342:
3337:
3325:
3323:
3322:
3317:
3270:
3268:
3267:
3262:
3260:
3247:
3194:
3192:
3191:
3186:
3181:
3179:
3154:
3136:
3117:
3102:
3091:
3059:
3057:
3056:
3051:
3024:
3022:
3021:
3016:
3005:
2972:
2970:
2969:
2964:
2926:
2924:
2923:
2918:
2879:
2877:
2876:
2871:
2834:
2832:
2831:
2826:
2824:
2823:
2808:
2803:
2802:
2794:
2785:
2780:
2776:
2755:
2750:
2749:
2741:
2732:
2727:
2723:
2698:
2697:
2689:
2663:
2661:
2660:
2655:
2637:
2635:
2634:
2629:
2598:
2596:
2595:
2590:
2585:
2583:
2569:
2558:
2544:
2543:
2535:
2518:
2516:
2515:
2510:
2508:
2507:
2495:
2494:
2486:
2477:
2473:
2465:
2460:
2459:
2451:
2442:
2424:
2423:
2415:
2406:
2402:
2394:
2389:
2388:
2380:
2371:
2349:
2348:
2340:
2310:
2308:
2307:
2302:
2274:
2272:
2271:
2266:
2261:
2252:
2240:
2226:
2225:
2217:
2200:
2198:
2197:
2192:
2190:
2185:
2184:
2176:
2167:
2153:
2152:
2144:
2118:
2116:
2115:
2110:
2077:
2075:
2074:
2069:
2048:
2046:
2045:
2040:
2035:
2033:
2016:
2015:
2011:
2010:
1986:
1985:
1973:
1972:
1940:
1926:
1925:
1917:
1903:
1901:
1900:
1895:
1884:
1883:
1849:
1847:
1846:
1841:
1815:
1810:
1809:
1790:
1788:
1787:
1782:
1780:
1779:
1755:
1754:
1734:
1732:
1731:
1726:
1721:
1719:
1708:
1701:
1693:
1687:
1673:
1672:
1664:
1650:
1648:
1647:
1642:
1606:
1604:
1603:
1598:
1578:
1573:
1572:
1549:
1547:
1546:
1541:
1539:
1538:
1514:
1513:
1493:
1491:
1490:
1485:
1477:
1475:
1474:
1473:
1461:
1460:
1450:
1449:
1440:
1432:
1430:
1429:
1428:
1416:
1415:
1405:
1404:
1395:
1381:
1380:
1372:
1358:
1356:
1355:
1350:
1345:
1344:
1310:
1308:
1307:
1302:
1297:
1296:
1269:
1250:
1248:
1247:
1242:
1237:
1182:
1180:
1179:
1174:
1156:
1126:
1100:
1099:
1091:
1066:
1064:
1063:
1058:
1046:
1044:
1043:
1038:
1023:
1021:
1020:
1015:
1010:
1006:
1005:
1004:
980:
979:
971:
954:
901:
899:
898:
893:
876:If the prior is
871:
869:
868:
863:
850:
848:
847:
842:
834:
826:
825:
817:
786:is said to be a
785:
783:
782:
777:
775:
774:
766:
756:
754:
753:
748:
746:
745:
737:
727:
725:
724:
719:
703:
701:
700:
695:
687:
686:
678:
660:
659:
643:
641:
640:
635:
633:
632:
624:
606:
604:
603:
598:
593:
592:
584:
558:
556:
555:
550:
538:
536:
535:
530:
519:
518:
510:
504:
503:
495:
485:
483:
482:
477:
462:
460:
459:
454:
374:
367:
360:
344:
343:
310:Model evaluation
111:
92:
91:
85:
78:
74:
71:
65:
60:this article by
51:inline citations
38:
37:
30:
21:
9775:
9774:
9770:
9769:
9768:
9766:
9765:
9764:
9745:
9744:
9743:
9738:
9701:
9672:
9634:
9571:
9557:quality control
9524:
9506:Clinical trials
9483:
9458:
9442:
9430:Hazard function
9424:
9378:
9340:
9324:
9287:
9283:BreuschâGodfrey
9271:
9248:
9188:
9163:Factor analysis
9109:
9090:Graphical model
9062:
9029:
8996:
8982:
8962:
8916:
8883:
8845:
8808:
8807:
8776:
8720:
8707:
8699:
8691:
8675:
8660:
8639:Rank statistics
8633:
8612:Model selection
8600:
8558:Goodness of fit
8552:
8529:
8503:
8475:
8428:
8373:
8362:Median unbiased
8290:
8201:
8134:Order statistic
8096:
8075:
8042:
8016:
7968:
7923:
7866:
7864:Data collection
7845:
7757:
7712:
7686:
7664:
7624:
7576:
7493:Continuous data
7483:
7470:
7452:
7447:
7400:
7397:
7387:
7368:
7341:
7322:
7317:
7316:
7309:
7305:
7300:
7296:
7291:
7287:
7282:
7278:
7271:
7257:
7253:
7248:
7241:
7236:
7232:
7227:
7210:
7143:
7140:
7139:
7118:
7115:
7114:
7093:
7090:
7089:
7068:
7065:
7064:
7043:
7040:
7039:
7006:
6989:
6987:
6979:
6976:
6975:
6961:
6925:
6920:
6901:
6896:
6894:
6891:
6890:
6847:
6842:
6823:
6818:
6816:
6813:
6812:
6729:
6725:
6706:
6701:
6667:
6656:
6654:
6636:
6632:
6630:
6627:
6626:
6587:
6576:
6574:
6560:
6533:
6529:
6527:
6524:
6523:
6497:
6475:
6469:
6461:
6421:
6417:
6410:
6405:
6398:
6394:
6379:
6375:
6366:
6362:
6352:
6350:
6347:
6346:
6306:
6302:
6300:
6297:
6296:
6272:
6268:
6266:
6263:
6262:
6242:
6238:
6223:
6219:
6210:
6206:
6197:
6193:
6191:
6188:
6187:
6165:
6159:
6155:
6147:
6144:
6143:
6116:
6112:
6103:
6099:
6097:
6094:
6093:
6090:
6059:
6056:
6055:
6029:
6023:
6018:
5990:
5986:
5984:
5981:
5980:
5957:
5953:
5951:
5948:
5947:
5927:
5922:
5911:
5910:
5900:
5889:
5888:
5887:
5866:
5862:
5860:
5857:
5856:
5830:
5826:
5811:
5807:
5802:
5796:
5792:
5790:
5787:
5786:
5757:
5752:
5741:
5740:
5730:
5725:
5714:
5713:
5710:
5707:
5706:
5685:
5674:
5673:
5672:
5663:
5652:
5651:
5650:
5648:
5645:
5644:
5612:
5607:
5594:
5589:
5576:
5571:
5558:
5553:
5547:
5544:
5543:
5520:
5516:
5507:
5503:
5501:
5498:
5497:
5493:; we then have
5462:
5457:
5451:
5448:
5447:
5415:
5411:
5409:
5406:
5405:
5385:
5381:
5376:
5370:
5366:
5358:
5355:
5354:
5328:
5324:
5322:
5319:
5318:
5292:
5288:
5286:
5283:
5282:
5256:
5252:
5246:
5242:
5224:
5220:
5208:
5204:
5183:
5178:
5165:
5161:
5152:
5147:
5141:
5138:
5137:
5102:
5098:
5089:
5085:
5076:
5072:
5070:
5067:
5066:
5046:
5041:
5035:
5032:
5031:
5010:
5006:
5004:
5001:
5000:
4972:
4968:
4962:
4951:
4950:
4949:
4940:
4936:
4932:
4919:
4910:
4905:
4894:
4893:
4890:
4887:
4886:
4864:
4860:
4859:
4846:
4837:
4826:
4825:
4824:
4822:
4819:
4818:
4794:
4790:
4775:
4771:
4769:
4766:
4765:
4746:
4742:
4740:
4737:
4736:
4717:
4713:
4711:
4708:
4707:
4688:
4685:
4684:
4662:
4658:
4656:
4653:
4652:
4633:
4629:
4627:
4624:
4623:
4607:
4604:
4603:
4587:
4584:
4583:
4566:
4562:
4560:
4557:
4556:
4533:
4529:
4527:
4524:
4523:
4500:
4496:
4494:
4491:
4490:
4470:
4466:
4461:
4455:
4451:
4443:
4440:
4439:
4422:
4418:
4403:
4399:
4397:
4394:
4393:
4390:
4376:There are both
4363:
4357:
4324:
4320:
4303:
4300:
4299:
4279:
4275:
4266:
4262:
4254:
4251:
4250:
4233:
4229:
4221:
4218:
4217:
4201:
4198:
4197:
4171:
4157:
4137:
4128:
4124:
4085:
4081:
4052:
4049:
4048:
4028:
4024:
4022:
4019:
4018:
3996:
3993:
3992:
3975:
3971:
3969:
3966:
3965:
3948:
3944:
3942:
3939:
3938:
3921:
3917:
3909:
3906:
3905:
3882:
3879:
3878:
3820:
3817:
3816:
3797:
3794:
3793:
3768:
3765:
3764:
3686:
3681:
3660:
3632:
3615:
3589:
3586:
3585:
3552:
3532:
3530:
3511:
3487:
3477:
3475:
3461:
3450:
3447:
3446:
3412:
3409:
3408:
3362:
3351:
3348:
3347:
3331:
3328:
3327:
3296:
3293:
3292:
3285:
3243:
3215:
3210:
3207:
3206:
3150:
3137:
3113:
3103:
3101:
3087:
3076:
3073:
3072:
3062:improper priors
3036:
3033:
3032:
2986:
2981:
2978:
2977:
2943:
2940:
2939:
2888:
2885:
2884:
2865:
2862:
2861:
2858:
2852:
2819:
2818:
2804:
2793:
2792:
2781:
2774:
2772:
2763:
2762:
2751:
2740:
2739:
2728:
2721:
2719:
2706:
2705:
2688:
2687:
2673:
2670:
2669:
2643:
2640:
2639:
2617:
2614:
2613:
2605:
2573:
2568:
2554:
2534:
2533:
2525:
2522:
2521:
2503:
2502:
2485:
2484:
2471:
2469:
2461:
2450:
2449:
2438:
2432:
2431:
2414:
2413:
2400:
2398:
2390:
2379:
2378:
2367:
2357:
2356:
2339:
2338:
2324:
2321:
2320:
2284:
2281:
2280:
2250:
2236:
2216:
2215:
2207:
2204:
2203:
2186:
2175:
2174:
2163:
2143:
2142:
2128:
2125:
2124:
2098:
2095:
2094:
2090:
2084:
2063:
2060:
2059:
2055:
2017:
2006:
2002:
1981:
1977:
1968:
1964:
1960:
1941:
1939:
1916:
1915:
1913:
1910:
1909:
1879:
1875:
1858:
1855:
1854:
1811:
1805:
1801:
1799:
1796:
1795:
1775:
1771:
1750:
1746:
1744:
1741:
1740:
1709:
1692:
1688:
1686:
1663:
1662:
1660:
1657:
1656:
1615:
1612:
1611:
1574:
1568:
1564:
1562:
1559:
1558:
1534:
1530:
1509:
1505:
1503:
1500:
1499:
1469:
1465:
1456:
1452:
1451:
1445:
1441:
1439:
1424:
1420:
1411:
1407:
1406:
1400:
1396:
1394:
1371:
1370:
1368:
1365:
1364:
1340:
1336:
1316:
1313:
1312:
1292:
1288:
1265:
1260:
1257:
1256:
1233:
1228:
1225:
1224:
1206:conjugate prior
1202:
1200:Conjugate prior
1196:
1152:
1122:
1090:
1089:
1087:
1084:
1083:
1073:
1052:
1049:
1048:
1032:
1029:
1028:
1000:
996:
970:
969:
965:
961:
944:
942:
939:
938:
924:
918:
913:
887:
884:
883:
857:
854:
853:
830:
816:
815:
795:
792:
791:
788:Bayes estimator
765:
764:
762:
759:
758:
757:. An estimator
736:
735:
733:
730:
729:
713:
710:
709:
677:
676:
655:
651:
649:
646:
645:
623:
622:
620:
617:
616:
583:
582:
568:
565:
564:
544:
541:
540:
509:
508:
494:
493:
491:
488:
487:
471:
468:
467:
448:
445:
444:
441:
394:Bayes estimator
390:decision theory
378:
338:
323:Model averaging
302:Nested sampling
214:Empirical Bayes
204:Conjugate prior
173:Cromwell's rule
86:
75:
69:
66:
56:Please help to
55:
39:
35:
28:
23:
22:
15:
12:
11:
5:
9773:
9763:
9762:
9757:
9740:
9739:
9737:
9736:
9724:
9712:
9698:
9685:
9682:
9681:
9678:
9677:
9674:
9673:
9671:
9670:
9665:
9660:
9655:
9650:
9644:
9642:
9636:
9635:
9633:
9632:
9627:
9622:
9617:
9612:
9607:
9602:
9597:
9592:
9587:
9581:
9579:
9573:
9572:
9570:
9569:
9564:
9559:
9550:
9545:
9540:
9534:
9532:
9526:
9525:
9523:
9522:
9517:
9512:
9503:
9501:Bioinformatics
9497:
9495:
9485:
9484:
9472:
9471:
9468:
9467:
9464:
9463:
9460:
9459:
9457:
9456:
9450:
9448:
9444:
9443:
9441:
9440:
9434:
9432:
9426:
9425:
9423:
9422:
9417:
9412:
9407:
9401:
9399:
9390:
9384:
9383:
9380:
9379:
9377:
9376:
9371:
9366:
9361:
9356:
9350:
9348:
9342:
9341:
9339:
9338:
9333:
9328:
9320:
9315:
9310:
9309:
9308:
9306:partial (PACF)
9297:
9295:
9289:
9288:
9286:
9285:
9280:
9275:
9267:
9262:
9256:
9254:
9253:Specific tests
9250:
9249:
9247:
9246:
9241:
9236:
9231:
9226:
9221:
9216:
9211:
9205:
9203:
9196:
9190:
9189:
9187:
9186:
9185:
9184:
9183:
9182:
9167:
9166:
9165:
9155:
9153:Classification
9150:
9145:
9140:
9135:
9130:
9125:
9119:
9117:
9111:
9110:
9108:
9107:
9102:
9100:McNemar's test
9097:
9092:
9087:
9082:
9076:
9074:
9064:
9063:
9039:
9038:
9035:
9034:
9031:
9030:
9028:
9027:
9022:
9017:
9012:
9006:
9004:
8998:
8997:
8995:
8994:
8978:
8972:
8970:
8964:
8963:
8961:
8960:
8955:
8950:
8945:
8940:
8938:Semiparametric
8935:
8930:
8924:
8922:
8918:
8917:
8915:
8914:
8909:
8904:
8899:
8893:
8891:
8885:
8884:
8882:
8881:
8876:
8871:
8866:
8861:
8855:
8853:
8847:
8846:
8844:
8843:
8838:
8833:
8828:
8822:
8820:
8810:
8809:
8806:
8805:
8800:
8794:
8786:
8785:
8782:
8781:
8778:
8777:
8775:
8774:
8773:
8772:
8762:
8757:
8752:
8751:
8750:
8745:
8734:
8732:
8726:
8725:
8722:
8721:
8719:
8718:
8713:
8712:
8711:
8703:
8695:
8679:
8676:(MannâWhitney)
8671:
8670:
8669:
8656:
8655:
8654:
8643:
8641:
8635:
8634:
8632:
8631:
8630:
8629:
8624:
8619:
8609:
8604:
8601:(ShapiroâWilk)
8596:
8591:
8586:
8581:
8576:
8568:
8562:
8560:
8554:
8553:
8551:
8550:
8542:
8533:
8521:
8515:
8513:Specific tests
8509:
8508:
8505:
8504:
8502:
8501:
8496:
8491:
8485:
8483:
8477:
8476:
8474:
8473:
8468:
8467:
8466:
8456:
8455:
8454:
8444:
8438:
8436:
8430:
8429:
8427:
8426:
8425:
8424:
8419:
8409:
8404:
8399:
8394:
8389:
8383:
8381:
8375:
8374:
8372:
8371:
8366:
8365:
8364:
8359:
8358:
8357:
8352:
8337:
8336:
8335:
8330:
8325:
8320:
8309:
8307:
8298:
8292:
8291:
8289:
8288:
8283:
8278:
8277:
8276:
8266:
8261:
8260:
8259:
8249:
8248:
8247:
8242:
8237:
8227:
8222:
8217:
8216:
8215:
8210:
8205:
8189:
8188:
8187:
8182:
8177:
8167:
8166:
8165:
8160:
8150:
8149:
8148:
8138:
8137:
8136:
8126:
8121:
8116:
8110:
8108:
8098:
8097:
8085:
8084:
8081:
8080:
8077:
8076:
8074:
8073:
8068:
8063:
8058:
8052:
8050:
8044:
8043:
8041:
8040:
8035:
8030:
8024:
8022:
8018:
8017:
8015:
8014:
8009:
8004:
7999:
7994:
7989:
7984:
7978:
7976:
7970:
7969:
7967:
7966:
7964:Standard error
7961:
7956:
7951:
7950:
7949:
7944:
7933:
7931:
7925:
7924:
7922:
7921:
7916:
7911:
7906:
7901:
7896:
7894:Optimal design
7891:
7886:
7880:
7878:
7868:
7867:
7855:
7854:
7851:
7850:
7847:
7846:
7844:
7843:
7838:
7833:
7828:
7823:
7818:
7813:
7808:
7803:
7798:
7793:
7788:
7783:
7778:
7773:
7767:
7765:
7759:
7758:
7756:
7755:
7750:
7749:
7748:
7743:
7733:
7728:
7722:
7720:
7714:
7713:
7711:
7710:
7705:
7700:
7694:
7692:
7691:Summary tables
7688:
7687:
7685:
7684:
7678:
7676:
7670:
7669:
7666:
7665:
7663:
7662:
7661:
7660:
7655:
7650:
7640:
7634:
7632:
7626:
7625:
7623:
7622:
7617:
7612:
7607:
7602:
7597:
7592:
7586:
7584:
7578:
7577:
7575:
7574:
7569:
7564:
7563:
7562:
7557:
7552:
7547:
7542:
7537:
7532:
7527:
7525:Contraharmonic
7522:
7517:
7506:
7504:
7495:
7485:
7484:
7472:
7471:
7469:
7468:
7463:
7457:
7454:
7453:
7446:
7445:
7438:
7431:
7423:
7417:
7416:
7396:
7395:External links
7393:
7392:
7391:
7385:
7372:
7366:
7353:
7339:
7321:
7318:
7315:
7314:
7303:
7294:
7285:
7276:
7269:
7251:
7239:
7229:
7228:
7226:
7223:
7222:
7221:
7216:
7209:
7206:
7163:
7162:
7147:
7137:
7122:
7112:
7097:
7087:
7072:
7062:
7047:
7033:
7032:
7015:
7012:
7009:
7004:
7001:
6998:
6995:
6992:
6986:
6983:
6960:
6957:
6940:
6934:
6931:
6928:
6924:
6919:
6916:
6910:
6907:
6904:
6900:
6862:
6856:
6853:
6850:
6846:
6841:
6838:
6832:
6829:
6826:
6822:
6755:
6754:
6743:
6738:
6735:
6732:
6728:
6721:
6718:
6715:
6712:
6709:
6705:
6700:
6697:
6694:
6691:
6688:
6682:
6679:
6676:
6673:
6670:
6665:
6662:
6659:
6653:
6650:
6647:
6644:
6639:
6635:
6620:
6619:
6608:
6602:
6599:
6596:
6593:
6590:
6585:
6582:
6579:
6573:
6570:
6567:
6563:
6559:
6556:
6553:
6550:
6547:
6544:
6541:
6536:
6532:
6496:
6489:
6471:
6467:
6459:
6452:
6451:
6440:
6436:
6429:
6424:
6420:
6416:
6413:
6409:
6404:
6401:
6397:
6393:
6390:
6387:
6382:
6378:
6374:
6369:
6365:
6361:
6356:
6309:
6305:
6275:
6271:
6250:
6245:
6241:
6237:
6234:
6231:
6226:
6222:
6218:
6213:
6209:
6205:
6200:
6196:
6175:
6172:
6168:
6162:
6158:
6154:
6151:
6127:
6124:
6119:
6115:
6111:
6106:
6102:
6089:
6086:
6069:
6066:
6063:
6051:
6050:
6047:
6040:
6022:
6019:
6017:
6014:
5999:
5996:
5993:
5989:
5966:
5963:
5960:
5956:
5935:
5930:
5925:
5918:
5915:
5908:
5903:
5896:
5893:
5886:
5883:
5880:
5875:
5872:
5869:
5865:
5844:
5841:
5838:
5833:
5829:
5825:
5822:
5819:
5814:
5810:
5805:
5799:
5795:
5783:
5782:
5771:
5768:
5765:
5760:
5755:
5748:
5745:
5738:
5733:
5728:
5721:
5718:
5704:
5693:
5688:
5681:
5678:
5671:
5666:
5659:
5656:
5638:
5637:
5626:
5623:
5620:
5615:
5610:
5606:
5602:
5597:
5592:
5588:
5584:
5579:
5574:
5570:
5566:
5561:
5556:
5552:
5541:
5530:
5523:
5519:
5515:
5510:
5506:
5482:
5479:
5476:
5473:
5470:
5465:
5460:
5456:
5435:
5432:
5429:
5426:
5423:
5418:
5414:
5393:
5388:
5384:
5379:
5373:
5369:
5365:
5362:
5342:
5339:
5336:
5331:
5327:
5306:
5303:
5300:
5295:
5291:
5279:
5278:
5267:
5264:
5259:
5255:
5249:
5245:
5241:
5238:
5235:
5232:
5227:
5223:
5219:
5216:
5211:
5207:
5203:
5200:
5197:
5194:
5191:
5186:
5181:
5177:
5173:
5168:
5164:
5160:
5155:
5150:
5146:
5135:
5124:
5119:
5116:
5113:
5110:
5105:
5101:
5097:
5092:
5088:
5084:
5079:
5075:
5049:
5044:
5040:
5013:
5009:
4993:
4992:
4981:
4975:
4971:
4965:
4958:
4955:
4948:
4943:
4939:
4935:
4931:
4926:
4923:
4918:
4913:
4908:
4901:
4898:
4884:
4873:
4867:
4863:
4858:
4853:
4850:
4845:
4840:
4833:
4830:
4797:
4793:
4789:
4786:
4783:
4778:
4774:
4749:
4745:
4720:
4716:
4692:
4672:
4665:
4661:
4636:
4632:
4611:
4591:
4569:
4565:
4542:
4539:
4536:
4532:
4509:
4506:
4503:
4499:
4478:
4473:
4469:
4464:
4458:
4454:
4450:
4447:
4425:
4421:
4417:
4414:
4411:
4406:
4402:
4389:
4386:
4382:non-parametric
4359:Main article:
4356:
4353:
4352:
4351:
4338:
4335:
4332:
4327:
4323:
4319:
4316:
4313:
4310:
4307:
4282:
4278:
4274:
4269:
4265:
4261:
4258:
4236:
4232:
4228:
4225:
4205:
4194:
4193:
4181:
4177:
4174:
4170:
4167:
4163:
4160:
4156:
4153:
4150:
4147:
4143:
4140:
4136:
4131:
4127:
4123:
4120:
4117:
4114:
4111:
4108:
4105:
4102:
4099:
4096:
4093:
4088:
4084:
4080:
4077:
4074:
4071:
4068:
4065:
4062:
4059:
4056:
4031:
4027:
4006:
4003:
4000:
3978:
3974:
3951:
3947:
3924:
3920:
3916:
3913:
3902:
3901:
3889:
3886:
3866:
3863:
3860:
3857:
3854:
3851:
3848:
3845:
3842:
3839:
3836:
3833:
3830:
3827:
3824:
3801:
3781:
3778:
3775:
3772:
3761:
3760:
3749:
3746:
3743:
3740:
3737:
3734:
3731:
3728:
3725:
3722:
3719:
3716:
3713:
3710:
3707:
3704:
3698:
3695:
3692:
3689:
3685:
3680:
3676:
3673:
3670:
3667:
3663:
3659:
3656:
3653:
3650:
3647:
3644:
3641:
3638:
3635:
3631:
3628:
3625:
3622:
3618:
3614:
3611:
3608:
3605:
3602:
3599:
3596:
3593:
3579:
3578:
3564:
3561:
3558:
3555:
3550:
3547:
3544:
3541:
3538:
3535:
3529:
3523:
3520:
3517:
3514:
3509:
3506:
3503:
3500:
3497:
3494:
3490:
3486:
3483:
3480:
3474:
3471:
3468:
3464:
3460:
3457:
3454:
3431:
3428:
3425:
3422:
3419:
3416:
3393:
3390:
3387:
3384:
3381:
3378:
3375:
3372:
3369:
3365:
3361:
3358:
3355:
3335:
3315:
3312:
3309:
3306:
3303:
3300:
3284:
3281:
3272:
3271:
3259:
3256:
3253:
3250:
3246:
3242:
3239:
3236:
3233:
3230:
3227:
3224:
3221:
3218:
3214:
3200:Bayes' theorem
3196:
3195:
3184:
3178:
3175:
3172:
3169:
3166:
3163:
3160:
3157:
3153:
3149:
3146:
3143:
3140:
3135:
3132:
3129:
3126:
3123:
3120:
3116:
3112:
3109:
3106:
3100:
3097:
3094:
3090:
3086:
3083:
3080:
3049:
3046:
3043:
3040:
3026:
3025:
3014:
3011:
3008:
3004:
3001:
2998:
2995:
2992:
2989:
2985:
2962:
2959:
2956:
2953:
2950:
2947:
2928:
2927:
2916:
2913:
2910:
2907:
2904:
2901:
2898:
2895:
2892:
2869:
2851:
2848:
2836:
2835:
2822:
2817:
2814:
2811:
2807:
2800:
2797:
2791:
2788:
2784:
2773:
2771:
2768:
2765:
2764:
2761:
2758:
2754:
2747:
2744:
2738:
2735:
2731:
2720:
2718:
2715:
2712:
2711:
2709:
2704:
2701:
2695:
2692:
2686:
2683:
2680:
2677:
2666:
2665:
2653:
2650:
2647:
2627:
2624:
2621:
2610:posterior mode
2604:
2603:Posterior mode
2601:
2600:
2599:
2588:
2582:
2579:
2576:
2572:
2567:
2564:
2561:
2557:
2553:
2550:
2547:
2541:
2538:
2532:
2529:
2519:
2506:
2501:
2498:
2492:
2489:
2483:
2480:
2470:
2468:
2464:
2457:
2454:
2448:
2445:
2441:
2437:
2434:
2433:
2430:
2427:
2421:
2418:
2412:
2409:
2399:
2397:
2393:
2386:
2383:
2377:
2374:
2370:
2366:
2363:
2362:
2360:
2355:
2352:
2346:
2343:
2337:
2334:
2331:
2328:
2317:
2316:
2300:
2297:
2294:
2291:
2288:
2276:
2275:
2264:
2258:
2255:
2249:
2246:
2243:
2239:
2235:
2232:
2229:
2223:
2220:
2214:
2211:
2201:
2189:
2182:
2179:
2173:
2170:
2166:
2162:
2159:
2156:
2150:
2147:
2141:
2138:
2135:
2132:
2121:
2120:
2108:
2105:
2102:
2086:Main article:
2083:
2080:
2067:
2054:
2051:
2050:
2049:
2038:
2032:
2029:
2026:
2023:
2020:
2014:
2009:
2005:
2001:
1998:
1995:
1992:
1989:
1984:
1980:
1976:
1971:
1967:
1963:
1959:
1956:
1953:
1950:
1947:
1944:
1938:
1935:
1932:
1929:
1923:
1920:
1906:
1905:
1893:
1890:
1887:
1882:
1878:
1874:
1871:
1868:
1865:
1862:
1839:
1836:
1833:
1830:
1827:
1824:
1821:
1818:
1814:
1808:
1804:
1778:
1774:
1770:
1767:
1764:
1761:
1758:
1753:
1749:
1736:
1735:
1724:
1718:
1715:
1712:
1707:
1704:
1699:
1696:
1691:
1685:
1682:
1679:
1676:
1670:
1667:
1653:
1652:
1640:
1637:
1634:
1631:
1628:
1625:
1622:
1619:
1596:
1593:
1590:
1587:
1584:
1581:
1577:
1571:
1567:
1537:
1533:
1529:
1526:
1523:
1520:
1517:
1512:
1508:
1495:
1494:
1483:
1480:
1472:
1468:
1464:
1459:
1455:
1448:
1444:
1438:
1435:
1427:
1423:
1419:
1414:
1410:
1403:
1399:
1393:
1390:
1387:
1384:
1378:
1375:
1361:
1360:
1348:
1343:
1339:
1335:
1332:
1329:
1326:
1323:
1320:
1300:
1295:
1291:
1287:
1284:
1281:
1278:
1275:
1272:
1268:
1264:
1240:
1236:
1232:
1198:Main article:
1195:
1192:
1184:
1183:
1172:
1169:
1166:
1162:
1159:
1155:
1151:
1148:
1145:
1141:
1138:
1135:
1132:
1129:
1125:
1121:
1118:
1115:
1112:
1109:
1106:
1103:
1097:
1094:
1072:
1071:Posterior mean
1069:
1056:
1036:
1025:
1024:
1013:
1009:
1003:
999:
995:
992:
989:
986:
983:
977:
974:
968:
964:
960:
957:
953:
950:
947:
920:Main article:
917:
914:
912:
909:
891:
861:
840:
837:
833:
829:
823:
820:
814:
811:
808:
805:
802:
799:
772:
769:
743:
740:
717:
693:
690:
684:
681:
675:
672:
669:
666:
663:
658:
654:
644:is defined as
630:
627:
596:
590:
587:
581:
578:
575:
572:
548:
528:
525:
522:
516:
513:
507:
501:
498:
475:
452:
440:
437:
413:expected value
380:
379:
377:
376:
369:
362:
354:
351:
350:
349:
348:
333:
332:
331:
330:
325:
320:
312:
311:
307:
306:
305:
304:
299:
291:
290:
286:
285:
284:
283:
278:
273:
265:
264:
260:
259:
258:
257:
252:
247:
242:
237:
229:
228:
224:
223:
222:
221:
216:
211:
206:
198:
197:
196:Model building
193:
192:
191:
190:
185:
180:
175:
170:
165:
160:
155:
153:Bayes' theorem
150:
145:
137:
136:
132:
131:
113:
112:
104:
103:
97:
96:
88:
87:
42:
40:
33:
26:
9:
6:
4:
3:
2:
9772:
9761:
9758:
9756:
9753:
9752:
9750:
9735:
9734:
9725:
9723:
9722:
9713:
9711:
9710:
9705:
9699:
9697:
9696:
9687:
9686:
9683:
9669:
9666:
9664:
9663:Geostatistics
9661:
9659:
9656:
9654:
9651:
9649:
9646:
9645:
9643:
9641:
9637:
9631:
9630:Psychometrics
9628:
9626:
9623:
9621:
9618:
9616:
9613:
9611:
9608:
9606:
9603:
9601:
9598:
9596:
9593:
9591:
9588:
9586:
9583:
9582:
9580:
9578:
9574:
9568:
9565:
9563:
9560:
9558:
9554:
9551:
9549:
9546:
9544:
9541:
9539:
9536:
9535:
9533:
9531:
9527:
9521:
9518:
9516:
9513:
9511:
9507:
9504:
9502:
9499:
9498:
9496:
9494:
9493:Biostatistics
9490:
9486:
9482:
9477:
9473:
9455:
9454:Log-rank test
9452:
9451:
9449:
9445:
9439:
9436:
9435:
9433:
9431:
9427:
9421:
9418:
9416:
9413:
9411:
9408:
9406:
9403:
9402:
9400:
9398:
9394:
9391:
9389:
9385:
9375:
9372:
9370:
9367:
9365:
9362:
9360:
9357:
9355:
9352:
9351:
9349:
9347:
9343:
9337:
9334:
9332:
9329:
9327:
9325:(BoxâJenkins)
9321:
9319:
9316:
9314:
9311:
9307:
9304:
9303:
9302:
9299:
9298:
9296:
9294:
9290:
9284:
9281:
9279:
9278:DurbinâWatson
9276:
9274:
9268:
9266:
9263:
9261:
9260:DickeyâFuller
9258:
9257:
9255:
9251:
9245:
9242:
9240:
9237:
9235:
9234:Cointegration
9232:
9230:
9227:
9225:
9222:
9220:
9217:
9215:
9212:
9210:
9209:Decomposition
9207:
9206:
9204:
9200:
9197:
9195:
9191:
9181:
9178:
9177:
9176:
9173:
9172:
9171:
9168:
9164:
9161:
9160:
9159:
9156:
9154:
9151:
9149:
9146:
9144:
9141:
9139:
9136:
9134:
9131:
9129:
9126:
9124:
9121:
9120:
9118:
9116:
9112:
9106:
9103:
9101:
9098:
9096:
9093:
9091:
9088:
9086:
9083:
9081:
9080:Cohen's kappa
9078:
9077:
9075:
9073:
9069:
9065:
9061:
9057:
9053:
9049:
9044:
9040:
9026:
9023:
9021:
9018:
9016:
9013:
9011:
9008:
9007:
9005:
9003:
8999:
8993:
8989:
8985:
8979:
8977:
8974:
8973:
8971:
8969:
8965:
8959:
8956:
8954:
8951:
8949:
8946:
8944:
8941:
8939:
8936:
8934:
8933:Nonparametric
8931:
8929:
8926:
8925:
8923:
8919:
8913:
8910:
8908:
8905:
8903:
8900:
8898:
8895:
8894:
8892:
8890:
8886:
8880:
8877:
8875:
8872:
8870:
8867:
8865:
8862:
8860:
8857:
8856:
8854:
8852:
8848:
8842:
8839:
8837:
8834:
8832:
8829:
8827:
8824:
8823:
8821:
8819:
8815:
8811:
8804:
8801:
8799:
8796:
8795:
8791:
8787:
8771:
8768:
8767:
8766:
8763:
8761:
8758:
8756:
8753:
8749:
8746:
8744:
8741:
8740:
8739:
8736:
8735:
8733:
8731:
8727:
8717:
8714:
8710:
8704:
8702:
8696:
8694:
8688:
8687:
8686:
8683:
8682:Nonparametric
8680:
8678:
8672:
8668:
8665:
8664:
8663:
8657:
8653:
8652:Sample median
8650:
8649:
8648:
8645:
8644:
8642:
8640:
8636:
8628:
8625:
8623:
8620:
8618:
8615:
8614:
8613:
8610:
8608:
8605:
8603:
8597:
8595:
8592:
8590:
8587:
8585:
8582:
8580:
8577:
8575:
8573:
8569:
8567:
8564:
8563:
8561:
8559:
8555:
8549:
8547:
8543:
8541:
8539:
8534:
8532:
8527:
8523:
8522:
8519:
8516:
8514:
8510:
8500:
8497:
8495:
8492:
8490:
8487:
8486:
8484:
8482:
8478:
8472:
8469:
8465:
8462:
8461:
8460:
8457:
8453:
8450:
8449:
8448:
8445:
8443:
8440:
8439:
8437:
8435:
8431:
8423:
8420:
8418:
8415:
8414:
8413:
8410:
8408:
8405:
8403:
8400:
8398:
8395:
8393:
8390:
8388:
8385:
8384:
8382:
8380:
8376:
8370:
8367:
8363:
8360:
8356:
8353:
8351:
8348:
8347:
8346:
8343:
8342:
8341:
8338:
8334:
8331:
8329:
8326:
8324:
8321:
8319:
8316:
8315:
8314:
8311:
8310:
8308:
8306:
8302:
8299:
8297:
8293:
8287:
8284:
8282:
8279:
8275:
8272:
8271:
8270:
8267:
8265:
8262:
8258:
8257:loss function
8255:
8254:
8253:
8250:
8246:
8243:
8241:
8238:
8236:
8233:
8232:
8231:
8228:
8226:
8223:
8221:
8218:
8214:
8211:
8209:
8206:
8204:
8198:
8195:
8194:
8193:
8190:
8186:
8183:
8181:
8178:
8176:
8173:
8172:
8171:
8168:
8164:
8161:
8159:
8156:
8155:
8154:
8151:
8147:
8144:
8143:
8142:
8139:
8135:
8132:
8131:
8130:
8127:
8125:
8122:
8120:
8117:
8115:
8112:
8111:
8109:
8107:
8103:
8099:
8095:
8090:
8086:
8072:
8069:
8067:
8064:
8062:
8059:
8057:
8054:
8053:
8051:
8049:
8045:
8039:
8036:
8034:
8031:
8029:
8026:
8025:
8023:
8019:
8013:
8010:
8008:
8005:
8003:
8000:
7998:
7995:
7993:
7990:
7988:
7985:
7983:
7980:
7979:
7977:
7975:
7971:
7965:
7962:
7960:
7959:Questionnaire
7957:
7955:
7952:
7948:
7945:
7943:
7940:
7939:
7938:
7935:
7934:
7932:
7930:
7926:
7920:
7917:
7915:
7912:
7910:
7907:
7905:
7902:
7900:
7897:
7895:
7892:
7890:
7887:
7885:
7882:
7881:
7879:
7877:
7873:
7869:
7865:
7860:
7856:
7842:
7839:
7837:
7834:
7832:
7829:
7827:
7824:
7822:
7819:
7817:
7814:
7812:
7809:
7807:
7804:
7802:
7799:
7797:
7794:
7792:
7789:
7787:
7786:Control chart
7784:
7782:
7779:
7777:
7774:
7772:
7769:
7768:
7766:
7764:
7760:
7754:
7751:
7747:
7744:
7742:
7739:
7738:
7737:
7734:
7732:
7729:
7727:
7724:
7723:
7721:
7719:
7715:
7709:
7706:
7704:
7701:
7699:
7696:
7695:
7693:
7689:
7683:
7680:
7679:
7677:
7675:
7671:
7659:
7656:
7654:
7651:
7649:
7646:
7645:
7644:
7641:
7639:
7636:
7635:
7633:
7631:
7627:
7621:
7618:
7616:
7613:
7611:
7608:
7606:
7603:
7601:
7598:
7596:
7593:
7591:
7588:
7587:
7585:
7583:
7579:
7573:
7570:
7568:
7565:
7561:
7558:
7556:
7553:
7551:
7548:
7546:
7543:
7541:
7538:
7536:
7533:
7531:
7528:
7526:
7523:
7521:
7518:
7516:
7513:
7512:
7511:
7508:
7507:
7505:
7503:
7499:
7496:
7494:
7490:
7486:
7482:
7477:
7473:
7467:
7464:
7462:
7459:
7458:
7455:
7451:
7444:
7439:
7437:
7432:
7430:
7425:
7424:
7421:
7413:
7409:
7408:
7403:
7399:
7398:
7388:
7386:0-471-91732-X
7382:
7378:
7373:
7369:
7367:0-387-98502-6
7363:
7359:
7354:
7350:
7346:
7342:
7340:0-387-96098-8
7336:
7332:
7328:
7324:
7323:
7312:
7307:
7298:
7289:
7280:
7272:
7266:
7262:
7255:
7246:
7244:
7234:
7230:
7220:
7217:
7215:
7212:
7211:
7205:
7202:
7200:
7196:
7192:
7188:
7184:
7180:
7176:
7172:
7168:
7145:
7138:
7120:
7113:
7095:
7088:
7070:
7063:
7045:
7038:
7037:
7036:
7013:
7010:
7007:
7002:
6999:
6996:
6993:
6990:
6984:
6981:
6974:
6973:
6972:
6970:
6966:
6956:
6952:
6938:
6932:
6929:
6926:
6922:
6917:
6914:
6908:
6905:
6902:
6898:
6888:
6884:
6878:
6876:
6860:
6854:
6851:
6848:
6844:
6839:
6836:
6830:
6827:
6824:
6820:
6810:
6806:
6802:
6797:
6795:
6791:
6787:
6783:
6779:
6775:
6771:
6767:
6762:
6760:
6741:
6736:
6733:
6730:
6726:
6719:
6716:
6713:
6710:
6707:
6703:
6698:
6692:
6686:
6680:
6677:
6674:
6671:
6668:
6663:
6660:
6657:
6651:
6645:
6637:
6633:
6625:
6624:
6623:
6606:
6600:
6597:
6594:
6591:
6588:
6583:
6580:
6577:
6571:
6565:
6557:
6551:
6548:
6542:
6534:
6530:
6522:
6521:
6520:
6518:
6514:
6510:
6506:
6502:
6494:
6488:
6486:
6481:
6479:
6476:under MSE is
6474:
6465:
6457:
6438:
6434:
6422:
6418:
6411:
6407:
6402:
6399:
6395:
6391:
6380:
6376:
6372:
6367:
6363:
6354:
6345:
6344:
6343:
6341:
6337:
6333:
6329:
6325:
6307:
6303:
6293:
6291:
6273:
6269:
6243:
6239:
6235:
6232:
6229:
6224:
6220:
6211:
6207:
6203:
6198:
6194:
6170:
6160:
6156:
6149:
6141:
6125:
6122:
6117:
6113:
6109:
6104:
6100:
6085:
6083:
6067:
6064:
6061:
6048:
6045:
6041:
6038:
6037:
6036:
6034:
6028:
6021:Admissibility
6013:
5997:
5994:
5991:
5987:
5964:
5961:
5958:
5954:
5928:
5923:
5916:
5913:
5906:
5901:
5894:
5891:
5881:
5878:
5873:
5870:
5867:
5863:
5839:
5836:
5831:
5827:
5820:
5817:
5812:
5808:
5797:
5793:
5769:
5766:
5763:
5758:
5753:
5746:
5743:
5736:
5731:
5726:
5719:
5716:
5705:
5691:
5686:
5679:
5676:
5669:
5664:
5657:
5654:
5643:
5642:
5641:
5624:
5621:
5618:
5613:
5608:
5604:
5600:
5595:
5590:
5586:
5582:
5577:
5572:
5568:
5564:
5559:
5554:
5550:
5542:
5528:
5521:
5517:
5513:
5508:
5504:
5496:
5495:
5494:
5480:
5477:
5471:
5463:
5458:
5454:
5433:
5430:
5424:
5416:
5412:
5386:
5382:
5371:
5367:
5360:
5337:
5329:
5325:
5301:
5293:
5289:
5265:
5257:
5247:
5243:
5239:
5233:
5225:
5221:
5209:
5205:
5201:
5192:
5184:
5179:
5175:
5166:
5162:
5158:
5153:
5148:
5144:
5136:
5122:
5111:
5103:
5099:
5090:
5086:
5082:
5077:
5073:
5065:
5064:
5063:
5047:
5042:
5038:
5029:
5011:
5007:
4998:
4979:
4973:
4963:
4956:
4953:
4946:
4941:
4937:
4929:
4924:
4921:
4916:
4911:
4906:
4899:
4896:
4885:
4871:
4865:
4861:
4856:
4851:
4848:
4843:
4838:
4831:
4828:
4817:
4816:
4815:
4813:
4795:
4791:
4787:
4784:
4781:
4776:
4772:
4747:
4743:
4735:and variance
4718:
4714:
4704:
4690:
4670:
4663:
4659:
4651:and variance
4634:
4630:
4609:
4589:
4567:
4563:
4540:
4537:
4534:
4530:
4507:
4504:
4501:
4497:
4471:
4467:
4456:
4452:
4445:
4423:
4419:
4415:
4412:
4409:
4404:
4400:
4385:
4383:
4379:
4374:
4372:
4369:is called an
4368:
4362:
4336:
4333:
4330:
4325:
4321:
4317:
4311:
4305:
4298:
4297:
4296:
4280:
4276:
4272:
4267:
4263:
4259:
4256:
4234:
4230:
4226:
4223:
4203:
4179:
4175:
4172:
4168:
4161:
4158:
4154:
4148:
4141:
4138:
4134:
4129:
4125:
4121:
4118:
4112:
4109:
4106:
4103:
4100:
4094:
4091:
4086:
4082:
4075:
4069:
4066:
4063:
4057:
4054:
4047:
4046:
4045:
4029:
4025:
4004:
4001:
3998:
3976:
3972:
3949:
3945:
3922:
3918:
3914:
3911:
3887:
3884:
3864:
3861:
3855:
3852:
3849:
3843:
3837:
3834:
3831:
3825:
3822:
3815:
3814:
3813:
3799:
3776:
3770:
3747:
3744:
3741:
3735:
3732:
3729:
3723:
3717:
3714:
3711:
3705:
3702:
3693:
3687:
3683:
3678:
3674:
3671:
3665:
3657:
3651:
3645:
3642:
3639:
3633:
3629:
3626:
3620:
3609:
3606:
3603:
3597:
3591:
3584:
3583:
3582:
3559:
3553:
3545:
3542:
3539:
3533:
3527:
3518:
3512:
3504:
3498:
3492:
3484:
3478:
3472:
3466:
3458:
3452:
3445:
3444:
3443:
3429:
3426:
3420:
3414:
3405:
3388:
3385:
3382:
3376:
3373:
3367:
3359:
3353:
3333:
3310:
3307:
3304:
3298:
3290:
3280:
3278:
3257:
3254:
3248:
3240:
3234:
3228:
3225:
3222:
3216:
3212:
3205:
3204:
3203:
3201:
3182:
3176:
3173:
3167:
3161:
3155:
3147:
3141:
3138:
3130:
3124:
3118:
3110:
3104:
3098:
3092:
3084:
3078:
3071:
3070:
3069:
3065:
3063:
3044:
3038:
3031:
3012:
3006:
3002:
2999:
2993:
2987:
2983:
2976:
2975:
2974:
2960:
2957:
2951:
2945:
2937:
2933:
2914:
2911:
2908:
2905:
2899:
2893:
2890:
2883:
2882:
2881:
2867:
2857:
2847:
2845:
2841:
2815:
2812:
2809:
2798:
2795:
2789:
2786:
2769:
2766:
2759:
2756:
2745:
2742:
2736:
2733:
2716:
2713:
2707:
2702:
2693:
2690:
2684:
2681:
2675:
2668:
2667:
2651:
2648:
2645:
2625:
2622:
2619:
2611:
2607:
2606:
2586:
2580:
2577:
2574:
2570:
2565:
2559:
2548:
2539:
2536:
2527:
2520:
2499:
2496:
2490:
2487:
2481:
2478:
2466:
2455:
2452:
2446:
2443:
2435:
2428:
2425:
2419:
2416:
2410:
2407:
2395:
2384:
2381:
2375:
2372:
2364:
2358:
2353:
2344:
2341:
2335:
2332:
2326:
2319:
2318:
2314:
2298:
2295:
2292:
2289:
2286:
2278:
2277:
2262:
2256:
2253:
2247:
2241:
2230:
2221:
2218:
2209:
2202:
2180:
2177:
2171:
2168:
2160:
2157:
2148:
2145:
2139:
2136:
2130:
2123:
2122:
2106:
2103:
2100:
2092:
2091:
2089:
2079:
2065:
2036:
2030:
2027:
2024:
2021:
2018:
2007:
2003:
1999:
1996:
1993:
1990:
1987:
1982:
1978:
1974:
1969:
1965:
1951:
1948:
1945:
1936:
1930:
1921:
1918:
1908:
1907:
1888:
1885:
1880:
1876:
1869:
1866:
1863:
1860:
1853:
1834:
1831:
1828:
1822:
1819:
1816:
1806:
1802:
1794:
1776:
1772:
1768:
1765:
1762:
1759:
1756:
1751:
1747:
1738:
1737:
1722:
1716:
1713:
1710:
1705:
1702:
1694:
1689:
1683:
1677:
1668:
1665:
1655:
1654:
1635:
1632:
1629:
1623:
1620:
1617:
1610:
1591:
1585:
1582:
1579:
1569:
1565:
1556:
1553:
1535:
1531:
1527:
1524:
1521:
1518:
1515:
1510:
1506:
1497:
1496:
1481:
1478:
1470:
1466:
1462:
1457:
1453:
1446:
1442:
1436:
1433:
1425:
1421:
1417:
1412:
1408:
1401:
1397:
1391:
1385:
1376:
1373:
1363:
1362:
1341:
1337:
1333:
1330:
1324:
1321:
1318:
1293:
1289:
1285:
1282:
1276:
1273:
1270:
1262:
1254:
1238:
1230:
1222:
1221:
1220:
1217:
1213:
1211:
1207:
1201:
1191:
1189:
1170:
1167:
1164:
1157:
1149:
1143:
1139:
1136:
1133:
1127:
1119:
1113:
1110:
1104:
1095:
1092:
1082:
1081:
1080:
1078:
1068:
1054:
1034:
1011:
1007:
1001:
993:
990:
984:
975:
972:
962:
958:
955:
937:
936:
935:
933:
929:
923:
908:
906:
902:
889:
879:
874:
872:
859:
835:
821:
818:
812:
809:
803:
797:
789:
770:
767:
741:
738:
715:
707:
682:
679:
673:
670:
664:
656:
652:
628:
625:
614:
610:
609:loss function
588:
585:
579:
576:
570:
562:
546:
523:
514:
511:
505:
499:
496:
473:
466:
450:
436:
434:
430:
426:
422:
418:
417:loss function
414:
411:
407:
406:decision rule
403:
399:
395:
391:
387:
375:
370:
368:
363:
361:
356:
355:
353:
352:
347:
342:
337:
336:
335:
334:
329:
326:
324:
321:
319:
316:
315:
314:
313:
309:
308:
303:
300:
298:
295:
294:
293:
292:
288:
287:
282:
279:
277:
274:
272:
269:
268:
267:
266:
262:
261:
256:
253:
251:
248:
246:
243:
241:
238:
236:
233:
232:
231:
230:
226:
225:
220:
217:
215:
212:
210:
207:
205:
202:
201:
200:
199:
195:
194:
189:
186:
184:
181:
179:
176:
174:
171:
169:
168:Cox's theorem
166:
164:
161:
159:
156:
154:
151:
149:
146:
144:
141:
140:
139:
138:
134:
133:
130:
126:
122:
118:
115:
114:
110:
106:
105:
102:
99:
98:
94:
93:
84:
81:
73:
70:November 2009
63:
59:
53:
52:
46:
41:
32:
31:
19:
9731:
9719:
9700:
9693:
9605:Econometrics
9555: /
9538:Chemometrics
9515:Epidemiology
9508: /
9481:Applications
9323:ARIMA model
9270:Q-statistic
9219:Stationarity
9115:Multivariate
9058: /
9054: /
9052:Multivariate
9050: /
8990: /
8986: /
8764:
8760:Bayes factor
8659:Signed rank
8571:
8545:
8537:
8525:
8220:Completeness
8056:Cohort study
7954:Opinion poll
7889:Missing data
7876:Study design
7831:Scatter plot
7753:Scatter plot
7746:Spearman's Ď
7708:Grouped data
7405:
7376:
7357:
7330:
7311:IMDb Top 250
7306:
7297:
7288:
7279:
7260:
7254:
7233:
7203:
7198:
7194:
7190:
7186:
7182:
7178:
7174:
7169:is just the
7166:
7164:
7034:
6962:
6953:
6886:
6882:
6879:
6874:
6804:
6800:
6798:
6793:
6789:
6785:
6781:
6777:
6773:
6769:
6765:
6763:
6758:
6756:
6621:
6516:
6512:
6504:
6500:
6498:
6492:
6482:
6472:
6455:
6453:
6327:
6323:
6294:
6289:
6091:
6052:
6044:discrete set
6030:
5784:
5639:
5280:
4994:
4705:
4391:
4375:
4370:
4364:
4195:
3903:
3877:for a given
3762:
3580:
3406:
3286:
3276:
3273:
3197:
3066:
3027:
2931:
2929:
2859:
2837:
2056:
1218:
1214:
1203:
1187:
1185:
1074:
1026:
931:
925:
904:
903:is called a
881:
875:
851:
787:
704:, where the
612:
560:
442:
420:
398:Bayes action
397:
393:
383:
318:Bayes factor
76:
67:
48:
18:Bayes action
9733:WikiProject
9648:Cartography
9610:Jurimetrics
9562:Reliability
9293:Time domain
9272:(LjungâBox)
9194:Time-series
9072:Categorical
9056:Time-series
9048:Categorical
8983:(Bernoulli)
8818:Correlation
8798:Correlation
8594:JarqueâBera
8566:Chi-squared
8328:M-estimator
8281:Asymptotics
8225:Sufficiency
7992:Interaction
7904:Replication
7884:Effect size
7841:Violin plot
7821:Radar chart
7801:Forest plot
7791:Correlogram
7741:Kendall's Ď
5030:to compute
4999:to compute
706:expectation
563:), and let
419:(i.e., the
62:introducing
9749:Categories
9600:Demography
9318:ARMA model
9123:Regression
8700:(Friedman)
8661:(Wilcoxon)
8599:Normality
8589:Lilliefors
8536:Student's
8412:Resampling
8286:Robustness
8274:divergence
8264:Efficiency
8202:(monotone)
8197:Likelihood
8114:Population
7947:Stratified
7899:Population
7718:Dependence
7674:Count data
7605:Percentile
7582:Dispersion
7515:Arithmetic
7450:Statistics
7320:References
7165:Note that
6288:for large
6033:admissible
6025:See also:
6016:Properties
5062:such that
4814:approach:
4810:using the
4378:parametric
2854:See also:
613:Bayes risk
439:Definition
263:Estimators
135:Background
121:Likelihood
45:references
9755:Estimator
8981:Logistic
8748:posterior
8674:Rank sum
8422:Jackknife
8417:Bootstrap
8235:Bootstrap
8170:Parameter
8119:Statistic
7914:Statistic
7826:Run chart
7811:Pie chart
7806:Histogram
7796:Fan chart
7771:Bar chart
7653:L-moments
7540:Geometric
7412:EMS Press
6855:β
6849:α
6845:β
6831:β
6825:α
6821:α
6727:δ
6693:θ
6634:δ
6558:θ
6531:δ
6462:) is the
6419:θ
6389:→
6377:θ
6373:−
6364:δ
6304:θ
6270:δ
6233:…
6208:δ
6195:δ
6171:θ
6126:…
5979:based on
5955:θ
5924:π
5917:^
5914:σ
5902:π
5895:^
5892:μ
5879:∼
5864:θ
5828:θ
5818:∼
5809:θ
5764:−
5747:^
5744:σ
5727:π
5720:^
5717:σ
5680:^
5677:μ
5665:π
5658:^
5655:μ
5619:−
5605:σ
5587:σ
5583:−
5569:σ
5555:π
5551:σ
5518:μ
5509:π
5505:μ
5472:θ
5455:σ
5446:and that
5434:θ
5425:θ
5413:μ
5383:θ
5338:θ
5326:σ
5302:θ
5290:μ
5244:μ
5240:−
5234:θ
5222:μ
5210:π
5193:θ
5176:σ
5167:π
5145:σ
5112:θ
5100:μ
5091:π
5074:μ
5039:σ
5008:μ
4957:^
4954:μ
4947:−
4930:∑
4900:^
4897:σ
4857:∑
4832:^
4829:μ
4785:…
4744:σ
4715:μ
4691:π
4664:π
4660:σ
4635:π
4631:μ
4610:π
4590:π
4564:θ
4522:based on
4498:θ
4468:θ
4413:…
4260:−
4227:−
4173:θ
4159:θ
4155:−
4139:θ
4135:−
4122:−
4110:∫
4104:θ
4095:θ
4092:−
4070:θ
4067:−
4055:∫
3865:θ
3856:θ
3853:−
3838:θ
3835:−
3823:∫
3745:θ
3736:θ
3733:−
3718:θ
3715:−
3703:∫
3675:θ
3658:θ
3646:θ
3643:−
3630:∫
3610:θ
3607:−
3546:θ
3543:−
3505:θ
3493:θ
3459:θ
3421:θ
3389:θ
3386:−
3368:θ
3334:θ
3311:θ
3308:−
3258:θ
3241:θ
3223:θ
3213:∫
3177:θ
3168:θ
3156:θ
3139:∫
3131:θ
3119:θ
3085:θ
3045:θ
3010:∞
3003:θ
2994:θ
2984:∫
2952:θ
2909:θ
2900:θ
2891:∫
2810:≥
2799:^
2796:θ
2790:−
2787:θ
2777:for
2746:^
2743:θ
2737:−
2734:θ
2724:for
2694:^
2691:θ
2682:θ
2540:^
2537:θ
2491:^
2488:θ
2482:−
2479:θ
2474:for
2456:^
2453:θ
2447:−
2444:θ
2426:≥
2420:^
2417:θ
2411:−
2408:θ
2403:for
2385:^
2382:θ
2376:−
2373:θ
2345:^
2342:θ
2333:θ
2222:^
2219:θ
2181:^
2178:θ
2172:−
2169:θ
2149:^
2146:θ
2137:θ
2028:−
1966:θ
1922:^
1919:θ
1877:θ
1864:∼
1861:θ
1835:θ
1820:∼
1817:θ
1698:¯
1669:^
1666:θ
1621:∼
1618:θ
1592:θ
1583:∼
1580:θ
1467:τ
1454:σ
1443:τ
1434:μ
1422:τ
1409:σ
1398:σ
1377:^
1374:θ
1338:τ
1331:μ
1322:∼
1319:θ
1290:σ
1283:θ
1274:∼
1271:θ
1239:θ
1168:θ
1150:θ
1140:θ
1137:∫
1120:θ
1096:^
1093:θ
1035:θ
994:θ
991:−
976:^
973:θ
882:for each
852:for each
822:^
819:θ
810:θ
771:^
768:θ
742:^
739:θ
716:θ
683:^
680:θ
671:θ
657:π
629:^
626:θ
589:^
586:θ
577:θ
547:θ
515:^
512:θ
500:^
497:θ
474:π
451:θ
410:posterior
402:estimator
163:Coherence
117:Posterior
9695:Category
9388:Survival
9265:Johansen
8988:Binomial
8943:Isotonic
8530:(normal)
8175:location
7982:Blocking
7937:Sampling
7816:QâQ plot
7781:Box plot
7763:Graphics
7658:Skewness
7648:Kurtosis
7620:Variance
7550:Heronian
7545:Harmonic
7329:(1985).
7208:See also
5026:and the
4176:′
4162:′
4142:′
3030:measures
2313:quantile
1791:are iid
911:Examples
878:improper
129:Evidence
9721:Commons
9668:Kriging
9553:Process
9510:studies
9369:Wavelet
9202:General
8369:Plug-in
8163:L space
7942:Cluster
7643:Moments
7461:Outline
7414:, 2001
7349:0804611
7035:where:
6875:exactly
6338:to the
6334:and it
4388:Example
3326:. Here
3283:Example
1555:Poisson
425:utility
58:improve
9590:Census
9180:Normal
9128:Manova
8948:Robust
8698:2-way
8690:1-way
8528:-test
8199:
7776:Biplot
7567:Median
7560:Lehmer
7502:Center
7383:
7364:
7347:
7337:
7267:
7197:is to
7183:(v, m)
7149:
7124:
7099:
7074:
7049:
7020:
6454:where
6186:. Let
5281:where
1253:Normal
486:. Let
400:is an
47:, but
9214:Trend
8743:prior
8685:anova
8574:-test
8548:-test
8540:-test
8447:Power
8392:Pivot
8185:shape
8180:scale
7630:Shape
7610:Range
7555:Heinz
7530:Cubic
7466:Index
7225:Notes
6503:~b(θ,
3028:Such
607:be a
415:of a
396:or a
125:Prior
9447:Test
8647:Sign
8499:Wald
7572:Mode
7510:Mean
7381:ISBN
7362:ISBN
7335:ISBN
7265:ISBN
7177:and
6963:The
6809:then
6466:of θ
6138:are
6065:>
5317:and
4380:and
2757:<
2649:>
2623:>
2497:<
2296:>
2104:>
1550:are
1047:and
392:, a
388:and
8627:BIC
8622:AIC
7173:of
6778:a+b
6140:iid
1958:max
1739:If
1552:iid
1498:If
1251:is
1223:If
615:of
431:is
404:or
384:In
9751::
7410:,
7404:,
7345:MR
7343:.
7242:^
6511:B(
6480:.
6458:(θ
6342::
6292:.
6084:.
3404:.
3279:.
3064:.
2915:1.
2846:.
2664:):
2078:.
1255:,
1079:,
1067:.
907:.
435:.
127:á
123:Ă
119:=
8572:G
8546:F
8538:t
8526:Z
8245:V
8240:U
7442:e
7435:t
7428:v
7389:.
7370:.
7351:.
7273:.
7199:C
7195:W
7191:v
7187:m
7179:C
7175:R
7167:W
7146:C
7121:m
7096:v
7071:R
7046:W
7014:m
7011:+
7008:v
7003:m
7000:C
6997:+
6994:v
6991:R
6985:=
6982:W
6939:v
6933:n
6930:+
6927:4
6923:n
6918:+
6915:V
6909:n
6906:+
6903:4
6899:4
6887:v
6883:n
6861:b
6852:+
6840:+
6837:B
6828:+
6805:b
6801:B
6794:d
6790:d
6786:b
6784:,
6782:a
6774:b
6772:=
6770:a
6766:n
6759:n
6742:.
6737:E
6734:L
6731:M
6720:n
6717:+
6714:b
6711:+
6708:a
6704:n
6699:+
6696:]
6690:[
6687:E
6681:n
6678:+
6675:b
6672:+
6669:a
6664:b
6661:+
6658:a
6652:=
6649:)
6646:x
6643:(
6638:n
6607:.
6601:n
6598:+
6595:b
6592:+
6589:a
6584:x
6581:+
6578:a
6572:=
6569:]
6566:x
6562:|
6555:[
6552:E
6549:=
6546:)
6543:x
6540:(
6535:n
6517:b
6515:,
6513:a
6505:n
6501:x
6493:p
6473:n
6468:0
6460:0
6456:I
6439:,
6435:)
6428:)
6423:0
6415:(
6412:I
6408:1
6403:,
6400:0
6396:(
6392:N
6386:)
6381:0
6368:n
6360:(
6355:n
6328:n
6324:n
6308:0
6290:n
6274:n
6249:)
6244:n
6240:x
6236:,
6230:,
6225:1
6221:x
6217:(
6212:n
6204:=
6199:n
6174:)
6167:|
6161:i
6157:x
6153:(
6150:f
6123:,
6118:2
6114:x
6110:,
6105:1
6101:x
6068:2
6062:p
5998:1
5995:+
5992:n
5988:x
5965:1
5962:+
5959:n
5934:)
5929:2
5907:,
5885:(
5882:N
5874:1
5871:+
5868:n
5843:)
5840:1
5837:,
5832:i
5824:(
5821:N
5813:i
5804:|
5798:i
5794:x
5770:.
5767:K
5759:2
5754:m
5737:=
5732:2
5692:,
5687:m
5670:=
5625:.
5622:K
5614:2
5609:m
5601:=
5596:2
5591:f
5578:2
5573:m
5565:=
5560:2
5529:,
5522:m
5514:=
5481:K
5478:=
5475:)
5469:(
5464:2
5459:f
5431:=
5428:)
5422:(
5417:f
5392:)
5387:i
5378:|
5372:i
5368:x
5364:(
5361:f
5341:)
5335:(
5330:f
5305:)
5299:(
5294:f
5266:,
5263:]
5258:2
5254:)
5248:m
5237:)
5231:(
5226:f
5218:(
5215:[
5206:E
5202:+
5199:]
5196:)
5190:(
5185:2
5180:f
5172:[
5163:E
5159:=
5154:2
5149:m
5123:,
5118:]
5115:)
5109:(
5104:f
5096:[
5087:E
5083:=
5078:m
5048:2
5043:m
5012:m
4980:.
4974:2
4970:)
4964:m
4942:i
4938:x
4934:(
4925:n
4922:1
4917:=
4912:2
4907:m
4872:,
4866:i
4862:x
4852:n
4849:1
4844:=
4839:m
4796:n
4792:x
4788:,
4782:,
4777:1
4773:x
4748:m
4719:m
4671:.
4568:i
4541:1
4538:+
4535:n
4531:x
4508:1
4505:+
4502:n
4477:)
4472:i
4463:|
4457:i
4453:x
4449:(
4446:f
4424:n
4420:x
4416:,
4410:,
4405:1
4401:x
4337:.
4334:x
4331:+
4326:0
4322:a
4318:=
4315:)
4312:x
4309:(
4306:a
4281:0
4277:a
4273:=
4268:1
4264:x
4257:a
4235:1
4231:x
4224:a
4204:a
4180:.
4169:d
4166:)
4152:(
4149:f
4146:)
4130:1
4126:x
4119:a
4116:(
4113:L
4107:=
4101:d
4098:)
4087:1
4083:x
4079:(
4076:f
4073:)
4064:a
4061:(
4058:L
4030:1
4026:x
4005:0
4002:=
3999:x
3977:0
3973:a
3950:0
3946:a
3923:0
3919:a
3915:+
3912:x
3888:.
3885:x
3862:d
3859:)
3850:x
3847:(
3844:f
3841:)
3832:a
3829:(
3826:L
3800:x
3780:)
3777:x
3774:(
3771:a
3748:.
3742:d
3739:)
3730:x
3727:(
3724:f
3721:)
3712:a
3709:(
3706:L
3697:)
3694:x
3691:(
3688:p
3684:1
3679:=
3672:d
3669:)
3666:x
3662:|
3655:(
3652:p
3649:)
3640:a
3637:(
3634:L
3627:=
3624:]
3621:x
3617:|
3613:)
3604:a
3601:(
3598:L
3595:[
3592:E
3563:)
3560:x
3557:(
3554:p
3549:)
3540:x
3537:(
3534:f
3528:=
3522:)
3519:x
3516:(
3513:p
3508:)
3502:(
3499:p
3496:)
3489:|
3485:x
3482:(
3479:p
3473:=
3470:)
3467:x
3463:|
3456:(
3453:p
3430:1
3427:=
3424:)
3418:(
3415:p
3392:)
3383:x
3380:(
3377:f
3374:=
3371:)
3364:|
3360:x
3357:(
3354:p
3314:)
3305:a
3302:(
3299:L
3255:d
3252:)
3249:x
3245:|
3238:(
3235:p
3232:)
3229:a
3226:,
3220:(
3217:L
3183:.
3174:d
3171:)
3165:(
3162:p
3159:)
3152:|
3148:x
3145:(
3142:p
3134:)
3128:(
3125:p
3122:)
3115:|
3111:x
3108:(
3105:p
3099:=
3096:)
3093:x
3089:|
3082:(
3079:p
3048:)
3042:(
3039:p
3013:.
3007:=
3000:d
2997:)
2991:(
2988:p
2961:1
2958:=
2955:)
2949:(
2946:p
2932:R
2912:=
2906:d
2903:)
2897:(
2894:p
2868:p
2816:.
2813:K
2806:|
2783:|
2770:,
2767:L
2760:K
2753:|
2730:|
2717:,
2714:0
2708:{
2703:=
2700:)
2685:,
2679:(
2676:L
2652:0
2646:L
2626:0
2620:K
2587:.
2581:b
2578:+
2575:a
2571:a
2566:=
2563:)
2560:X
2556:|
2552:)
2549:x
2546:(
2531:(
2528:F
2500:0
2467:,
2463:|
2440:|
2436:b
2429:0
2396:,
2392:|
2369:|
2365:a
2359:{
2354:=
2351:)
2336:,
2330:(
2327:L
2299:0
2293:b
2290:,
2287:a
2263:.
2257:2
2254:1
2248:=
2245:)
2242:X
2238:|
2234:)
2231:x
2228:(
2213:(
2210:F
2188:|
2165:|
2161:a
2158:=
2155:)
2140:,
2134:(
2131:L
2107:0
2101:a
2066:F
2037:.
2031:1
2025:n
2022:+
2019:a
2013:)
2008:n
2004:x
2000:,
1997:.
1994:.
1991:.
1988:,
1983:1
1979:x
1975:,
1970:0
1962:(
1955:)
1952:n
1949:+
1946:a
1943:(
1937:=
1934:)
1931:X
1928:(
1892:)
1889:a
1886:,
1881:0
1873:(
1870:a
1867:P
1838:)
1832:,
1829:0
1826:(
1823:U
1813:|
1807:i
1803:x
1777:n
1773:x
1769:,
1766:.
1763:.
1760:.
1757:,
1752:1
1748:x
1723:.
1717:b
1714:+
1711:n
1706:a
1703:+
1695:X
1690:n
1684:=
1681:)
1678:X
1675:(
1639:)
1636:b
1633:,
1630:a
1627:(
1624:G
1595:)
1589:(
1586:P
1576:|
1570:i
1566:x
1536:n
1532:x
1528:,
1525:.
1522:.
1519:.
1516:,
1511:1
1507:x
1482:.
1479:x
1471:2
1463:+
1458:2
1447:2
1437:+
1426:2
1418:+
1413:2
1402:2
1392:=
1389:)
1386:x
1383:(
1347:)
1342:2
1334:,
1328:(
1325:N
1299:)
1294:2
1286:,
1280:(
1277:N
1267:|
1263:x
1235:|
1231:x
1171:.
1165:d
1161:)
1158:x
1154:|
1147:(
1144:p
1134:=
1131:]
1128:x
1124:|
1117:[
1114:E
1111:=
1108:)
1105:x
1102:(
1055:x
1012:,
1008:]
1002:2
998:)
988:)
985:x
982:(
967:(
963:[
959:E
956:=
952:E
949:S
946:M
890:x
860:x
839:)
836:x
832:|
828:)
813:,
807:(
804:L
801:(
798:E
692:)
689:)
674:,
668:(
665:L
662:(
653:E
595:)
580:,
574:(
571:L
561:x
527:)
524:x
521:(
506:=
373:e
366:t
359:v
83:)
77:(
72:)
68:(
54:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.