1003:
25:
1562:
2141:
2484:
2564:
The work of James and Stein has been extended to the case of a general measurement covariance matrix, i.e., where measurements may be statistically dependent and may have differing variances. A similar dominating estimator can be constructed, with a suitably generalized dominance condition. This can
1763:
A consequence of the above discussion is the following counterintuitive result: When three or more unrelated parameters are measured, their total MSE can be reduced by using a combined estimator such as the James–Stein estimator; whereas when each parameter is estimated separately, the least squares
1772:
MSE, i.e., the sum of the expected squared errors of each component. Therefore, the total MSE in measuring light speed, tea consumption, and hog weight would improve by using the James–Stein estimator. However, any particular component (such as the speed of light) would improve for some parameter
2572:
Stein's result has been extended to a wide class of distributions and loss functions. However, this theory provides only an existence result, in that explicit dominating estimators were not actually exhibited. It is quite difficult to obtain explicit estimators improving upon the usual estimator
1406:
1976:
1159:
1810:
estimator in a hypothetical regression of the population means on the sample means gives an estimator of the form of either the James–Stein estimator (when we force the OLS intercept to equal 0) or of the Efron-Morris estimator (when we allow the intercept to vary).
272:
2161:
The James–Stein estimator may seem at first sight to be a result of some peculiarity of the problem setting. In fact, the estimator exemplifies a very wide-ranging effect; namely, the fact that the "ordinary" or least squares estimator is often
1627:. But we may have some guess as to what the mean vector is. This can be considered a disadvantage of the estimator: the choice is not objective as it may depend on the beliefs of the researcher. Nonetheless, James and Stein's result is that
2351:
1773:
values, and deteriorate for others. Thus, although the James–Stein estimator dominates the LS estimator when three or more parameters are estimated, any single component does not dominate the respective component of the LS estimator.
2287:
542:
1557:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{JS}=\left(1-{\frac {(m-2)\sigma ^{2}}{\|{\mathbf {y} }-{\boldsymbol {\nu }}\|^{2}}}\right)({\mathbf {y} }-{\boldsymbol {\nu }})+{\boldsymbol {\nu }},\qquad m\geq 3.}
2136:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{JS+}=\left(1-{\frac {(m-3)\sigma ^{2}}{\|{\mathbf {y} }-{\boldsymbol {\nu }}\|^{2}}}\right)^{+}({\mathbf {y} }-{\boldsymbol {\nu }})+{\boldsymbol {\nu }},m\geq 4.}
810:
1615:
1043:
1336:
1932:
1204:
966:
923:
1964:
680:
378:
195:
2517:
1006:
MSE (R) of least squares estimator (ML) vs. James–Stein estimator (JS). The James–Stein estimator gives its best estimate when the norm of the actual parameter vector θ is near zero.
988:
884:
702:
564:
109:
1768:. A quirky example would be estimating the speed of light, tea consumption in Taiwan, and hog weight in Montana, all together. The James–Stein estimator always improves upon the
1891:
1865:
400:
200:
2479:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{JS}=\left(1-{\frac {(m-2){\frac {\sigma ^{2}}{n}}}{\|{\overline {\mathbf {y} }}\|^{2}}}\right){\overline {\mathbf {y} }},}
1841:
750:
726:
592:
1358:
648:
2199:
1700:
1035:
466:
1758:
1268:
1230:
343:
313:
2313:
2153:
It turns out, however, that the positive-part estimator is also inadmissible. This follows from a general result which requires admissible estimators to be smooth.
2557:
2537:
2340:
1732:
1392:
613:
2204:
1776:
The conclusion from this hypothetical example is that measurements should be combined if one is interested in minimizing their total MSE. For example, in a
1791:
The James–Stein estimator has also found use in fundamental quantum theory, where the estimator has been used to improve the theoretical bounds of the
479:
43:
406:
or commonly as the "average of averages" of the sample means, given all samples share the same size). This observation is commonly referred to as
1966:
is then negative. This can be easily remedied by replacing this multiplier by zero when it is negative. The resulting estimator is called the
759:. Since this noise has mean of zero, it may be reasonable to use the samples themselves as an estimate of the parameters. This approach is the
766:
755:
In real-world application, this is a common situation in which a set of parameters is sampled, and the samples are corrupted by independent
1582:
1154:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{JS}=\left(1-{\frac {(m-2)\sigma ^{2}}{\|{\mathbf {y} }\|^{2}}}\right){\mathbf {y} }.}
2146:
This estimator has a smaller risk than the basic James–Stein estimator. It follows that the basic James–Stein estimator is itself
1276:
1896:
1170:
932:
889:
1760:. The James–Stein estimator is a member of a class of Bayesian estimators that dominate the maximum-likelihood estimator.
1937:
656:
2950:
277:
It arose sequentially in two main published papers. The earlier version of the estimator was developed in 1956, when
61:
595:
352:
2755:
Using Stein's estimator to correct the bound on the entropic uncertainty principle for more than two measurements
1573:. A natural question to ask is whether the improvement over the usual estimator is independent of the choice of
123:
2493:
1792:
2170:, and has been demonstrated for several different problem settings, some of which are briefly outlined below.
971:
821:
685:
547:
92:
2974:
2662:
2621:
278:
267:{\displaystyle \{{\boldsymbol {\theta }}_{1},{\boldsymbol {\theta }}_{2},...,{\boldsymbol {\theta }}_{m}\}}
1874:
1848:
1637:
improves the expected MSE over the maximum-likelihood estimator, which is tantamount to using an infinite
383:
2718:
Efron, B.; Morris, C. (1973). "Stein's
Estimation Rule and Its Competitors—An Empirical Bayes Approach".
1623:
is necessary. Of course this is the quantity we are trying to estimate so we don't have this knowledge
112:
39:
1624:
403:
2174:
James and Stein demonstrated that the estimator presented above can still be used when the variance
1822:
731:
707:
573:
2583:
2163:
2147:
1765:
1241:
1165:
420:
316:
286:
16:
Biased estimator for
Gaussian random vectors, better than ordinary least-squared-error minimization
1341:
623:
2177:
1803:
1667:
1013:
1807:
1652:
1365:
445:
346:
2624:(1956), "Inadmissibility of the usual estimator for the mean of a multivariate distribution",
2167:
1819:
Despite the intuition that the James–Stein estimator shrinks the maximum-likelihood estimate
1781:
1737:
1247:
1209:
322:
292:
281:
reached a relatively shocking conclusion that while the then-usual estimate of the mean, the
2844:
Bock, M. E. (1975), "Minimax estimators of the mean of a multivariate normal distribution",
2292:
968:. The paradoxical result, that there is a (possibly) better and never any worse estimate of
2916:
2893:(1966), "On the admissibility of invariant estimators of one or more location parameters",
2869:
2846:
2768:
2676:
2635:
2588:
435:
79:
2924:
2877:
2643:
8:
2593:
1802:
perspective. Under this interpretation, we aim to predict the population means using the
115:
2772:
2969:
2758:
2735:
2542:
2522:
2325:
2282:{\displaystyle {\widehat {\sigma }}^{2}={\frac {1}{m}}\sum (y_{i}-{\overline {y}})^{2}}
1785:
1717:
1711:
1377:
1237:
1233:
991:
816:
598:
428:
408:
2946:
2890:
2808:
2598:
2566:
1777:
1662:
618:
439:
2920:
2902:
2873:
2855:
2798:
2787:"The 1988 Neyman Memorial Lecture: A Galtonian Perspective on Shrinkage Estimators"
2727:
2666:
2639:
2625:
2318:
The results in this article are for the case when only a single observation vector
413:
2943:
The
Statistical Implications of Pre-Test and Stein-Rule Estimators in Econometrics
1617:
is large. Thus to get a very great improvement some knowledge of the location of
2912:
2865:
2672:
2631:
537:{\displaystyle {\mathbf {Y} }\sim N_{m}({\boldsymbol {\theta }},\sigma ^{2}I),\,}
118:
2688:
Beran, R. (1995). THE ROLE OF HAJEK’S CONVOLUTION THEOREM IN STATISTICAL THEORY
2166:
for simultaneous estimation of several parameters. This effect has been called
1799:
756:
567:
2907:
1394:. Then there exists an estimator of the James–Stein type that shrinks toward
2963:
2860:
2812:
760:
424:
2803:
2786:
2569:
technique which outperforms the standard application of the LS estimator.
2201:
is unknown, by replacing it with the standard estimator of the variance,
1788:
scenario, as the goal is to minimize the total channel estimation error.
282:
2739:
990:
in mean squared error as compared to the sample mean, became known as
805:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{LS}={\mathbf {y} }}
2289:. The dominance result still holds under the same condition, namely,
82:
2731:
2763:
1610:{\displaystyle \|{{\boldsymbol {\theta }}-{\boldsymbol {\nu }}}\|}
1002:
1240:
estimator. By definition, this makes the least squares estimator
427:
approach, meaning the James–Stein estimator has a lower or equal
1567:
The James–Stein estimator dominates the usual estimator for any
1232:, meaning that the James–Stein estimator always achieves lower
2573:
without specific restrictions on the underlying distributions.
345:. Stein proposed a possible improvement to the estimator that
925:, is sub-optimal to shrinkage based estimators, such as the
1798:
An intuitive derivation and interpretation is given by the
86:
1331:{\displaystyle (m-2)\sigma ^{2}<\|{\mathbf {y} }\|^{2}}
1927:{\displaystyle \|{\mathbf {y} }-{\boldsymbol {\nu }}\|,}
1338:
then this estimator simply takes the natural estimator
1199:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{LS}}
961:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{JS}}
918:{\displaystyle {\widehat {\boldsymbol {\theta }}}_{LS}}
1655:
gives some intuition to this result: One assumes that
2545:
2525:
2496:
2354:
2328:
2295:
2207:
2180:
1979:
1940:
1899:
1877:
1851:
1825:
1740:
1720:
1670:
1585:
1409:
1380:
1344:
1279:
1250:
1212:
1173:
1046:
1016:
974:
935:
892:
824:
769:
734:
710:
688:
659:
626:
601:
576:
550:
482:
448:
386:
355:
325:
295:
203:
126:
95:
2828:
2698:
2696:
2694:
1959:{\displaystyle {\mathbf {y} }-{\boldsymbol {\nu }}}
416:and Charles Stein simplified the original process.
34:
may be too technical for most readers to understand
2726:(341). American Statistical Association: 117–130.
2551:
2531:
2511:
2478:
2334:
2307:
2281:
2193:
2135:
1958:
1926:
1885:
1859:
1835:
1752:
1726:
1694:
1609:
1556:
1386:
1352:
1330:
1262:
1224:
1198:
1153:
1029:
982:
960:
917:
878:
804:
744:
720:
696:
675:{\displaystyle {\widehat {\boldsymbol {\theta }}}}
674:
642:
607:
586:
558:
536:
460:
438:, the James-Stein estimator is superefficient and
394:
372:
337:
307:
266:
189:
103:
2691:
1579:. The answer is no. The improvement is small if
2961:
2668:Proc. Fourth Berkeley Symp. Math. Statist. Prob.
2656:
2654:
2652:
1164:James and Stein showed that the above estimator
1037:is known, the James–Stein estimator is given by
2830:(2nd ed.), New York: John Wiley & Sons
2720:Journal of the American Statistical Association
2627:Proc. Third Berkeley Symp. Math. Statist. Prob.
2342:vectors are available, the results are similar:
419:It can be shown that the James–Stein estimator
2702:
2322:is available. For the more general case when
1706:is estimated from the data itself. Estimating
2945:. New York: North Holland. pp. 229–257.
2649:
1364:. In fact this is not the only direction of
373:{\displaystyle {{\boldsymbol {\theta }}_{i}}}
2444:
2428:
2069:
2050:
1918:
1900:
1734:is large enough; hence it does not work for
1604:
1586:
1495:
1476:
1319:
1308:
1124:
1113:
997:
653:We are interested in obtaining an estimate,
431:than the "ordinary" least square estimator.
261:
204:
184:
133:
2717:
190:{\displaystyle Y=\{Y_{1},Y_{2},...,Y_{m}\}}
2819:
2665:(1961), "Estimation with quadratic loss",
2660:
2616:
2614:
2512:{\displaystyle {\overline {\mathbf {y} }}}
1374:be an arbitrary fixed vector of dimension
2940:
2906:
2859:
2802:
2762:
2746:
533:
62:Learn how and when to remove this message
46:, without removing the technical details.
2839:
2837:
2825:
1710:only gives an advantage compared to the
1001:
2883:
2784:
2752:
2611:
2359:
2117:
2106:
2064:
1984:
1952:
1914:
1879:
1853:
1651:Seeing the James–Stein estimator as an
1599:
1591:
1537:
1526:
1490:
1414:
1178:
1051:
983:{\displaystyle {\boldsymbol {\theta }}}
976:
940:
897:
879:{\displaystyle \operatorname {E} \left}
852:
842:
774:
697:{\displaystyle {\boldsymbol {\theta }}}
690:
663:
559:{\displaystyle {\boldsymbol {\theta }}}
552:
507:
388:
359:
251:
224:
209:
104:{\displaystyle {\boldsymbol {\theta }}}
97:
2962:
2941:Judge, George G.; Bock, M. E. (1978).
2889:
2834:
2620:
1780:setting, it is reasonable to combine
44:make it understandable to non-experts
2843:
2703:Lehmann, E. L.; Casella, G. (1998),
1886:{\displaystyle {\boldsymbol {\nu }}}
1860:{\displaystyle {\boldsymbol {\nu }}}
815:Stein demonstrated that in terms of
395:{\displaystyle {\boldsymbol {\nu }}}
18:
1968:positive-part James–Stein estimator
380:towards a more central mean vector
13:
2934:
2785:Stigler, Stephen M. (1990-02-01).
2707:(2nd ed.), New York: Springer
1795:for more than three measurements.
1360:and shrinks it towards the origin
825:
14:
2986:
2895:Annals of Mathematical Statistics
2671:, vol. 1, pp. 361–379,
2630:, vol. 1, pp. 197–206,
1804:imperfectly measured sample means
1661:itself is a random variable with
1646:
704:, based on a single observation,
2500:
2464:
2434:
2097:
2055:
1943:
1905:
1828:
1517:
1481:
1346:
1313:
1143:
1118:
797:
737:
713:
579:
485:
23:
1814:
1544:
886:, the least squares estimator,
2778:
2711:
2682:
2406:
2394:
2270:
2243:
2110:
2092:
2035:
2023:
1867:, the estimate actually moves
1836:{\displaystyle {\mathbf {y} }}
1793:entropic uncertainty principle
1689:
1677:
1530:
1512:
1461:
1449:
1292:
1280:
1098:
1086:
862:
837:
745:{\displaystyle {\mathbf {Y} }}
721:{\displaystyle {\mathbf {y} }}
587:{\displaystyle {\mathbf {Y} }}
527:
503:
1:
2604:
2156:
615:-variate normally distributed
2504:
2468:
2438:
2264:
1712:maximum-likelihood estimator
1353:{\displaystyle \mathbf {y} }
643:{\displaystyle \sigma ^{2}I}
7:
2577:
2194:{\displaystyle \sigma ^{2}}
1695:{\displaystyle \sim N(0,A)}
1030:{\displaystyle \sigma ^{2}}
10:
2991:
2705:Theory of Point Estimation
471:
409:Stein's example or paradox
998:The James–Stein estimator
461:{\displaystyle \theta =0}
2826:Anderson, T. W. (1984),
2584:Admissible decision rule
2908:10.1214/aoms/1177699259
2565:be used to construct a
2539:-length average of the
1753:{\displaystyle m\leq 2}
1643:, surely a poor guess.
1263:{\displaystyle m\geq 3}
1225:{\displaystyle m\geq 3}
338:{\displaystyle m\geq 3}
308:{\displaystyle m\leq 2}
2861:10.1214/aos/1176343009
2553:
2533:
2513:
2480:
2336:
2309:
2308:{\displaystyle m>2}
2283:
2195:
2137:
1960:
1928:
1887:
1861:
1837:
1806:. The equation of the
1784:tap measurements in a
1754:
1728:
1696:
1653:empirical Bayes method
1611:
1558:
1388:
1354:
1332:
1264:
1226:
1200:
1155:
1031:
1007:
984:
962:
919:
880:
806:
746:
722:
698:
676:
644:
609:
588:
560:
538:
462:
396:
374:
339:
309:
268:
191:
105:
2804:10.1214/ss/1177012274
2554:
2534:
2514:
2481:
2337:
2310:
2284:
2196:
2138:
1961:
1934:as the multiplier on
1929:
1888:
1862:
1838:
1755:
1729:
1697:
1612:
1559:
1389:
1355:
1333:
1265:
1227:
1201:
1156:
1032:
1005:
985:
963:
927:James–Stein estimator
920:
881:
807:
747:
723:
699:
677:
645:
610:
589:
561:
539:
463:
402:(which can be chosen
397:
375:
340:
310:
269:
192:
106:
76:James–Stein estimator
2847:Annals of Statistics
2753:Stander, M. (2017),
2543:
2523:
2494:
2352:
2326:
2293:
2205:
2178:
1977:
1938:
1897:
1893:for small values of
1875:
1849:
1823:
1738:
1718:
1668:
1583:
1407:
1378:
1342:
1277:
1248:
1210:
1171:
1044:
1014:
972:
933:
890:
822:
767:
763:estimator, which is
732:
708:
686:
657:
624:
599:
574:
548:
480:
446:
384:
353:
323:
293:
201:
124:
116:Gaussian distributed
93:
2975:Normal distribution
2791:Statistical Science
2773:2017arXiv170202440S
2594:Shrinkage estimator
1714:when the dimension
197:with unknown means
2549:
2529:
2509:
2476:
2332:
2305:
2279:
2191:
2168:Stein's phenomenon
2133:
1956:
1924:
1883:
1857:
1833:
1786:channel estimation
1764:(LS) estimator is
1750:
1724:
1692:
1663:prior distribution
1607:
1554:
1384:
1350:
1328:
1260:
1238:maximum likelihood
1234:mean squared error
1222:
1196:
1151:
1027:
1008:
980:
958:
915:
876:
817:mean squared error
802:
742:
718:
694:
672:
640:
605:
584:
556:
534:
458:
429:mean squared error
392:
370:
335:
305:
264:
187:
101:
2599:Regular estimator
2589:Hodges' estimator
2567:linear regression
2552:{\displaystyle n}
2532:{\displaystyle m}
2507:
2471:
2454:
2441:
2424:
2365:
2335:{\displaystyle n}
2267:
2238:
2218:
2079:
1990:
1778:telecommunication
1727:{\displaystyle m}
1505:
1420:
1387:{\displaystyle m}
1368:that works. Let
1184:
1134:
1057:
946:
903:
858:
780:
669:
619:covariance matrix
608:{\displaystyle m}
544:where the vector
436:Hodges' estimator
349:the sample means
72:
71:
64:
2982:
2956:
2928:
2927:
2910:
2901:(5): 1087–1136,
2887:
2881:
2880:
2863:
2841:
2832:
2831:
2823:
2817:
2816:
2806:
2782:
2776:
2775:
2766:
2750:
2744:
2743:
2715:
2709:
2708:
2700:
2689:
2686:
2680:
2679:
2658:
2647:
2646:
2618:
2558:
2556:
2555:
2550:
2538:
2536:
2535:
2530:
2518:
2516:
2515:
2510:
2508:
2503:
2498:
2485:
2483:
2482:
2477:
2472:
2467:
2462:
2460:
2456:
2455:
2453:
2452:
2451:
2442:
2437:
2432:
2426:
2425:
2420:
2419:
2410:
2392:
2376:
2375:
2367:
2366:
2358:
2341:
2339:
2338:
2333:
2314:
2312:
2311:
2306:
2288:
2286:
2285:
2280:
2278:
2277:
2268:
2260:
2255:
2254:
2239:
2231:
2226:
2225:
2220:
2219:
2211:
2200:
2198:
2197:
2192:
2190:
2189:
2142:
2140:
2139:
2134:
2120:
2109:
2101:
2100:
2091:
2090:
2085:
2081:
2080:
2078:
2077:
2076:
2067:
2059:
2058:
2048:
2047:
2046:
2021:
2004:
2003:
1992:
1991:
1983:
1970:and is given by
1965:
1963:
1962:
1957:
1955:
1947:
1946:
1933:
1931:
1930:
1925:
1917:
1909:
1908:
1892:
1890:
1889:
1884:
1882:
1866:
1864:
1863:
1858:
1856:
1842:
1840:
1839:
1834:
1832:
1831:
1759:
1757:
1756:
1751:
1733:
1731:
1730:
1725:
1701:
1699:
1698:
1693:
1616:
1614:
1613:
1608:
1603:
1602:
1594:
1563:
1561:
1560:
1555:
1540:
1529:
1521:
1520:
1511:
1507:
1506:
1504:
1503:
1502:
1493:
1485:
1484:
1474:
1473:
1472:
1447:
1431:
1430:
1422:
1421:
1413:
1393:
1391:
1390:
1385:
1359:
1357:
1356:
1351:
1349:
1337:
1335:
1334:
1329:
1327:
1326:
1317:
1316:
1304:
1303:
1269:
1267:
1266:
1261:
1231:
1229:
1228:
1223:
1205:
1203:
1202:
1197:
1195:
1194:
1186:
1185:
1177:
1160:
1158:
1157:
1152:
1147:
1146:
1140:
1136:
1135:
1133:
1132:
1131:
1122:
1121:
1111:
1110:
1109:
1084:
1068:
1067:
1059:
1058:
1050:
1036:
1034:
1033:
1028:
1026:
1025:
989:
987:
986:
981:
979:
967:
965:
964:
959:
957:
956:
948:
947:
939:
924:
922:
921:
916:
914:
913:
905:
904:
896:
885:
883:
882:
877:
875:
871:
870:
865:
861:
860:
859:
851:
845:
811:
809:
808:
803:
801:
800:
791:
790:
782:
781:
773:
751:
749:
748:
743:
741:
740:
727:
725:
724:
719:
717:
716:
703:
701:
700:
695:
693:
681:
679:
678:
673:
671:
670:
662:
649:
647:
646:
641:
636:
635:
614:
612:
611:
606:
593:
591:
590:
585:
583:
582:
565:
563:
562:
557:
555:
543:
541:
540:
535:
523:
522:
510:
502:
501:
489:
488:
467:
465:
464:
459:
401:
399:
398:
393:
391:
379:
377:
376:
371:
369:
368:
367:
362:
344:
342:
341:
336:
314:
312:
311:
306:
273:
271:
270:
265:
260:
259:
254:
233:
232:
227:
218:
217:
212:
196:
194:
193:
188:
183:
182:
158:
157:
145:
144:
119:random variables
111:, of (possibly)
110:
108:
107:
102:
100:
67:
60:
56:
53:
47:
27:
26:
19:
2990:
2989:
2985:
2984:
2983:
2981:
2980:
2979:
2960:
2959:
2953:
2937:
2935:Further reading
2932:
2931:
2888:
2884:
2842:
2835:
2824:
2820:
2783:
2779:
2751:
2747:
2732:10.2307/2284155
2716:
2712:
2701:
2692:
2687:
2683:
2659:
2650:
2619:
2612:
2607:
2580:
2544:
2541:
2540:
2524:
2521:
2520:
2499:
2497:
2495:
2492:
2491:
2463:
2461:
2447:
2443:
2433:
2431:
2427:
2415:
2411:
2409:
2393:
2391:
2384:
2380:
2368:
2357:
2356:
2355:
2353:
2350:
2349:
2327:
2324:
2323:
2294:
2291:
2290:
2273:
2269:
2259:
2250:
2246:
2230:
2221:
2210:
2209:
2208:
2206:
2203:
2202:
2185:
2181:
2179:
2176:
2175:
2159:
2116:
2105:
2096:
2095:
2086:
2072:
2068:
2063:
2054:
2053:
2049:
2042:
2038:
2022:
2020:
2013:
2009:
2008:
1993:
1982:
1981:
1980:
1978:
1975:
1974:
1951:
1942:
1941:
1939:
1936:
1935:
1913:
1904:
1903:
1898:
1895:
1894:
1878:
1876:
1873:
1872:
1852:
1850:
1847:
1846:
1827:
1826:
1824:
1821:
1820:
1817:
1739:
1736:
1735:
1719:
1716:
1715:
1669:
1666:
1665:
1649:
1598:
1590:
1589:
1584:
1581:
1580:
1536:
1525:
1516:
1515:
1498:
1494:
1489:
1480:
1479:
1475:
1468:
1464:
1448:
1446:
1439:
1435:
1423:
1412:
1411:
1410:
1408:
1405:
1404:
1379:
1376:
1375:
1345:
1343:
1340:
1339:
1322:
1318:
1312:
1311:
1299:
1295:
1278:
1275:
1274:
1273:Notice that if
1249:
1246:
1245:
1236:(MSE) than the
1211:
1208:
1207:
1187:
1176:
1175:
1174:
1172:
1169:
1168:
1142:
1141:
1127:
1123:
1117:
1116:
1112:
1105:
1101:
1085:
1083:
1076:
1072:
1060:
1049:
1048:
1047:
1045:
1042:
1041:
1021:
1017:
1015:
1012:
1011:
1000:
992:Stein's example
975:
973:
970:
969:
949:
938:
937:
936:
934:
931:
930:
906:
895:
894:
893:
891:
888:
887:
866:
850:
849:
841:
840:
836:
835:
831:
823:
820:
819:
796:
795:
783:
772:
771:
770:
768:
765:
764:
736:
735:
733:
730:
729:
712:
711:
709:
706:
705:
689:
687:
684:
683:
661:
660:
658:
655:
654:
631:
627:
625:
622:
621:
617:and with known
600:
597:
596:
578:
577:
575:
572:
571:
566:is the unknown
551:
549:
546:
545:
518:
514:
506:
497:
493:
484:
483:
481:
478:
477:
474:
447:
444:
443:
434:Similar to the
423:the "ordinary"
387:
385:
382:
381:
363:
358:
357:
356:
354:
351:
350:
324:
321:
320:
294:
291:
290:
255:
250:
249:
228:
223:
222:
213:
208:
207:
202:
199:
198:
178:
174:
153:
149:
140:
136:
125:
122:
121:
96:
94:
91:
90:
68:
57:
51:
48:
40:help improve it
37:
28:
24:
17:
12:
11:
5:
2988:
2978:
2977:
2972:
2958:
2957:
2951:
2936:
2933:
2930:
2929:
2882:
2854:(1): 209–218,
2833:
2818:
2777:
2745:
2710:
2690:
2681:
2648:
2609:
2608:
2606:
2603:
2602:
2601:
2596:
2591:
2586:
2579:
2576:
2575:
2574:
2570:
2561:
2560:
2548:
2528:
2506:
2502:
2488:
2487:
2486:
2475:
2470:
2466:
2459:
2450:
2446:
2440:
2436:
2430:
2423:
2418:
2414:
2408:
2405:
2402:
2399:
2396:
2390:
2387:
2383:
2379:
2374:
2371:
2364:
2361:
2344:
2343:
2331:
2316:
2304:
2301:
2298:
2276:
2272:
2266:
2263:
2258:
2253:
2249:
2245:
2242:
2237:
2234:
2229:
2224:
2217:
2214:
2188:
2184:
2158:
2155:
2144:
2143:
2132:
2129:
2126:
2123:
2119:
2115:
2112:
2108:
2104:
2099:
2094:
2089:
2084:
2075:
2071:
2066:
2062:
2057:
2052:
2045:
2041:
2037:
2034:
2031:
2028:
2025:
2019:
2016:
2012:
2007:
2002:
1999:
1996:
1989:
1986:
1954:
1950:
1945:
1923:
1920:
1916:
1912:
1907:
1902:
1881:
1855:
1830:
1816:
1813:
1749:
1746:
1743:
1723:
1691:
1688:
1685:
1682:
1679:
1676:
1673:
1648:
1647:Interpretation
1645:
1606:
1601:
1597:
1593:
1588:
1565:
1564:
1553:
1550:
1547:
1543:
1539:
1535:
1532:
1528:
1524:
1519:
1514:
1510:
1501:
1497:
1492:
1488:
1483:
1478:
1471:
1467:
1463:
1460:
1457:
1454:
1451:
1445:
1442:
1438:
1434:
1429:
1426:
1419:
1416:
1383:
1348:
1325:
1321:
1315:
1310:
1307:
1302:
1298:
1294:
1291:
1288:
1285:
1282:
1259:
1256:
1253:
1221:
1218:
1215:
1193:
1190:
1183:
1180:
1162:
1161:
1150:
1145:
1139:
1130:
1126:
1120:
1115:
1108:
1104:
1100:
1097:
1094:
1091:
1088:
1082:
1079:
1075:
1071:
1066:
1063:
1056:
1053:
1024:
1020:
999:
996:
978:
955:
952:
945:
942:
912:
909:
902:
899:
874:
869:
864:
857:
854:
848:
844:
839:
834:
830:
827:
799:
794:
789:
786:
779:
776:
757:Gaussian noise
739:
715:
692:
668:
665:
639:
634:
630:
604:
581:
554:
532:
529:
526:
521:
517:
513:
509:
505:
500:
496:
492:
487:
473:
470:
457:
454:
451:
390:
366:
361:
334:
331:
328:
304:
301:
298:
263:
258:
253:
248:
245:
242:
239:
236:
231:
226:
221:
216:
211:
206:
186:
181:
177:
173:
170:
167:
164:
161:
156:
152:
148:
143:
139:
135:
132:
129:
99:
70:
69:
31:
29:
22:
15:
9:
6:
4:
3:
2:
2987:
2976:
2973:
2971:
2968:
2967:
2965:
2954:
2952:0-7204-0729-X
2948:
2944:
2939:
2938:
2926:
2922:
2918:
2914:
2909:
2904:
2900:
2896:
2892:
2886:
2879:
2875:
2871:
2867:
2862:
2857:
2853:
2849:
2848:
2840:
2838:
2829:
2822:
2814:
2810:
2805:
2800:
2796:
2792:
2788:
2781:
2774:
2770:
2765:
2760:
2756:
2749:
2741:
2737:
2733:
2729:
2725:
2721:
2714:
2706:
2699:
2697:
2695:
2685:
2678:
2674:
2670:
2669:
2664:
2657:
2655:
2653:
2645:
2641:
2637:
2633:
2629:
2628:
2623:
2617:
2615:
2610:
2600:
2597:
2595:
2592:
2590:
2587:
2585:
2582:
2581:
2571:
2568:
2563:
2562:
2559:observations.
2546:
2526:
2489:
2473:
2457:
2448:
2421:
2416:
2412:
2403:
2400:
2397:
2388:
2385:
2381:
2377:
2372:
2369:
2362:
2348:
2347:
2346:
2345:
2329:
2321:
2317:
2302:
2299:
2296:
2274:
2261:
2256:
2251:
2247:
2240:
2235:
2232:
2227:
2222:
2215:
2212:
2186:
2182:
2173:
2172:
2171:
2169:
2165:
2154:
2151:
2149:
2130:
2127:
2124:
2121:
2113:
2102:
2087:
2082:
2073:
2060:
2043:
2039:
2032:
2029:
2026:
2017:
2014:
2010:
2005:
2000:
1997:
1994:
1987:
1973:
1972:
1971:
1969:
1948:
1921:
1910:
1870:
1845:
1812:
1809:
1805:
1801:
1796:
1794:
1789:
1787:
1783:
1779:
1774:
1771:
1767:
1761:
1747:
1744:
1741:
1721:
1713:
1709:
1705:
1686:
1683:
1680:
1674:
1671:
1664:
1660:
1659:
1654:
1644:
1642:
1641:
1636:
1635:
1631:finite guess
1630:
1626:
1622:
1621:
1595:
1578:
1577:
1572:
1571:
1551:
1548:
1545:
1541:
1533:
1522:
1508:
1499:
1486:
1469:
1465:
1458:
1455:
1452:
1443:
1440:
1436:
1432:
1427:
1424:
1417:
1403:
1402:
1401:
1399:
1398:
1381:
1373:
1372:
1367:
1363:
1323:
1305:
1300:
1296:
1289:
1286:
1283:
1271:
1257:
1254:
1251:
1243:
1239:
1235:
1219:
1216:
1213:
1191:
1188:
1181:
1167:
1148:
1137:
1128:
1106:
1102:
1095:
1092:
1089:
1080:
1077:
1073:
1069:
1064:
1061:
1054:
1040:
1039:
1038:
1022:
1018:
1004:
995:
993:
953:
950:
943:
928:
910:
907:
900:
872:
867:
855:
846:
832:
828:
818:
813:
792:
787:
784:
777:
762:
761:least squares
758:
753:
666:
651:
637:
632:
628:
620:
616:
602:
569:
530:
524:
519:
515:
511:
498:
494:
490:
469:
455:
452:
449:
441:
437:
432:
430:
426:
425:least squares
422:
417:
415:
414:Willard James
411:
410:
405:
364:
348:
332:
329:
326:
318:
302:
299:
296:
288:
284:
280:
279:Charles Stein
275:
256:
246:
243:
240:
237:
234:
229:
219:
214:
179:
175:
171:
168:
165:
162:
159:
154:
150:
146:
141:
137:
130:
127:
120:
117:
114:
88:
84:
81:
77:
66:
63:
55:
52:November 2017
45:
41:
35:
32:This article
30:
21:
20:
2942:
2898:
2894:
2891:Brown, L. D.
2885:
2851:
2845:
2827:
2821:
2794:
2790:
2780:
2754:
2748:
2723:
2719:
2713:
2704:
2684:
2667:
2626:
2319:
2164:inadmissible
2160:
2152:
2148:inadmissible
2145:
1967:
1868:
1843:
1818:
1815:Improvements
1797:
1790:
1775:
1769:
1762:
1707:
1703:
1657:
1656:
1650:
1639:
1638:
1633:
1632:
1628:
1619:
1618:
1575:
1574:
1569:
1568:
1566:
1396:
1395:
1370:
1369:
1361:
1272:
1242:inadmissible
1163:
1009:
926:
814:
754:
652:
475:
433:
418:
407:
317:inadmissible
276:
75:
73:
58:
49:
33:
2661:James, W.;
594:, which is
440:non-regular
412:. In 1961,
283:sample mean
2964:Categories
2925:0156.39401
2878:0314.62005
2764:1702.02440
2644:0073.35602
2605:References
2157:Extensions
1766:admissible
287:admissible
113:correlated
2970:Estimator
2813:0883-4237
2663:Stein, C.
2622:Stein, C.
2505:¯
2469:¯
2445:‖
2439:¯
2429:‖
2413:σ
2401:−
2389:−
2363:^
2360:θ
2265:¯
2257:−
2241:∑
2216:^
2213:σ
2183:σ
2128:≥
2118:ν
2107:ν
2103:−
2070:‖
2065:ν
2061:−
2051:‖
2040:σ
2030:−
2018:−
1988:^
1985:θ
1953:ν
1949:−
1919:‖
1915:ν
1911:−
1901:‖
1880:ν
1854:ν
1800:Galtonian
1745:≤
1672:∼
1605:‖
1600:ν
1596:−
1592:θ
1587:‖
1549:≥
1538:ν
1527:ν
1523:−
1496:‖
1491:ν
1487:−
1477:‖
1466:σ
1456:−
1444:−
1418:^
1415:θ
1400:, namely
1366:shrinkage
1320:‖
1309:‖
1297:σ
1287:−
1255:≥
1217:≥
1182:^
1179:θ
1166:dominates
1125:‖
1114:‖
1103:σ
1093:−
1081:−
1055:^
1052:θ
1019:σ
977:θ
944:^
941:θ
901:^
898:θ
856:^
853:θ
847:−
843:θ
829:
778:^
775:θ
691:θ
667:^
664:θ
629:σ
553:θ
516:σ
508:θ
491:∼
450:θ
421:dominates
389:ν
360:θ
330:≥
300:≤
252:θ
225:θ
210:θ
98:θ
83:estimator
2578:See also
1702:, where
1625:a priori
1206:for any
863:‖
838:‖
404:a priori
315:, it is
2917:0216647
2870:0381064
2769:Bibcode
2740:2284155
2677:0133191
2636:0084922
2519:is the
1782:channel
472:Setting
347:shrinks
85:of the
38:Please
2949:
2923:
2915:
2876:
2868:
2811:
2738:
2675:
2642:
2634:
2490:where
1844:toward
80:biased
2797:(1).
2759:arXiv
2736:JSTOR
1871:from
1770:total
1244:when
728:, of
682:, of
319:when
289:when
285:, is
78:is a
2947:ISBN
2809:ISSN
2300:>
1869:away
1306:<
568:mean
476:Let
87:mean
74:The
2921:Zbl
2903:doi
2874:Zbl
2856:doi
2799:doi
2728:doi
2640:Zbl
1808:OLS
1629:any
1010:If
570:of
442:at
274:.
42:to
2966::
2919:,
2913:MR
2911:,
2899:37
2897:,
2872:,
2866:MR
2864:,
2850:,
2836:^
2807:.
2793:.
2789:.
2767:,
2757:,
2734:.
2724:68
2722:.
2693:^
2673:MR
2651:^
2638:,
2632:MR
2613:^
2150:.
2131:4.
1552:3.
1270:.
994:.
929:,
812:.
752:.
650:.
468:.
89:,
2955:.
2905::
2858::
2852:3
2815:.
2801::
2795:5
2771::
2761::
2742:.
2730::
2547:n
2527:m
2501:y
2474:,
2465:y
2458:)
2449:2
2435:y
2422:n
2417:2
2407:)
2404:2
2398:m
2395:(
2386:1
2382:(
2378:=
2373:S
2370:J
2330:n
2320:y
2315:.
2303:2
2297:m
2275:2
2271:)
2262:y
2252:i
2248:y
2244:(
2236:m
2233:1
2228:=
2223:2
2187:2
2125:m
2122:,
2114:+
2111:)
2098:y
2093:(
2088:+
2083:)
2074:2
2056:y
2044:2
2036:)
2033:3
2027:m
2024:(
2015:1
2011:(
2006:=
2001:+
1998:S
1995:J
1944:y
1922:,
1906:y
1829:y
1748:2
1742:m
1722:m
1708:A
1704:A
1690:)
1687:A
1684:,
1681:0
1678:(
1675:N
1658:θ
1640:ν
1634:ν
1620:θ
1576:ν
1570:ν
1546:m
1542:,
1534:+
1531:)
1518:y
1513:(
1509:)
1500:2
1482:y
1470:2
1462:)
1459:2
1453:m
1450:(
1441:1
1437:(
1433:=
1428:S
1425:J
1397:ν
1382:m
1371:ν
1362:0
1347:y
1324:2
1314:y
1301:2
1293:)
1290:2
1284:m
1281:(
1258:3
1252:m
1220:3
1214:m
1192:S
1189:L
1149:.
1144:y
1138:)
1129:2
1119:y
1107:2
1099:)
1096:2
1090:m
1087:(
1078:1
1074:(
1070:=
1065:S
1062:J
1023:2
954:S
951:J
911:S
908:L
873:]
868:2
833:[
826:E
798:y
793:=
788:S
785:L
738:Y
714:y
638:I
633:2
603:m
580:Y
531:,
528:)
525:I
520:2
512:,
504:(
499:m
495:N
486:Y
456:0
453:=
365:i
333:3
327:m
303:2
297:m
262:}
257:m
247:,
244:.
241:.
238:.
235:,
230:2
220:,
215:1
205:{
185:}
180:m
176:Y
172:,
169:.
166:.
163:.
160:,
155:2
151:Y
147:,
142:1
138:Y
134:{
131:=
128:Y
65:)
59:(
54:)
50:(
36:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.