Knowledge

Invariant estimator

Source 📝

63: 299:
without a fully defined statistical model or the classical theory of statistical inference cannot be readily applied because the family of models being considered are not amenable to such treatment. In addition to these cases where general theory does not prescribe an estimator, the concept of invariance of an estimator can be applied when seeking estimators of alternative forms, either for the sake of simplicity of application of the estimator or so that the estimator is
125: 22: 4542: 4745: 318:
One use of the concept of invariance is where a class or family of estimators is proposed and a particular formulation must be selected amongst these. One procedure is to impose relevant invariance properties and then to find the formulation within this class that has the best properties, leading to
251:
for the same quantity. It is a way of formalising the idea that an estimator should have certain intuitively appealing qualities. Strictly speaking, "invariant" would mean that the estimates themselves are unchanged when both the measurements and the parameters are transformed in a compatible way,
298:
might be undertaken, leading to a posterior distribution for relevant parameters, but the use of a specific utility or loss function may be unclear. Ideas of invariance can then be applied to the task of summarising the posterior distribution. In other cases, statistical analyses are undertaken
350:
Parameter-transformation invariance: Here, the transformation applies to the parameters alone. The concept here is that essentially the same inference should be made from data and a model involving a parameter θ as would be made from the same data if the model used a parameter φ, where φ is a
377:, it is reasonable to impose the requirement that any estimator of any property of the common distribution should be permutation-invariant: specifically that the estimator, considered as a function of the set of data-values, should not change if items of data are swapped within the dataset. 340:, this invariance requirement immediately implies that the weights should sum to one. While the same result is often derived from a requirement for unbiasedness, the use of "invariance" does not require that a mean value exists and makes no use of any probability distribution at all. 4344: 4556: 4197: 4090: 306:
The concept of invariance is sometimes used on its own as a way of choosing between estimators, but this is not necessarily definitive. For example, a requirement of invariance may be incompatible with the requirement that the
5161: 3334: 4881: 3415: 3024:
For a given problem, the invariant estimator with the lowest risk is termed the "best invariant estimator". Best invariant estimator cannot always be achieved. A special case for which it can be achieved is the case when
327:
There are several types of transformations that are usefully considered when dealing with invariant estimators. Each gives rise to a class of estimators which are invariant to those particular types of transformation.
4304: 290:. Similarly, the theory of classical statistical inference can sometimes lead to strong conclusions about what estimator should be used. However, the usefulness of these theories depends on having a fully prescribed 256:
is used in formal mathematical contexts that include a precise description of the relation of the way the estimator changes in response to changes to the dataset and parameterisation: this corresponds to the use of
3659: 2916: 2261: 4998: 2793: 4947: 3899: 2499: 5347: 4537:{\displaystyle {\frac {\int _{-\infty }^{\infty }L(\delta (x)-\theta )f(x_{1}-\theta ,\dots ,x_{n}-\theta )d\theta }{\int _{-\infty }^{\infty }f(x_{1}-\theta ,\dots ,x_{n}-\theta )d\theta }},} 336:
should be invariant to simple shifts of the data values. If all data values are increased by a given amount, the estimate should change by the same amount. When considering estimation using a
3767: 2417: 4740:{\displaystyle \delta (x)={\frac {\int _{-\infty }^{\infty }\theta f(x_{1}-\theta ,\dots ,x_{n}-\theta )d\theta }{\int _{-\infty }^{\infty }f(x_{1}-\theta ,\dots ,x_{n}-\theta )d\theta }}.} 1685: 4801: 858:, the rule which assigns a class to a new data-item can be considered to be a special type of estimator. A number of invariance-type considerations can be brought to bear in formulating 3176: 2592: 2986: 3832: 1853: 1997: 3694: 2942: 2287: 1820: 3487: 1301: 1179: 1932: 724: 2352: 3952: 3459: 3217: 1492: 642: 3135: 2075: 957: 495: 1959: 1243: 4336: 3516: 3052: 3018: 2644: 2180: 1135: 1102: 1609: 1365: 3556: 3536: 3080: 2842: 2822: 2437: 920: 824: 764: 538: 515: 431: 367:. Though the asymptotic properties of the estimator might be invariant, the small sample properties can be different, and a specific distribution needs to be derived. 2716: 2690: 2147: 2121: 1888: 1794: 1518: 1329: 1207: 385:
dataset using a weighted average implies that the weights should be identical and sum to one. Of course, estimators other than a weighted average may be preferable.
2314: 1573: 1423: 1396: 3100: 2664: 2615: 2542: 2519: 2095: 2040: 2020: 1768: 1748: 1725: 1705: 1546: 1443: 1065: 1045: 1025: 1001: 977: 900: 844: 804: 784: 744: 598: 578: 558: 451: 411: 4096: 3960: 5006: 3222: 4813: 3342: 4203: 3561: 2847: 343:
Scale invariance: Note that this topic about the invariance of the estimator scale parameter not to be confused with the more general
983:
To define an invariant or equivariant estimator formally, some definitions related to groups of transformations are needed first. Let
5365:
see section 5.2.1 in Gourieroux, C. and Monfort, A. (1995). Statistics and econometric models, volume 1. Cambridge University Press.
879:
Principle of Rational Invariance: The action taken in a decision problem should not depend on transformation on the measurement used
189: 859: 382: 371: 161: 2185: 4960: 2724: 282:
that can be used to decide immediately what estimators should be used according to those approaches. For example, ideas from
4889: 5434:
Pitman, E.J.G. (1939). "The estimation of the location and scale parameters of a continuous population of any given form".
3837: 252:
but the meaning has been extended to allow the estimates to change in appropriate ways with such transformations. The term
168: 142: 35: 2442: 370:
Permutation invariance: Where a set of data values can be represented by a statistical model that they are outcomes from
355:(θ). According to this type of invariance, results from transformation-invariant estimators should also be related by φ= 5172: 3702: 2360: 5395: 226: 208: 175: 106: 84: 49: 77: 4804: 1614: 4753: 3140: 157: 2547: 146: 2947: 381:
The combination of permutation invariance and location invariance for estimating a location parameter from an
315:
is defined in terms of the estimator's sampling distribution and so is invariant under many transformations.
3779: 1825: 458: 360: 5383: 1964: 3664: 2921: 2266: 1799: 3464: 1248: 1140: 855: 1893: 650: 5510: 2319: 3912: 3423: 3181: 1448: 606: 71: 41: 3105: 2045: 925: 463: 1937: 1212: 1004: 182: 135: 4312: 3492: 3028: 2994: 2620: 2152: 1107: 1074: 88: 1578: 1334: 3541: 3521: 3065: 2827: 2807: 2422: 905: 809: 749: 523: 500: 416: 275: 2695: 2669: 2126: 2100: 1858: 1773: 1497: 1308: 1186: 5405: 2292: 1551: 1525: 1401: 1374: 882:
Invariance Principle: If two decision problems have the same formal structure (in terms of
308: 5413:
Freue, Gabriela V. Cohen (2007). "The Pitman estimator of the Cauchy location parameter".
312: 294:
and may also depend on having a relevant loss function to determine the estimator. Thus a
8: 4950: 5505: 5484: 5465:
Pitman, E.J.G. (1939). "Tests of Hypotheses Concerning Location and Scale Parameters".
5453: 3085: 2649: 2600: 2527: 2504: 2080: 2025: 2005: 1753: 1733: 1710: 1690: 1531: 1428: 1050: 1030: 1010: 986: 962: 885: 829: 789: 769: 729: 583: 563: 543: 436: 396: 364: 333: 295: 287: 283: 5391: 1521: 300: 291: 279: 4192:{\displaystyle {\bar {G}}=\{g_{c}:g_{c}(\theta )=\theta +c,c\in \mathbb {R} ^{1}\},} 4085:{\displaystyle G=\{g_{c}:g_{c}(x)=(x_{1}+c,\dots ,x_{n}+c),c\in \mathbb {R} ^{1}\},} 5476: 5445: 5422: 374: 344: 337: 5156:{\displaystyle \delta _{\text{Pitman}}=\sum _{k=1}^{n}{x_{k}\left},\qquad n>1,} 3329:{\displaystyle g={\bar {g}}={\tilde {g}}=\{g_{c}:g_{c}(x)=x+c,c\in \mathbb {R} \}} 5401: 3954:. This problem is invariant with the following (additive) transformation groups: 258: 5480: 5449: 5426: 4876:{\displaystyle \delta _{\text{Pitman}}=\delta _{ML}={\frac {\sum {x_{i}}}{n}}.} 3410:{\displaystyle \delta (x+c)=\delta (x)+c,{\text{ for all }}c\in \mathbb {R} ,} 5499: 3906: 645: 601: 454: 875:
An invariant estimator is an estimator which obeys the following two rules:
4299:{\displaystyle {\tilde {G}}=\{g_{c}:g_{c}(a)=a+c,c\in \mathbb {R} ^{1}\}.} 5488: 5467: 5457: 5436: 247:
is a criterion that can be used to compare the properties of different
240: 347:
about the behavior of systems under aggregate properties (in physics).
248: 124: 3654:{\displaystyle R(\theta ,\delta )=R(0,\delta )=\operatorname {E} } 2911:{\displaystyle R(\theta ,\delta )=R({\bar {g}}(\theta ),\delta )} 3661:. The best invariant estimator is the one that brings the risk 1303:(That is, each transformation has an inverse within the group.) 979:), then the same decision rule should be used in each problem. 2991:
The risk function of an invariant estimator with transitive
1047:, is a set of (measurable) 1:1 and onto transformations of 2256:{\displaystyle L(\theta ,a)=L({\bar {g}}(\theta ),a^{*})} 580:, is a function of the measurements and belongs to a set 4993:{\displaystyle \delta _{\text{Pitman}}\neq \delta _{ML}} 2788:{\displaystyle \delta (g(x))={\tilde {g}}(\delta (x)).} 1067:
into itself, which satisfies the following conditions:
393:
Under this setting, we are given a set of measurements
322: 4942:{\displaystyle x\sim C(\theta 1_{n},I\sigma ^{2})\,\!} 2524:
An estimation problem is invariant(equivariant) under
413:
which contains information about an unknown parameter
5175: 5009: 4963: 4892: 4816: 4756: 4559: 4347: 4315: 4206: 4099: 3963: 3915: 3894:{\displaystyle f(x_{1}-\theta ,\dots ,x_{n}-\theta )} 3840: 3782: 3705: 3667: 3564: 3544: 3524: 3495: 3467: 3426: 3345: 3225: 3184: 3143: 3108: 3088: 3068: 3031: 2997: 2950: 2924: 2850: 2830: 2810: 2727: 2698: 2672: 2652: 2623: 2603: 2550: 2530: 2507: 2445: 2425: 2363: 2322: 2295: 2269: 2188: 2155: 2129: 2103: 2083: 2048: 2028: 2008: 1967: 1940: 1896: 1861: 1828: 1802: 1776: 1756: 1736: 1713: 1693: 1617: 1581: 1554: 1534: 1500: 1451: 1431: 1404: 1377: 1337: 1311: 1251: 1215: 1189: 1143: 1110: 1077: 1053: 1033: 1013: 989: 965: 928: 908: 888: 832: 812: 792: 772: 752: 732: 653: 609: 586: 566: 546: 526: 503: 466: 439: 419: 399: 3336:. The invariant estimator in this case must satisfy 2494:{\displaystyle {\tilde {G}}=\{{\tilde {g}}:g\in G\}} 149:. Unsourced material may be challenged and removed. 5341: 5155: 4992: 4941: 4875: 4795: 4739: 4536: 4330: 4298: 4191: 4084: 3946: 3893: 3826: 3761: 3688: 3653: 3550: 3530: 3510: 3481: 3453: 3409: 3328: 3211: 3170: 3129: 3094: 3074: 3046: 3012: 2980: 2936: 2910: 2836: 2816: 2787: 2710: 2684: 2658: 2638: 2609: 2597:For an estimation problem that is invariant under 2586: 2536: 2513: 2493: 2431: 2411: 2346: 2308: 2281: 2255: 2174: 2141: 2115: 2089: 2069: 2034: 2014: 1991: 1953: 1926: 1882: 1847: 1814: 1788: 1762: 1742: 1719: 1699: 1679: 1603: 1567: 1540: 1512: 1486: 1437: 1417: 1390: 1359: 1323: 1295: 1237: 1201: 1173: 1129: 1096: 1059: 1039: 1019: 995: 971: 951: 914: 894: 838: 818: 798: 778: 758: 738: 718: 636: 592: 572: 552: 532: 509: 489: 445: 425: 405: 5388:Statistical decision theory and Bayesian Analysis 5342:{\displaystyle w_{k}=\prod _{j\neq k}\left\left.} 4938: 4807:with independent, unit-variance components) then 4792: 5497: 3762:{\displaystyle \delta (x)=x-\operatorname {E} .} 2412:{\displaystyle {\bar {G}}=\{{\bar {g}}:g\in G\}} 319:what is called the optimal invariant estimator. 4550:For the squared error loss case, the result is 388: 3905:is a parameter to be estimated, and where the 3057: 363:have this property when the transformation is 5415:Journal of Statistical Planning and Inference 2804:The risk function of an invariant estimator, 332:Shift invariance: Notionally, estimates of a 5125: 5112: 5080: 5067: 4290: 4222: 4183: 4115: 4076: 3970: 3323: 3262: 2488: 2461: 2406: 2379: 1680:{\displaystyle X(x_{0})=\{g(x_{0}):g\in G\}} 1674: 1640: 600:. The quality of the result is defined by a 5390:(2nd ed.). New York: Springer-Verlag. 4796:{\displaystyle x\sim N(\theta 1_{n},I)\,\!} 1003:denote the set of possible data-samples. A 50:Learn how and when to remove these messages 3171:{\displaystyle \Theta =A=\mathbb {R} ^{1}} 3082:is a location parameter if the density of 1331:(i.e. there is an identity transformation 4937: 4791: 4280: 4173: 4066: 3475: 3400: 3319: 3158: 2587:{\displaystyle G,{\bar {G}},{\tilde {G}}} 1524:. Such an equivalence class is called an 1356: 1289: 1170: 227:Learn how and when to remove this message 209:Learn how and when to remove this message 107:Learn how and when to remove this message 3699:In the case that L is the squared error 2981:{\displaystyle {\bar {g}}\in {\bar {G}}} 1750:is said to be invariant under the group 70:This article includes a list of general 4547:and this is Pitman's estimator (1939). 865: 860:prior knowledge for pattern recognition 383:independent and identically distributed 372:independent and identically distributed 5498: 5464: 5433: 5382: 3827:{\displaystyle X=(X_{1},\dots ,X_{n})} 1848:{\displaystyle \theta ^{*}\in \Theta } 311:; on the other hand, the criterion of 5412: 497:which depends on a parameter vector 1520:. All the equivalent points form an 849: 323:Some classes of invariant estimators 147:adding citations to reliable sources 118: 56: 15: 3771: 2501:is a group of transformations from 2419:is a group of transformations from 1992:{\displaystyle {\bar {g}}(\theta )} 13: 4670: 4665: 4591: 4586: 4467: 4462: 4364: 4359: 3727: 3689:{\displaystyle R(\theta ,\delta )} 3607: 3525: 3144: 2937:{\displaystyle \theta \in \Theta } 2931: 2831: 2426: 2282:{\displaystyle \theta \in \Theta } 2276: 1842: 1815:{\displaystyle \theta \in \Theta } 1809: 909: 813: 351:one-to-one transformation of θ, φ= 278:, there are several approaches to 264: 76:it lacks sufficient corresponding 14: 5522: 4949:(independent components having a 3482:{\displaystyle K\in \mathbb {R} } 3219:, the problem is invariant under 1296:{\displaystyle g^{-1}(g(x))=x\,.} 1174:{\displaystyle g_{1}g_{2}\in G\,} 726:. The sets of possible values of 31:This article has multiple issues. 4805:multivariate normal distribution 2646:is an invariant estimator under 1927:{\displaystyle f(y|\theta ^{*})} 1707:consists of a single orbit then 719:{\displaystyle R=R(a,\theta )=E} 123: 61: 20: 5140: 3776:The estimation problem is that 3538:so the risk does not vary with 2347:{\displaystyle {\tilde {g}}(a)} 261:" in more general mathematics. 134:needs additional citations for 39:or discuss these issues on the 5368: 5359: 5322: 5296: 5242: 5215: 4934: 4902: 4788: 4766: 4722: 4678: 4646: 4602: 4569: 4563: 4519: 4475: 4443: 4399: 4393: 4384: 4378: 4372: 4325: 4319: 4254: 4248: 4213: 4147: 4141: 4106: 4052: 4008: 4002: 3996: 3947:{\displaystyle L(|a-\theta |)} 3941: 3937: 3923: 3919: 3888: 3844: 3821: 3789: 3753: 3740: 3733: 3715: 3709: 3683: 3671: 3648: 3635: 3631: 3619: 3613: 3601: 3589: 3580: 3568: 3502: 3454:{\displaystyle \delta (x)=x+K} 3436: 3430: 3376: 3370: 3361: 3349: 3294: 3288: 3253: 3238: 3212:{\displaystyle L=L(a-\theta )} 3206: 3194: 3124: 3112: 3038: 3004: 2972: 2957: 2905: 2896: 2890: 2884: 2875: 2866: 2854: 2779: 2776: 2770: 2764: 2758: 2746: 2743: 2737: 2731: 2633: 2627: 2578: 2563: 2470: 2452: 2388: 2370: 2341: 2335: 2329: 2250: 2234: 2228: 2222: 2213: 2204: 2192: 2077:is said to be invariant under 2064: 2052: 1986: 1980: 1974: 1921: 1907: 1900: 1877: 1871: 1659: 1646: 1634: 1621: 1598: 1585: 1487:{\displaystyle x_{1}=g(x_{2})} 1481: 1468: 1347: 1341: 1280: 1277: 1271: 1265: 946: 939: 932: 713: 706: 702: 690: 684: 675: 663: 637:{\displaystyle L=L(a,\theta )} 631: 619: 484: 477: 470: 1: 5374:Gouriéroux and Monfort (1995) 5352: 4309:The best invariant estimator 2798: 2022:is invariant under the group 870: 361:Maximum likelihood estimators 269: 3130:{\displaystyle f(x-\theta )} 2544:if there exist three groups 2070:{\displaystyle L(\theta ,a)} 952:{\displaystyle f(x|\theta )} 490:{\displaystyle f(x|\theta )} 459:probability density function 389:Optimal invariant estimators 7: 3058:Example: Location parameter 2824:, is constant on orbits of 1954:{\displaystyle \theta ^{*}} 1238:{\displaystyle g^{-1}\in G} 560:. The estimate, denoted by 520:The problem is to estimate 10: 5527: 5427:10.1016/j.jspi.2006.05.002 4338:is the one that minimizes 4331:{\displaystyle \delta (x)} 3511:{\displaystyle {\bar {g}}} 3047:{\displaystyle {\bar {g}}} 3013:{\displaystyle {\bar {g}}} 2639:{\displaystyle \delta (x)} 2175:{\displaystyle a^{*}\in A} 1727:is said to be transitive. 1130:{\displaystyle g_{2}\in G} 1097:{\displaystyle g_{1}\in G} 856:statistical classification 309:estimator be mean-unbiased 243:, the concept of being an 5481:10.1093/biomet/31.1-2.200 5450:10.1093/biomet/30.3-4.391 5000:,. However the result is 2289:. The transformed value 1604:{\displaystyle X(x_{0})} 1360:{\displaystyle e(x)=x\,} 1005:group of transformations 3551:{\displaystyle \theta } 3531:{\displaystyle \Theta } 3420:thus it is of the form 3075:{\displaystyle \theta } 2837:{\displaystyle \Theta } 2817:{\displaystyle \delta } 2432:{\displaystyle \Theta } 2042:then the loss function 915:{\displaystyle \Theta } 819:{\displaystyle \Theta } 759:{\displaystyle \theta } 533:{\displaystyle \theta } 510:{\displaystyle \theta } 426:{\displaystyle \theta } 286:would lead directly to 91:more precise citations. 5343: 5157: 5105: 5043: 4994: 4943: 4877: 4797: 4741: 4538: 4332: 4300: 4193: 4086: 3948: 3895: 3828: 3763: 3690: 3655: 3552: 3532: 3512: 3483: 3455: 3411: 3330: 3213: 3172: 3131: 3096: 3076: 3048: 3014: 2982: 2938: 2912: 2838: 2818: 2789: 2712: 2711:{\displaystyle g\in G} 2686: 2685:{\displaystyle x\in X} 2660: 2640: 2611: 2588: 2538: 2515: 2495: 2433: 2413: 2348: 2310: 2283: 2257: 2176: 2143: 2142:{\displaystyle a\in A} 2117: 2116:{\displaystyle g\in G} 2091: 2071: 2036: 2016: 1993: 1955: 1928: 1884: 1883:{\displaystyle Y=g(x)} 1849: 1822:there exists a unique 1816: 1790: 1789:{\displaystyle g\in G} 1764: 1744: 1730:A family of densities 1721: 1701: 1681: 1605: 1569: 1542: 1514: 1513:{\displaystyle g\in G} 1488: 1439: 1419: 1392: 1361: 1325: 1324:{\displaystyle e\in G} 1297: 1239: 1203: 1202:{\displaystyle g\in G} 1175: 1131: 1098: 1061: 1041: 1021: 997: 973: 953: 916: 896: 840: 820: 800: 780: 760: 740: 720: 638: 594: 574: 554: 534: 511: 491: 455:vector random variable 447: 427: 407: 5344: 5158: 5085: 5023: 4995: 4953:with scale parameter 4944: 4878: 4798: 4742: 4539: 4333: 4301: 4194: 4087: 3949: 3896: 3829: 3764: 3691: 3656: 3553: 3533: 3513: 3484: 3456: 3412: 3331: 3214: 3173: 3132: 3097: 3077: 3049: 3015: 2983: 2939: 2913: 2839: 2819: 2790: 2713: 2687: 2661: 2641: 2612: 2589: 2539: 2516: 2496: 2434: 2414: 2349: 2311: 2309:{\displaystyle a^{*}} 2284: 2258: 2177: 2144: 2118: 2092: 2072: 2037: 2017: 1994: 1956: 1929: 1885: 1850: 1817: 1791: 1765: 1745: 1722: 1702: 1682: 1606: 1570: 1568:{\displaystyle x_{0}} 1543: 1515: 1489: 1440: 1420: 1418:{\displaystyle x_{2}} 1393: 1391:{\displaystyle x_{1}} 1362: 1326: 1298: 1240: 1204: 1176: 1132: 1099: 1062: 1042: 1022: 998: 974: 954: 917: 897: 841: 821: 801: 781: 761: 741: 721: 639: 595: 575: 555: 535: 512: 492: 448: 428: 408: 276:statistical inference 254:equivariant estimator 158:"Invariant estimator" 5173: 5007: 4961: 4890: 4814: 4754: 4557: 4345: 4313: 4204: 4097: 3961: 3913: 3838: 3780: 3703: 3665: 3562: 3542: 3522: 3493: 3465: 3424: 3343: 3223: 3182: 3141: 3106: 3086: 3066: 3029: 2995: 2948: 2922: 2848: 2828: 2808: 2725: 2696: 2670: 2650: 2621: 2601: 2548: 2528: 2505: 2443: 2423: 2361: 2320: 2293: 2267: 2186: 2153: 2127: 2101: 2081: 2046: 2026: 2006: 1965: 1938: 1894: 1859: 1826: 1800: 1774: 1754: 1734: 1711: 1691: 1615: 1579: 1552: 1532: 1498: 1449: 1429: 1402: 1375: 1335: 1309: 1249: 1213: 1187: 1141: 1108: 1075: 1051: 1031: 1011: 987: 963: 926: 906: 886: 866:Mathematical setting 830: 810: 790: 770: 750: 730: 651: 607: 584: 564: 544: 524: 501: 464: 437: 417: 397: 143:improve this article 4951:Cauchy distribution 4674: 4595: 4471: 4368: 3390: for all  2316:will be denoted by 1027:, to be denoted by 644:which determines a 433:. The measurements 313:median-unbiasedness 288:Bayesian estimators 245:invariant estimator 5339: 5204: 5153: 4990: 4939: 4873: 4793: 4737: 4657: 4578: 4534: 4454: 4351: 4328: 4296: 4189: 4082: 3944: 3891: 3824: 3759: 3686: 3651: 3548: 3528: 3508: 3479: 3451: 3407: 3326: 3209: 3168: 3127: 3092: 3072: 3044: 3010: 2978: 2934: 2908: 2834: 2814: 2785: 2708: 2682: 2656: 2636: 2607: 2594:as defined above. 2584: 2534: 2511: 2491: 2429: 2409: 2344: 2306: 2279: 2253: 2172: 2139: 2113: 2087: 2067: 2032: 2012: 1989: 1951: 1924: 1880: 1845: 1812: 1786: 1760: 1740: 1717: 1697: 1677: 1601: 1565: 1538: 1510: 1484: 1445:are equivalent if 1435: 1415: 1388: 1357: 1321: 1293: 1235: 1199: 1171: 1127: 1094: 1057: 1037: 1017: 993: 969: 949: 912: 892: 836: 816: 796: 776: 756: 736: 716: 634: 590: 570: 550: 530: 507: 487: 453:are modelled as a 443: 423: 403: 334:location parameter 284:Bayesian inference 5326: 5268: 5189: 5130: 5110: 5065: 5017: 4971: 4868: 4824: 4732: 4529: 4216: 4109: 3518:is transitive on 3505: 3391: 3256: 3241: 3095:{\displaystyle X} 3041: 3007: 2975: 2960: 2887: 2761: 2659:{\displaystyle G} 2610:{\displaystyle G} 2581: 2566: 2537:{\displaystyle G} 2514:{\displaystyle A} 2473: 2455: 2391: 2373: 2332: 2225: 2090:{\displaystyle G} 2035:{\displaystyle G} 2015:{\displaystyle F} 1977: 1763:{\displaystyle G} 1743:{\displaystyle F} 1720:{\displaystyle g} 1700:{\displaystyle X} 1541:{\displaystyle X} 1522:equivalence class 1438:{\displaystyle X} 1060:{\displaystyle X} 1040:{\displaystyle G} 1020:{\displaystyle X} 996:{\displaystyle X} 972:{\displaystyle L} 895:{\displaystyle X} 850:In classification 839:{\displaystyle A} 799:{\displaystyle X} 779:{\displaystyle a} 739:{\displaystyle x} 593:{\displaystyle A} 573:{\displaystyle a} 553:{\displaystyle x} 446:{\displaystyle x} 406:{\displaystyle x} 296:Bayesian analysis 292:statistical model 280:estimation theory 237: 236: 229: 219: 218: 211: 193: 117: 116: 109: 54: 5518: 5511:Invariant theory 5492: 5475:(1/2): 200–215. 5461: 5444:(3/4): 391–421. 5430: 5421:(6): 1900–1913. 5409: 5384:Berger, James O. 5375: 5372: 5366: 5363: 5348: 5346: 5345: 5340: 5335: 5331: 5327: 5325: 5321: 5320: 5308: 5307: 5294: 5286: 5273: 5269: 5267: 5266: 5265: 5250: 5249: 5240: 5239: 5227: 5226: 5210: 5203: 5185: 5184: 5162: 5160: 5159: 5154: 5136: 5135: 5131: 5129: 5128: 5124: 5123: 5111: 5108: 5104: 5099: 5083: 5079: 5078: 5066: 5063: 5060: 5054: 5053: 5042: 5037: 5019: 5018: 5015: 4999: 4997: 4996: 4991: 4989: 4988: 4973: 4972: 4969: 4948: 4946: 4945: 4940: 4933: 4932: 4917: 4916: 4882: 4880: 4879: 4874: 4869: 4864: 4863: 4862: 4861: 4847: 4842: 4841: 4826: 4825: 4822: 4802: 4800: 4799: 4794: 4781: 4780: 4746: 4744: 4743: 4738: 4733: 4731: 4715: 4714: 4690: 4689: 4673: 4668: 4655: 4639: 4638: 4614: 4613: 4594: 4589: 4576: 4543: 4541: 4540: 4535: 4530: 4528: 4512: 4511: 4487: 4486: 4470: 4465: 4452: 4436: 4435: 4411: 4410: 4367: 4362: 4349: 4337: 4335: 4334: 4329: 4305: 4303: 4302: 4297: 4289: 4288: 4283: 4247: 4246: 4234: 4233: 4218: 4217: 4209: 4198: 4196: 4195: 4190: 4182: 4181: 4176: 4140: 4139: 4127: 4126: 4111: 4110: 4102: 4091: 4089: 4088: 4083: 4075: 4074: 4069: 4045: 4044: 4020: 4019: 3995: 3994: 3982: 3981: 3953: 3951: 3950: 3945: 3940: 3926: 3900: 3898: 3897: 3892: 3881: 3880: 3856: 3855: 3833: 3831: 3830: 3825: 3820: 3819: 3801: 3800: 3772:Pitman estimator 3768: 3766: 3765: 3760: 3743: 3695: 3693: 3692: 3687: 3660: 3658: 3657: 3652: 3638: 3557: 3555: 3554: 3549: 3537: 3535: 3534: 3529: 3517: 3515: 3514: 3509: 3507: 3506: 3498: 3488: 3486: 3485: 3480: 3478: 3460: 3458: 3457: 3452: 3416: 3414: 3413: 3408: 3403: 3392: 3389: 3335: 3333: 3332: 3327: 3322: 3287: 3286: 3274: 3273: 3258: 3257: 3249: 3243: 3242: 3234: 3218: 3216: 3215: 3210: 3177: 3175: 3174: 3169: 3167: 3166: 3161: 3136: 3134: 3133: 3128: 3101: 3099: 3098: 3093: 3081: 3079: 3078: 3073: 3053: 3051: 3050: 3045: 3043: 3042: 3034: 3019: 3017: 3016: 3011: 3009: 3008: 3000: 2987: 2985: 2984: 2979: 2977: 2976: 2968: 2962: 2961: 2953: 2943: 2941: 2940: 2935: 2917: 2915: 2914: 2909: 2889: 2888: 2880: 2843: 2841: 2840: 2835: 2823: 2821: 2820: 2815: 2794: 2792: 2791: 2786: 2763: 2762: 2754: 2717: 2715: 2714: 2709: 2691: 2689: 2688: 2683: 2665: 2663: 2662: 2657: 2645: 2643: 2642: 2637: 2616: 2614: 2613: 2608: 2593: 2591: 2590: 2585: 2583: 2582: 2574: 2568: 2567: 2559: 2543: 2541: 2540: 2535: 2520: 2518: 2517: 2512: 2500: 2498: 2497: 2492: 2475: 2474: 2466: 2457: 2456: 2448: 2438: 2436: 2435: 2430: 2418: 2416: 2415: 2410: 2393: 2392: 2384: 2375: 2374: 2366: 2353: 2351: 2350: 2345: 2334: 2333: 2325: 2315: 2313: 2312: 2307: 2305: 2304: 2288: 2286: 2285: 2280: 2262: 2260: 2259: 2254: 2249: 2248: 2227: 2226: 2218: 2181: 2179: 2178: 2173: 2165: 2164: 2149:there exists an 2148: 2146: 2145: 2140: 2122: 2120: 2119: 2114: 2096: 2094: 2093: 2088: 2076: 2074: 2073: 2068: 2041: 2039: 2038: 2033: 2021: 2019: 2018: 2013: 1998: 1996: 1995: 1990: 1979: 1978: 1970: 1961:will be denoted 1960: 1958: 1957: 1952: 1950: 1949: 1933: 1931: 1930: 1925: 1920: 1919: 1910: 1889: 1887: 1886: 1881: 1854: 1852: 1851: 1846: 1838: 1837: 1821: 1819: 1818: 1813: 1795: 1793: 1792: 1787: 1769: 1767: 1766: 1761: 1749: 1747: 1746: 1741: 1726: 1724: 1723: 1718: 1706: 1704: 1703: 1698: 1686: 1684: 1683: 1678: 1658: 1657: 1633: 1632: 1610: 1608: 1607: 1602: 1597: 1596: 1574: 1572: 1571: 1566: 1564: 1563: 1547: 1545: 1544: 1539: 1519: 1517: 1516: 1511: 1493: 1491: 1490: 1485: 1480: 1479: 1461: 1460: 1444: 1442: 1441: 1436: 1424: 1422: 1421: 1416: 1414: 1413: 1397: 1395: 1394: 1389: 1387: 1386: 1366: 1364: 1363: 1358: 1330: 1328: 1327: 1322: 1302: 1300: 1299: 1294: 1264: 1263: 1244: 1242: 1241: 1236: 1228: 1227: 1208: 1206: 1205: 1200: 1180: 1178: 1177: 1172: 1163: 1162: 1153: 1152: 1136: 1134: 1133: 1128: 1120: 1119: 1103: 1101: 1100: 1095: 1087: 1086: 1066: 1064: 1063: 1058: 1046: 1044: 1043: 1038: 1026: 1024: 1023: 1018: 1002: 1000: 999: 994: 978: 976: 975: 970: 958: 956: 955: 950: 942: 921: 919: 918: 913: 901: 899: 898: 893: 846:, respectively. 845: 843: 842: 837: 825: 823: 822: 817: 805: 803: 802: 797: 785: 783: 782: 777: 765: 763: 762: 757: 745: 743: 742: 737: 725: 723: 722: 717: 709: 643: 641: 640: 635: 599: 597: 596: 591: 579: 577: 576: 571: 559: 557: 556: 551: 539: 537: 536: 531: 516: 514: 513: 508: 496: 494: 493: 488: 480: 452: 450: 449: 444: 432: 430: 429: 424: 412: 410: 409: 404: 375:random variables 345:scale invariance 338:weighted average 232: 225: 214: 207: 203: 200: 194: 192: 151: 127: 119: 112: 105: 101: 98: 92: 87:this article by 78:inline citations 65: 64: 57: 46: 24: 23: 16: 5526: 5525: 5521: 5520: 5519: 5517: 5516: 5515: 5496: 5495: 5398: 5379: 5378: 5373: 5369: 5364: 5360: 5355: 5316: 5312: 5303: 5299: 5295: 5287: 5285: 5278: 5274: 5261: 5257: 5245: 5241: 5235: 5231: 5222: 5218: 5214: 5209: 5205: 5193: 5180: 5176: 5174: 5171: 5170: 5119: 5115: 5107: 5106: 5100: 5089: 5084: 5074: 5070: 5062: 5061: 5059: 5055: 5049: 5045: 5044: 5038: 5027: 5014: 5010: 5008: 5005: 5004: 4981: 4977: 4968: 4964: 4962: 4959: 4958: 4928: 4924: 4912: 4908: 4891: 4888: 4887: 4857: 4853: 4852: 4848: 4846: 4834: 4830: 4821: 4817: 4815: 4812: 4811: 4776: 4772: 4755: 4752: 4751: 4710: 4706: 4685: 4681: 4669: 4661: 4656: 4634: 4630: 4609: 4605: 4590: 4582: 4577: 4575: 4558: 4555: 4554: 4507: 4503: 4482: 4478: 4466: 4458: 4453: 4431: 4427: 4406: 4402: 4363: 4355: 4350: 4348: 4346: 4343: 4342: 4314: 4311: 4310: 4284: 4279: 4278: 4242: 4238: 4229: 4225: 4208: 4207: 4205: 4202: 4201: 4177: 4172: 4171: 4135: 4131: 4122: 4118: 4101: 4100: 4098: 4095: 4094: 4070: 4065: 4064: 4040: 4036: 4015: 4011: 3990: 3986: 3977: 3973: 3962: 3959: 3958: 3936: 3922: 3914: 3911: 3910: 3876: 3872: 3851: 3847: 3839: 3836: 3835: 3815: 3811: 3796: 3792: 3781: 3778: 3777: 3774: 3739: 3704: 3701: 3700: 3666: 3663: 3662: 3634: 3563: 3560: 3559: 3543: 3540: 3539: 3523: 3520: 3519: 3497: 3496: 3494: 3491: 3490: 3474: 3466: 3463: 3462: 3425: 3422: 3421: 3399: 3388: 3344: 3341: 3340: 3318: 3282: 3278: 3269: 3265: 3248: 3247: 3233: 3232: 3224: 3221: 3220: 3183: 3180: 3179: 3162: 3157: 3156: 3142: 3139: 3138: 3107: 3104: 3103: 3102:is of the form 3087: 3084: 3083: 3067: 3064: 3063: 3060: 3054:is transitive. 3033: 3032: 3030: 3027: 3026: 2999: 2998: 2996: 2993: 2992: 2967: 2966: 2952: 2951: 2949: 2946: 2945: 2923: 2920: 2919: 2879: 2878: 2849: 2846: 2845: 2844:. Equivalently 2829: 2826: 2825: 2809: 2806: 2805: 2801: 2753: 2752: 2726: 2723: 2722: 2697: 2694: 2693: 2671: 2668: 2667: 2651: 2648: 2647: 2622: 2619: 2618: 2602: 2599: 2598: 2573: 2572: 2558: 2557: 2549: 2546: 2545: 2529: 2526: 2525: 2506: 2503: 2502: 2465: 2464: 2447: 2446: 2444: 2441: 2440: 2424: 2421: 2420: 2383: 2382: 2365: 2364: 2362: 2359: 2358: 2324: 2323: 2321: 2318: 2317: 2300: 2296: 2294: 2291: 2290: 2268: 2265: 2264: 2244: 2240: 2217: 2216: 2187: 2184: 2183: 2160: 2156: 2154: 2151: 2150: 2128: 2125: 2124: 2102: 2099: 2098: 2082: 2079: 2078: 2047: 2044: 2043: 2027: 2024: 2023: 2007: 2004: 2003: 1969: 1968: 1966: 1963: 1962: 1945: 1941: 1939: 1936: 1935: 1915: 1911: 1906: 1895: 1892: 1891: 1860: 1857: 1856: 1833: 1829: 1827: 1824: 1823: 1801: 1798: 1797: 1775: 1772: 1771: 1755: 1752: 1751: 1735: 1732: 1731: 1712: 1709: 1708: 1692: 1689: 1688: 1653: 1649: 1628: 1624: 1616: 1613: 1612: 1592: 1588: 1580: 1577: 1576: 1559: 1555: 1553: 1550: 1549: 1533: 1530: 1529: 1499: 1496: 1495: 1475: 1471: 1456: 1452: 1450: 1447: 1446: 1430: 1427: 1426: 1409: 1405: 1403: 1400: 1399: 1382: 1378: 1376: 1373: 1372: 1336: 1333: 1332: 1310: 1307: 1306: 1256: 1252: 1250: 1247: 1246: 1220: 1216: 1214: 1211: 1210: 1188: 1185: 1184: 1158: 1154: 1148: 1144: 1142: 1139: 1138: 1115: 1111: 1109: 1106: 1105: 1082: 1078: 1076: 1073: 1072: 1052: 1049: 1048: 1032: 1029: 1028: 1012: 1009: 1008: 988: 985: 984: 964: 961: 960: 938: 927: 924: 923: 907: 904: 903: 887: 884: 883: 873: 868: 852: 831: 828: 827: 811: 808: 807: 791: 788: 787: 786:are denoted by 771: 768: 767: 751: 748: 747: 731: 728: 727: 705: 652: 649: 648: 608: 605: 604: 585: 582: 581: 565: 562: 561: 545: 542: 541: 525: 522: 521: 502: 499: 498: 476: 465: 462: 461: 438: 435: 434: 418: 415: 414: 398: 395: 394: 391: 325: 272: 267: 265:General setting 233: 222: 221: 220: 215: 204: 198: 195: 152: 150: 140: 128: 113: 102: 96: 93: 83:Please help to 82: 66: 62: 25: 21: 12: 11: 5: 5524: 5514: 5513: 5508: 5494: 5493: 5462: 5431: 5410: 5396: 5377: 5376: 5367: 5357: 5356: 5354: 5351: 5350: 5349: 5338: 5334: 5330: 5324: 5319: 5315: 5311: 5306: 5302: 5298: 5293: 5290: 5284: 5281: 5277: 5272: 5264: 5260: 5256: 5253: 5248: 5244: 5238: 5234: 5230: 5225: 5221: 5217: 5213: 5208: 5202: 5199: 5196: 5192: 5188: 5183: 5179: 5164: 5163: 5152: 5149: 5146: 5143: 5139: 5134: 5127: 5122: 5118: 5114: 5103: 5098: 5095: 5092: 5088: 5082: 5077: 5073: 5069: 5058: 5052: 5048: 5041: 5036: 5033: 5030: 5026: 5022: 5013: 4987: 4984: 4980: 4976: 4967: 4936: 4931: 4927: 4923: 4920: 4915: 4911: 4907: 4904: 4901: 4898: 4895: 4884: 4883: 4872: 4867: 4860: 4856: 4851: 4845: 4840: 4837: 4833: 4829: 4820: 4790: 4787: 4784: 4779: 4775: 4771: 4768: 4765: 4762: 4759: 4748: 4747: 4736: 4730: 4727: 4724: 4721: 4718: 4713: 4709: 4705: 4702: 4699: 4696: 4693: 4688: 4684: 4680: 4677: 4672: 4667: 4664: 4660: 4654: 4651: 4648: 4645: 4642: 4637: 4633: 4629: 4626: 4623: 4620: 4617: 4612: 4608: 4604: 4601: 4598: 4593: 4588: 4585: 4581: 4574: 4571: 4568: 4565: 4562: 4545: 4544: 4533: 4527: 4524: 4521: 4518: 4515: 4510: 4506: 4502: 4499: 4496: 4493: 4490: 4485: 4481: 4477: 4474: 4469: 4464: 4461: 4457: 4451: 4448: 4445: 4442: 4439: 4434: 4430: 4426: 4423: 4420: 4417: 4414: 4409: 4405: 4401: 4398: 4395: 4392: 4389: 4386: 4383: 4380: 4377: 4374: 4371: 4366: 4361: 4358: 4354: 4327: 4324: 4321: 4318: 4307: 4306: 4295: 4292: 4287: 4282: 4277: 4274: 4271: 4268: 4265: 4262: 4259: 4256: 4253: 4250: 4245: 4241: 4237: 4232: 4228: 4224: 4221: 4215: 4212: 4199: 4188: 4185: 4180: 4175: 4170: 4167: 4164: 4161: 4158: 4155: 4152: 4149: 4146: 4143: 4138: 4134: 4130: 4125: 4121: 4117: 4114: 4108: 4105: 4092: 4081: 4078: 4073: 4068: 4063: 4060: 4057: 4054: 4051: 4048: 4043: 4039: 4035: 4032: 4029: 4026: 4023: 4018: 4014: 4010: 4007: 4004: 4001: 3998: 3993: 3989: 3985: 3980: 3976: 3972: 3969: 3966: 3943: 3939: 3935: 3932: 3929: 3925: 3921: 3918: 3890: 3887: 3884: 3879: 3875: 3871: 3868: 3865: 3862: 3859: 3854: 3850: 3846: 3843: 3823: 3818: 3814: 3810: 3807: 3804: 3799: 3795: 3791: 3788: 3785: 3773: 3770: 3758: 3755: 3752: 3749: 3746: 3742: 3738: 3735: 3732: 3729: 3726: 3723: 3720: 3717: 3714: 3711: 3708: 3685: 3682: 3679: 3676: 3673: 3670: 3650: 3647: 3644: 3641: 3637: 3633: 3630: 3627: 3624: 3621: 3618: 3615: 3612: 3609: 3606: 3603: 3600: 3597: 3594: 3591: 3588: 3585: 3582: 3579: 3576: 3573: 3570: 3567: 3547: 3527: 3504: 3501: 3477: 3473: 3470: 3450: 3447: 3444: 3441: 3438: 3435: 3432: 3429: 3418: 3417: 3406: 3402: 3398: 3395: 3387: 3384: 3381: 3378: 3375: 3372: 3369: 3366: 3363: 3360: 3357: 3354: 3351: 3348: 3325: 3321: 3317: 3314: 3311: 3308: 3305: 3302: 3299: 3296: 3293: 3290: 3285: 3281: 3277: 3272: 3268: 3264: 3261: 3255: 3252: 3246: 3240: 3237: 3231: 3228: 3208: 3205: 3202: 3199: 3196: 3193: 3190: 3187: 3165: 3160: 3155: 3152: 3149: 3146: 3126: 3123: 3120: 3117: 3114: 3111: 3091: 3071: 3059: 3056: 3040: 3037: 3022: 3021: 3006: 3003: 2989: 2974: 2971: 2965: 2959: 2956: 2933: 2930: 2927: 2907: 2904: 2901: 2898: 2895: 2892: 2886: 2883: 2877: 2874: 2871: 2868: 2865: 2862: 2859: 2856: 2853: 2833: 2813: 2800: 2797: 2796: 2795: 2784: 2781: 2778: 2775: 2772: 2769: 2766: 2760: 2757: 2751: 2748: 2745: 2742: 2739: 2736: 2733: 2730: 2707: 2704: 2701: 2681: 2678: 2675: 2655: 2635: 2632: 2629: 2626: 2606: 2580: 2577: 2571: 2565: 2562: 2556: 2553: 2533: 2510: 2490: 2487: 2484: 2481: 2478: 2472: 2469: 2463: 2460: 2454: 2451: 2439:to itself and 2428: 2408: 2405: 2402: 2399: 2396: 2390: 2387: 2381: 2378: 2372: 2369: 2357:In the above, 2343: 2340: 2337: 2331: 2328: 2303: 2299: 2278: 2275: 2272: 2252: 2247: 2243: 2239: 2236: 2233: 2230: 2224: 2221: 2215: 2212: 2209: 2206: 2203: 2200: 2197: 2194: 2191: 2171: 2168: 2163: 2159: 2138: 2135: 2132: 2112: 2109: 2106: 2086: 2066: 2063: 2060: 2057: 2054: 2051: 2031: 2011: 1988: 1985: 1982: 1976: 1973: 1948: 1944: 1923: 1918: 1914: 1909: 1905: 1902: 1899: 1879: 1876: 1873: 1870: 1867: 1864: 1844: 1841: 1836: 1832: 1811: 1808: 1805: 1785: 1782: 1779: 1770:if, for every 1759: 1739: 1716: 1696: 1676: 1673: 1670: 1667: 1664: 1661: 1656: 1652: 1648: 1645: 1642: 1639: 1636: 1631: 1627: 1623: 1620: 1600: 1595: 1591: 1587: 1584: 1562: 1558: 1537: 1509: 1506: 1503: 1483: 1478: 1474: 1470: 1467: 1464: 1459: 1455: 1434: 1412: 1408: 1385: 1381: 1369: 1368: 1355: 1352: 1349: 1346: 1343: 1340: 1320: 1317: 1314: 1304: 1292: 1288: 1285: 1282: 1279: 1276: 1273: 1270: 1267: 1262: 1259: 1255: 1234: 1231: 1226: 1223: 1219: 1198: 1195: 1192: 1181: 1169: 1166: 1161: 1157: 1151: 1147: 1126: 1123: 1118: 1114: 1093: 1090: 1085: 1081: 1056: 1036: 1016: 992: 981: 980: 968: 948: 945: 941: 937: 934: 931: 911: 891: 880: 872: 869: 867: 864: 851: 848: 835: 815: 795: 775: 755: 735: 715: 712: 708: 704: 701: 698: 695: 692: 689: 686: 683: 680: 677: 674: 671: 668: 665: 662: 659: 656: 633: 630: 627: 624: 621: 618: 615: 612: 589: 569: 549: 529: 506: 486: 483: 479: 475: 472: 469: 442: 422: 402: 390: 387: 379: 378: 368: 348: 341: 324: 321: 271: 268: 266: 263: 235: 234: 217: 216: 131: 129: 122: 115: 114: 69: 67: 60: 55: 29: 28: 26: 19: 9: 6: 4: 3: 2: 5523: 5512: 5509: 5507: 5504: 5503: 5501: 5490: 5486: 5482: 5478: 5474: 5470: 5469: 5463: 5459: 5455: 5451: 5447: 5443: 5439: 5438: 5432: 5428: 5424: 5420: 5416: 5411: 5407: 5403: 5399: 5397:0-387-96098-8 5393: 5389: 5385: 5381: 5380: 5371: 5362: 5358: 5336: 5332: 5328: 5317: 5313: 5309: 5304: 5300: 5291: 5288: 5282: 5279: 5275: 5270: 5262: 5258: 5254: 5251: 5246: 5236: 5232: 5228: 5223: 5219: 5211: 5206: 5200: 5197: 5194: 5190: 5186: 5181: 5177: 5169: 5168: 5167: 5150: 5147: 5144: 5141: 5137: 5132: 5120: 5116: 5101: 5096: 5093: 5090: 5086: 5075: 5071: 5056: 5050: 5046: 5039: 5034: 5031: 5028: 5024: 5020: 5011: 5003: 5002: 5001: 4985: 4982: 4978: 4974: 4965: 4956: 4952: 4929: 4925: 4921: 4918: 4913: 4909: 4905: 4899: 4896: 4893: 4870: 4865: 4858: 4854: 4849: 4843: 4838: 4835: 4831: 4827: 4818: 4810: 4809: 4808: 4806: 4785: 4782: 4777: 4773: 4769: 4763: 4760: 4757: 4734: 4728: 4725: 4719: 4716: 4711: 4707: 4703: 4700: 4697: 4694: 4691: 4686: 4682: 4675: 4662: 4658: 4652: 4649: 4643: 4640: 4635: 4631: 4627: 4624: 4621: 4618: 4615: 4610: 4606: 4599: 4596: 4583: 4579: 4572: 4566: 4560: 4553: 4552: 4551: 4548: 4531: 4525: 4522: 4516: 4513: 4508: 4504: 4500: 4497: 4494: 4491: 4488: 4483: 4479: 4472: 4459: 4455: 4449: 4446: 4440: 4437: 4432: 4428: 4424: 4421: 4418: 4415: 4412: 4407: 4403: 4396: 4390: 4387: 4381: 4375: 4369: 4356: 4352: 4341: 4340: 4339: 4322: 4316: 4293: 4285: 4275: 4272: 4269: 4266: 4263: 4260: 4257: 4251: 4243: 4239: 4235: 4230: 4226: 4219: 4210: 4200: 4186: 4178: 4168: 4165: 4162: 4159: 4156: 4153: 4150: 4144: 4136: 4132: 4128: 4123: 4119: 4112: 4103: 4093: 4079: 4071: 4061: 4058: 4055: 4049: 4046: 4041: 4037: 4033: 4030: 4027: 4024: 4021: 4016: 4012: 4005: 3999: 3991: 3987: 3983: 3978: 3974: 3967: 3964: 3957: 3956: 3955: 3933: 3930: 3927: 3916: 3908: 3907:loss function 3904: 3885: 3882: 3877: 3873: 3869: 3866: 3863: 3860: 3857: 3852: 3848: 3841: 3816: 3812: 3808: 3805: 3802: 3797: 3793: 3786: 3783: 3769: 3756: 3750: 3747: 3744: 3736: 3730: 3724: 3721: 3718: 3712: 3706: 3697: 3680: 3677: 3674: 3668: 3645: 3642: 3639: 3628: 3625: 3622: 3616: 3610: 3604: 3598: 3595: 3592: 3586: 3583: 3577: 3574: 3571: 3565: 3545: 3499: 3471: 3468: 3448: 3445: 3442: 3439: 3433: 3427: 3404: 3396: 3393: 3385: 3382: 3379: 3373: 3367: 3364: 3358: 3355: 3352: 3346: 3339: 3338: 3337: 3315: 3312: 3309: 3306: 3303: 3300: 3297: 3291: 3283: 3279: 3275: 3270: 3266: 3259: 3250: 3244: 3235: 3229: 3226: 3203: 3200: 3197: 3191: 3188: 3185: 3163: 3153: 3150: 3147: 3121: 3118: 3115: 3109: 3089: 3069: 3055: 3035: 3001: 2990: 2969: 2963: 2954: 2928: 2925: 2902: 2899: 2893: 2881: 2872: 2869: 2863: 2860: 2857: 2851: 2811: 2803: 2802: 2782: 2773: 2767: 2755: 2749: 2740: 2734: 2728: 2721: 2720: 2719: 2705: 2702: 2699: 2679: 2676: 2673: 2653: 2630: 2624: 2604: 2595: 2575: 2569: 2560: 2554: 2551: 2531: 2522: 2508: 2485: 2482: 2479: 2476: 2467: 2458: 2449: 2403: 2400: 2397: 2394: 2385: 2376: 2367: 2355: 2338: 2326: 2301: 2297: 2273: 2270: 2245: 2241: 2237: 2231: 2219: 2210: 2207: 2201: 2198: 2195: 2189: 2169: 2166: 2161: 2157: 2136: 2133: 2130: 2110: 2107: 2104: 2097:if for every 2084: 2061: 2058: 2055: 2049: 2029: 2009: 2000: 1983: 1971: 1946: 1942: 1916: 1912: 1903: 1897: 1874: 1868: 1865: 1862: 1839: 1834: 1830: 1806: 1803: 1783: 1780: 1777: 1757: 1737: 1728: 1714: 1694: 1671: 1668: 1665: 1662: 1654: 1650: 1643: 1637: 1629: 1625: 1618: 1611:, is the set 1593: 1589: 1582: 1560: 1556: 1535: 1527: 1523: 1507: 1504: 1501: 1476: 1472: 1465: 1462: 1457: 1453: 1432: 1410: 1406: 1383: 1379: 1353: 1350: 1344: 1338: 1318: 1315: 1312: 1305: 1290: 1286: 1283: 1274: 1268: 1260: 1257: 1253: 1232: 1229: 1224: 1221: 1217: 1196: 1193: 1190: 1182: 1167: 1164: 1159: 1155: 1149: 1145: 1124: 1121: 1116: 1112: 1091: 1088: 1083: 1079: 1070: 1069: 1068: 1054: 1034: 1014: 1006: 990: 966: 943: 935: 929: 889: 881: 878: 877: 876: 863: 861: 857: 847: 833: 793: 773: 753: 733: 710: 699: 696: 693: 687: 681: 678: 672: 669: 666: 660: 657: 654: 647: 646:risk function 628: 625: 622: 616: 613: 610: 603: 602:loss function 587: 567: 547: 527: 518: 504: 481: 473: 467: 460: 456: 440: 420: 400: 386: 384: 376: 373: 369: 366: 362: 358: 354: 349: 346: 342: 339: 335: 331: 330: 329: 320: 316: 314: 310: 304: 302: 297: 293: 289: 285: 281: 277: 262: 260: 255: 250: 246: 242: 231: 228: 213: 210: 202: 191: 188: 184: 181: 177: 174: 170: 167: 163: 160: –  159: 155: 154:Find sources: 148: 144: 138: 137: 132:This article 130: 126: 121: 120: 111: 108: 100: 90: 86: 80: 79: 73: 68: 59: 58: 53: 51: 44: 43: 38: 37: 32: 27: 18: 17: 5472: 5466: 5441: 5435: 5418: 5414: 5387: 5370: 5361: 5165: 4954: 4885: 4749: 4549: 4546: 4308: 3902: 3834:has density 3775: 3698: 3696:to minimum. 3419: 3061: 3023: 3020:is constant. 2666:if, for all 2617:, estimator 2596: 2523: 2356: 2001: 1890:has density 1729: 1370: 982: 874: 853: 519: 392: 380: 356: 352: 326: 317: 305: 273: 259:equivariance 253: 244: 238: 223: 205: 196: 186: 179: 172: 165: 153: 141:Please help 136:verification 133: 103: 94: 75: 47: 40: 34: 33:Please help 30: 3558:: that is, 2521:to itself. 89:introducing 5500:Categories 5468:Biometrika 5437:Biometrika 5353:References 2799:Properties 2182:such that 1855:such that 871:Definition 270:Background 249:estimators 241:statistics 169:newspapers 72:references 36:improve it 5506:Estimator 5310:− 5292:σ 5283:− 5259:σ 5229:− 5198:≠ 5191:∏ 5087:∑ 5025:∑ 5012:δ 4979:δ 4975:≠ 4966:δ 4926:σ 4906:θ 4897:∼ 4850:∑ 4832:δ 4819:δ 4770:θ 4761:∼ 4729:θ 4720:θ 4717:− 4701:… 4695:θ 4692:− 4671:∞ 4666:∞ 4663:− 4659:∫ 4653:θ 4644:θ 4641:− 4625:… 4619:θ 4616:− 4597:θ 4592:∞ 4587:∞ 4584:− 4580:∫ 4561:δ 4526:θ 4517:θ 4514:− 4498:… 4492:θ 4489:− 4468:∞ 4463:∞ 4460:− 4456:∫ 4450:θ 4441:θ 4438:− 4422:… 4416:θ 4413:− 4391:θ 4388:− 4376:δ 4365:∞ 4360:∞ 4357:− 4353:∫ 4317:δ 4276:∈ 4214:~ 4169:∈ 4154:θ 4145:θ 4107:¯ 4062:∈ 4031:… 3934:θ 3931:− 3886:θ 3883:− 3867:… 3861:θ 3858:− 3806:… 3745:θ 3731:⁡ 3725:− 3707:δ 3681:δ 3675:θ 3640:θ 3611:⁡ 3599:δ 3578:δ 3572:θ 3546:θ 3526:Θ 3503:¯ 3472:∈ 3428:δ 3397:∈ 3368:δ 3347:δ 3316:∈ 3254:~ 3239:¯ 3204:θ 3201:− 3145:Θ 3122:θ 3119:− 3070:θ 3039:¯ 3005:¯ 2973:¯ 2964:∈ 2958:¯ 2932:Θ 2929:∈ 2926:θ 2903:δ 2894:θ 2885:¯ 2864:δ 2858:θ 2832:Θ 2812:δ 2768:δ 2759:~ 2729:δ 2703:∈ 2677:∈ 2625:δ 2579:~ 2564:¯ 2483:∈ 2471:~ 2453:~ 2427:Θ 2401:∈ 2389:¯ 2371:¯ 2330:~ 2302:∗ 2277:Θ 2274:∈ 2271:θ 2246:∗ 2232:θ 2223:¯ 2196:θ 2167:∈ 2162:∗ 2134:∈ 2108:∈ 2056:θ 1984:θ 1975:¯ 1947:∗ 1943:θ 1917:∗ 1913:θ 1843:Θ 1840:∈ 1835:∗ 1831:θ 1810:Θ 1807:∈ 1804:θ 1781:∈ 1669:∈ 1505:∈ 1494:for some 1371:Datasets 1316:∈ 1258:− 1230:∈ 1222:− 1194:∈ 1165:∈ 1122:∈ 1089:∈ 944:θ 910:Θ 814:Θ 754:θ 711:θ 700:θ 673:θ 629:θ 528:θ 505:θ 482:θ 457:having a 421:θ 365:monotonic 199:July 2010 97:July 2010 42:talk page 5386:(1985). 4803:(i.e. a 3901:, where 3062:Suppose 2918:for all 2263:for all 1245:, where 5489:2334983 5458:2332656 5406:0804611 4957:) then 1575:orbit, 1548:). The 183:scholar 85:improve 5487:  5456:  5404:  5394:  5016:Pitman 4970:Pitman 4955:σ 4823:Pitman 3903:θ 3137:. For 826:, and 766:, and 540:given 301:robust 185:  178:  171:  164:  156:  74:, but 5485:JSTOR 5454:JSTOR 5166:with 1687:. If 1526:orbit 1209:then 1137:then 359:(θ). 190:JSTOR 176:books 5392:ISBN 5145:> 3178:and 2944:and 2692:and 2123:and 1796:and 1528:(in 1398:and 1104:and 959:and 162:news 5477:doi 5446:doi 5423:doi 5419:137 4886:If 4750:If 3909:is 3489:). 2002:If 1425:in 1183:If 1071:If 1007:of 854:In 274:In 239:In 145:by 5502:: 5483:. 5473:31 5471:. 5452:. 5442:30 5440:. 5417:. 5402:MR 5400:. 5109:Re 5064:Re 2718:, 2354:. 1999:. 1934:. 922:, 902:, 862:. 806:, 746:, 517:. 303:. 45:. 5491:. 5479:: 5460:. 5448:: 5429:. 5425:: 5408:. 5337:. 5333:] 5329:i 5323:) 5318:j 5314:x 5305:k 5301:x 5297:( 5289:2 5280:1 5276:[ 5271:] 5263:2 5255:4 5252:+ 5247:2 5243:) 5237:j 5233:x 5224:k 5220:x 5216:( 5212:1 5207:[ 5201:k 5195:j 5187:= 5182:k 5178:w 5151:, 5148:1 5142:n 5138:, 5133:] 5126:} 5121:k 5117:w 5113:{ 5102:n 5097:1 5094:= 5091:m 5081:} 5076:k 5072:w 5068:{ 5057:[ 5051:k 5047:x 5040:n 5035:1 5032:= 5029:k 5021:= 4986:L 4983:M 4935:) 4930:2 4922:I 4919:, 4914:n 4910:1 4903:( 4900:C 4894:x 4871:. 4866:n 4859:i 4855:x 4844:= 4839:L 4836:M 4828:= 4789:) 4786:I 4783:, 4778:n 4774:1 4767:( 4764:N 4758:x 4735:. 4726:d 4723:) 4712:n 4708:x 4704:, 4698:, 4687:1 4683:x 4679:( 4676:f 4650:d 4647:) 4636:n 4632:x 4628:, 4622:, 4611:1 4607:x 4603:( 4600:f 4573:= 4570:) 4567:x 4564:( 4532:, 4523:d 4520:) 4509:n 4505:x 4501:, 4495:, 4484:1 4480:x 4476:( 4473:f 4447:d 4444:) 4433:n 4429:x 4425:, 4419:, 4408:1 4404:x 4400:( 4397:f 4394:) 4385:) 4382:x 4379:( 4373:( 4370:L 4326:) 4323:x 4320:( 4294:. 4291:} 4286:1 4281:R 4273:c 4270:, 4267:c 4264:+ 4261:a 4258:= 4255:) 4252:a 4249:( 4244:c 4240:g 4236:: 4231:c 4227:g 4223:{ 4220:= 4211:G 4187:, 4184:} 4179:1 4174:R 4166:c 4163:, 4160:c 4157:+ 4151:= 4148:) 4142:( 4137:c 4133:g 4129:: 4124:c 4120:g 4116:{ 4113:= 4104:G 4080:, 4077:} 4072:1 4067:R 4059:c 4056:, 4053:) 4050:c 4047:+ 4042:n 4038:x 4034:, 4028:, 4025:c 4022:+ 4017:1 4013:x 4009:( 4006:= 4003:) 4000:x 3997:( 3992:c 3988:g 3984:: 3979:c 3975:g 3971:{ 3968:= 3965:G 3942:) 3938:| 3928:a 3924:| 3920:( 3917:L 3889:) 3878:n 3874:x 3870:, 3864:, 3853:1 3849:x 3845:( 3842:f 3822:) 3817:n 3813:X 3809:, 3803:, 3798:1 3794:X 3790:( 3787:= 3784:X 3757:. 3754:] 3751:0 3748:= 3741:| 3737:X 3734:[ 3728:E 3722:x 3719:= 3716:) 3713:x 3710:( 3684:) 3678:, 3672:( 3669:R 3649:] 3646:0 3643:= 3636:| 3632:) 3629:K 3626:+ 3623:X 3620:( 3617:L 3614:[ 3608:E 3605:= 3602:) 3596:, 3593:0 3590:( 3587:R 3584:= 3581:) 3575:, 3569:( 3566:R 3500:g 3476:R 3469:K 3461:( 3449:K 3446:+ 3443:x 3440:= 3437:) 3434:x 3431:( 3405:, 3401:R 3394:c 3386:, 3383:c 3380:+ 3377:) 3374:x 3371:( 3365:= 3362:) 3359:c 3356:+ 3353:x 3350:( 3324:} 3320:R 3313:c 3310:, 3307:c 3304:+ 3301:x 3298:= 3295:) 3292:x 3289:( 3284:c 3280:g 3276:: 3271:c 3267:g 3263:{ 3260:= 3251:g 3245:= 3236:g 3230:= 3227:g 3207:) 3198:a 3195:( 3192:L 3189:= 3186:L 3164:1 3159:R 3154:= 3151:A 3148:= 3125:) 3116:x 3113:( 3110:f 3090:X 3036:g 3002:g 2988:. 2970:G 2955:g 2906:) 2900:, 2897:) 2891:( 2882:g 2876:( 2873:R 2870:= 2867:) 2861:, 2855:( 2852:R 2783:. 2780:) 2777:) 2774:x 2771:( 2765:( 2756:g 2750:= 2747:) 2744:) 2741:x 2738:( 2735:g 2732:( 2706:G 2700:g 2680:X 2674:x 2654:G 2634:) 2631:x 2628:( 2605:G 2576:G 2570:, 2561:G 2555:, 2552:G 2532:G 2509:A 2489:} 2486:G 2480:g 2477:: 2468:g 2462:{ 2459:= 2450:G 2407:} 2404:G 2398:g 2395:: 2386:g 2380:{ 2377:= 2368:G 2342:) 2339:a 2336:( 2327:g 2298:a 2251:) 2242:a 2238:, 2235:) 2229:( 2220:g 2214:( 2211:L 2208:= 2205:) 2202:a 2199:, 2193:( 2190:L 2170:A 2158:a 2137:A 2131:a 2111:G 2105:g 2085:G 2065:) 2062:a 2059:, 2053:( 2050:L 2030:G 2010:F 1987:) 1981:( 1972:g 1922:) 1908:| 1904:y 1901:( 1898:f 1878:) 1875:x 1872:( 1869:g 1866:= 1863:Y 1784:G 1778:g 1758:G 1738:F 1715:g 1695:X 1675:} 1672:G 1666:g 1663:: 1660:) 1655:0 1651:x 1647:( 1644:g 1641:{ 1638:= 1635:) 1630:0 1626:x 1622:( 1619:X 1599:) 1594:0 1590:x 1586:( 1583:X 1561:0 1557:x 1536:X 1508:G 1502:g 1482:) 1477:2 1473:x 1469:( 1466:g 1463:= 1458:1 1454:x 1433:X 1411:2 1407:x 1384:1 1380:x 1367:) 1354:x 1351:= 1348:) 1345:x 1342:( 1339:e 1319:G 1313:e 1291:. 1287:x 1284:= 1281:) 1278:) 1275:x 1272:( 1269:g 1266:( 1261:1 1254:g 1233:G 1225:1 1218:g 1197:G 1191:g 1168:G 1160:2 1156:g 1150:1 1146:g 1125:G 1117:2 1113:g 1092:G 1084:1 1080:g 1055:X 1035:G 1015:X 991:X 967:L 947:) 940:| 936:x 933:( 930:f 890:X 834:A 794:X 774:a 734:x 714:] 707:| 703:) 697:, 694:a 691:( 688:L 685:[ 682:E 679:= 676:) 670:, 667:a 664:( 661:R 658:= 655:R 632:) 626:, 623:a 620:( 617:L 614:= 611:L 588:A 568:a 548:x 485:) 478:| 474:x 471:( 468:f 441:x 401:x 357:h 353:h 257:" 230:) 224:( 212:) 206:( 201:) 197:( 187:· 180:· 173:· 166:· 139:. 110:) 104:( 99:) 95:( 81:. 52:) 48:(

Index

improve it
talk page
Learn how and when to remove these messages
references
inline citations
improve
introducing
Learn how and when to remove this message

verification
improve this article
adding citations to reliable sources
"Invariant estimator"
news
newspapers
books
scholar
JSTOR
Learn how and when to remove this message
Learn how and when to remove this message
statistics
estimators
equivariance
statistical inference
estimation theory
Bayesian inference
Bayesian estimators
statistical model
Bayesian analysis
robust

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.