Knowledge

Weighted arithmetic mean

Source 📝

6081: 5364: 7483: 6076:{\displaystyle {\begin{aligned}\operatorname {Var} ({\hat {Y}}_{\text{pwr}})&={\frac {1}{n}}{\frac {1}{n-1}}\sum _{i=1}^{n}\left({\frac {y_{i}}{p_{i}}}-{\hat {Y}}_{pwr}\right)^{2}\\&={\frac {1}{n}}{\frac {1}{n-1}}\sum _{i=1}^{n}\left({\frac {n}{n}}{\frac {y_{i}}{p_{i}}}-{\frac {n}{n}}\sum _{i=1}^{n}w_{i}y_{i}\right)^{2}={\frac {1}{n}}{\frac {1}{n-1}}\sum _{i=1}^{n}\left(n{\frac {y_{i}}{\pi _{i}}}-n{\frac {\sum _{i=1}^{n}w_{i}y_{i}}{n}}\right)^{2}\\&={\frac {n^{2}}{n}}{\frac {1}{n-1}}\sum _{i=1}^{n}\left(w_{i}y_{i}-{\overline {wy}}\right)^{2}\\&={\frac {n}{n-1}}\sum _{i=1}^{n}\left(w_{i}y_{i}-{\overline {wy}}\right)^{2}\end{aligned}}} 14318: 6955: 13675: 16434: 8639: 7478:{\displaystyle {\begin{aligned}\operatorname {Var} ({\hat {Y}}_{{\text{pwr (known }}N{\text{)}}})&={\frac {1}{N^{2}}}\sum _{i=1}^{n}\sum _{j=1}^{n}\left({\check {\Delta }}_{ij}{\check {y}}_{i}{\check {y}}_{j}\right)\\&={\frac {1}{N^{2}}}\sum _{i=1}^{n}\left({\check {\Delta }}_{ii}{\check {y}}_{i}{\check {y}}_{i}\right)\\&={\frac {1}{N^{2}}}\sum _{i=1}^{n}\left((1-\pi _{i}){\frac {y_{i}}{\pi _{i}}}{\frac {y_{i}}{\pi _{i}}}\right)\\&={\frac {1}{N^{2}}}\sum _{i=1}^{n}\left(w_{i}y_{i}\right)^{2}\end{aligned}}} 17580: 14313:{\displaystyle {\begin{aligned}\operatorname {E} &={\frac {\sum \limits _{i=1}^{N}\operatorname {E} }{N}}\\&=\operatorname {E} )^{2}]-{\frac {1}{N}}\operatorname {E} )^{2}]\\&=\left({\frac {N-1}{N}}\right)\sigma _{\text{actual}}^{2}\\\operatorname {E} &={\frac {\sum \limits _{i=1}^{N}w_{i}\operatorname {E} }{V_{1}}}\\&=\operatorname {E} )^{2}]-{\frac {V_{2}}{V_{1}^{2}}}\operatorname {E} )^{2}]\\&=\left(1-{\frac {V_{2}}{V_{1}^{2}}}\right)\sigma _{\text{actual}}^{2}\end{aligned}}} 15995: 17298: 8162: 12807: 4746: 16429:{\displaystyle {\begin{aligned}\mathbf {C} &={\frac {\sum _{i=1}^{N}w_{i}}{\left(\sum _{i=1}^{N}w_{i}\right)^{2}-\sum _{i=1}^{N}w_{i}^{2}}}\sum _{i=1}^{N}w_{i}\left(\mathbf {x} _{i}-\mu ^{*}\right)^{T}\left(\mathbf {x} _{i}-\mu ^{*}\right)\\&={\frac {\sum _{i=1}^{N}w_{i}\left(\mathbf {x} _{i}-\mu ^{*}\right)^{T}\left(\mathbf {x} _{i}-\mu ^{*}\right)}{V_{1}-(V_{2}/V_{1})}}.\end{aligned}}} 10062: 8157: 11128: 10662: 9623: 14953: 17575:{\displaystyle {\begin{aligned}{\bar {\mathbf {x} }}&=\left(\mathbf {C} _{1}^{-1}+\mathbf {C} _{2}^{-1}\right)^{-1}\left(\mathbf {C} _{1}^{-1}\mathbf {x} _{1}+\mathbf {C} _{2}^{-1}\mathbf {x} _{2}\right)\\&={\begin{bmatrix}0.9901&0\\0&0.9901\end{bmatrix}}{\begin{bmatrix}1\\1\end{bmatrix}}={\begin{bmatrix}0.9901\\0.9901\end{bmatrix}}\end{aligned}}} 8634:{\displaystyle {\hat {R}}={\hat {\bar {Y}}}={\frac {\sum _{i=1}^{N}I_{i}{\frac {y_{i}}{\pi _{i}}}}{\sum _{i=1}^{N}I_{i}{\frac {1}{\pi _{i}}}}}={\frac {\sum _{i=1}^{N}{\check {y}}'_{i}}{\sum _{i=1}^{N}{\check {1}}'_{i}}}={\frac {\sum _{i=1}^{N}w_{i}y'_{i}}{\sum _{i=1}^{N}w_{i}1'_{i}}}={\frac {\sum _{i=1}^{n}w_{i}y'_{i}}{\sum _{i=1}^{n}w_{i}1'_{i}}}={\bar {y}}_{w}} 1746: 5110: 6290: 12548: 2226: 9632: 10800: 10374: 4464: 4479: 7859: 9363: 19124:
Weighted means are typically used to find the weighted mean of historical data, rather than theoretically generated data. In this case, there will be some error in the variance of each data point. Typically experimental errors may be underestimated due to the experimenter not taking into account all
11772:
Gatz et al. mention that the above formulation was published by Endlich et al. (1988) when treating the weighted mean as a combination of a weighted total estimator divided by an estimator of the population size, based on the formulation published by Cochran (1977), as an approximation to the ratio
11136:
We have (at least) two versions of variance for the weighted mean: one with known and one with unknown population size estimation. There is no uniformly better approach, but the literature presents several arguments to prefer using the population estimation version (even when the population size is
14700: 8922:
first-order linearization, asymptotics, and bootstrap/jackknife. The Taylor linearization method could lead to under-estimation of the variance for small sample sizes in general, but that depends on the complexity of the statistic. For the weighted mean, the approximate variance is supposed to be
17287: 17168: 11144:
For the trivial case in which all the weights are equal to 1, the above formula is just like the regular formula for the variance of the mean (but notice that it uses the maximum likelihood estimator for the variance instead of the unbiased variance. I.e.: dividing it by n instead of (n-1)).
6898: 16636: 15627: 6098: 1507: 124:
The mean for the morning class is 80 and the mean of the afternoon class is 90. The unweighted mean of the two means is 85. However, this does not account for the difference in number of students in each class (20 versus 30); hence the value of 85 does not reflect the average student grade
15421: 11141:), the version with unknown population mean is considered more stable. Lastly, if the proportion of sampling is negatively correlated with the values (i.e.: smaller chance to sample an observation that is large), then the un-known population size version slightly compensates for that. 3411:
is very small). For the following derivation we'll assume that the probability of selecting each element is fully represented by these probabilities. I.e.: selecting some element will not influence the probability of drawing another element (this doesn't apply for things such as
9121: 4270: 16940: 7827: 5268: 1891: 13490: 12802:{\displaystyle {\begin{aligned}{\hat {\sigma }}^{2}\ &={\frac {\sum \limits _{i=1}^{N}\left(x_{i}-\mu \right)^{2}}{N}}\\{\hat {\sigma }}_{\mathrm {w} }^{2}&={\frac {\sum \limits _{i=1}^{N}w_{i}\left(x_{i}-\mu ^{*}\right)^{2}}{\sum _{i=1}^{N}w_{i}}}\end{aligned}}} 4858: 11767: 11773:
mean. However, Endlich et al. didn't seem to publish this derivation in their paper (even though they mention they used it), and Cochran's book includes a slightly different formulation. Still, it's almost identical to the formulations described in previous sections.
18389:
In the scenario described in the previous section, most frequently the decrease in interaction strength obeys a negative exponential law. If the observations are sampled at equidistant times, then exponential decrease is equivalent to decrease by a constant fraction
16705:
The above generalizes easily to the case of taking the mean of vector-valued estimates. For example, estimates of position on a plane may have less certainty in one direction than another. As in the scalar case, the weighted mean of multiple estimates can provide a
2778: 890:
Therefore, data elements with a high weight contribute more to the weighted mean than do elements with a low weight. The weights may not be negative in order for the equation to work. Some may be zero, but not all of them (since division by zero is not allowed).
2007: 13133: 19425: 8917:
depends on the variability of the random variables both in the numerator and the denominator - as well as their correlation. Since there is no closed analytical form to compute this variance, various methods are used for approximate estimation. Primarily
10057:{\displaystyle {\widehat {V({\hat {R}})}}={\frac {1}{{\hat {Z}}^{2}}}\sum _{i=1}^{n}\sum _{j=1}^{n}\left({\check {\Delta }}_{ij}{\frac {y_{i}-{\hat {R}}z_{i}}{\pi _{i}}}{\frac {y_{j}-{\hat {R}}z_{j}}{\pi _{j}}}\right)={\frac {1}{{\hat {Z}}^{2}}}\left} 4741:{\displaystyle {\hat {\bar {Y}}}_{{\text{known }}N}={\frac {{\hat {Y}}_{pwr}}{N}}={\frac {{\frac {1}{n}}\sum _{i=1}^{n}{\frac {y'_{i}}{p_{i}}}}{N}}\approx {\frac {\sum _{i=1}^{n}{\frac {y'_{i}}{\pi _{i}}}}{N}}={\frac {\sum _{i=1}^{n}w_{i}y'_{i}}{N}}.} 8152:{\displaystyle R={\bar {Y}}={\frac {\sum _{i=1}^{N}{\frac {y_{i}}{\pi _{i}}}}{\sum _{i=1}^{N}{\frac {1}{\pi _{i}}}}}={\frac {\sum _{i=1}^{N}{\check {y}}_{i}}{\sum _{i=1}^{N}{\check {1}}_{i}}}={\frac {\sum _{i=1}^{N}w_{i}y_{i}}{\sum _{i=1}^{N}w_{i}}}} 11123:{\displaystyle {\widehat {V({\bar {y}}_{w})}}={\frac {1}{{\hat {N}}^{2}}}\sum _{i=1}^{n}\left((1-\pi _{i}){\frac {y_{i}-{\bar {y}}_{w}}{\pi _{i}}}\right)^{2}={\frac {1}{(\sum _{i=1}^{n}w_{i})^{2}}}\sum _{i=1}^{n}w_{i}^{2}(y_{i}-{\bar {y}}_{w})^{2}.} 4293: 15263: 10657:{\displaystyle {\widehat {V({\hat {R}})}}={\widehat {V({\bar {y}}_{w})}}={\frac {1}{{\hat {N}}^{2}}}\sum _{i=1}^{n}\sum _{j=1}^{n}\left({\check {\Delta }}_{ij}{\frac {y_{i}-{\bar {y}}_{w}}{\pi _{i}}}{\frac {y_{j}-{\bar {y}}_{w}}{\pi _{j}}}\right).} 8897: 885: 11137:
known). For example: if all y values are constant, the estimator with unknown population size will give the correct result, while the one with known population size will have some variability. Also, when the sample size itself is random (e.g.: in
17045: 722: 11541: 18807:
must correspond to the actual decrease of interaction strength. If this cannot be determined from theoretical considerations, then the following properties of exponentially decreasing weights are useful in making a suitable choice: at step
2339: 2901: 2394:, is itself a random variable. Its expected value and standard deviation are related to the expected values and standard deviations of the observations, as follows. For simplicity, we assume normalized weights (weights summing to one). 18025: 4034: 6679: 8758: 17934: 12130: 1352: 19241: 11930: 3382: 14948:{\displaystyle {\begin{aligned}s_{\mathrm {w} }^{2}\ &={\frac {{\hat {\sigma }}_{\mathrm {w} }^{2}}{1-(V_{2}/V_{1}^{2})}}\\&={\frac {\sum \limits _{i=1}^{N}w_{i}(x_{i}-\mu ^{*})^{2}}{V_{1}-(V_{2}/V_{1})}},\end{aligned}}} 5358:
The above formula was taken from Sarndal et al. (1992) (also presented in Cochran 1977), but was written differently. The left side is how the variance was written and the right side is how we've developed the weighted version:
9618:{\displaystyle {\hat {R}}={\frac {\hat {Y}}{\hat {Z}}}={\frac {\sum _{i=1}^{n}w_{i}y'_{i}}{\sum _{i=1}^{n}w_{i}z'_{i}}}\approx R+{\frac {1}{Z}}\sum _{i=1}^{n}\left({\frac {y'_{i}}{\pi _{i}}}-R{\frac {z'_{i}}{\pi _{i}}}\right)} 19737: 17174: 17055: 6576: 1741:{\displaystyle {\bar {x}}={\frac {\sum _{i=1}^{n}\left({\dfrac {x_{i}}{\sigma _{i}^{2}}}\right)}{\sum _{i=1}^{n}{\dfrac {1}{\sigma _{i}^{2}}}}}={\frac {\sum _{i=1}^{n}\left(x_{i}\cdot w_{i}\right)}{\sum _{i=1}^{n}w_{i}}},} 18245:
but also on its past values. Commonly, the strength of this dependence decreases as the separation of observations in time increases. To model this situation, one may replace the independent variable by its sliding mean
6760: 4115: 16832: 16481: 15472: 1121: 264: 15274: 15958: 5122: 3055: 5353: 18706: 6459: 3297: 1032: 12393: 2539: 15841: 8933: 6285:{\displaystyle \operatorname {Var} ({\hat {\bar {Y}}}_{{\text{pwr (known }}N{\text{)}}})={\frac {1}{N^{2}}}\sum _{i=1}^{n}\sum _{j=1}^{n}\left({\check {\Delta }}_{ij}{\check {y}}_{i}{\check {y}}_{j}\right)} 15020: 10750: 10217: 2973: 451: 17303: 14705: 12213: 6355: 5105:{\displaystyle {\hat {Y}}_{pwr}={\frac {1}{n}}\sum _{i=1}^{n}{\frac {y'_{i}}{p_{i}}}=\sum _{i=1}^{n}{\frac {y'_{i}}{np_{i}}}\approx \sum _{i=1}^{n}{\frac {y'_{i}}{\pi _{i}}}=\sum _{i=1}^{n}w_{i}y'_{i}} 3834: 3744: 14510: 1761: 1188: 13315: 4136: 3924: 1959: 16825: 12304: 9356: 11606: 19819: 7665: 604: 532: 19621: 18373: 16000: 13680: 12911: 12553: 12500: 6960: 5369: 1496: 951: 11598: 269:
Thus, the weighted mean makes it possible to find the mean average student grade without knowing each student's score. Only the class means and the number of students in each class are needed.
10127: 171: 6752: 2221:{\displaystyle \sigma _{\bar {x}}^{2}=\sum _{i=1}^{n}{w_{i}'^{2}\sigma _{i}^{2}}={\frac {\sum _{i=1}^{n}{\sigma _{i}^{-4}\sigma _{i}^{2}}}{\left(\sum _{i=1}^{n}\sigma _{i}^{-2}\right)^{2}}}.} 18538: 18074: 12024: 2670: 14689: 14448: 3190:
is considered constant, and the variability comes from the selection procedure. This in contrast to "model based" approaches in which the randomness is often described in the y values. The
125:(independent of class). The average student grade can be obtained by averaging all the grades, without regard to classes (add all the grades up and divide by the total number of students): 15728: 14553: 16953: 12989: 19471: 19288: 13667: 9215: 7660: 14382: 14622: 20214: 9262: 57:), except that instead of each of the data points contributing equally to the final average, some data points contribute more than others. The notion of weighted mean plays a role in 19518: 18916: 15464: 1999: 374: 330: 19557: 12864: 10331: 10278: 9168: 4803: 3618: 3524: 10795: 6950: 19110:
The concept of weighted average can be extended to functions. Weighted averages of functions play an important role in the systems of weighted differential and integral calculus.
13541: 12540: 4459:{\displaystyle \operatorname {Var} \left({\hat {\bar {Y}}}_{{\text{known }}N}\right)={\frac {1}{N^{2}}}{\frac {n}{n-1}}\sum _{i=1}^{n}\left(w_{i}y_{i}-{\overline {wy}}\right)^{2}} 2449: 15877: 15143: 2619: 19060: 11967: 1260: 18420: 17668: 19276: 15151: 15070: 733: 18990: 16695: 13195: 12245: 11827: 1438: 18762: 4042:
perspective, the weights, used in the numerator of the weighted mean, are obtained from taking the inverse of the selection probability (i.e.: the inflation factor). I.e.:
2572: 17772: 17690: 16760: 15987: 612: 18452: 16735: 15107: 12938: 1222: 18848: 17750: 7593: 7524: 3184: 2392: 19150: 13307: 13271: 13233: 11167: 6489: 12447: 2237: 16473: 12313:
Because one can always transform non-normalized weights to normalized weights, all formulas in this section can be adapted to non-normalized weights by replacing all
10367: 18946: 8767: 18595: 18568: 18243: 18216: 18169: 17721: 15758: 13588: 9289: 7854: 3471: 3444: 3409: 3223: 3100: 1399: 17824: 13635: 12941: 4828: 3767: 3103: 2786: 2653: 1194: 17945: 19125:
sources of error in calculating the variance of each data point. In this event, the variance in the weighted mean must be corrected to account for the fact that
19100: 19080: 19010: 18805: 18782: 18472: 18284: 18264: 18189: 18142: 18122: 18102: 17844: 17585:
which makes sense: the estimate is "compliant" in the second component and the estimate is compliant in the first component, so the weighted mean is nearly .
13561: 9295:, the variance calculation would look the same. When all weights are equal to one another, this formula is reduced to the standard unbiased variance estimator. 7613: 7564: 7544: 4853: 3854: 3155: 11781:
Because there is no closed analytical form for the variance of the weighted mean, it was proposed in the literature to rely on replication methods such as the
6581: 68:. While weighted means generally behave in a similar fashion to arithmetic means, they do have a few counterintuitive properties, as captured for instance in 8644: 17856: 19158: 3929: 3302: 11161:
linearization) is a reasonable estimation for the square of the standard error of the mean (when used in the context of measuring chemical constituents):
17282:{\displaystyle \mathbf {x} _{2}:={\begin{bmatrix}0&1\end{bmatrix}}^{\top },\qquad \mathbf {C} _{2}:={\begin{bmatrix}100&0\\0&1\end{bmatrix}}} 17163:{\displaystyle \mathbf {x} _{1}:={\begin{bmatrix}1&0\end{bmatrix}}^{\top },\qquad \mathbf {C} _{1}:={\begin{bmatrix}1&0\\0&100\end{bmatrix}}} 12032: 19633: 11835: 1272: 20007:"Statistical Analysis of Precipitation Chemistry Measurements over the Eastern United States. Part I: Seasonal and Regional Patterns and Correlations" 17049:
For example, consider the weighted mean of the point with high variance in the second component and with high variance in the first component. Then
13617:), we can determine a correction factor to yield an unbiased estimator. Assuming each random variable is sampled from the same distribution with mean 10133:, it would include many combinations of covariances that will depend on the indicator variables. If the selection probability are uncorrelated (i.e.: 9291:
by some factor would lead to the same estimator. It also means that if we scale the sum of weights to be equal to a known-from-before population size
6893:{\displaystyle \operatorname {Var} ({\hat {\bar {Y}}}_{{\text{pwr (known }}N{\text{)}}})={\frac {1}{N^{2}}}\sum _{i=1}^{n}\left(w_{i}y_{i}\right)^{2}} 16631:{\displaystyle \mathbf {C} ={\frac {\sum _{i=1}^{N}w_{i}\left(\mathbf {x} _{i}-\mu ^{*}\right)^{T}\left(\mathbf {x} _{i}-\mu ^{*}\right)}{1-V_{2}}}.} 15622:{\displaystyle \mathbf {C} ={\frac {\sum _{i=1}^{N}w_{i}\left(\mathbf {x} _{i}-\mu ^{*}\right)^{T}\left(\mathbf {x} _{i}-\mu ^{*}\right)}{V_{1}-1}}.} 6494: 3473:), we often talk about the multiplication of the two, which is a random variable. To avoid confusion in the following section, let's call this term: 2454: 15416:{\displaystyle \mathbf {C} ={\frac {\sum _{i=1}^{N}w_{i}\left(\mathbf {x} _{i}-\mu ^{*}\right)^{T}\left(\mathbf {x} _{i}-\mu ^{*}\right)}{V_{1}}}.} 4045: 181: 15885: 2978: 16775: 12247:, when all weights except one are zero. Its minimum value is found when all weights are equal (i.e., unweighted mean), in which case we have 5273: 18603: 6360: 3236: 1044: 175:
Or, this can be accomplished by weighting the class means by the number of students in each class. The larger class is given more "weight":
10333:. This helps illustrate that this formula incorporates the effect of correlation between y and z on the variance of the ratio estimators. 120:
Afternoon class = {81, 82, 83, 84, 85, 86, 87, 87, 88, 88, 89, 89, 89, 90, 90, 90, 90, 91, 91, 91, 92, 92, 93, 93, 94, 95, 96, 97, 98, 99}
19844:
Technically, negatives may be used if all the values are either zero or negatives. This fills no function however as the weights work as
9116:{\displaystyle {\widehat {V({\bar {y}}_{w})}}={\frac {1}{(\sum _{i=1}^{n}w_{i})^{2}}}\sum _{i=1}^{n}w_{i}^{2}(y_{i}-{\bar {y}}_{w})^{2}} 128: 19966:
Gatz, Donald F.; Smith, Luther (June 1995). "The standard error of a weighted mean concentration—I. Bootstrapping vs other methods".
16935:{\displaystyle {\bar {\mathbf {x} }}=\mathbf {C} _{\bar {\mathbf {x} }}\left(\sum _{i=1}^{n}\mathbf {W} _{i}\mathbf {x} _{i}\right),} 4284: 959: 14961: 13197:
are drawn from the same distribution, then we can treat this set as an unweighted sample, or we can treat it as the weighted sample
12316: 10669: 10136: 281:
weights are relevant, any weighted mean can be expressed using coefficients that sum to one. Such a linear combination is called a
15766: 5263:{\displaystyle \operatorname {Var} ({\hat {Y}}_{pwr})={\frac {n}{n-1}}\sum _{i=1}^{n}\left(w_{i}y_{i}-{\overline {wy}}\right)^{2}} 2906: 1886:{\displaystyle \sigma _{\bar {x}}={\sqrt {\frac {1}{\sum _{i=1}^{n}\sigma _{i}^{-2}}}}={\sqrt {\frac {1}{\sum _{i=1}^{n}w_{i}}}},} 385: 13485:{\displaystyle s^{2}\ ={\frac {\sum _{i=1}^{N}w_{i}}{\sum _{i=1}^{N}w_{i}-1}}\sum _{i=1}^{N}w_{i}\left(x_{i}-\mu ^{*}\right)^{2}} 4265:{\displaystyle {\hat {\bar {Y}}}_{{\text{known }}N}={\frac {{\hat {Y}}_{pwr}}{N}}\approx {\frac {\sum _{i=1}^{n}w_{i}y'_{i}}{N}}} 6297: 3776: 3623: 14453: 11762:{\displaystyle {\widehat {\sigma _{\bar {x}}^{2}}}={\frac {n}{(n-1)(n{\bar {w}})^{2}}}\sum w_{i}^{2}(x_{i}-{\bar {x}}_{w})^{2}} 2656:
random variables), then the variance of the weighted mean can be estimated as the multiplication of the unweighted variance by
20103: 7822:{\displaystyle {\hat {N}}=\sum _{i=1}^{n}w_{i}I_{i}=\sum _{i=1}^{n}{\frac {I_{i}}{\pi _{i}}}=\sum _{i=1}^{n}{\check {1}}'_{i}} 3863: 1899: 1133: 20267: 20201: 19934: 19880: 18384: 9305: 2638: 12250: 8914: 3229:
is in the sample and 0 if it was not selected. This can occur with fixed sample size, or varied sample size sampling (e.g.:
13563:). In any case, the information on total number of samples is necessary in order to obtain an unbiased correction, even if 12138: 540: 468: 19566: 18292: 8641:. As we moved from using N to using n, we actually know that all the indicator variables get 1, so we could simply write: 2400: 15426:
Similarly to weighted sample variance, there are two different unbiased estimators depending on the type of the weights.
12873: 12462: 3111: 2577: 1446: 11549: 2773:{\displaystyle \operatorname {Var} ({\bar {y}}_{w})={\hat {\sigma }}_{y}^{2}{\frac {\overline {w^{2}}}{{\bar {w}}^{2}}}} 20072:"Weighted Standard Error and its Impact on Significance Testing (WinCross vs. Quantum & SPSS), Dr. Albert Madansky" 10070: 14691:, ensuring that the expected value of the estimated variance equals the actual variance of the sampling distribution. 6687: 20224: 20177: 19993: 18480: 18036: 13128:{\displaystyle s^{2}\ ={\frac {\sum \limits _{i=1}^{N}w_{i}\left(x_{i}-\mu ^{*}\right)^{2}}{\sum _{i=1}^{N}w_{i}-1}}} 11979: 11132:
A similar re-creation of the proof (up to some mistakes at the end) was provided by Thomas Lumley in crossvalidated.
19420:{\displaystyle \chi _{\nu }^{2}={\frac {1}{(n-1)}}\sum _{i=1}^{n}{\frac {(x_{i}-{\bar {x}})^{2}}{\sigma _{i}^{2}}};} 14631: 14387: 4855:-expanded with replacement estimator, or "probability with replacement" estimator). With the above notation, it is: 19779: 15666: 14515: 897: 19433: 13640: 9173: 7618: 20031: 20006: 14326: 20320: 14558: 4806: 14625: 9220: 19483: 18853: 15445: 2657: 1964: 338: 294: 17: 19523: 12815: 9130: 7497:
The previous section dealt with estimating the population mean as a ratio of an estimated population total (
4754: 3529: 10755: 8764:
for specific values of y and w, but the statistical properties comes when including the indicator variable
7546:), and the variance was estimated in that context. Another common case is that the population size itself ( 6910: 2345: 20156: 20046: 13498: 12509: 3749:
When each element of the sample is inflated by the inverse of its selection probability, it is termed the
20325: 15853: 15258:{\displaystyle \mathbf {\mu ^{*}} ={\frac {\sum _{i=1}^{N}w_{i}\mathbf {x} _{i}}{\sum _{i=1}^{N}w_{i}}}.} 15119: 10130: 880:{\displaystyle {\bar {x}}={\frac {w_{1}x_{1}+w_{2}x_{2}+\cdots +w_{n}x_{n}}{w_{1}+w_{2}+\cdots +w_{n}}}.} 20151:
Mark Galassi, Jim Davies, James Theiler, Brian Gough, Gerard Jungman, Michael Booth, and Fabrice Rossi.
19015: 11938: 8923:
relatively accurate even for medium sample sizes. For when the sampling has a random sample size (as in
1440:, all having the same mean, one possible choice for the weights is given by the reciprocal of variance: 1231: 19560: 18393: 17606: 15637: 13598: 12307: 11786: 11154: 10283: 10230: 3476: 1363: 19249: 15046: 12963:). In the weighted setting, there are actually two different unbiased estimators, one for the case of 12449:
is used, the variance of the weighted sample is different from the variance of the unweighted sample.
19624: 18951: 17594: 17040:{\displaystyle \mathbf {C} _{\bar {\mathbf {x} }}=\left(\sum _{i=1}^{n}\mathbf {W} _{i}\right)^{-1},} 16943: 16644: 13144: 12218: 11800: 10129:
is the estimated covariance between the estimated sum of Y and estimated sum of Z. Since this is the
1411: 1038:
One can always normalize the weights by making the following transformation on the original weights:
717:{\displaystyle {\bar {x}}={\frac {\sum \limits _{i=1}^{n}w_{i}x_{i}}{\sum \limits _{i=1}^{n}w_{i}}},} 19082:
observations matter and the effect of the remaining observations can be ignored safely, then choose
2661: 39: 18714: 17847: 16697:, then the weighted mean and covariance reduce to the unweighted sample mean and covariance above. 2544: 1402: 17755: 17673: 16743: 15970: 11536:{\displaystyle {\widehat {\sigma _{{\bar {x}}_{w}}^{2}}}={\frac {n}{(n-1)(n{\bar {w}})^{2}}}\left} 3075:
perspective, we are interested in estimating the variance of the weighted mean when the different
19804: 19784: 18425: 16713: 15079: 12916: 2334:{\displaystyle {\bar {x}}=\sigma _{\bar {x}}^{2}\sum _{i=1}^{n}{\frac {x_{i}}{\sigma _{i}^{2}}}.} 1263: 1200: 35: 19953:), How to estimate the (approximate) variance of the weighted mean?, URL (version: 2021-06-08): 18811: 17726: 7569: 7500: 3160: 2368: 2348:
of the mean of the probability distributions under the assumption that they are independent and
113:
Morning class = {62, 67, 71, 74, 76, 77, 78, 79, 79, 80, 80, 81, 81, 82, 83, 84, 86, 89, 93, 98}
19794: 19789: 19128: 17598: 13279: 13238: 13200: 12960: 8892:{\displaystyle {\bar {y}}_{w}={\frac {\sum _{i=1}^{n}w_{i}y'_{i}}{\sum _{i=1}^{n}w_{i}1'_{i}}}} 6464: 3195: 1369: 58: 15035:
As a side note, other approaches have been described to compute the weighted sample variance.
12425: 894:
The formulas are simplified when the weights are normalized such that they sum up to 1, i.e.,
19950: 19809: 16445: 10339: 2896:{\displaystyle {\hat {\sigma }}_{y}^{2}={\frac {\sum _{i=1}^{n}(y_{i}-{\bar {y}})^{2}}{n-1}}} 20300: 20260:
Data Fitting and Uncertainty (A practical introduction to weighted least squares and beyond)
20071: 19875:
Cochran, W. G. (1977). Sampling Techniques (3rd ed.). Nashville, TN: John Wiley & Sons.
18921: 18020:{\displaystyle {\bar {x}}=\sigma _{\bar {x}}^{2}(\mathbf {J} ^{T}\mathbf {W} \mathbf {X} ),} 20018: 19975: 18573: 18546: 18221: 18194: 18147: 17699: 15736: 15657: 13566: 11782: 9267: 7832: 3449: 3422: 3387: 3201: 3078: 2349: 1377: 69: 17784: 13620: 4813: 3752: 8: 19279: 6674:{\displaystyle {\check {\Delta }}_{ii}=1-{\frac {\pi _{i}\pi _{i}}{\pi _{i}}}=1-\pi _{i}} 20022: 19979: 8753:{\displaystyle {\bar {y}}_{w}={\frac {\sum _{i=1}^{n}w_{i}y_{i}}{\sum _{i=1}^{n}w_{i}}}} 3106:
random variables. An alternative perspective for this problem is that of some arbitrary
3063:
observations. This has led to the development of alternative, more general, estimators.
20134: 20122: 19769: 19764: 19085: 19065: 18995: 18790: 18767: 18457: 18269: 18249: 18174: 18127: 18107: 18087: 17929:{\displaystyle \sigma _{\bar {x}}^{2}=(\mathbf {J} ^{T}\mathbf {W} \mathbf {J} )^{-1},} 17829: 16763: 16707: 15025:
The degrees of freedom of the weighted, unbiased sample variance vary accordingly from
13546: 12948: 12419: 7598: 7549: 7529: 4838: 3839: 3140: 3118: 282: 19236:{\displaystyle {\hat {\sigma }}_{\bar {x}}^{2}=\sigma _{\bar {x}}^{2}\chi _{\nu }^{2}} 4029:{\displaystyle {\check {y}}'_{i}=I_{i}{\check {y}}_{i}={\frac {I_{i}y_{i}}{\pi _{i}}}} 3377:{\displaystyle P(I_{i}=1|{\text{one sample draw}})=p_{i}\approx {\frac {\pi _{i}}{n}}} 20263: 20246: 20220: 20197: 20173: 20126: 20005:
Endlich, R. M.; Eymon, B. P.; Ferek, R. J.; Valdes, A. D.; Maxwell, C. (1988-12-01).
19987: 19930: 19876: 17693: 16738: 15641: 13602: 20138: 15733:(If they are not, divide the weights by their sum to normalize prior to calculating 20118: 20026: 19983: 19754: 13309:
are normalized to 1, then the correct expression after Bessel's correction becomes
12983:(where a weight equals the number of occurrences), then the unbiased estimator is: 11138: 8924: 7829:. With the above notation, the parameter we care about is the ratio of the sums of 6090: 3926:. As above, we can add a tick mark if multiplying by the indicator function. I.e.: 3413: 3230: 2641: 20286: 12125:{\displaystyle \sigma _{\bar {x}}^{2}=\sigma _{0}^{2}\sum _{i=1}^{n}{w_{i}'^{2}},} 3446:) is fixed, and the randomness comes from it being included in the sample or not ( 3059:
However, this estimation is rather limited due to the strong assumption about the
1347:{\textstyle \sigma _{\bar {x}}=\sigma {\sqrt {\sum \limits _{i=1}^{n}w_{i}'^{2}}}} 19845: 19824: 19799: 19774: 19732:{\displaystyle \sigma ^{2}={\frac {\sum _{i=1}^{n}(x_{i}-{\bar {x}})^{2}}{n-1}}.} 12457: 11925:{\displaystyle \sigma _{\bar {x}}^{2}=\sum _{i=1}^{n}{w_{i}'^{2}\sigma _{i}^{2}}} 8903: 4276: 3191: 3107: 535: 65: 50: 31: 16770:(both denoted in the same way, via superscripts); the weight matrix then reads: 9302:
The Taylor linearization states that for a general ratio estimator of two sums (
20315: 17779: 16767: 15633: 13594: 6571:{\displaystyle {\check {\Delta }}_{ij}=1-{\frac {\pi _{i}\pi _{j}}{\pi _{ij}}}} 3233:). The probability of some element to be chosen, given a sample, is denoted as 2649: 15640:, these processes changing the data's mean and variance and thus leading to a 13601:, these processes changing the data's mean and variance and thus leading to a 61:
and also occurs in a more general form in several other areas of mathematics.
20309: 20250: 17775: 15847: 15113: 11158: 8919: 7488: 4110:{\displaystyle w_{i}={\frac {1}{\pi _{i}}}\approx {\frac {1}{n\times p_{i}}}} 1128: 19954: 6089:
An alternative term, for when the sampling has a random sample size (as in
259:{\displaystyle {\bar {x}}={\frac {(20\times 80)+(30\times 90)}{20+30}}=86.} 20152: 20130: 19924: 17850:
states that the estimate of the mean having minimum variance is given by:
10752:), and when assuming the probability of each element is very small (i.e.: 6754:), and when assuming the probability of each element is very small, then: 4120: 1190:
is a special case of the weighted mean where all data have equal weights.
16947: 15953:{\displaystyle \mathbf {\mu ^{*}} =\sum _{i=1}^{N}w_{i}\mathbf {x} _{i}.} 3050:{\displaystyle {\overline {w^{2}}}={\frac {\sum _{i=1}^{n}w_{i}^{2}}{n}}} 15644:(the population count, which is a requirement for Bessel's correction). 15032:
The standard deviation is simply the square root of the variance above.
13605:(the population count, which is a requirement for Bessel's correction). 64:
If all the weights are equal, then the weighted mean is the same as the
19475:
standard error of the weighted mean (variance weights, scale corrected)
5348:{\displaystyle {\overline {wy}}=\sum _{i=1}^{n}{\frac {w_{i}y_{i}}{n}}} 2574:, then the expectation of the weighted sample mean will be that value, 18701:{\displaystyle V_{1}=\sum _{i=1}^{m}{w^{i-1}}={\frac {1-w^{m}}{1-w}},} 6454:{\displaystyle C(I_{i},I_{j})=\pi _{ij}-\pi _{i}\pi _{j}=\Delta _{ij}} 3292:{\displaystyle P(I_{i}=1\mid {\text{Some sample of size }}n)=\pi _{i}} 1116:{\displaystyle w_{i}'={\frac {w_{i}}{\sum \limits _{j=1}^{n}{w_{j}}}}} 20291: 19814: 14628:). This means that to unbias our estimator we need to pre-divide by 13138:
This effectively applies Bessel's correction for frequency weights.
12955:
in the denominator (corresponding to the sample size) is changed to
12415: 8761: 2645: 2001:. It is a special case of the general formula in previous section, 1753:
standard error of the weighted mean (with inverse-variance weights)
1406: 19820:
Standard error of a proportion estimation when using weighted data
2344:
The significance of this choice is that this weighted mean is the
953:. For such normalized weights, the weighted mean is equivalently: 19749: 54: 20216:
The First Systems of Weighted Differential and Integral Calculus
12414:
Typically when a mean is calculated it is important to know the
11157:
methods, the following (variance estimation of ratio-mean using
11153:
It has been shown, by Gatz et al. (1995), that in comparison to
288:
Using the previous example, we would get the following weights:
12951:
for the population variance. In normal unweighted samples, the
2633:
When treating the weights as constants, and having a sample of
1374:
For the weighted mean of a list of data for which each element
1027:{\displaystyle {\bar {x}}=\sum \limits _{i=1}^{n}{w_{i}'x_{i}}} 15015:{\displaystyle \operatorname {E} =\sigma _{\text{actual}}^{2}} 12388:{\displaystyle w_{i}'={\frac {w_{i}}{\sum _{i=1}^{n}{w_{i}}}}} 2534:{\displaystyle E({\bar {x}})=\sum _{i=1}^{n}{w_{i}'\mu _{i}}.} 20047:"GNU Scientific Library – Reference Manual: Weighted Samples" 12409: 10745:{\displaystyle \forall i\neq j:\Delta _{ij}=C(I_{i},I_{j})=0} 10212:{\displaystyle \forall i\neq j:\Delta _{ij}=C(I_{i},I_{j})=0} 4831:-estimator. This estimator can be itself estimated using the 462: 15836:{\displaystyle w_{i}'={\frac {w_{i}}{\sum _{i=1}^{N}w_{i}}}} 2968:{\displaystyle {\bar {w}}={\frac {\sum _{i=1}^{n}w_{i}}{n}}} 446:{\displaystyle {\bar {x}}=(0.4\times 80)+(0.6\times 90)=86.} 20243:
Data Reduction and Error Analysis for the Physical Sciences
20032:
10.1175/1520-0450(1988)027<1322:SAOPCM>2.0.CO;2
19759: 19113: 16439:
The reasoning here is the same as in the previous section.
15632:
This estimator can be unbiased only if the weights are not
11976:
Consequently, if all the observations have equal variance,
9358:), they can be expanded around the true value R, and give: 30:"Weighted average" redirects here. Not to be confused with 20172:(2nd ed.). Singapore: World Scientific. p. 324. 19951:
https://stats.stackexchange.com/users/249135/thomas-lumley
13593:
The estimator can be unbiased only if the weights are not
6350:{\displaystyle {\check {y}}_{i}={\frac {y_{i}}{\pi _{i}}}} 3829:{\displaystyle {\check {y}}_{i}={\frac {y_{i}}{\pi _{i}}}} 3739:{\displaystyle V=y_{i}^{2}V=y_{i}^{2}\pi _{i}(1-\pi _{i})} 14505:{\displaystyle \left(1-{\frac {V_{2}}{V_{1}^{2}}}\right)} 3137:) and dividing it by the population size – either known ( 1183:{\textstyle {\frac {1}{n}}\sum \limits _{i=1}^{n}{x_{i}}} 3919:{\displaystyle {\frac {y_{i}}{p_{i}}}=n{\check {y}}_{i}} 3125:, is calculated by taking an estimation of the total of 1954:{\displaystyle \sigma _{\bar {x}}^{2}=\sigma _{0}^{2}/n} 1195:
independent and identically distributed random variables
20153:
GNU Scientific Library - Reference manual, Version 1.15
20004: 19925:
Carl-Erik Sarndal; Bengt Swensson; Jan Wretman (1992).
16820:{\displaystyle \mathbf {W} _{i}=\mathbf {C} _{i}^{-1}.} 16442:
Since we are assuming the weights are normalized, then
12870:(and thus are random variables), it can be shown that 12299:{\textstyle \sigma _{\bar {x}}=\sigma _{0}/{\sqrt {n}}} 9351:{\displaystyle {\hat {R}}={\frac {\hat {Y}}{\hat {Z}}}} 20283: 18079: 17547: 17518: 17482: 17248: 17199: 17129: 17080: 15449: 12253: 12208:{\textstyle 1/n\leq \sum _{i=1}^{n}{w_{i}'^{2}}\leq 1} 12141: 7566:) is unknown and is estimated using the sample (i.e.: 1275: 1136: 900: 19636: 19569: 19526: 19486: 19436: 19291: 19252: 19161: 19131: 19088: 19068: 19018: 18998: 18954: 18924: 18856: 18814: 18793: 18770: 18717: 18606: 18576: 18570:
is the sum of the unnormalized weights. In this case
18549: 18483: 18460: 18428: 18396: 18295: 18272: 18252: 18224: 18197: 18177: 18150: 18130: 18110: 18090: 18039: 17948: 17859: 17832: 17787: 17758: 17729: 17702: 17676: 17609: 17301: 17177: 17058: 16956: 16835: 16778: 16746: 16716: 16647: 16484: 16448: 15998: 15973: 15888: 15856: 15769: 15739: 15669: 15475: 15448: 15277: 15154: 15122: 15082: 15049: 14964: 14703: 14634: 14561: 14518: 14456: 14390: 14329: 13678: 13643: 13623: 13590:
has a different meaning other than frequency weight.
13569: 13549: 13501: 13318: 13282: 13241: 13203: 13147: 12992: 12919: 12876: 12818: 12551: 12512: 12465: 12428: 12319: 12221: 12035: 11982: 11941: 11838: 11803: 11609: 11552: 11170: 10803: 10758: 10672: 10666:
If the selection probability are uncorrelated (i.e.:
10377: 10342: 10286: 10233: 10139: 10073: 9635: 9366: 9308: 9270: 9223: 9176: 9133: 8936: 8770: 8647: 8165: 7862: 7835: 7668: 7621: 7601: 7572: 7552: 7532: 7503: 6958: 6913: 6763: 6690: 6684:
If the selection probability are uncorrelated (i.e.:
6584: 6497: 6467: 6363: 6300: 6101: 5367: 5276: 5125: 4861: 4841: 4816: 4757: 4482: 4296: 4139: 4048: 3932: 3866: 3842: 3779: 3755: 3626: 3532: 3479: 3452: 3425: 3390: 3305: 3239: 3204: 3163: 3143: 3081: 2981: 2909: 2789: 2673: 2580: 2547: 2457: 2403: 2371: 2240: 2010: 1967: 1902: 1764: 1613: 1555: 1510: 1449: 1414: 1380: 1234: 1203: 1047: 962: 736: 615: 599:{\displaystyle \left(w_{1},w_{2},\dots ,w_{n}\right)} 543: 527:{\displaystyle \left(x_{1},x_{2},\dots ,x_{n}\right)} 471: 388: 341: 297: 184: 131: 19616:{\displaystyle \sigma _{\bar {x}}^{2}=\sigma ^{2}/n} 18368:{\displaystyle z_{k}=\sum _{i=1}^{m}w_{i}x_{k+1-i}.} 18084:
Consider the time series of an independent variable
16950:), in terms of the covariance of the weighted mean: 3121:, the population mean, of some quantity of interest 18378: 14694:The final unbiased estimate of sample variance is: 14555:bias in the unweighted estimator (also notice that 12906:{\displaystyle {\hat {\sigma }}_{\mathrm {w} }^{2}} 12495:{\displaystyle {\hat {\sigma }}_{\mathrm {w} }^{2}} 4133:is known we can estimate the population mean using 1491:{\displaystyle w_{i}={\frac {1}{\sigma _{i}^{2}}}.} 19731: 19615: 19551: 19512: 19465: 19419: 19270: 19235: 19152:is too large. The correction that must be made is 19144: 19094: 19074: 19054: 19004: 18984: 18940: 18910: 18842: 18799: 18776: 18756: 18700: 18589: 18562: 18532: 18466: 18446: 18414: 18367: 18278: 18258: 18237: 18210: 18183: 18163: 18136: 18116: 18096: 18068: 18019: 17928: 17838: 17818: 17766: 17744: 17715: 17684: 17662: 17574: 17281: 17162: 17039: 16934: 16819: 16754: 16729: 16689: 16630: 16467: 16428: 15981: 15952: 15871: 15835: 15752: 15722: 15621: 15458: 15415: 15257: 15137: 15101: 15064: 15014: 14947: 14683: 14616: 14547: 14504: 14442: 14376: 14312: 13661: 13629: 13582: 13555: 13535: 13484: 13301: 13265: 13227: 13189: 13127: 12932: 12905: 12858: 12801: 12534: 12494: 12441: 12387: 12298: 12239: 12207: 12124: 12018: 11971:standard error of the weighted mean (general case) 11961: 11924: 11821: 11761: 11593:{\displaystyle {\bar {w}}={\frac {\sum w_{i}}{n}}} 11592: 11535: 11122: 10789: 10744: 10656: 10361: 10325: 10272: 10211: 10121: 10056: 9617: 9350: 9283: 9256: 9209: 9162: 9115: 8891: 8752: 8633: 8151: 7848: 7821: 7654: 7607: 7587: 7558: 7538: 7518: 7477: 6944: 6892: 6746: 6673: 6570: 6491:is the probability of selecting both i and j. And 6483: 6453: 6349: 6284: 6075: 5347: 5262: 5104: 4847: 4822: 4797: 4740: 4458: 4264: 4109: 4028: 3918: 3848: 3828: 3761: 3738: 3612: 3518: 3465: 3438: 3403: 3376: 3291: 3217: 3178: 3149: 3094: 3049: 2967: 2895: 2772: 2613: 2566: 2533: 2443: 2386: 2333: 2220: 1993: 1953: 1885: 1740: 1490: 1432: 1393: 1346: 1254: 1216: 1182: 1115: 1026: 945: 879: 716: 598: 526: 461:Formally, the weighted mean of a non-empty finite 445: 368: 324: 258: 165: 19520:, they cancel out in the weighted mean variance, 19105: 10122:{\displaystyle {\hat {C}}({\hat {Y}},{\hat {Z}})} 9264:would give the same estimator, since multiplying 166:{\displaystyle {\bar {x}}={\frac {4300}{50}}=86.} 20307: 20145: 15268:And the weighted covariance matrix is given by: 15072:(each set of single observations on each of the 10219:), this term would still include a summation of 7615:can be described as the sum of weights. So when 6747:{\displaystyle \forall i\neq j:C(I_{i},I_{j})=0} 4473:The general formula can be developed like this: 20104:"Extension of covariance selection mathematics" 19102:such that the tail area is sufficiently small. 18533:{\displaystyle w_{i}={\frac {w^{i-1}}{V_{1}}},} 18069:{\displaystyle \mathbf {W} =\mathbf {C} ^{-1}.} 12410:§ Correcting for over- or under-dispersion 12019:{\displaystyle \sigma _{i}^{2}=\sigma _{0}^{2}} 11776: 3299:, and the one-draw probability of selection is 2231:The equations above can be combined to obtain: 20213:Jane Grossman, Michael Grossman, Robert Katz. 20011:Journal of Applied Meteorology and Climatology 19920: 19918: 19916: 19914: 19912: 19910: 19908: 17588: 14684:{\displaystyle 1-\left(V_{2}/V_{1}^{2}\right)} 14443:{\displaystyle V_{2}=\sum _{i=1}^{N}w_{i}^{2}} 12026:, the weighted sample mean will have variance 11829:, the variance of the weighted sample mean is 3066: 2451:then the weighted sample mean has expectation 946:{\textstyle \sum \limits _{i=1}^{n}{w_{i}'}=1} 20192:G. H. Hardy, J. E. Littlewood, and G. Pólya. 19906: 19904: 19902: 19900: 19898: 19896: 19894: 19892: 19890: 19888: 19871: 19869: 19867: 19865: 19119: 15723:{\displaystyle V_{1}=\sum _{i=1}^{N}w_{i}=1.} 15038: 14548:{\displaystyle \left({\frac {N-1}{N}}\right)} 12947:For small samples, it is customary to use an 11797:For uncorrelated observations with variances 6093:), is presented in Sarndal et al. (1992) as: 272: 20097: 20095: 13296: 13283: 13260: 13242: 13222: 13204: 13184: 13148: 10797:), then the above reduced to the following: 8159:. We can estimate it using our sample with: 20170:Statistical Methods in Experimental Physics 19466:{\displaystyle {\hat {\sigma }}_{\bar {x}}} 17599:Variance § Sum of correlated variables 15967:weighted estimate of the covariance matrix 15442:weighted estimate of the covariance matrix 13662:{\displaystyle \sigma _{\text{actual}}^{2}} 12866:for normalized weights. If the weights are 12403: 11148: 9210:{\displaystyle w_{i}={\frac {1}{\pi _{i}}}} 7655:{\displaystyle w_{i}={\frac {1}{\pi _{i}}}} 4287:), then the variance of this estimator is: 4279:is one that results in a fixed sample size 1357: 20257: 19885: 19862: 18171:. In many common situations, the value of 16700: 14450:. Therefore, the bias in our estimator is 14377:{\displaystyle V_{1}=\sum _{i=1}^{N}w_{i}} 12215:. The variance attains its maximum value, 10131:covariance of two sums of random variables 4805:and it may be estimated by the (unbiased) 456: 20240: 20092: 20030: 19965: 16710:estimate. We simply replace the variance 15466:, with Bessel's correction, is given by: 14617:{\displaystyle \ V_{1}^{2}/V_{2}=N_{eff}} 13273:, and we get the same result either way. 9627:And the variance can be approximated by: 2397:If the observations have expected values 2355: 19955:https://stats.stackexchange.com/q/525770 19114:Correcting for over- or under-dispersion 17752:is the common mean to be estimated, and 15647: 9257:{\displaystyle w_{i}={\frac {1}{p_{i}}}} 20196:(2nd ed.), Cambridge University Press, 19513:{\displaystyle \sigma _{i}=\sigma _{0}} 18911:{\displaystyle {e^{-1}}(1-w)=0.39(1-w)} 18144:observations sampled at discrete times 15459:{\displaystyle \textstyle \mathbf {C} } 15076:random variables) is assigned a weight 13613:If the weights are instead non-random ( 12913:is the maximum likelihood estimator of 2541:In particular, if the means are equal, 1994:{\displaystyle \sigma _{i}=\sigma _{0}} 369:{\displaystyle {\frac {30}{20+30}}=0.6} 325:{\displaystyle {\frac {20}{20+30}}=0.4} 14: 20308: 19552:{\displaystyle \sigma _{\bar {x}}^{2}} 15429: 15043:In a weighted sample, each row vector 13608: 12859:{\displaystyle \sum _{i=1}^{N}w_{i}=1} 12422:about that mean. When a weighted mean 9163:{\displaystyle \pi _{i}\approx p_{i}n} 6085:And we got to the formula from above. 4798:{\displaystyle Y=\sum _{i=1}^{N}y_{i}} 3613:{\displaystyle E=y_{i}E=y_{i}\pi _{i}} 109:test grades in each class as follows: 20284: 20167: 20101: 18385:Exponentially weighted moving average 13495:where the total number of samples is 10790:{\displaystyle (1-\pi _{i})\approx 1} 8913:In this case, the variability of the 8906:and it is approximately unbiased for 6945:{\displaystyle (1-\pi _{i})\approx 1} 3129:over all elements in the population ( 2628: 13536:{\displaystyle \sum _{i=1}^{N}w_{i}} 12974: 12535:{\displaystyle {\hat {\sigma }}^{2}} 2444:{\displaystyle E(x_{i})={\mu _{i}},} 19480:When all data variances are equal, 18080:Decreasing strength of interactions 16829:The weighted mean in this case is: 15872:{\displaystyle \mathbf {\mu ^{*}} } 15138:{\displaystyle \mathbf {\mu ^{*}} } 14823: 14011: 13725: 13013: 12689: 12589: 12502:is defined similarly to the normal 12398: 4751:The population total is denoted as 3112:selected with unequal probabilities 2614:{\displaystyle E({\bar {x}})=\mu .} 1501:The weighted mean in this case is: 1401:potentially comes from a different 1302: 1226:standard error of the weighted mean 1148: 1078: 979: 902: 678: 635: 379:Then, apply the weights like this: 24: 20301:Tool to calculate Weighted Average 20234: 20123:10.1111/j.1469-1809.1957.tb01874.x 19055:{\displaystyle \leq {e^{-n(1-w)}}} 18850:, the weight approximately equals 18441: 18403: 17603:In the general case, suppose that 17219: 17100: 16641:If all weights are the same, i.e. 14980: 14965: 14752: 14714: 14208: 14190: 14130: 14112: 14041: 13986: 13962: 13879: 13861: 13820: 13802: 13745: 13683: 12892: 12667: 12481: 11962:{\displaystyle \sigma _{\bar {x}}} 11600:. Further simplification leads to 10689: 10673: 10532: 10156: 10140: 9748: 7197: 7079: 6691: 6589: 6502: 6439: 6221: 3186:). In this context, each value of 1255:{\displaystyle \sigma _{\bar {x}}} 534:, with corresponding non-negative 25: 20337: 20277: 18415:{\displaystyle 0<\Delta <1} 17663:{\displaystyle \mathbf {X} =^{T}} 10326:{\displaystyle z'_{i}=I_{i}z_{i}} 10273:{\displaystyle y'_{i}=I_{i}y_{i}} 3526:. With the following expectancy: 3519:{\displaystyle y'_{i}=y_{i}I_{i}} 3225:) that get 1 if some observation 19780:Weighted average cost of capital 19271:{\displaystyle \chi _{\nu }^{2}} 18379:Exponentially decreasing weights 18050: 18041: 18007: 18002: 17991: 17906: 17901: 17890: 17760: 17678: 17611: 17452: 17432: 17417: 17397: 17358: 17335: 17310: 17230: 17180: 17111: 17061: 17010: 16967: 16959: 16914: 16902: 16863: 16855: 16840: 16796: 16781: 16748: 16576: 16535: 16486: 16342: 16301: 16221: 16180: 16004: 15975: 15937: 15895: 15863: 15567: 15526: 15477: 15451: 15369: 15328: 15279: 15206: 15161: 15129: 15065:{\displaystyle \mathbf {x} _{i}} 15052: 12306:, i.e., it degenerates into the 7526:) with a known population size ( 80: 20245:. New York, N.Y.: McGraw-Hill. 20207: 20186: 20161: 20102:Price, George R. (April 1972). 19120:§ Weighted sample variance 18985:{\displaystyle {1-e^{-1}}=0.61} 17227: 17108: 16690:{\displaystyle w_{i}/V_{1}=1/N} 15029: − 1 down to 0. 13669:, taking expectations we have, 13190:{\displaystyle \{2,2,4,5,5,5\}} 12240:{\displaystyle \sigma _{0}^{2}} 11822:{\displaystyle \sigma _{i}^{2}} 7489:Variance of the weighted mean ( 3110:of the data in which units are 1433:{\displaystyle \sigma _{i}^{2}} 20064: 20039: 19998: 19959: 19943: 19927:Model Assisted Survey Sampling 19838: 19703: 19696: 19674: 19580: 19537: 19456: 19444: 19387: 19380: 19358: 19328: 19316: 19206: 19181: 19169: 19106:Weighted averages of functions 19062:. Where primarily the closest 19046: 19034: 18905: 18893: 18884: 18872: 18828: 18815: 18751: 18739: 18011: 17986: 17974: 17955: 17911: 17885: 17870: 17807: 17788: 17736: 17651: 17618: 17314: 16971: 16867: 16844: 16413: 16385: 14991: 14971: 14932: 14904: 14880: 14853: 14803: 14770: 14744: 14233: 14224: 14220: 14214: 14199: 14196: 14155: 14146: 14142: 14136: 14121: 14118: 14086: 14077: 14050: 14047: 13997: 13978: 13968: 13904: 13895: 13891: 13885: 13870: 13867: 13845: 13836: 13832: 13826: 13811: 13808: 13783: 13774: 13754: 13751: 13711: 13699: 13689: 12884: 12659: 12563: 12520: 12473: 12264: 12046: 11952: 11849: 11792: 11750: 11737: 11714: 11684: 11677: 11665: 11662: 11650: 11623: 11559: 11519: 11512: 11490: 11470: 11457: 11445: 11432: 11400: 11397: 11391: 11369: 11354: 11332: 11319: 11306: 11274: 11254: 11247: 11235: 11232: 11220: 11186: 11108: 11095: 11072: 11024: 10989: 10944: 10918: 10899: 10858: 10833: 10821: 10811: 10778: 10759: 10733: 10707: 10621: 10573: 10535: 10467: 10442: 10430: 10420: 10400: 10394: 10385: 10200: 10174: 10116: 10110: 10095: 10086: 10080: 10046: 10040: 10025: 10016: 10010: 9998: 9976: 9970: 9961: 9948: 9929: 9923: 9914: 9888: 9839: 9788: 9751: 9683: 9658: 9652: 9643: 9399: 9389: 9373: 9341: 9331: 9315: 9104: 9091: 9068: 9020: 8985: 8966: 8954: 8944: 8778: 8655: 8619: 8393: 8348: 8194: 8189: 8172: 8051: 8009: 7875: 7804: 7675: 7579: 7510: 7330: 7311: 7241: 7222: 7200: 7123: 7104: 7082: 7001: 6979: 6969: 6933: 6914: 6809: 6787: 6782: 6770: 6735: 6709: 6592: 6505: 6393: 6367: 6308: 6265: 6246: 6224: 6147: 6125: 6120: 6108: 5499: 5400: 5388: 5378: 5160: 5142: 5132: 5115:The estimated variance of the 4869: 4526: 4497: 4492: 4321: 4316: 4183: 4154: 4149: 4121:Variance of the weighted sum ( 3975: 3940: 3904: 3787: 3733: 3714: 3683: 3670: 3646: 3630: 3584: 3571: 3552: 3536: 3338: 3329: 3309: 3273: 3243: 3170: 2916: 2870: 2863: 2841: 2797: 2756: 2715: 2702: 2690: 2680: 2599: 2593: 2584: 2476: 2470: 2461: 2420: 2407: 2378: 2266: 2247: 2021: 1913: 1775: 1517: 1286: 1245: 969: 743: 622: 434: 422: 416: 404: 395: 233: 221: 215: 203: 191: 138: 97:with 20 students, one with 30 13: 1: 19855: 19623:, formulated in terms of the 19559:, which again reduces to the 18757:{\displaystyle V_{1}=1/(1-w)} 10223:covariances for each element 3384:(If N is very large and each 3194:procedure yields a series of 2567:{\displaystyle \mu _{i}=\mu } 2360: 20241:Bevington, Philip R (1969). 19988:10.1016/1352-2310(94)00210-C 17767:{\displaystyle \mathbf {J} } 17685:{\displaystyle \mathbf {C} } 16755:{\displaystyle \mathbf {C} } 15982:{\displaystyle \mathbf {C} } 12967:and another for the case of 11777:Replication-based estimators 6053: 5949: 5287: 5244: 4440: 2994: 2745: 2346:maximum likelihood estimator 7: 19742: 18447:{\displaystyle w=1-\Delta } 18422:at each time step. Setting 17589:Accounting for correlations 17292:then the weighted mean is: 16730:{\displaystyle \sigma ^{2}} 15102:{\displaystyle w_{i}\geq 0} 13235:with corresponding weights 12933:{\displaystyle \sigma ^{2}} 3067:Survey sampling perspective 2623: 1217:{\displaystyle \sigma ^{2}} 75: 10: 20342: 20157:Sec. 21.7 Weighted Samples 19561:standard error of the mean 19117: 18918:, the tail area the value 18843:{\displaystyle (1-w)^{-1}} 18382: 17745:{\displaystyle {\bar {x}}} 17592: 15039:Weighted sample covariance 12407: 12308:standard error of the mean 7588:{\displaystyle {\hat {N}}} 7519:{\displaystyle {\hat {Y}}} 7493:-estimator for ratio-mean) 4807:Horvitz–Thompson estimator 3179:{\displaystyle {\hat {N}}} 2387:{\displaystyle {\bar {x}}} 2365:The weighted sample mean, 1367: 1364:Inverse-variance weighting 1361: 273:Convex combination example 49:is similar to an ordinary 29: 20168:James, Frederick (2006). 19810:Weighted sum of variables 19625:sample standard deviation 19145:{\displaystyle \chi ^{2}} 18104:and a dependent variable 17595:Generalized least squares 13302:{\displaystyle \{w_{i}\}} 13276:If the frequency weights 13266:{\displaystyle \{2,1,3\}} 13228:{\displaystyle \{2,4,5\}} 6484:{\displaystyle \pi _{ij}} 3267:Some sample of size  1193:If the data elements are 53:(the most common type of 20111:Annals of Human Genetics 19831: 18992:. The tail area at step 17696:relating the quantities 16942:(where the order of the 12442:{\displaystyle \mu ^{*}} 12404:Weighted sample variance 11149:Bootstrapping validation 5119:-estimator is given by: 3836:. A related quantity is 1403:probability distribution 1358:Variance-defined weights 47:weighted arithmetic mean 19968:Atmospheric Environment 19805:Weighted moving average 19785:Weighted geometric mean 16701:Vector-valued estimates 16468:{\displaystyle V_{1}=1} 13141:For example, if values 12944:Gaussian observations. 10362:{\displaystyle z_{i}=1} 4129:If the population size 1264:uncertainty propagation 457:Mathematical definition 36:Weighted geometric mean 19795:Weighted least squares 19790:Weighted harmonic mean 19733: 19673: 19617: 19553: 19514: 19467: 19421: 19354: 19272: 19237: 19146: 19096: 19076: 19056: 19006: 18986: 18942: 18941:{\displaystyle e^{-1}} 18912: 18844: 18801: 18778: 18758: 18702: 18640: 18591: 18564: 18534: 18474:normalized weights by 18468: 18448: 18416: 18369: 18329: 18280: 18260: 18239: 18212: 18185: 18165: 18138: 18118: 18098: 18070: 18021: 17930: 17840: 17820: 17768: 17746: 17717: 17686: 17664: 17576: 17283: 17164: 17041: 17007: 16936: 16899: 16821: 16756: 16731: 16691: 16632: 16516: 16469: 16430: 16282: 16161: 16122: 16077: 16038: 15983: 15954: 15924: 15873: 15837: 15819: 15754: 15724: 15703: 15623: 15507: 15460: 15417: 15309: 15259: 15238: 15193: 15139: 15103: 15066: 15016: 14949: 14842: 14685: 14618: 14549: 14506: 14444: 14424: 14378: 14363: 14314: 14030: 13744: 13663: 13631: 13584: 13557: 13537: 13522: 13486: 13431: 13391: 13358: 13303: 13267: 13229: 13191: 13129: 13105: 13032: 12934: 12907: 12860: 12839: 12803: 12781: 12708: 12608: 12536: 12496: 12443: 12389: 12369: 12300: 12241: 12209: 12176: 12126: 12096: 12020: 11963: 11926: 11884: 11823: 11763: 11594: 11537: 11124: 11056: 11012: 10892: 10791: 10746: 10658: 10522: 10501: 10363: 10327: 10274: 10213: 10123: 10058: 9738: 9717: 9619: 9544: 9478: 9432: 9352: 9285: 9258: 9211: 9164: 9117: 9052: 9008: 8893: 8862: 8816: 8754: 8736: 8693: 8635: 8582: 8536: 8483: 8437: 8385: 8340: 8283: 8226: 8153: 8135: 8092: 8043: 8001: 7954: 7907: 7850: 7823: 7796: 7748: 7704: 7656: 7609: 7589: 7560: 7540: 7520: 7479: 7433: 7305: 7187: 7069: 7048: 6946: 6894: 6852: 6748: 6675: 6572: 6485: 6455: 6351: 6286: 6211: 6190: 6077: 6013: 5909: 5806: 5743: 5660: 5586: 5458: 5349: 5315: 5264: 5204: 5106: 5078: 5027: 4971: 4920: 4849: 4824: 4799: 4784: 4742: 4705: 4645: 4585: 4460: 4400: 4266: 4232: 4125:-estimator for totals) 4111: 4030: 3920: 3850: 3830: 3763: 3740: 3614: 3520: 3467: 3440: 3405: 3378: 3293: 3219: 3180: 3151: 3096: 3051: 3025: 2969: 2948: 2897: 2840: 2774: 2615: 2568: 2535: 2502: 2445: 2388: 2356:Statistical properties 2335: 2298: 2222: 2183: 2120: 2056: 1995: 1955: 1887: 1865: 1812: 1742: 1721: 1665: 1611: 1549: 1492: 1434: 1395: 1370:Weighted least squares 1348: 1321: 1256: 1218: 1184: 1167: 1117: 1097: 1028: 998: 947: 921: 881: 718: 697: 654: 600: 528: 447: 370: 326: 260: 167: 59:descriptive statistics 40:Weighted harmonic mean 20321:Mathematical analysis 19734: 19653: 19618: 19554: 19515: 19468: 19422: 19334: 19273: 19238: 19147: 19097: 19077: 19057: 19007: 18987: 18943: 18913: 18845: 18802: 18787:The damping constant 18779: 18759: 18703: 18620: 18592: 18590:{\displaystyle V_{1}} 18565: 18563:{\displaystyle V_{1}} 18535: 18469: 18449: 18417: 18370: 18309: 18281: 18261: 18240: 18238:{\displaystyle x_{i}} 18213: 18211:{\displaystyle t_{i}} 18186: 18166: 18164:{\displaystyle t_{i}} 18139: 18119: 18099: 18071: 18022: 17931: 17841: 17821: 17769: 17747: 17718: 17716:{\displaystyle x_{i}} 17687: 17665: 17577: 17284: 17165: 17042: 16987: 16944:matrix–vector product 16937: 16879: 16822: 16757: 16732: 16692: 16633: 16496: 16475:and this reduces to: 16470: 16431: 16262: 16141: 16102: 16057: 16018: 15984: 15955: 15904: 15879:can be simplified to 15874: 15838: 15799: 15755: 15753:{\displaystyle V_{1}} 15725: 15683: 15642:loss of the base rate 15624: 15487: 15461: 15418: 15289: 15260: 15218: 15173: 15140: 15104: 15067: 15017: 14950: 14822: 14686: 14626:effective sample size 14619: 14550: 14507: 14445: 14404: 14379: 14343: 14315: 14010: 13724: 13664: 13632: 13603:loss of the base rate 13585: 13583:{\displaystyle w_{i}} 13558: 13538: 13502: 13487: 13411: 13371: 13338: 13304: 13268: 13230: 13192: 13130: 13085: 13012: 12935: 12908: 12861: 12819: 12804: 12761: 12688: 12588: 12537: 12497: 12444: 12390: 12349: 12301: 12242: 12210: 12156: 12127: 12076: 12021: 11964: 11927: 11864: 11824: 11764: 11595: 11538: 11125: 11036: 10992: 10872: 10792: 10747: 10659: 10502: 10481: 10364: 10328: 10275: 10214: 10124: 10059: 9718: 9697: 9620: 9524: 9458: 9412: 9353: 9286: 9284:{\displaystyle w_{i}} 9259: 9212: 9165: 9118: 9032: 8988: 8927:), it is as follows: 8894: 8842: 8796: 8755: 8716: 8673: 8636: 8562: 8516: 8463: 8417: 8365: 8320: 8263: 8206: 8154: 8115: 8072: 8023: 7981: 7934: 7887: 7851: 7849:{\displaystyle y_{i}} 7824: 7776: 7728: 7684: 7657: 7610: 7595:). The estimation of 7590: 7561: 7541: 7521: 7480: 7413: 7285: 7167: 7049: 7028: 6947: 6895: 6832: 6749: 6676: 6573: 6486: 6456: 6352: 6287: 6191: 6170: 6078: 5993: 5889: 5786: 5723: 5640: 5566: 5438: 5350: 5295: 5265: 5184: 5107: 5058: 5007: 4951: 4900: 4850: 4825: 4800: 4764: 4743: 4685: 4625: 4565: 4461: 4380: 4267: 4212: 4112: 4031: 3921: 3851: 3831: 3764: 3741: 3615: 3521: 3468: 3466:{\displaystyle I_{i}} 3441: 3439:{\displaystyle y_{i}} 3406: 3404:{\displaystyle p_{i}} 3379: 3294: 3220: 3218:{\displaystyle I_{i}} 3181: 3152: 3097: 3095:{\displaystyle y_{i}} 3052: 3005: 2970: 2928: 2898: 2820: 2775: 2616: 2569: 2536: 2482: 2446: 2389: 2336: 2278: 2223: 2163: 2100: 2036: 1996: 1956: 1896:Note this reduces to 1888: 1845: 1792: 1743: 1701: 1645: 1591: 1529: 1493: 1435: 1396: 1394:{\displaystyle x_{i}} 1349: 1301: 1257: 1219: 1185: 1147: 1118: 1077: 1029: 978: 948: 901: 882: 719: 677: 634: 601: 529: 448: 371: 327: 261: 168: 19634: 19567: 19524: 19484: 19434: 19289: 19250: 19159: 19129: 19086: 19066: 19016: 18996: 18952: 18922: 18854: 18812: 18791: 18768: 18764:for large values of 18715: 18604: 18574: 18547: 18481: 18458: 18426: 18394: 18293: 18270: 18250: 18222: 18218:depends not only on 18195: 18175: 18148: 18128: 18108: 18088: 18037: 17946: 17857: 17848:Gauss–Markov theorem 17830: 17819:{\displaystyle ^{T}} 17785: 17756: 17727: 17700: 17674: 17607: 17299: 17175: 17056: 16954: 16833: 16776: 16744: 16714: 16645: 16482: 16446: 15996: 15971: 15886: 15854: 15767: 15737: 15667: 15473: 15446: 15275: 15152: 15120: 15080: 15047: 14962: 14701: 14632: 14559: 14516: 14454: 14388: 14327: 13676: 13641: 13637:and actual variance 13630:{\displaystyle \mu } 13621: 13567: 13547: 13499: 13316: 13280: 13239: 13201: 13145: 12990: 12959: − 1 (see 12917: 12874: 12816: 12549: 12510: 12463: 12426: 12317: 12251: 12219: 12139: 12033: 11980: 11939: 11836: 11801: 11607: 11550: 11168: 10801: 10756: 10670: 10375: 10340: 10284: 10231: 10137: 10071: 9633: 9364: 9306: 9268: 9221: 9174: 9170:, then either using 9131: 8934: 8768: 8645: 8163: 7860: 7833: 7666: 7619: 7599: 7570: 7550: 7530: 7501: 6956: 6911: 6761: 6688: 6582: 6495: 6465: 6361: 6298: 6099: 5365: 5274: 5123: 4859: 4839: 4823:{\displaystyle \pi } 4814: 4755: 4480: 4294: 4137: 4046: 3930: 3864: 3840: 3777: 3762:{\displaystyle \pi } 3753: 3624: 3530: 3477: 3450: 3423: 3419:Since each element ( 3388: 3303: 3237: 3202: 3161: 3141: 3114:(with replacement). 3079: 2979: 2907: 2787: 2671: 2658:Kish's design effect 2652:(as is the case for 2644:, all with the same 2578: 2545: 2455: 2401: 2369: 2352:with the same mean. 2350:normally distributed 2238: 2008: 1965: 1900: 1762: 1508: 1447: 1412: 1378: 1273: 1232: 1201: 1134: 1045: 960: 898: 734: 613: 541: 469: 386: 339: 295: 182: 129: 20258:Strutz, T. (2010). 20079:Analyticalgroup.com 20023:1988JApMe..27.1322E 19980:1995AtmEn..29.1185G 19591: 19548: 19411: 19306: 19280:reduced chi-squared 19267: 19232: 19217: 19192: 17985: 17881: 17449: 17414: 17375: 17352: 16813: 16137: 15782: 15654:reliability weights 15648:Reliability weights 15434:If the weights are 15011: 14990: 14802: 14762: 14724: 14675: 14579: 14512:, analogous to the 14494: 14439: 14305: 14283: 14187: 13996: 13957: 13658: 13615:reliability weights 13609:Reliability weights 12979:If the weights are 12969:reliability weights 12961:Bessel's correction 12902: 12677: 12491: 12332: 12236: 12197: 12117: 12075: 12057: 12015: 11997: 11920: 11905: 11860: 11818: 11713: 11634: 11486: 11204: 11071: 10369:the above becomes: 10299: 10246: 9597: 9564: 9501: 9455: 9067: 8885: 8839: 8760:. This will be the 8605: 8559: 8506: 8460: 8407: 8362: 7818: 5101: 5042: 4986: 4935: 4728: 4660: 4600: 4255: 3954: 3703: 3666: 3645: 3551: 3492: 3040: 2813: 2731: 2516: 2325: 2277: 2201: 2154: 2139: 2092: 2077: 2032: 1942: 1924: 1830: 1632: 1581: 1482: 1429: 1341: 1262:, can be shown via 1060: 1012: 935: 20326:Summary statistics 20262:. Vieweg+Teubner. 19770:Summary statistics 19765:Standard deviation 19729: 19613: 19570: 19549: 19527: 19510: 19473:can be called the 19463: 19417: 19397: 19292: 19268: 19253: 19233: 19218: 19196: 19162: 19142: 19092: 19072: 19052: 19002: 18982: 18938: 18908: 18840: 18797: 18774: 18754: 18698: 18587: 18560: 18530: 18464: 18444: 18412: 18365: 18276: 18266:for a window size 18256: 18235: 18208: 18181: 18161: 18134: 18114: 18094: 18066: 18017: 17964: 17926: 17860: 17836: 17816: 17764: 17742: 17713: 17682: 17660: 17572: 17570: 17562: 17533: 17507: 17430: 17395: 17356: 17333: 17279: 17273: 17212: 17160: 17154: 17093: 17037: 16932: 16817: 16794: 16764:arithmetic inverse 16752: 16727: 16708:maximum likelihood 16687: 16628: 16465: 16426: 16424: 16123: 15979: 15950: 15869: 15833: 15770: 15750: 15720: 15656:, the weights are 15619: 15456: 15455: 15413: 15255: 15135: 15099: 15062: 15012: 14997: 14974: 14945: 14943: 14788: 14737: 14708: 14681: 14661: 14614: 14565: 14545: 14502: 14480: 14440: 14425: 14374: 14310: 14308: 14291: 14269: 14173: 13971: 13943: 13659: 13644: 13627: 13580: 13553: 13533: 13482: 13299: 13263: 13225: 13187: 13125: 12949:unbiased estimator 12930: 12903: 12877: 12856: 12799: 12797: 12652: 12532: 12492: 12466: 12439: 12420:standard deviation 12385: 12320: 12296: 12237: 12222: 12205: 12178: 12122: 12098: 12061: 12036: 12016: 12001: 11983: 11969:can be called the 11959: 11935:whose square root 11922: 11906: 11886: 11839: 11819: 11804: 11759: 11699: 11613: 11590: 11533: 11463: 11174: 11120: 11057: 10787: 10742: 10654: 10359: 10323: 10287: 10270: 10234: 10209: 10119: 10054: 9615: 9585: 9552: 9489: 9443: 9348: 9300: 9281: 9254: 9207: 9160: 9113: 9053: 8889: 8873: 8827: 8750: 8631: 8593: 8547: 8494: 8448: 8386: 8341: 8149: 7846: 7819: 7797: 7652: 7605: 7585: 7556: 7536: 7516: 7475: 7473: 6942: 6905: 6890: 6744: 6671: 6568: 6481: 6451: 6347: 6282: 6073: 6071: 5345: 5260: 5102: 5089: 5030: 4974: 4923: 4845: 4835:-estimator (i.e.: 4820: 4809:, also called the 4795: 4738: 4716: 4648: 4588: 4471: 4456: 4262: 4243: 4107: 4026: 3933: 3916: 3846: 3826: 3759: 3736: 3689: 3652: 3633: 3610: 3539: 3516: 3480: 3463: 3436: 3401: 3374: 3289: 3215: 3198:indicator values ( 3176: 3147: 3119:Survey methodology 3092: 3047: 3026: 2965: 2893: 2790: 2770: 2708: 2637:observations from 2629:Simple i.i.d. case 2611: 2564: 2531: 2504: 2441: 2384: 2331: 2311: 2256: 2218: 2184: 2140: 2122: 2078: 2058: 2011: 1991: 1951: 1928: 1903: 1883: 1813: 1738: 1634: 1618: 1583: 1567: 1488: 1468: 1430: 1415: 1391: 1344: 1322: 1252: 1214: 1180: 1113: 1048: 1024: 1000: 943: 923: 877: 727:which expands to: 714: 596: 524: 443: 366: 322: 283:convex combination 256: 163: 27:Statistical amount 20269:978-3-8348-1022-9 20202:978-0-521-35880-4 20017:(12): 1322–1333. 19974:(11): 1185–1193. 19936:978-0-387-97528-3 19881:978-0-471-16240-7 19724: 19699: 19583: 19540: 19459: 19447: 19412: 19383: 19332: 19209: 19184: 19172: 19095:{\displaystyle w} 19075:{\displaystyle n} 19005:{\displaystyle n} 18800:{\displaystyle w} 18777:{\displaystyle m} 18693: 18525: 18467:{\displaystyle m} 18279:{\displaystyle m} 18259:{\displaystyle z} 18184:{\displaystyle y} 18137:{\displaystyle n} 18117:{\displaystyle y} 18097:{\displaystyle x} 17977: 17958: 17873: 17839:{\displaystyle n} 17739: 17694:covariance matrix 17317: 16974: 16870: 16847: 16739:covariance matrix 16623: 16417: 16139: 15831: 15614: 15436:frequency weights 15430:Frequency weights 15408: 15250: 15004: 14936: 14807: 14747: 14727: 14564: 14539: 14495: 14298: 14284: 14188: 14100: 13981: 13950: 13937: 13859: 13790: 13702: 13651: 13556:{\displaystyle N} 13409: 13331: 13123: 13005: 12981:frequency weights 12975:Frequency weights 12965:frequency weights 12887: 12868:frequency weights 12793: 12662: 12646: 12577: 12566: 12523: 12476: 12383: 12294: 12267: 12049: 11955: 11852: 11740: 11694: 11680: 11638: 11626: 11588: 11562: 11515: 11473: 11448: 11435: 11394: 11357: 11322: 11309: 11264: 11250: 11208: 11189: 11098: 11034: 10967: 10947: 10870: 10861: 10840: 10824: 10644: 10624: 10596: 10576: 10538: 10479: 10470: 10449: 10433: 10407: 10397: 10113: 10098: 10083: 10043: 10028: 10013: 10001: 9983: 9973: 9951: 9936: 9926: 9900: 9891: 9866: 9842: 9815: 9791: 9754: 9695: 9686: 9665: 9655: 9608: 9575: 9522: 9503: 9404: 9402: 9392: 9376: 9346: 9344: 9334: 9318: 9298: 9252: 9205: 9094: 9030: 8973: 8957: 8902:This is called a 8887: 8781: 8748: 8658: 8622: 8607: 8508: 8409: 8396: 8351: 8312: 8309: 8259: 8197: 8192: 8175: 8147: 8064: 8054: 8012: 7973: 7970: 7930: 7878: 7856:s, and 1s. I.e.: 7807: 7771: 7678: 7650: 7608:{\displaystyle N} 7582: 7559:{\displaystyle N} 7539:{\displaystyle N} 7513: 7411: 7379: 7355: 7283: 7244: 7225: 7203: 7165: 7126: 7107: 7085: 7026: 6997: 6989: 6982: 6903: 6830: 6805: 6797: 6790: 6785: 6650: 6595: 6566: 6508: 6345: 6311: 6268: 6249: 6227: 6168: 6143: 6135: 6128: 6123: 6056: 5991: 5952: 5887: 5869: 5831: 5775: 5721: 5703: 5638: 5625: 5601: 5564: 5546: 5502: 5487: 5436: 5418: 5397: 5391: 5343: 5290: 5247: 5182: 5145: 5053: 5002: 4946: 4898: 4872: 4848:{\displaystyle p} 4733: 4677: 4671: 4617: 4611: 4563: 4547: 4529: 4507: 4500: 4495: 4469: 4443: 4378: 4360: 4331: 4324: 4319: 4260: 4204: 4186: 4164: 4157: 4152: 4105: 4077: 4024: 3978: 3943: 3907: 3889: 3849:{\displaystyle p} 3824: 3790: 3372: 3336: 3268: 3173: 3150:{\displaystyle N} 3045: 2997: 2963: 2919: 2891: 2866: 2800: 2768: 2759: 2748: 2718: 2693: 2596: 2473: 2381: 2326: 2269: 2250: 2213: 2024: 1916: 1878: 1877: 1833: 1832: 1778: 1733: 1637: 1633: 1582: 1520: 1483: 1342: 1289: 1248: 1145: 1111: 972: 872: 746: 709: 625: 398: 358: 314: 248: 194: 155: 141: 85:Given two school 70:Simpson's paradox 16:(Redirected from 20333: 20297: 20296: 20273: 20254: 20228: 20211: 20205: 20190: 20184: 20183: 20165: 20159: 20149: 20143: 20142: 20108: 20099: 20090: 20089: 20087: 20085: 20076: 20068: 20062: 20061: 20059: 20057: 20043: 20037: 20036: 20034: 20002: 19996: 19991: 19963: 19957: 19947: 19941: 19940: 19922: 19883: 19873: 19849: 19842: 19755:Central tendency 19738: 19736: 19735: 19730: 19725: 19723: 19712: 19711: 19710: 19701: 19700: 19692: 19686: 19685: 19672: 19667: 19651: 19646: 19645: 19622: 19620: 19619: 19614: 19609: 19604: 19603: 19590: 19585: 19584: 19576: 19558: 19556: 19555: 19550: 19547: 19542: 19541: 19533: 19519: 19517: 19516: 19511: 19509: 19508: 19496: 19495: 19472: 19470: 19469: 19464: 19462: 19461: 19460: 19452: 19449: 19448: 19440: 19430:The square root 19426: 19424: 19423: 19418: 19413: 19410: 19405: 19396: 19395: 19394: 19385: 19384: 19376: 19370: 19369: 19356: 19353: 19348: 19333: 19331: 19311: 19305: 19300: 19277: 19275: 19274: 19269: 19266: 19261: 19242: 19240: 19239: 19234: 19231: 19226: 19216: 19211: 19210: 19202: 19191: 19186: 19185: 19177: 19174: 19173: 19165: 19151: 19149: 19148: 19143: 19141: 19140: 19101: 19099: 19098: 19093: 19081: 19079: 19078: 19073: 19061: 19059: 19058: 19053: 19051: 19050: 19049: 19011: 19009: 19008: 19003: 18991: 18989: 18988: 18983: 18975: 18974: 18973: 18948:, the head area 18947: 18945: 18944: 18939: 18937: 18936: 18917: 18915: 18914: 18909: 18871: 18870: 18869: 18849: 18847: 18846: 18841: 18839: 18838: 18806: 18804: 18803: 18798: 18783: 18781: 18780: 18775: 18763: 18761: 18760: 18755: 18738: 18727: 18726: 18707: 18705: 18704: 18699: 18694: 18692: 18681: 18680: 18679: 18663: 18658: 18657: 18656: 18639: 18634: 18616: 18615: 18596: 18594: 18593: 18588: 18586: 18585: 18569: 18567: 18566: 18561: 18559: 18558: 18539: 18537: 18536: 18531: 18526: 18524: 18523: 18514: 18513: 18498: 18493: 18492: 18473: 18471: 18470: 18465: 18453: 18451: 18450: 18445: 18421: 18419: 18418: 18413: 18374: 18372: 18371: 18366: 18361: 18360: 18339: 18338: 18328: 18323: 18305: 18304: 18285: 18283: 18282: 18277: 18265: 18263: 18262: 18257: 18244: 18242: 18241: 18236: 18234: 18233: 18217: 18215: 18214: 18209: 18207: 18206: 18190: 18188: 18187: 18182: 18170: 18168: 18167: 18162: 18160: 18159: 18143: 18141: 18140: 18135: 18123: 18121: 18120: 18115: 18103: 18101: 18100: 18095: 18075: 18073: 18072: 18067: 18062: 18061: 18053: 18044: 18026: 18024: 18023: 18018: 18010: 18005: 18000: 17999: 17994: 17984: 17979: 17978: 17970: 17960: 17959: 17951: 17935: 17933: 17932: 17927: 17922: 17921: 17909: 17904: 17899: 17898: 17893: 17880: 17875: 17874: 17866: 17845: 17843: 17842: 17837: 17825: 17823: 17822: 17817: 17815: 17814: 17773: 17771: 17770: 17765: 17763: 17751: 17749: 17748: 17743: 17741: 17740: 17732: 17722: 17720: 17719: 17714: 17712: 17711: 17691: 17689: 17688: 17683: 17681: 17669: 17667: 17666: 17661: 17659: 17658: 17649: 17648: 17630: 17629: 17614: 17581: 17579: 17578: 17573: 17571: 17567: 17566: 17538: 17537: 17512: 17511: 17470: 17466: 17462: 17461: 17460: 17455: 17448: 17440: 17435: 17426: 17425: 17420: 17413: 17405: 17400: 17389: 17388: 17380: 17376: 17374: 17366: 17361: 17351: 17343: 17338: 17319: 17318: 17313: 17308: 17288: 17286: 17285: 17280: 17278: 17277: 17239: 17238: 17233: 17223: 17222: 17217: 17216: 17189: 17188: 17183: 17169: 17167: 17166: 17161: 17159: 17158: 17120: 17119: 17114: 17104: 17103: 17098: 17097: 17070: 17069: 17064: 17046: 17044: 17043: 17038: 17033: 17032: 17024: 17020: 17019: 17018: 17013: 17006: 17001: 16977: 16976: 16975: 16970: 16965: 16962: 16941: 16939: 16938: 16933: 16928: 16924: 16923: 16922: 16917: 16911: 16910: 16905: 16898: 16893: 16873: 16872: 16871: 16866: 16861: 16858: 16849: 16848: 16843: 16838: 16826: 16824: 16823: 16818: 16812: 16804: 16799: 16790: 16789: 16784: 16761: 16759: 16758: 16753: 16751: 16736: 16734: 16733: 16728: 16726: 16725: 16696: 16694: 16693: 16688: 16683: 16672: 16671: 16662: 16657: 16656: 16637: 16635: 16634: 16629: 16624: 16622: 16621: 16620: 16604: 16603: 16599: 16598: 16597: 16585: 16584: 16579: 16568: 16567: 16562: 16558: 16557: 16556: 16544: 16543: 16538: 16526: 16525: 16515: 16510: 16494: 16489: 16474: 16472: 16471: 16466: 16458: 16457: 16435: 16433: 16432: 16427: 16425: 16418: 16416: 16412: 16411: 16402: 16397: 16396: 16381: 16380: 16370: 16369: 16365: 16364: 16363: 16351: 16350: 16345: 16334: 16333: 16328: 16324: 16323: 16322: 16310: 16309: 16304: 16292: 16291: 16281: 16276: 16260: 16252: 16248: 16244: 16243: 16242: 16230: 16229: 16224: 16213: 16212: 16207: 16203: 16202: 16201: 16189: 16188: 16183: 16171: 16170: 16160: 16155: 16140: 16138: 16136: 16131: 16121: 16116: 16098: 16097: 16092: 16088: 16087: 16086: 16076: 16071: 16049: 16048: 16047: 16037: 16032: 16016: 16007: 15988: 15986: 15985: 15980: 15978: 15959: 15957: 15956: 15951: 15946: 15945: 15940: 15934: 15933: 15923: 15918: 15900: 15899: 15898: 15878: 15876: 15875: 15870: 15868: 15867: 15866: 15842: 15840: 15839: 15834: 15832: 15830: 15829: 15828: 15818: 15813: 15797: 15796: 15787: 15778: 15759: 15757: 15756: 15751: 15749: 15748: 15729: 15727: 15726: 15721: 15713: 15712: 15702: 15697: 15679: 15678: 15628: 15626: 15625: 15620: 15615: 15613: 15606: 15605: 15595: 15594: 15590: 15589: 15588: 15576: 15575: 15570: 15559: 15558: 15553: 15549: 15548: 15547: 15535: 15534: 15529: 15517: 15516: 15506: 15501: 15485: 15480: 15465: 15463: 15462: 15457: 15454: 15422: 15420: 15419: 15414: 15409: 15407: 15406: 15397: 15396: 15392: 15391: 15390: 15378: 15377: 15372: 15361: 15360: 15355: 15351: 15350: 15349: 15337: 15336: 15331: 15319: 15318: 15308: 15303: 15287: 15282: 15264: 15262: 15261: 15256: 15251: 15249: 15248: 15247: 15237: 15232: 15216: 15215: 15214: 15209: 15203: 15202: 15192: 15187: 15171: 15166: 15165: 15164: 15144: 15142: 15141: 15136: 15134: 15133: 15132: 15108: 15106: 15105: 15100: 15092: 15091: 15071: 15069: 15068: 15063: 15061: 15060: 15055: 15021: 15019: 15018: 15013: 15010: 15005: 15002: 14989: 14984: 14983: 14954: 14952: 14951: 14946: 14944: 14937: 14935: 14931: 14930: 14921: 14916: 14915: 14900: 14899: 14889: 14888: 14887: 14878: 14877: 14865: 14864: 14852: 14851: 14841: 14836: 14820: 14812: 14808: 14806: 14801: 14796: 14787: 14782: 14781: 14761: 14756: 14755: 14749: 14748: 14740: 14736: 14725: 14723: 14718: 14717: 14690: 14688: 14687: 14682: 14680: 14676: 14674: 14669: 14660: 14655: 14654: 14623: 14621: 14620: 14615: 14613: 14612: 14594: 14593: 14584: 14578: 14573: 14562: 14554: 14552: 14551: 14546: 14544: 14540: 14535: 14524: 14511: 14509: 14508: 14503: 14501: 14497: 14496: 14493: 14488: 14479: 14478: 14469: 14449: 14447: 14446: 14441: 14438: 14433: 14423: 14418: 14400: 14399: 14383: 14381: 14380: 14375: 14373: 14372: 14362: 14357: 14339: 14338: 14319: 14317: 14316: 14311: 14309: 14304: 14299: 14296: 14290: 14286: 14285: 14282: 14277: 14268: 14267: 14258: 14239: 14232: 14231: 14189: 14186: 14181: 14172: 14171: 14162: 14154: 14153: 14105: 14101: 14099: 14098: 14089: 14085: 14084: 14075: 14074: 14062: 14061: 14040: 14039: 14029: 14024: 14008: 13995: 13990: 13989: 13983: 13982: 13974: 13956: 13951: 13948: 13942: 13938: 13933: 13922: 13910: 13903: 13902: 13860: 13852: 13844: 13843: 13795: 13791: 13786: 13782: 13781: 13766: 13765: 13743: 13738: 13722: 13710: 13709: 13704: 13703: 13695: 13668: 13666: 13665: 13660: 13657: 13652: 13649: 13636: 13634: 13633: 13628: 13589: 13587: 13586: 13581: 13579: 13578: 13562: 13560: 13559: 13554: 13542: 13540: 13539: 13534: 13532: 13531: 13521: 13516: 13491: 13489: 13488: 13483: 13481: 13480: 13475: 13471: 13470: 13469: 13457: 13456: 13441: 13440: 13430: 13425: 13410: 13408: 13401: 13400: 13390: 13385: 13369: 13368: 13367: 13357: 13352: 13336: 13329: 13328: 13327: 13308: 13306: 13305: 13300: 13295: 13294: 13272: 13270: 13269: 13264: 13234: 13232: 13231: 13226: 13196: 13194: 13193: 13188: 13134: 13132: 13131: 13126: 13124: 13122: 13115: 13114: 13104: 13099: 13083: 13082: 13081: 13076: 13072: 13071: 13070: 13058: 13057: 13042: 13041: 13031: 13026: 13010: 13003: 13002: 13001: 12939: 12937: 12936: 12931: 12929: 12928: 12912: 12910: 12909: 12904: 12901: 12896: 12895: 12889: 12888: 12880: 12865: 12863: 12862: 12857: 12849: 12848: 12838: 12833: 12808: 12806: 12805: 12800: 12798: 12794: 12792: 12791: 12790: 12780: 12775: 12759: 12758: 12757: 12752: 12748: 12747: 12746: 12734: 12733: 12718: 12717: 12707: 12702: 12686: 12676: 12671: 12670: 12664: 12663: 12655: 12647: 12642: 12641: 12640: 12635: 12631: 12624: 12623: 12607: 12602: 12586: 12575: 12574: 12573: 12568: 12567: 12559: 12541: 12539: 12538: 12533: 12531: 12530: 12525: 12524: 12516: 12506:sample variance 12501: 12499: 12498: 12493: 12490: 12485: 12484: 12478: 12477: 12469: 12448: 12446: 12445: 12440: 12438: 12437: 12399:Related concepts 12394: 12392: 12391: 12386: 12384: 12382: 12381: 12380: 12379: 12368: 12363: 12347: 12346: 12337: 12328: 12305: 12303: 12302: 12297: 12295: 12290: 12288: 12283: 12282: 12270: 12269: 12268: 12260: 12246: 12244: 12243: 12238: 12235: 12230: 12214: 12212: 12211: 12206: 12198: 12196: 12195: 12186: 12175: 12170: 12149: 12131: 12129: 12128: 12123: 12118: 12116: 12115: 12106: 12095: 12090: 12074: 12069: 12056: 12051: 12050: 12042: 12025: 12023: 12022: 12017: 12014: 12009: 11996: 11991: 11968: 11966: 11965: 11960: 11958: 11957: 11956: 11948: 11931: 11929: 11928: 11923: 11921: 11919: 11914: 11904: 11903: 11894: 11883: 11878: 11859: 11854: 11853: 11845: 11828: 11826: 11825: 11820: 11817: 11812: 11768: 11766: 11765: 11760: 11758: 11757: 11748: 11747: 11742: 11741: 11733: 11726: 11725: 11712: 11707: 11695: 11693: 11692: 11691: 11682: 11681: 11673: 11645: 11640: 11639: 11633: 11628: 11627: 11619: 11612: 11599: 11597: 11596: 11591: 11589: 11584: 11583: 11582: 11569: 11564: 11563: 11555: 11542: 11540: 11539: 11534: 11532: 11528: 11527: 11526: 11517: 11516: 11508: 11502: 11501: 11485: 11480: 11475: 11474: 11466: 11456: 11455: 11450: 11449: 11441: 11437: 11436: 11428: 11422: 11421: 11412: 11411: 11396: 11395: 11387: 11381: 11380: 11365: 11364: 11359: 11358: 11350: 11340: 11339: 11330: 11329: 11324: 11323: 11315: 11311: 11310: 11302: 11296: 11295: 11286: 11285: 11265: 11263: 11262: 11261: 11252: 11251: 11243: 11215: 11210: 11209: 11203: 11198: 11197: 11196: 11191: 11190: 11182: 11173: 11139:Poisson sampling 11129: 11127: 11126: 11121: 11116: 11115: 11106: 11105: 11100: 11099: 11091: 11084: 11083: 11070: 11065: 11055: 11050: 11035: 11033: 11032: 11031: 11022: 11021: 11011: 11006: 10984: 10979: 10978: 10973: 10969: 10968: 10966: 10965: 10956: 10955: 10954: 10949: 10948: 10940: 10933: 10932: 10922: 10917: 10916: 10891: 10886: 10871: 10869: 10868: 10863: 10862: 10854: 10847: 10842: 10841: 10836: 10832: 10831: 10826: 10825: 10817: 10806: 10796: 10794: 10793: 10788: 10777: 10776: 10751: 10749: 10748: 10743: 10732: 10731: 10719: 10718: 10700: 10699: 10663: 10661: 10660: 10655: 10650: 10646: 10645: 10643: 10642: 10633: 10632: 10631: 10626: 10625: 10617: 10610: 10609: 10599: 10597: 10595: 10594: 10585: 10584: 10583: 10578: 10577: 10569: 10562: 10561: 10551: 10549: 10548: 10540: 10539: 10531: 10521: 10516: 10500: 10495: 10480: 10478: 10477: 10472: 10471: 10463: 10456: 10451: 10450: 10445: 10441: 10440: 10435: 10434: 10426: 10415: 10409: 10408: 10403: 10399: 10398: 10390: 10380: 10368: 10366: 10365: 10360: 10352: 10351: 10332: 10330: 10329: 10324: 10322: 10321: 10312: 10311: 10295: 10279: 10277: 10276: 10271: 10269: 10268: 10259: 10258: 10242: 10218: 10216: 10215: 10210: 10199: 10198: 10186: 10185: 10167: 10166: 10128: 10126: 10125: 10120: 10115: 10114: 10106: 10100: 10099: 10091: 10085: 10084: 10076: 10063: 10061: 10060: 10055: 10053: 10049: 10045: 10044: 10036: 10030: 10029: 10021: 10015: 10014: 10006: 10003: 10002: 9994: 9985: 9984: 9979: 9975: 9974: 9966: 9956: 9953: 9952: 9944: 9938: 9937: 9932: 9928: 9927: 9919: 9909: 9901: 9899: 9898: 9893: 9892: 9884: 9877: 9872: 9868: 9867: 9865: 9864: 9855: 9854: 9853: 9844: 9843: 9835: 9829: 9828: 9818: 9816: 9814: 9813: 9804: 9803: 9802: 9793: 9792: 9784: 9778: 9777: 9767: 9765: 9764: 9756: 9755: 9747: 9737: 9732: 9716: 9711: 9696: 9694: 9693: 9688: 9687: 9679: 9672: 9667: 9666: 9661: 9657: 9656: 9648: 9638: 9624: 9622: 9621: 9616: 9614: 9610: 9609: 9607: 9606: 9593: 9584: 9576: 9574: 9573: 9560: 9551: 9543: 9538: 9523: 9515: 9504: 9502: 9497: 9488: 9487: 9477: 9472: 9456: 9451: 9442: 9441: 9431: 9426: 9410: 9405: 9403: 9395: 9393: 9385: 9383: 9378: 9377: 9369: 9357: 9355: 9354: 9349: 9347: 9345: 9337: 9335: 9327: 9325: 9320: 9319: 9311: 9290: 9288: 9287: 9282: 9280: 9279: 9263: 9261: 9260: 9255: 9253: 9251: 9250: 9238: 9233: 9232: 9216: 9214: 9213: 9208: 9206: 9204: 9203: 9191: 9186: 9185: 9169: 9167: 9166: 9161: 9156: 9155: 9143: 9142: 9122: 9120: 9119: 9114: 9112: 9111: 9102: 9101: 9096: 9095: 9087: 9080: 9079: 9066: 9061: 9051: 9046: 9031: 9029: 9028: 9027: 9018: 9017: 9007: 9002: 8980: 8975: 8974: 8969: 8965: 8964: 8959: 8958: 8950: 8939: 8925:Poisson sampling 8898: 8896: 8895: 8890: 8888: 8886: 8881: 8872: 8871: 8861: 8856: 8840: 8835: 8826: 8825: 8815: 8810: 8794: 8789: 8788: 8783: 8782: 8774: 8759: 8757: 8756: 8751: 8749: 8747: 8746: 8745: 8735: 8730: 8714: 8713: 8712: 8703: 8702: 8692: 8687: 8671: 8666: 8665: 8660: 8659: 8651: 8640: 8638: 8637: 8632: 8630: 8629: 8624: 8623: 8615: 8608: 8606: 8601: 8592: 8591: 8581: 8576: 8560: 8555: 8546: 8545: 8535: 8530: 8514: 8509: 8507: 8502: 8493: 8492: 8482: 8477: 8461: 8456: 8447: 8446: 8436: 8431: 8415: 8410: 8408: 8403: 8398: 8397: 8389: 8384: 8379: 8363: 8358: 8353: 8352: 8344: 8339: 8334: 8318: 8313: 8311: 8310: 8308: 8307: 8295: 8293: 8292: 8282: 8277: 8261: 8260: 8258: 8257: 8248: 8247: 8238: 8236: 8235: 8225: 8220: 8204: 8199: 8198: 8193: 8185: 8183: 8177: 8176: 8168: 8158: 8156: 8155: 8150: 8148: 8146: 8145: 8144: 8134: 8129: 8113: 8112: 8111: 8102: 8101: 8091: 8086: 8070: 8065: 8063: 8062: 8061: 8056: 8055: 8047: 8042: 8037: 8021: 8020: 8019: 8014: 8013: 8005: 8000: 7995: 7979: 7974: 7972: 7971: 7969: 7968: 7956: 7953: 7948: 7932: 7931: 7929: 7928: 7919: 7918: 7909: 7906: 7901: 7885: 7880: 7879: 7871: 7855: 7853: 7852: 7847: 7845: 7844: 7828: 7826: 7825: 7820: 7814: 7809: 7808: 7800: 7795: 7790: 7772: 7770: 7769: 7760: 7759: 7750: 7747: 7742: 7724: 7723: 7714: 7713: 7703: 7698: 7680: 7679: 7671: 7661: 7659: 7658: 7653: 7651: 7649: 7648: 7636: 7631: 7630: 7614: 7612: 7611: 7606: 7594: 7592: 7591: 7586: 7584: 7583: 7575: 7565: 7563: 7562: 7557: 7545: 7543: 7542: 7537: 7525: 7523: 7522: 7517: 7515: 7514: 7506: 7492: 7484: 7482: 7481: 7476: 7474: 7470: 7469: 7464: 7460: 7459: 7458: 7449: 7448: 7432: 7427: 7412: 7410: 7409: 7397: 7389: 7385: 7381: 7380: 7378: 7377: 7368: 7367: 7358: 7356: 7354: 7353: 7344: 7343: 7334: 7329: 7328: 7304: 7299: 7284: 7282: 7281: 7269: 7261: 7257: 7253: 7252: 7251: 7246: 7245: 7237: 7233: 7232: 7227: 7226: 7218: 7214: 7213: 7205: 7204: 7196: 7186: 7181: 7166: 7164: 7163: 7151: 7143: 7139: 7135: 7134: 7133: 7128: 7127: 7119: 7115: 7114: 7109: 7108: 7100: 7096: 7095: 7087: 7086: 7078: 7068: 7063: 7047: 7042: 7027: 7025: 7024: 7012: 7000: 6999: 6998: 6995: 6990: 6988:pwr (known  6987: 6984: 6983: 6975: 6951: 6949: 6948: 6943: 6932: 6931: 6899: 6897: 6896: 6891: 6889: 6888: 6883: 6879: 6878: 6877: 6868: 6867: 6851: 6846: 6831: 6829: 6828: 6816: 6808: 6807: 6806: 6803: 6798: 6796:pwr (known  6795: 6792: 6791: 6786: 6778: 6776: 6753: 6751: 6750: 6745: 6734: 6733: 6721: 6720: 6680: 6678: 6677: 6672: 6670: 6669: 6651: 6649: 6648: 6639: 6638: 6637: 6628: 6627: 6617: 6606: 6605: 6597: 6596: 6588: 6577: 6575: 6574: 6569: 6567: 6565: 6564: 6552: 6551: 6550: 6541: 6540: 6530: 6519: 6518: 6510: 6509: 6501: 6490: 6488: 6487: 6482: 6480: 6479: 6460: 6458: 6457: 6452: 6450: 6449: 6434: 6433: 6424: 6423: 6411: 6410: 6392: 6391: 6379: 6378: 6356: 6354: 6353: 6348: 6346: 6344: 6343: 6334: 6333: 6324: 6319: 6318: 6313: 6312: 6304: 6291: 6289: 6288: 6283: 6281: 6277: 6276: 6275: 6270: 6269: 6261: 6257: 6256: 6251: 6250: 6242: 6238: 6237: 6229: 6228: 6220: 6210: 6205: 6189: 6184: 6169: 6167: 6166: 6154: 6146: 6145: 6144: 6141: 6136: 6134:pwr (known  6133: 6130: 6129: 6124: 6116: 6114: 6091:Poisson sampling 6082: 6080: 6079: 6074: 6072: 6068: 6067: 6062: 6058: 6057: 6052: 6044: 6039: 6038: 6029: 6028: 6012: 6007: 5992: 5990: 5976: 5968: 5964: 5963: 5958: 5954: 5953: 5948: 5940: 5935: 5934: 5925: 5924: 5908: 5903: 5888: 5886: 5872: 5870: 5865: 5864: 5855: 5847: 5843: 5842: 5837: 5833: 5832: 5827: 5826: 5825: 5816: 5815: 5805: 5800: 5784: 5776: 5774: 5773: 5764: 5763: 5754: 5742: 5737: 5722: 5720: 5706: 5704: 5696: 5691: 5690: 5685: 5681: 5680: 5679: 5670: 5669: 5659: 5654: 5639: 5631: 5626: 5624: 5623: 5614: 5613: 5604: 5602: 5594: 5585: 5580: 5565: 5563: 5549: 5547: 5539: 5531: 5527: 5526: 5521: 5517: 5516: 5515: 5504: 5503: 5495: 5488: 5486: 5485: 5476: 5475: 5466: 5457: 5452: 5437: 5435: 5421: 5419: 5411: 5399: 5398: 5395: 5393: 5392: 5384: 5354: 5352: 5351: 5346: 5344: 5339: 5338: 5337: 5328: 5327: 5317: 5314: 5309: 5291: 5286: 5278: 5269: 5267: 5266: 5261: 5259: 5258: 5253: 5249: 5248: 5243: 5235: 5230: 5229: 5220: 5219: 5203: 5198: 5183: 5181: 5167: 5159: 5158: 5147: 5146: 5138: 5111: 5109: 5108: 5103: 5097: 5088: 5087: 5077: 5072: 5054: 5052: 5051: 5038: 5029: 5026: 5021: 5003: 5001: 5000: 4999: 4982: 4973: 4970: 4965: 4947: 4945: 4944: 4931: 4922: 4919: 4914: 4899: 4891: 4886: 4885: 4874: 4873: 4865: 4854: 4852: 4851: 4846: 4829: 4827: 4826: 4821: 4804: 4802: 4801: 4796: 4794: 4793: 4783: 4778: 4747: 4745: 4744: 4739: 4734: 4729: 4724: 4715: 4714: 4704: 4699: 4683: 4678: 4673: 4672: 4670: 4669: 4656: 4647: 4644: 4639: 4623: 4618: 4613: 4612: 4610: 4609: 4596: 4587: 4584: 4579: 4564: 4556: 4553: 4548: 4543: 4542: 4531: 4530: 4522: 4518: 4513: 4512: 4508: 4505: 4502: 4501: 4496: 4488: 4486: 4465: 4463: 4462: 4457: 4455: 4454: 4449: 4445: 4444: 4439: 4431: 4426: 4425: 4416: 4415: 4399: 4394: 4379: 4377: 4363: 4361: 4359: 4358: 4346: 4341: 4337: 4336: 4332: 4329: 4326: 4325: 4320: 4312: 4310: 4271: 4269: 4268: 4263: 4261: 4256: 4251: 4242: 4241: 4231: 4226: 4210: 4205: 4200: 4199: 4188: 4187: 4179: 4175: 4170: 4169: 4165: 4162: 4159: 4158: 4153: 4145: 4143: 4116: 4114: 4113: 4108: 4106: 4104: 4103: 4102: 4083: 4078: 4076: 4075: 4063: 4058: 4057: 4035: 4033: 4032: 4027: 4025: 4023: 4022: 4013: 4012: 4011: 4002: 4001: 3991: 3986: 3985: 3980: 3979: 3971: 3967: 3966: 3950: 3945: 3944: 3936: 3925: 3923: 3922: 3917: 3915: 3914: 3909: 3908: 3900: 3890: 3888: 3887: 3878: 3877: 3868: 3855: 3853: 3852: 3847: 3835: 3833: 3832: 3827: 3825: 3823: 3822: 3813: 3812: 3803: 3798: 3797: 3792: 3791: 3783: 3768: 3766: 3765: 3760: 3745: 3743: 3742: 3737: 3732: 3731: 3713: 3712: 3702: 3697: 3682: 3681: 3665: 3660: 3641: 3620:; and variance: 3619: 3617: 3616: 3611: 3609: 3608: 3599: 3598: 3583: 3582: 3567: 3566: 3547: 3525: 3523: 3522: 3517: 3515: 3514: 3505: 3504: 3488: 3472: 3470: 3469: 3464: 3462: 3461: 3445: 3443: 3442: 3437: 3435: 3434: 3414:cluster sampling 3410: 3408: 3407: 3402: 3400: 3399: 3383: 3381: 3380: 3375: 3373: 3368: 3367: 3358: 3353: 3352: 3337: 3334: 3332: 3321: 3320: 3298: 3296: 3295: 3290: 3288: 3287: 3269: 3266: 3255: 3254: 3231:Poisson sampling 3224: 3222: 3221: 3216: 3214: 3213: 3185: 3183: 3182: 3177: 3175: 3174: 3166: 3157:) or estimated ( 3156: 3154: 3153: 3148: 3101: 3099: 3098: 3093: 3091: 3090: 3056: 3054: 3053: 3048: 3046: 3041: 3039: 3034: 3024: 3019: 3003: 2998: 2993: 2992: 2983: 2974: 2972: 2971: 2966: 2964: 2959: 2958: 2957: 2947: 2942: 2926: 2921: 2920: 2912: 2902: 2900: 2899: 2894: 2892: 2890: 2879: 2878: 2877: 2868: 2867: 2859: 2853: 2852: 2839: 2834: 2818: 2812: 2807: 2802: 2801: 2793: 2779: 2777: 2776: 2771: 2769: 2767: 2766: 2761: 2760: 2752: 2744: 2743: 2734: 2733: 2730: 2725: 2720: 2719: 2711: 2701: 2700: 2695: 2694: 2686: 2642:random variables 2620: 2618: 2617: 2612: 2598: 2597: 2589: 2573: 2571: 2570: 2565: 2557: 2556: 2540: 2538: 2537: 2532: 2527: 2526: 2525: 2512: 2501: 2496: 2475: 2474: 2466: 2450: 2448: 2447: 2442: 2437: 2436: 2435: 2419: 2418: 2393: 2391: 2390: 2385: 2383: 2382: 2374: 2340: 2338: 2337: 2332: 2327: 2324: 2319: 2310: 2309: 2300: 2297: 2292: 2276: 2271: 2270: 2262: 2252: 2251: 2243: 2227: 2225: 2224: 2219: 2214: 2212: 2211: 2206: 2202: 2200: 2192: 2182: 2177: 2156: 2155: 2153: 2148: 2138: 2130: 2119: 2114: 2098: 2093: 2091: 2086: 2076: 2075: 2066: 2055: 2050: 2031: 2026: 2025: 2017: 2000: 1998: 1997: 1992: 1990: 1989: 1977: 1976: 1960: 1958: 1957: 1952: 1947: 1941: 1936: 1923: 1918: 1917: 1909: 1892: 1890: 1889: 1884: 1879: 1876: 1875: 1874: 1864: 1859: 1840: 1839: 1834: 1831: 1829: 1821: 1811: 1806: 1787: 1786: 1781: 1780: 1779: 1771: 1747: 1745: 1744: 1739: 1734: 1732: 1731: 1730: 1720: 1715: 1699: 1698: 1694: 1693: 1692: 1680: 1679: 1664: 1659: 1643: 1638: 1636: 1635: 1631: 1626: 1614: 1610: 1605: 1589: 1588: 1584: 1580: 1575: 1566: 1565: 1556: 1548: 1543: 1527: 1522: 1521: 1513: 1497: 1495: 1494: 1489: 1484: 1481: 1476: 1464: 1459: 1458: 1439: 1437: 1436: 1431: 1428: 1423: 1400: 1398: 1397: 1392: 1390: 1389: 1353: 1351: 1350: 1345: 1343: 1340: 1339: 1330: 1320: 1315: 1300: 1292: 1291: 1290: 1282: 1261: 1259: 1258: 1253: 1251: 1250: 1249: 1241: 1223: 1221: 1220: 1215: 1213: 1212: 1189: 1187: 1186: 1181: 1179: 1178: 1177: 1166: 1161: 1146: 1138: 1122: 1120: 1119: 1114: 1112: 1110: 1109: 1108: 1107: 1096: 1091: 1075: 1074: 1065: 1056: 1033: 1031: 1030: 1025: 1023: 1022: 1021: 1008: 997: 992: 974: 973: 965: 952: 950: 949: 944: 936: 931: 920: 915: 886: 884: 883: 878: 873: 871: 870: 869: 851: 850: 838: 837: 827: 826: 825: 816: 815: 797: 796: 787: 786: 774: 773: 764: 763: 753: 748: 747: 739: 723: 721: 720: 715: 710: 708: 707: 706: 696: 691: 675: 674: 673: 664: 663: 653: 648: 632: 627: 626: 618: 605: 603: 602: 597: 595: 591: 590: 589: 571: 570: 558: 557: 533: 531: 530: 525: 523: 519: 518: 517: 499: 498: 486: 485: 452: 450: 449: 444: 400: 399: 391: 375: 373: 372: 367: 359: 357: 343: 331: 329: 328: 323: 315: 313: 299: 265: 263: 262: 257: 249: 247: 236: 201: 196: 195: 187: 172: 170: 169: 164: 156: 148: 143: 142: 134: 108: 106: 102: 96: 94: 90: 21: 20341: 20340: 20336: 20335: 20334: 20332: 20331: 20330: 20306: 20305: 20287:"Weighted Mean" 20280: 20270: 20237: 20235:Further reading 20232: 20231: 20212: 20208: 20191: 20187: 20180: 20166: 20162: 20150: 20146: 20106: 20100: 20093: 20083: 20081: 20074: 20070: 20069: 20065: 20055: 20053: 20045: 20044: 20040: 20003: 19999: 19964: 19960: 19949:Thomas Lumley ( 19948: 19944: 19937: 19923: 19886: 19874: 19863: 19858: 19853: 19852: 19846:absolute values 19843: 19839: 19834: 19829: 19825:Ratio estimator 19800:Weighted median 19775:Weight function 19745: 19713: 19706: 19702: 19691: 19690: 19681: 19677: 19668: 19657: 19652: 19650: 19641: 19637: 19635: 19632: 19631: 19605: 19599: 19595: 19586: 19575: 19574: 19568: 19565: 19564: 19543: 19532: 19531: 19525: 19522: 19521: 19504: 19500: 19491: 19487: 19485: 19482: 19481: 19451: 19450: 19439: 19438: 19437: 19435: 19432: 19431: 19406: 19401: 19390: 19386: 19375: 19374: 19365: 19361: 19357: 19355: 19349: 19338: 19315: 19310: 19301: 19296: 19290: 19287: 19286: 19262: 19257: 19251: 19248: 19247: 19227: 19222: 19212: 19201: 19200: 19187: 19176: 19175: 19164: 19163: 19160: 19157: 19156: 19136: 19132: 19130: 19127: 19126: 19122: 19116: 19108: 19087: 19084: 19083: 19067: 19064: 19063: 19027: 19023: 19022: 19017: 19014: 19013: 18997: 18994: 18993: 18966: 18962: 18955: 18953: 18950: 18949: 18929: 18925: 18923: 18920: 18919: 18862: 18858: 18857: 18855: 18852: 18851: 18831: 18827: 18813: 18810: 18809: 18792: 18789: 18788: 18769: 18766: 18765: 18734: 18722: 18718: 18716: 18713: 18712: 18682: 18675: 18671: 18664: 18662: 18646: 18642: 18641: 18635: 18624: 18611: 18607: 18605: 18602: 18601: 18581: 18577: 18575: 18572: 18571: 18554: 18550: 18548: 18545: 18544: 18519: 18515: 18503: 18499: 18497: 18488: 18484: 18482: 18479: 18478: 18459: 18456: 18455: 18427: 18424: 18423: 18395: 18392: 18391: 18387: 18381: 18344: 18340: 18334: 18330: 18324: 18313: 18300: 18296: 18294: 18291: 18290: 18271: 18268: 18267: 18251: 18248: 18247: 18229: 18225: 18223: 18220: 18219: 18202: 18198: 18196: 18193: 18192: 18176: 18173: 18172: 18155: 18151: 18149: 18146: 18145: 18129: 18126: 18125: 18109: 18106: 18105: 18089: 18086: 18085: 18082: 18054: 18049: 18048: 18040: 18038: 18035: 18034: 18006: 18001: 17995: 17990: 17989: 17980: 17969: 17968: 17950: 17949: 17947: 17944: 17943: 17914: 17910: 17905: 17900: 17894: 17889: 17888: 17876: 17865: 17864: 17858: 17855: 17854: 17831: 17828: 17827: 17810: 17806: 17786: 17783: 17782: 17759: 17757: 17754: 17753: 17731: 17730: 17728: 17725: 17724: 17707: 17703: 17701: 17698: 17697: 17677: 17675: 17672: 17671: 17654: 17650: 17644: 17640: 17625: 17621: 17610: 17608: 17605: 17604: 17601: 17591: 17569: 17568: 17561: 17560: 17554: 17553: 17543: 17542: 17532: 17531: 17525: 17524: 17514: 17513: 17506: 17505: 17500: 17494: 17493: 17488: 17478: 17477: 17468: 17467: 17456: 17451: 17450: 17441: 17436: 17431: 17421: 17416: 17415: 17406: 17401: 17396: 17394: 17390: 17381: 17367: 17362: 17357: 17344: 17339: 17334: 17332: 17328: 17327: 17320: 17309: 17307: 17306: 17302: 17300: 17297: 17296: 17272: 17271: 17266: 17260: 17259: 17254: 17244: 17243: 17234: 17229: 17228: 17218: 17211: 17210: 17205: 17195: 17194: 17193: 17184: 17179: 17178: 17176: 17173: 17172: 17153: 17152: 17147: 17141: 17140: 17135: 17125: 17124: 17115: 17110: 17109: 17099: 17092: 17091: 17086: 17076: 17075: 17074: 17065: 17060: 17059: 17057: 17054: 17053: 17025: 17014: 17009: 17008: 17002: 16991: 16986: 16982: 16981: 16966: 16964: 16963: 16958: 16957: 16955: 16952: 16951: 16918: 16913: 16912: 16906: 16901: 16900: 16894: 16883: 16878: 16874: 16862: 16860: 16859: 16854: 16853: 16839: 16837: 16836: 16834: 16831: 16830: 16805: 16800: 16795: 16785: 16780: 16779: 16777: 16774: 16773: 16747: 16745: 16742: 16741: 16721: 16717: 16715: 16712: 16711: 16703: 16679: 16667: 16663: 16658: 16652: 16648: 16646: 16643: 16642: 16616: 16612: 16605: 16593: 16589: 16580: 16575: 16574: 16573: 16569: 16563: 16552: 16548: 16539: 16534: 16533: 16532: 16528: 16527: 16521: 16517: 16511: 16500: 16495: 16493: 16485: 16483: 16480: 16479: 16453: 16449: 16447: 16444: 16443: 16423: 16422: 16407: 16403: 16398: 16392: 16388: 16376: 16372: 16371: 16359: 16355: 16346: 16341: 16340: 16339: 16335: 16329: 16318: 16314: 16305: 16300: 16299: 16298: 16294: 16293: 16287: 16283: 16277: 16266: 16261: 16259: 16250: 16249: 16238: 16234: 16225: 16220: 16219: 16218: 16214: 16208: 16197: 16193: 16184: 16179: 16178: 16177: 16173: 16172: 16166: 16162: 16156: 16145: 16132: 16127: 16117: 16106: 16093: 16082: 16078: 16072: 16061: 16056: 16052: 16051: 16050: 16043: 16039: 16033: 16022: 16017: 16015: 16008: 16003: 15999: 15997: 15994: 15993: 15974: 15972: 15969: 15968: 15941: 15936: 15935: 15929: 15925: 15919: 15908: 15894: 15890: 15889: 15887: 15884: 15883: 15862: 15858: 15857: 15855: 15852: 15851: 15824: 15820: 15814: 15803: 15798: 15792: 15788: 15786: 15774: 15768: 15765: 15764: 15744: 15740: 15738: 15735: 15734: 15708: 15704: 15698: 15687: 15674: 15670: 15668: 15665: 15664: 15652:In the case of 15650: 15601: 15597: 15596: 15584: 15580: 15571: 15566: 15565: 15564: 15560: 15554: 15543: 15539: 15530: 15525: 15524: 15523: 15519: 15518: 15512: 15508: 15502: 15491: 15486: 15484: 15476: 15474: 15471: 15470: 15450: 15447: 15444: 15443: 15432: 15402: 15398: 15386: 15382: 15373: 15368: 15367: 15366: 15362: 15356: 15345: 15341: 15332: 15327: 15326: 15325: 15321: 15320: 15314: 15310: 15304: 15293: 15288: 15286: 15278: 15276: 15273: 15272: 15243: 15239: 15233: 15222: 15217: 15210: 15205: 15204: 15198: 15194: 15188: 15177: 15172: 15170: 15160: 15156: 15155: 15153: 15150: 15149: 15128: 15124: 15123: 15121: 15118: 15117: 15087: 15083: 15081: 15078: 15077: 15056: 15051: 15050: 15048: 15045: 15044: 15041: 15006: 15001: 14985: 14979: 14978: 14963: 14960: 14959: 14942: 14941: 14926: 14922: 14917: 14911: 14907: 14895: 14891: 14890: 14883: 14879: 14873: 14869: 14860: 14856: 14847: 14843: 14837: 14826: 14821: 14819: 14810: 14809: 14797: 14792: 14783: 14777: 14773: 14763: 14757: 14751: 14750: 14739: 14738: 14735: 14728: 14719: 14713: 14712: 14704: 14702: 14699: 14698: 14670: 14665: 14656: 14650: 14646: 14645: 14641: 14633: 14630: 14629: 14602: 14598: 14589: 14585: 14580: 14574: 14569: 14560: 14557: 14556: 14525: 14523: 14519: 14517: 14514: 14513: 14489: 14484: 14474: 14470: 14468: 14461: 14457: 14455: 14452: 14451: 14434: 14429: 14419: 14408: 14395: 14391: 14389: 14386: 14385: 14368: 14364: 14358: 14347: 14334: 14330: 14328: 14325: 14324: 14307: 14306: 14300: 14295: 14278: 14273: 14263: 14259: 14257: 14250: 14246: 14237: 14236: 14227: 14223: 14182: 14177: 14167: 14163: 14161: 14149: 14145: 14103: 14102: 14094: 14090: 14080: 14076: 14070: 14066: 14057: 14053: 14035: 14031: 14025: 14014: 14009: 14007: 14000: 13991: 13985: 13984: 13973: 13972: 13959: 13958: 13952: 13947: 13923: 13921: 13917: 13908: 13907: 13898: 13894: 13851: 13839: 13835: 13793: 13792: 13777: 13773: 13761: 13757: 13739: 13728: 13723: 13721: 13714: 13705: 13694: 13693: 13692: 13679: 13677: 13674: 13673: 13653: 13648: 13642: 13639: 13638: 13622: 13619: 13618: 13611: 13574: 13570: 13568: 13565: 13564: 13548: 13545: 13544: 13527: 13523: 13517: 13506: 13500: 13497: 13496: 13476: 13465: 13461: 13452: 13448: 13447: 13443: 13442: 13436: 13432: 13426: 13415: 13396: 13392: 13386: 13375: 13370: 13363: 13359: 13353: 13342: 13337: 13335: 13323: 13319: 13317: 13314: 13313: 13290: 13286: 13281: 13278: 13277: 13240: 13237: 13236: 13202: 13199: 13198: 13146: 13143: 13142: 13110: 13106: 13100: 13089: 13084: 13077: 13066: 13062: 13053: 13049: 13048: 13044: 13043: 13037: 13033: 13027: 13016: 13011: 13009: 12997: 12993: 12991: 12988: 12987: 12977: 12924: 12920: 12918: 12915: 12914: 12897: 12891: 12890: 12879: 12878: 12875: 12872: 12871: 12844: 12840: 12834: 12823: 12817: 12814: 12813: 12796: 12795: 12786: 12782: 12776: 12765: 12760: 12753: 12742: 12738: 12729: 12725: 12724: 12720: 12719: 12713: 12709: 12703: 12692: 12687: 12685: 12678: 12672: 12666: 12665: 12654: 12653: 12649: 12648: 12636: 12619: 12615: 12614: 12610: 12609: 12603: 12592: 12587: 12585: 12578: 12569: 12558: 12557: 12556: 12552: 12550: 12547: 12546: 12526: 12515: 12514: 12513: 12511: 12508: 12507: 12486: 12480: 12479: 12468: 12467: 12464: 12461: 12460: 12458:sample variance 12433: 12429: 12427: 12424: 12423: 12412: 12406: 12401: 12375: 12371: 12370: 12364: 12353: 12348: 12342: 12338: 12336: 12324: 12318: 12315: 12314: 12289: 12284: 12278: 12274: 12259: 12258: 12254: 12252: 12249: 12248: 12231: 12226: 12220: 12217: 12216: 12191: 12187: 12182: 12177: 12171: 12160: 12145: 12140: 12137: 12136: 12111: 12107: 12102: 12097: 12091: 12080: 12070: 12065: 12052: 12041: 12040: 12034: 12031: 12030: 12010: 12005: 11992: 11987: 11981: 11978: 11977: 11947: 11946: 11942: 11940: 11937: 11936: 11915: 11910: 11899: 11895: 11890: 11885: 11879: 11868: 11855: 11844: 11843: 11837: 11834: 11833: 11813: 11808: 11802: 11799: 11798: 11795: 11779: 11753: 11749: 11743: 11732: 11731: 11730: 11721: 11717: 11708: 11703: 11687: 11683: 11672: 11671: 11649: 11644: 11629: 11618: 11617: 11611: 11610: 11608: 11605: 11604: 11578: 11574: 11570: 11568: 11554: 11553: 11551: 11548: 11547: 11522: 11518: 11507: 11506: 11497: 11493: 11481: 11476: 11465: 11464: 11451: 11440: 11439: 11438: 11427: 11426: 11417: 11413: 11407: 11403: 11386: 11385: 11376: 11372: 11360: 11349: 11348: 11347: 11335: 11331: 11325: 11314: 11313: 11312: 11301: 11300: 11291: 11287: 11281: 11277: 11270: 11266: 11257: 11253: 11242: 11241: 11219: 11214: 11199: 11192: 11181: 11180: 11179: 11178: 11172: 11171: 11169: 11166: 11165: 11151: 11134: 11111: 11107: 11101: 11090: 11089: 11088: 11079: 11075: 11066: 11061: 11051: 11040: 11027: 11023: 11017: 11013: 11007: 10996: 10988: 10983: 10974: 10961: 10957: 10950: 10939: 10938: 10937: 10928: 10924: 10923: 10921: 10912: 10908: 10898: 10894: 10893: 10887: 10876: 10864: 10853: 10852: 10851: 10846: 10827: 10816: 10815: 10814: 10807: 10805: 10804: 10802: 10799: 10798: 10772: 10768: 10757: 10754: 10753: 10727: 10723: 10714: 10710: 10692: 10688: 10671: 10668: 10667: 10638: 10634: 10627: 10616: 10615: 10614: 10605: 10601: 10600: 10598: 10590: 10586: 10579: 10568: 10567: 10566: 10557: 10553: 10552: 10550: 10541: 10530: 10529: 10528: 10527: 10523: 10517: 10506: 10496: 10485: 10473: 10462: 10461: 10460: 10455: 10436: 10425: 10424: 10423: 10416: 10414: 10413: 10389: 10388: 10381: 10379: 10378: 10376: 10373: 10372: 10347: 10343: 10341: 10338: 10337: 10317: 10313: 10307: 10303: 10291: 10285: 10282: 10281: 10264: 10260: 10254: 10250: 10238: 10232: 10229: 10228: 10194: 10190: 10181: 10177: 10159: 10155: 10138: 10135: 10134: 10105: 10104: 10090: 10089: 10075: 10074: 10072: 10069: 10068: 10035: 10034: 10020: 10019: 10005: 10004: 9993: 9992: 9965: 9964: 9957: 9955: 9954: 9943: 9942: 9918: 9917: 9910: 9908: 9907: 9906: 9902: 9894: 9883: 9882: 9881: 9876: 9860: 9856: 9849: 9845: 9834: 9833: 9824: 9820: 9819: 9817: 9809: 9805: 9798: 9794: 9783: 9782: 9773: 9769: 9768: 9766: 9757: 9746: 9745: 9744: 9743: 9739: 9733: 9722: 9712: 9701: 9689: 9678: 9677: 9676: 9671: 9647: 9646: 9639: 9637: 9636: 9634: 9631: 9630: 9602: 9598: 9589: 9583: 9569: 9565: 9556: 9550: 9549: 9545: 9539: 9528: 9514: 9493: 9483: 9479: 9473: 9462: 9457: 9447: 9437: 9433: 9427: 9416: 9411: 9409: 9394: 9384: 9382: 9368: 9367: 9365: 9362: 9361: 9336: 9326: 9324: 9310: 9309: 9307: 9304: 9303: 9275: 9271: 9269: 9266: 9265: 9246: 9242: 9237: 9228: 9224: 9222: 9219: 9218: 9199: 9195: 9190: 9181: 9177: 9175: 9172: 9171: 9151: 9147: 9138: 9134: 9132: 9129: 9128: 9107: 9103: 9097: 9086: 9085: 9084: 9075: 9071: 9062: 9057: 9047: 9036: 9023: 9019: 9013: 9009: 9003: 8992: 8984: 8979: 8960: 8949: 8948: 8947: 8940: 8938: 8937: 8935: 8932: 8931: 8904:Ratio estimator 8877: 8867: 8863: 8857: 8846: 8841: 8831: 8821: 8817: 8811: 8800: 8795: 8793: 8784: 8773: 8772: 8771: 8769: 8766: 8765: 8741: 8737: 8731: 8720: 8715: 8708: 8704: 8698: 8694: 8688: 8677: 8672: 8670: 8661: 8650: 8649: 8648: 8646: 8643: 8642: 8625: 8614: 8613: 8612: 8597: 8587: 8583: 8577: 8566: 8561: 8551: 8541: 8537: 8531: 8520: 8515: 8513: 8498: 8488: 8484: 8478: 8467: 8462: 8452: 8442: 8438: 8432: 8421: 8416: 8414: 8399: 8388: 8387: 8380: 8369: 8364: 8354: 8343: 8342: 8335: 8324: 8319: 8317: 8303: 8299: 8294: 8288: 8284: 8278: 8267: 8262: 8253: 8249: 8243: 8239: 8237: 8231: 8227: 8221: 8210: 8205: 8203: 8184: 8182: 8181: 8167: 8166: 8164: 8161: 8160: 8140: 8136: 8130: 8119: 8114: 8107: 8103: 8097: 8093: 8087: 8076: 8071: 8069: 8057: 8046: 8045: 8044: 8038: 8027: 8022: 8015: 8004: 8003: 8002: 7996: 7985: 7980: 7978: 7964: 7960: 7955: 7949: 7938: 7933: 7924: 7920: 7914: 7910: 7908: 7902: 7891: 7886: 7884: 7870: 7869: 7861: 7858: 7857: 7840: 7836: 7834: 7831: 7830: 7810: 7799: 7798: 7791: 7780: 7765: 7761: 7755: 7751: 7749: 7743: 7732: 7719: 7715: 7709: 7705: 7699: 7688: 7670: 7669: 7667: 7664: 7663: 7644: 7640: 7635: 7626: 7622: 7620: 7617: 7616: 7600: 7597: 7596: 7574: 7573: 7571: 7568: 7567: 7551: 7548: 7547: 7531: 7528: 7527: 7505: 7504: 7502: 7499: 7498: 7495: 7490: 7486: 7472: 7471: 7465: 7454: 7450: 7444: 7440: 7439: 7435: 7434: 7428: 7417: 7405: 7401: 7396: 7387: 7386: 7373: 7369: 7363: 7359: 7357: 7349: 7345: 7339: 7335: 7333: 7324: 7320: 7310: 7306: 7300: 7289: 7277: 7273: 7268: 7259: 7258: 7247: 7236: 7235: 7234: 7228: 7217: 7216: 7215: 7206: 7195: 7194: 7193: 7192: 7188: 7182: 7171: 7159: 7155: 7150: 7141: 7140: 7129: 7118: 7117: 7116: 7110: 7099: 7098: 7097: 7088: 7077: 7076: 7075: 7074: 7070: 7064: 7053: 7043: 7032: 7020: 7016: 7011: 7004: 6994: 6986: 6985: 6974: 6973: 6972: 6959: 6957: 6954: 6953: 6927: 6923: 6912: 6909: 6908: 6907:We assume that 6884: 6873: 6869: 6863: 6859: 6858: 6854: 6853: 6847: 6836: 6824: 6820: 6815: 6802: 6794: 6793: 6777: 6775: 6774: 6773: 6762: 6759: 6758: 6729: 6725: 6716: 6712: 6689: 6686: 6685: 6665: 6661: 6644: 6640: 6633: 6629: 6623: 6619: 6618: 6616: 6598: 6587: 6586: 6585: 6583: 6580: 6579: 6578:, and for i=j: 6557: 6553: 6546: 6542: 6536: 6532: 6531: 6529: 6511: 6500: 6499: 6498: 6496: 6493: 6492: 6472: 6468: 6466: 6463: 6462: 6442: 6438: 6429: 6425: 6419: 6415: 6403: 6399: 6387: 6383: 6374: 6370: 6362: 6359: 6358: 6339: 6335: 6329: 6325: 6323: 6314: 6303: 6302: 6301: 6299: 6296: 6295: 6271: 6260: 6259: 6258: 6252: 6241: 6240: 6239: 6230: 6219: 6218: 6217: 6216: 6212: 6206: 6195: 6185: 6174: 6162: 6158: 6153: 6140: 6132: 6131: 6115: 6113: 6112: 6111: 6100: 6097: 6096: 6087: 6070: 6069: 6063: 6045: 6043: 6034: 6030: 6024: 6020: 6019: 6015: 6014: 6008: 5997: 5980: 5975: 5966: 5965: 5959: 5941: 5939: 5930: 5926: 5920: 5916: 5915: 5911: 5910: 5904: 5893: 5876: 5871: 5860: 5856: 5854: 5845: 5844: 5838: 5821: 5817: 5811: 5807: 5801: 5790: 5785: 5783: 5769: 5765: 5759: 5755: 5753: 5749: 5745: 5744: 5738: 5727: 5710: 5705: 5695: 5686: 5675: 5671: 5665: 5661: 5655: 5644: 5630: 5619: 5615: 5609: 5605: 5603: 5593: 5592: 5588: 5587: 5581: 5570: 5553: 5548: 5538: 5529: 5528: 5522: 5505: 5494: 5493: 5492: 5481: 5477: 5471: 5467: 5465: 5464: 5460: 5459: 5453: 5442: 5425: 5420: 5410: 5403: 5394: 5383: 5382: 5381: 5368: 5366: 5363: 5362: 5333: 5329: 5323: 5319: 5318: 5316: 5310: 5299: 5279: 5277: 5275: 5272: 5271: 5254: 5236: 5234: 5225: 5221: 5215: 5211: 5210: 5206: 5205: 5199: 5188: 5171: 5166: 5148: 5137: 5136: 5135: 5124: 5121: 5120: 5093: 5083: 5079: 5073: 5062: 5047: 5043: 5034: 5028: 5022: 5011: 4995: 4991: 4987: 4978: 4972: 4966: 4955: 4940: 4936: 4927: 4921: 4915: 4904: 4890: 4875: 4864: 4863: 4862: 4860: 4857: 4856: 4840: 4837: 4836: 4815: 4812: 4811: 4789: 4785: 4779: 4768: 4756: 4753: 4752: 4720: 4710: 4706: 4700: 4689: 4684: 4682: 4665: 4661: 4652: 4646: 4640: 4629: 4624: 4622: 4605: 4601: 4592: 4586: 4580: 4569: 4555: 4554: 4552: 4532: 4521: 4520: 4519: 4517: 4504: 4503: 4487: 4485: 4484: 4483: 4481: 4478: 4477: 4450: 4432: 4430: 4421: 4417: 4411: 4407: 4406: 4402: 4401: 4395: 4384: 4367: 4362: 4354: 4350: 4345: 4328: 4327: 4311: 4309: 4308: 4307: 4303: 4295: 4292: 4291: 4277:sampling design 4247: 4237: 4233: 4227: 4216: 4211: 4209: 4189: 4178: 4177: 4176: 4174: 4161: 4160: 4144: 4142: 4141: 4140: 4138: 4135: 4134: 4127: 4098: 4094: 4087: 4082: 4071: 4067: 4062: 4053: 4049: 4047: 4044: 4043: 4018: 4014: 4007: 4003: 3997: 3993: 3992: 3990: 3981: 3970: 3969: 3968: 3962: 3958: 3946: 3935: 3934: 3931: 3928: 3927: 3910: 3899: 3898: 3897: 3883: 3879: 3873: 3869: 3867: 3865: 3862: 3861: 3841: 3838: 3837: 3818: 3814: 3808: 3804: 3802: 3793: 3782: 3781: 3780: 3778: 3775: 3774: 3754: 3751: 3750: 3727: 3723: 3708: 3704: 3698: 3693: 3677: 3673: 3661: 3656: 3637: 3625: 3622: 3621: 3604: 3600: 3594: 3590: 3578: 3574: 3562: 3558: 3543: 3531: 3528: 3527: 3510: 3506: 3500: 3496: 3484: 3478: 3475: 3474: 3457: 3453: 3451: 3448: 3447: 3430: 3426: 3424: 3421: 3420: 3395: 3391: 3389: 3386: 3385: 3363: 3359: 3357: 3348: 3344: 3335:one sample draw 3333: 3328: 3316: 3312: 3304: 3301: 3300: 3283: 3279: 3265: 3250: 3246: 3238: 3235: 3234: 3209: 3205: 3203: 3200: 3199: 3192:survey sampling 3165: 3164: 3162: 3159: 3158: 3142: 3139: 3138: 3108:sampling design 3086: 3082: 3080: 3077: 3076: 3069: 3035: 3030: 3020: 3009: 3004: 3002: 2988: 2984: 2982: 2980: 2977: 2976: 2953: 2949: 2943: 2932: 2927: 2925: 2911: 2910: 2908: 2905: 2904: 2880: 2873: 2869: 2858: 2857: 2848: 2844: 2835: 2824: 2819: 2817: 2808: 2803: 2792: 2791: 2788: 2785: 2784: 2762: 2751: 2750: 2749: 2739: 2735: 2732: 2726: 2721: 2710: 2709: 2696: 2685: 2684: 2683: 2672: 2669: 2668: 2631: 2626: 2588: 2587: 2579: 2576: 2575: 2552: 2548: 2546: 2543: 2542: 2521: 2517: 2508: 2503: 2497: 2486: 2465: 2464: 2456: 2453: 2452: 2431: 2427: 2426: 2414: 2410: 2402: 2399: 2398: 2373: 2372: 2370: 2367: 2366: 2363: 2358: 2320: 2315: 2305: 2301: 2299: 2293: 2282: 2272: 2261: 2260: 2242: 2241: 2239: 2236: 2235: 2207: 2193: 2188: 2178: 2167: 2162: 2158: 2157: 2149: 2144: 2131: 2126: 2121: 2115: 2104: 2099: 2097: 2087: 2082: 2071: 2067: 2062: 2057: 2051: 2040: 2027: 2016: 2015: 2009: 2006: 2005: 1985: 1981: 1972: 1968: 1966: 1963: 1962: 1943: 1937: 1932: 1919: 1908: 1907: 1901: 1898: 1897: 1870: 1866: 1860: 1849: 1844: 1838: 1822: 1817: 1807: 1796: 1791: 1785: 1770: 1769: 1765: 1763: 1760: 1759: 1726: 1722: 1716: 1705: 1700: 1688: 1684: 1675: 1671: 1670: 1666: 1660: 1649: 1644: 1642: 1627: 1622: 1612: 1606: 1595: 1590: 1576: 1571: 1561: 1557: 1554: 1550: 1544: 1533: 1528: 1526: 1512: 1511: 1509: 1506: 1505: 1477: 1472: 1463: 1454: 1450: 1448: 1445: 1444: 1424: 1419: 1413: 1410: 1409: 1385: 1381: 1379: 1376: 1375: 1372: 1366: 1360: 1335: 1331: 1326: 1316: 1305: 1299: 1281: 1280: 1276: 1274: 1271: 1270: 1240: 1239: 1235: 1233: 1230: 1229: 1208: 1204: 1202: 1199: 1198: 1173: 1169: 1168: 1162: 1151: 1137: 1135: 1132: 1131: 1103: 1099: 1098: 1092: 1081: 1076: 1070: 1066: 1064: 1052: 1046: 1043: 1042: 1017: 1013: 1004: 999: 993: 982: 964: 963: 961: 958: 957: 927: 922: 916: 905: 899: 896: 895: 865: 861: 846: 842: 833: 829: 828: 821: 817: 811: 807: 792: 788: 782: 778: 769: 765: 759: 755: 754: 752: 738: 737: 735: 732: 731: 702: 698: 692: 681: 676: 669: 665: 659: 655: 649: 638: 633: 631: 617: 616: 614: 611: 610: 585: 581: 566: 562: 553: 549: 548: 544: 542: 539: 538: 513: 509: 494: 490: 481: 477: 476: 472: 470: 467: 466: 459: 390: 389: 387: 384: 383: 347: 342: 340: 337: 336: 303: 298: 296: 293: 292: 277:Since only the 275: 237: 202: 200: 186: 185: 183: 180: 179: 147: 133: 132: 130: 127: 126: 104: 100: 98: 92: 88: 86: 83: 78: 66:arithmetic mean 51:arithmetic mean 43: 32:Weighted median 28: 23: 22: 15: 12: 11: 5: 20339: 20329: 20328: 20323: 20318: 20304: 20303: 20298: 20279: 20278:External links 20276: 20275: 20274: 20268: 20255: 20236: 20233: 20230: 20229: 20206: 20185: 20178: 20160: 20144: 20117:(4): 485–490. 20091: 20063: 20038: 19997: 19958: 19942: 19935: 19884: 19860: 19859: 19857: 19854: 19851: 19850: 19836: 19835: 19833: 19830: 19828: 19827: 19822: 19817: 19812: 19807: 19802: 19797: 19792: 19787: 19782: 19777: 19772: 19767: 19762: 19757: 19752: 19746: 19744: 19741: 19740: 19739: 19728: 19722: 19719: 19716: 19709: 19705: 19698: 19695: 19689: 19684: 19680: 19676: 19671: 19666: 19663: 19660: 19656: 19649: 19644: 19640: 19612: 19608: 19602: 19598: 19594: 19589: 19582: 19579: 19573: 19546: 19539: 19536: 19530: 19507: 19503: 19499: 19494: 19490: 19458: 19455: 19446: 19443: 19428: 19427: 19416: 19409: 19404: 19400: 19393: 19389: 19382: 19379: 19373: 19368: 19364: 19360: 19352: 19347: 19344: 19341: 19337: 19330: 19327: 19324: 19321: 19318: 19314: 19309: 19304: 19299: 19295: 19265: 19260: 19256: 19244: 19243: 19230: 19225: 19221: 19215: 19208: 19205: 19199: 19195: 19190: 19183: 19180: 19171: 19168: 19139: 19135: 19115: 19112: 19107: 19104: 19091: 19071: 19048: 19045: 19042: 19039: 19036: 19033: 19030: 19026: 19021: 19001: 18981: 18978: 18972: 18969: 18965: 18961: 18958: 18935: 18932: 18928: 18907: 18904: 18901: 18898: 18895: 18892: 18889: 18886: 18883: 18880: 18877: 18874: 18868: 18865: 18861: 18837: 18834: 18830: 18826: 18823: 18820: 18817: 18796: 18773: 18753: 18750: 18747: 18744: 18741: 18737: 18733: 18730: 18725: 18721: 18709: 18708: 18697: 18691: 18688: 18685: 18678: 18674: 18670: 18667: 18661: 18655: 18652: 18649: 18645: 18638: 18633: 18630: 18627: 18623: 18619: 18614: 18610: 18584: 18580: 18557: 18553: 18541: 18540: 18529: 18522: 18518: 18512: 18509: 18506: 18502: 18496: 18491: 18487: 18463: 18454:we can define 18443: 18440: 18437: 18434: 18431: 18411: 18408: 18405: 18402: 18399: 18380: 18377: 18376: 18375: 18364: 18359: 18356: 18353: 18350: 18347: 18343: 18337: 18333: 18327: 18322: 18319: 18316: 18312: 18308: 18303: 18299: 18275: 18255: 18232: 18228: 18205: 18201: 18180: 18158: 18154: 18133: 18113: 18093: 18081: 18078: 18077: 18076: 18065: 18060: 18057: 18052: 18047: 18043: 18028: 18027: 18016: 18013: 18009: 18004: 17998: 17993: 17988: 17983: 17976: 17973: 17967: 17963: 17957: 17954: 17937: 17936: 17925: 17920: 17917: 17913: 17908: 17903: 17897: 17892: 17887: 17884: 17879: 17872: 17869: 17863: 17835: 17813: 17809: 17805: 17802: 17799: 17796: 17793: 17790: 17780:vector of ones 17762: 17738: 17735: 17710: 17706: 17680: 17657: 17653: 17647: 17643: 17639: 17636: 17633: 17628: 17624: 17620: 17617: 17613: 17590: 17587: 17583: 17582: 17565: 17559: 17556: 17555: 17552: 17549: 17548: 17546: 17541: 17536: 17530: 17527: 17526: 17523: 17520: 17519: 17517: 17510: 17504: 17501: 17499: 17496: 17495: 17492: 17489: 17487: 17484: 17483: 17481: 17476: 17473: 17471: 17469: 17465: 17459: 17454: 17447: 17444: 17439: 17434: 17429: 17424: 17419: 17412: 17409: 17404: 17399: 17393: 17387: 17384: 17379: 17373: 17370: 17365: 17360: 17355: 17350: 17347: 17342: 17337: 17331: 17326: 17323: 17321: 17316: 17312: 17305: 17304: 17290: 17289: 17276: 17270: 17267: 17265: 17262: 17261: 17258: 17255: 17253: 17250: 17249: 17247: 17242: 17237: 17232: 17226: 17221: 17215: 17209: 17206: 17204: 17201: 17200: 17198: 17192: 17187: 17182: 17170: 17157: 17151: 17148: 17146: 17143: 17142: 17139: 17136: 17134: 17131: 17130: 17128: 17123: 17118: 17113: 17107: 17102: 17096: 17090: 17087: 17085: 17082: 17081: 17079: 17073: 17068: 17063: 17036: 17031: 17028: 17023: 17017: 17012: 17005: 17000: 16997: 16994: 16990: 16985: 16980: 16973: 16969: 16961: 16931: 16927: 16921: 16916: 16909: 16904: 16897: 16892: 16889: 16886: 16882: 16877: 16869: 16865: 16857: 16852: 16846: 16842: 16816: 16811: 16808: 16803: 16798: 16793: 16788: 16783: 16768:matrix inverse 16750: 16724: 16720: 16702: 16699: 16686: 16682: 16678: 16675: 16670: 16666: 16661: 16655: 16651: 16639: 16638: 16627: 16619: 16615: 16611: 16608: 16602: 16596: 16592: 16588: 16583: 16578: 16572: 16566: 16561: 16555: 16551: 16547: 16542: 16537: 16531: 16524: 16520: 16514: 16509: 16506: 16503: 16499: 16492: 16488: 16464: 16461: 16456: 16452: 16437: 16436: 16421: 16415: 16410: 16406: 16401: 16395: 16391: 16387: 16384: 16379: 16375: 16368: 16362: 16358: 16354: 16349: 16344: 16338: 16332: 16327: 16321: 16317: 16313: 16308: 16303: 16297: 16290: 16286: 16280: 16275: 16272: 16269: 16265: 16258: 16255: 16253: 16251: 16247: 16241: 16237: 16233: 16228: 16223: 16217: 16211: 16206: 16200: 16196: 16192: 16187: 16182: 16176: 16169: 16165: 16159: 16154: 16151: 16148: 16144: 16135: 16130: 16126: 16120: 16115: 16112: 16109: 16105: 16101: 16096: 16091: 16085: 16081: 16075: 16070: 16067: 16064: 16060: 16055: 16046: 16042: 16036: 16031: 16028: 16025: 16021: 16014: 16011: 16009: 16006: 16002: 16001: 15977: 15961: 15960: 15949: 15944: 15939: 15932: 15928: 15922: 15917: 15914: 15911: 15907: 15903: 15897: 15893: 15865: 15861: 15844: 15843: 15827: 15823: 15817: 15812: 15809: 15806: 15802: 15795: 15791: 15785: 15781: 15777: 15773: 15747: 15743: 15731: 15730: 15719: 15716: 15711: 15707: 15701: 15696: 15693: 15690: 15686: 15682: 15677: 15673: 15649: 15646: 15630: 15629: 15618: 15612: 15609: 15604: 15600: 15593: 15587: 15583: 15579: 15574: 15569: 15563: 15557: 15552: 15546: 15542: 15538: 15533: 15528: 15522: 15515: 15511: 15505: 15500: 15497: 15494: 15490: 15483: 15479: 15453: 15431: 15428: 15424: 15423: 15412: 15405: 15401: 15395: 15389: 15385: 15381: 15376: 15371: 15365: 15359: 15354: 15348: 15344: 15340: 15335: 15330: 15324: 15317: 15313: 15307: 15302: 15299: 15296: 15292: 15285: 15281: 15266: 15265: 15254: 15246: 15242: 15236: 15231: 15228: 15225: 15221: 15213: 15208: 15201: 15197: 15191: 15186: 15183: 15180: 15176: 15169: 15163: 15159: 15131: 15127: 15098: 15095: 15090: 15086: 15059: 15054: 15040: 15037: 15009: 15000: 14996: 14993: 14988: 14982: 14977: 14973: 14970: 14967: 14956: 14955: 14940: 14934: 14929: 14925: 14920: 14914: 14910: 14906: 14903: 14898: 14894: 14886: 14882: 14876: 14872: 14868: 14863: 14859: 14855: 14850: 14846: 14840: 14835: 14832: 14829: 14825: 14818: 14815: 14813: 14811: 14805: 14800: 14795: 14791: 14786: 14780: 14776: 14772: 14769: 14766: 14760: 14754: 14746: 14743: 14734: 14731: 14729: 14722: 14716: 14711: 14707: 14706: 14679: 14673: 14668: 14664: 14659: 14653: 14649: 14644: 14640: 14637: 14611: 14608: 14605: 14601: 14597: 14592: 14588: 14583: 14577: 14572: 14568: 14543: 14538: 14534: 14531: 14528: 14522: 14500: 14492: 14487: 14483: 14477: 14473: 14467: 14464: 14460: 14437: 14432: 14428: 14422: 14417: 14414: 14411: 14407: 14403: 14398: 14394: 14371: 14367: 14361: 14356: 14353: 14350: 14346: 14342: 14337: 14333: 14321: 14320: 14303: 14294: 14289: 14281: 14276: 14272: 14266: 14262: 14256: 14253: 14249: 14245: 14242: 14240: 14238: 14235: 14230: 14226: 14222: 14219: 14216: 14213: 14210: 14207: 14204: 14201: 14198: 14195: 14192: 14185: 14180: 14176: 14170: 14166: 14160: 14157: 14152: 14148: 14144: 14141: 14138: 14135: 14132: 14129: 14126: 14123: 14120: 14117: 14114: 14111: 14108: 14106: 14104: 14097: 14093: 14088: 14083: 14079: 14073: 14069: 14065: 14060: 14056: 14052: 14049: 14046: 14043: 14038: 14034: 14028: 14023: 14020: 14017: 14013: 14006: 14003: 14001: 13999: 13994: 13988: 13980: 13977: 13970: 13967: 13964: 13961: 13960: 13955: 13946: 13941: 13936: 13932: 13929: 13926: 13920: 13916: 13913: 13911: 13909: 13906: 13901: 13897: 13893: 13890: 13887: 13884: 13881: 13878: 13875: 13872: 13869: 13866: 13863: 13858: 13855: 13850: 13847: 13842: 13838: 13834: 13831: 13828: 13825: 13822: 13819: 13816: 13813: 13810: 13807: 13804: 13801: 13798: 13796: 13794: 13789: 13785: 13780: 13776: 13772: 13769: 13764: 13760: 13756: 13753: 13750: 13747: 13742: 13737: 13734: 13731: 13727: 13720: 13717: 13715: 13713: 13708: 13701: 13698: 13691: 13688: 13685: 13682: 13681: 13656: 13647: 13626: 13610: 13607: 13577: 13573: 13552: 13530: 13526: 13520: 13515: 13512: 13509: 13505: 13493: 13492: 13479: 13474: 13468: 13464: 13460: 13455: 13451: 13446: 13439: 13435: 13429: 13424: 13421: 13418: 13414: 13407: 13404: 13399: 13395: 13389: 13384: 13381: 13378: 13374: 13366: 13362: 13356: 13351: 13348: 13345: 13341: 13334: 13326: 13322: 13298: 13293: 13289: 13285: 13262: 13259: 13256: 13253: 13250: 13247: 13244: 13224: 13221: 13218: 13215: 13212: 13209: 13206: 13186: 13183: 13180: 13177: 13174: 13171: 13168: 13165: 13162: 13159: 13156: 13153: 13150: 13136: 13135: 13121: 13118: 13113: 13109: 13103: 13098: 13095: 13092: 13088: 13080: 13075: 13069: 13065: 13061: 13056: 13052: 13047: 13040: 13036: 13030: 13025: 13022: 13019: 13015: 13008: 13000: 12996: 12976: 12973: 12927: 12923: 12900: 12894: 12886: 12883: 12855: 12852: 12847: 12843: 12837: 12832: 12829: 12826: 12822: 12810: 12809: 12789: 12785: 12779: 12774: 12771: 12768: 12764: 12756: 12751: 12745: 12741: 12737: 12732: 12728: 12723: 12716: 12712: 12706: 12701: 12698: 12695: 12691: 12684: 12681: 12679: 12675: 12669: 12661: 12658: 12651: 12650: 12645: 12639: 12634: 12630: 12627: 12622: 12618: 12613: 12606: 12601: 12598: 12595: 12591: 12584: 12581: 12579: 12572: 12565: 12562: 12555: 12554: 12529: 12522: 12519: 12489: 12483: 12475: 12472: 12436: 12432: 12405: 12402: 12400: 12397: 12378: 12374: 12367: 12362: 12359: 12356: 12352: 12345: 12341: 12335: 12331: 12327: 12323: 12293: 12287: 12281: 12277: 12273: 12266: 12263: 12257: 12234: 12229: 12225: 12204: 12201: 12194: 12190: 12185: 12181: 12174: 12169: 12166: 12163: 12159: 12155: 12152: 12148: 12144: 12133: 12132: 12121: 12114: 12110: 12105: 12101: 12094: 12089: 12086: 12083: 12079: 12073: 12068: 12064: 12060: 12055: 12048: 12045: 12039: 12013: 12008: 12004: 12000: 11995: 11990: 11986: 11954: 11951: 11945: 11933: 11932: 11918: 11913: 11909: 11902: 11898: 11893: 11889: 11882: 11877: 11874: 11871: 11867: 11863: 11858: 11851: 11848: 11842: 11816: 11811: 11807: 11794: 11791: 11778: 11775: 11770: 11769: 11756: 11752: 11746: 11739: 11736: 11729: 11724: 11720: 11716: 11711: 11706: 11702: 11698: 11690: 11686: 11679: 11676: 11670: 11667: 11664: 11661: 11658: 11655: 11652: 11648: 11643: 11637: 11632: 11625: 11622: 11616: 11587: 11581: 11577: 11573: 11567: 11561: 11558: 11544: 11543: 11531: 11525: 11521: 11514: 11511: 11505: 11500: 11496: 11492: 11489: 11484: 11479: 11472: 11469: 11462: 11459: 11454: 11447: 11444: 11434: 11431: 11425: 11420: 11416: 11410: 11406: 11402: 11399: 11393: 11390: 11384: 11379: 11375: 11371: 11368: 11363: 11356: 11353: 11346: 11343: 11338: 11334: 11328: 11321: 11318: 11308: 11305: 11299: 11294: 11290: 11284: 11280: 11276: 11273: 11269: 11260: 11256: 11249: 11246: 11240: 11237: 11234: 11231: 11228: 11225: 11222: 11218: 11213: 11207: 11202: 11195: 11188: 11185: 11177: 11150: 11147: 11119: 11114: 11110: 11104: 11097: 11094: 11087: 11082: 11078: 11074: 11069: 11064: 11060: 11054: 11049: 11046: 11043: 11039: 11030: 11026: 11020: 11016: 11010: 11005: 11002: 10999: 10995: 10991: 10987: 10982: 10977: 10972: 10964: 10960: 10953: 10946: 10943: 10936: 10931: 10927: 10920: 10915: 10911: 10907: 10904: 10901: 10897: 10890: 10885: 10882: 10879: 10875: 10867: 10860: 10857: 10850: 10845: 10839: 10835: 10830: 10823: 10820: 10813: 10810: 10786: 10783: 10780: 10775: 10771: 10767: 10764: 10761: 10741: 10738: 10735: 10730: 10726: 10722: 10717: 10713: 10709: 10706: 10703: 10698: 10695: 10691: 10687: 10684: 10681: 10678: 10675: 10653: 10649: 10641: 10637: 10630: 10623: 10620: 10613: 10608: 10604: 10593: 10589: 10582: 10575: 10572: 10565: 10560: 10556: 10547: 10544: 10537: 10534: 10526: 10520: 10515: 10512: 10509: 10505: 10499: 10494: 10491: 10488: 10484: 10476: 10469: 10466: 10459: 10454: 10448: 10444: 10439: 10432: 10429: 10422: 10419: 10412: 10406: 10402: 10396: 10393: 10387: 10384: 10358: 10355: 10350: 10346: 10336:When defining 10320: 10316: 10310: 10306: 10302: 10298: 10294: 10290: 10267: 10263: 10257: 10253: 10249: 10245: 10241: 10237: 10208: 10205: 10202: 10197: 10193: 10189: 10184: 10180: 10176: 10173: 10170: 10165: 10162: 10158: 10154: 10151: 10148: 10145: 10142: 10118: 10112: 10109: 10103: 10097: 10094: 10088: 10082: 10079: 10052: 10048: 10042: 10039: 10033: 10027: 10024: 10018: 10012: 10009: 10000: 9997: 9991: 9988: 9982: 9978: 9972: 9969: 9963: 9960: 9950: 9947: 9941: 9935: 9931: 9925: 9922: 9916: 9913: 9905: 9897: 9890: 9887: 9880: 9875: 9871: 9863: 9859: 9852: 9848: 9841: 9838: 9832: 9827: 9823: 9812: 9808: 9801: 9797: 9790: 9787: 9781: 9776: 9772: 9763: 9760: 9753: 9750: 9742: 9736: 9731: 9728: 9725: 9721: 9715: 9710: 9707: 9704: 9700: 9692: 9685: 9682: 9675: 9670: 9664: 9660: 9654: 9651: 9645: 9642: 9613: 9605: 9601: 9596: 9592: 9588: 9582: 9579: 9572: 9568: 9563: 9559: 9555: 9548: 9542: 9537: 9534: 9531: 9527: 9521: 9518: 9513: 9510: 9507: 9500: 9496: 9492: 9486: 9482: 9476: 9471: 9468: 9465: 9461: 9454: 9450: 9446: 9440: 9436: 9430: 9425: 9422: 9419: 9415: 9408: 9401: 9398: 9391: 9388: 9381: 9375: 9372: 9343: 9340: 9333: 9330: 9323: 9317: 9314: 9297: 9278: 9274: 9249: 9245: 9241: 9236: 9231: 9227: 9202: 9198: 9194: 9189: 9184: 9180: 9159: 9154: 9150: 9146: 9141: 9137: 9125: 9124: 9110: 9106: 9100: 9093: 9090: 9083: 9078: 9074: 9070: 9065: 9060: 9056: 9050: 9045: 9042: 9039: 9035: 9026: 9022: 9016: 9012: 9006: 9001: 8998: 8995: 8991: 8987: 8983: 8978: 8972: 8968: 8963: 8956: 8953: 8946: 8943: 8884: 8880: 8876: 8870: 8866: 8860: 8855: 8852: 8849: 8845: 8838: 8834: 8830: 8824: 8820: 8814: 8809: 8806: 8803: 8799: 8792: 8787: 8780: 8777: 8744: 8740: 8734: 8729: 8726: 8723: 8719: 8711: 8707: 8701: 8697: 8691: 8686: 8683: 8680: 8676: 8669: 8664: 8657: 8654: 8628: 8621: 8618: 8611: 8604: 8600: 8596: 8590: 8586: 8580: 8575: 8572: 8569: 8565: 8558: 8554: 8550: 8544: 8540: 8534: 8529: 8526: 8523: 8519: 8512: 8505: 8501: 8497: 8491: 8487: 8481: 8476: 8473: 8470: 8466: 8459: 8455: 8451: 8445: 8441: 8435: 8430: 8427: 8424: 8420: 8413: 8406: 8402: 8395: 8392: 8383: 8378: 8375: 8372: 8368: 8361: 8357: 8350: 8347: 8338: 8333: 8330: 8327: 8323: 8316: 8306: 8302: 8298: 8291: 8287: 8281: 8276: 8273: 8270: 8266: 8256: 8252: 8246: 8242: 8234: 8230: 8224: 8219: 8216: 8213: 8209: 8202: 8196: 8191: 8188: 8180: 8174: 8171: 8143: 8139: 8133: 8128: 8125: 8122: 8118: 8110: 8106: 8100: 8096: 8090: 8085: 8082: 8079: 8075: 8068: 8060: 8053: 8050: 8041: 8036: 8033: 8030: 8026: 8018: 8011: 8008: 7999: 7994: 7991: 7988: 7984: 7977: 7967: 7963: 7959: 7952: 7947: 7944: 7941: 7937: 7927: 7923: 7917: 7913: 7905: 7900: 7897: 7894: 7890: 7883: 7877: 7874: 7868: 7865: 7843: 7839: 7817: 7813: 7806: 7803: 7794: 7789: 7786: 7783: 7779: 7775: 7768: 7764: 7758: 7754: 7746: 7741: 7738: 7735: 7731: 7727: 7722: 7718: 7712: 7708: 7702: 7697: 7694: 7691: 7687: 7683: 7677: 7674: 7647: 7643: 7639: 7634: 7629: 7625: 7604: 7581: 7578: 7555: 7535: 7512: 7509: 7494: 7487: 7468: 7463: 7457: 7453: 7447: 7443: 7438: 7431: 7426: 7423: 7420: 7416: 7408: 7404: 7400: 7395: 7392: 7390: 7388: 7384: 7376: 7372: 7366: 7362: 7352: 7348: 7342: 7338: 7332: 7327: 7323: 7319: 7316: 7313: 7309: 7303: 7298: 7295: 7292: 7288: 7280: 7276: 7272: 7267: 7264: 7262: 7260: 7256: 7250: 7243: 7240: 7231: 7224: 7221: 7212: 7209: 7202: 7199: 7191: 7185: 7180: 7177: 7174: 7170: 7162: 7158: 7154: 7149: 7146: 7144: 7142: 7138: 7132: 7125: 7122: 7113: 7106: 7103: 7094: 7091: 7084: 7081: 7073: 7067: 7062: 7059: 7056: 7052: 7046: 7041: 7038: 7035: 7031: 7023: 7019: 7015: 7010: 7007: 7005: 7003: 6993: 6981: 6978: 6971: 6968: 6965: 6962: 6961: 6941: 6938: 6935: 6930: 6926: 6922: 6919: 6916: 6902: 6901: 6900: 6887: 6882: 6876: 6872: 6866: 6862: 6857: 6850: 6845: 6842: 6839: 6835: 6827: 6823: 6819: 6814: 6811: 6801: 6789: 6784: 6781: 6772: 6769: 6766: 6743: 6740: 6737: 6732: 6728: 6724: 6719: 6715: 6711: 6708: 6705: 6702: 6699: 6696: 6693: 6668: 6664: 6660: 6657: 6654: 6647: 6643: 6636: 6632: 6626: 6622: 6615: 6612: 6609: 6604: 6601: 6594: 6591: 6563: 6560: 6556: 6549: 6545: 6539: 6535: 6528: 6525: 6522: 6517: 6514: 6507: 6504: 6478: 6475: 6471: 6448: 6445: 6441: 6437: 6432: 6428: 6422: 6418: 6414: 6409: 6406: 6402: 6398: 6395: 6390: 6386: 6382: 6377: 6373: 6369: 6366: 6342: 6338: 6332: 6328: 6322: 6317: 6310: 6307: 6280: 6274: 6267: 6264: 6255: 6248: 6245: 6236: 6233: 6226: 6223: 6215: 6209: 6204: 6201: 6198: 6194: 6188: 6183: 6180: 6177: 6173: 6165: 6161: 6157: 6152: 6149: 6139: 6127: 6122: 6119: 6110: 6107: 6104: 6066: 6061: 6055: 6051: 6048: 6042: 6037: 6033: 6027: 6023: 6018: 6011: 6006: 6003: 6000: 5996: 5989: 5986: 5983: 5979: 5974: 5971: 5969: 5967: 5962: 5957: 5951: 5947: 5944: 5938: 5933: 5929: 5923: 5919: 5914: 5907: 5902: 5899: 5896: 5892: 5885: 5882: 5879: 5875: 5868: 5863: 5859: 5853: 5850: 5848: 5846: 5841: 5836: 5830: 5824: 5820: 5814: 5810: 5804: 5799: 5796: 5793: 5789: 5782: 5779: 5772: 5768: 5762: 5758: 5752: 5748: 5741: 5736: 5733: 5730: 5726: 5719: 5716: 5713: 5709: 5702: 5699: 5694: 5689: 5684: 5678: 5674: 5668: 5664: 5658: 5653: 5650: 5647: 5643: 5637: 5634: 5629: 5622: 5618: 5612: 5608: 5600: 5597: 5591: 5584: 5579: 5576: 5573: 5569: 5562: 5559: 5556: 5552: 5545: 5542: 5537: 5534: 5532: 5530: 5525: 5520: 5514: 5511: 5508: 5501: 5498: 5491: 5484: 5480: 5474: 5470: 5463: 5456: 5451: 5448: 5445: 5441: 5434: 5431: 5428: 5424: 5417: 5414: 5409: 5406: 5404: 5402: 5390: 5387: 5380: 5377: 5374: 5371: 5370: 5342: 5336: 5332: 5326: 5322: 5313: 5308: 5305: 5302: 5298: 5294: 5289: 5285: 5282: 5257: 5252: 5246: 5242: 5239: 5233: 5228: 5224: 5218: 5214: 5209: 5202: 5197: 5194: 5191: 5187: 5180: 5177: 5174: 5170: 5165: 5162: 5157: 5154: 5151: 5144: 5141: 5134: 5131: 5128: 5100: 5096: 5092: 5086: 5082: 5076: 5071: 5068: 5065: 5061: 5057: 5050: 5046: 5041: 5037: 5033: 5025: 5020: 5017: 5014: 5010: 5006: 4998: 4994: 4990: 4985: 4981: 4977: 4969: 4964: 4961: 4958: 4954: 4950: 4943: 4939: 4934: 4930: 4926: 4918: 4913: 4910: 4907: 4903: 4897: 4894: 4889: 4884: 4881: 4878: 4871: 4868: 4844: 4819: 4792: 4788: 4782: 4777: 4774: 4771: 4767: 4763: 4760: 4749: 4748: 4737: 4732: 4727: 4723: 4719: 4713: 4709: 4703: 4698: 4695: 4692: 4688: 4681: 4676: 4668: 4664: 4659: 4655: 4651: 4643: 4638: 4635: 4632: 4628: 4621: 4616: 4608: 4604: 4599: 4595: 4591: 4583: 4578: 4575: 4572: 4568: 4562: 4559: 4551: 4546: 4541: 4538: 4535: 4528: 4525: 4516: 4511: 4499: 4494: 4491: 4468: 4467: 4466: 4453: 4448: 4442: 4438: 4435: 4429: 4424: 4420: 4414: 4410: 4405: 4398: 4393: 4390: 4387: 4383: 4376: 4373: 4370: 4366: 4357: 4353: 4349: 4344: 4340: 4335: 4323: 4318: 4315: 4306: 4302: 4299: 4259: 4254: 4250: 4246: 4240: 4236: 4230: 4225: 4222: 4219: 4215: 4208: 4203: 4198: 4195: 4192: 4185: 4182: 4173: 4168: 4156: 4151: 4148: 4126: 4119: 4101: 4097: 4093: 4090: 4086: 4081: 4074: 4070: 4066: 4061: 4056: 4052: 4021: 4017: 4010: 4006: 4000: 3996: 3989: 3984: 3977: 3974: 3965: 3961: 3957: 3953: 3949: 3942: 3939: 3913: 3906: 3903: 3896: 3893: 3886: 3882: 3876: 3872: 3845: 3821: 3817: 3811: 3807: 3801: 3796: 3789: 3786: 3773:values, i.e.: 3758: 3735: 3730: 3726: 3722: 3719: 3716: 3711: 3707: 3701: 3696: 3692: 3688: 3685: 3680: 3676: 3672: 3669: 3664: 3659: 3655: 3651: 3648: 3644: 3640: 3636: 3632: 3629: 3607: 3603: 3597: 3593: 3589: 3586: 3581: 3577: 3573: 3570: 3565: 3561: 3557: 3554: 3550: 3546: 3542: 3538: 3535: 3513: 3509: 3503: 3499: 3495: 3491: 3487: 3483: 3460: 3456: 3433: 3429: 3398: 3394: 3371: 3366: 3362: 3356: 3351: 3347: 3343: 3340: 3331: 3327: 3324: 3319: 3315: 3311: 3308: 3286: 3282: 3278: 3275: 3272: 3264: 3261: 3258: 3253: 3249: 3245: 3242: 3212: 3208: 3172: 3169: 3146: 3089: 3085: 3068: 3065: 3044: 3038: 3033: 3029: 3023: 3018: 3015: 3012: 3008: 3001: 2996: 2991: 2987: 2962: 2956: 2952: 2946: 2941: 2938: 2935: 2931: 2924: 2918: 2915: 2889: 2886: 2883: 2876: 2872: 2865: 2862: 2856: 2851: 2847: 2843: 2838: 2833: 2830: 2827: 2823: 2816: 2811: 2806: 2799: 2796: 2781: 2780: 2765: 2758: 2755: 2747: 2742: 2738: 2729: 2724: 2717: 2714: 2707: 2704: 2699: 2692: 2689: 2682: 2679: 2676: 2630: 2627: 2625: 2622: 2610: 2607: 2604: 2601: 2595: 2592: 2586: 2583: 2563: 2560: 2555: 2551: 2530: 2524: 2520: 2515: 2511: 2507: 2500: 2495: 2492: 2489: 2485: 2481: 2478: 2472: 2469: 2463: 2460: 2440: 2434: 2430: 2425: 2422: 2417: 2413: 2409: 2406: 2380: 2377: 2362: 2359: 2357: 2354: 2342: 2341: 2330: 2323: 2318: 2314: 2308: 2304: 2296: 2291: 2288: 2285: 2281: 2275: 2268: 2265: 2259: 2255: 2249: 2246: 2229: 2228: 2217: 2210: 2205: 2199: 2196: 2191: 2187: 2181: 2176: 2173: 2170: 2166: 2161: 2152: 2147: 2143: 2137: 2134: 2129: 2125: 2118: 2113: 2110: 2107: 2103: 2096: 2090: 2085: 2081: 2074: 2070: 2065: 2061: 2054: 2049: 2046: 2043: 2039: 2035: 2030: 2023: 2020: 2014: 1988: 1984: 1980: 1975: 1971: 1950: 1946: 1940: 1935: 1931: 1927: 1922: 1915: 1912: 1906: 1894: 1893: 1882: 1873: 1869: 1863: 1858: 1855: 1852: 1848: 1843: 1837: 1828: 1825: 1820: 1816: 1810: 1805: 1802: 1799: 1795: 1790: 1784: 1777: 1774: 1768: 1749: 1748: 1737: 1729: 1725: 1719: 1714: 1711: 1708: 1704: 1697: 1691: 1687: 1683: 1678: 1674: 1669: 1663: 1658: 1655: 1652: 1648: 1641: 1630: 1625: 1621: 1617: 1609: 1604: 1601: 1598: 1594: 1587: 1579: 1574: 1570: 1564: 1560: 1553: 1547: 1542: 1539: 1536: 1532: 1525: 1519: 1516: 1499: 1498: 1487: 1480: 1475: 1471: 1467: 1462: 1457: 1453: 1427: 1422: 1418: 1388: 1384: 1362:Main article: 1359: 1356: 1355: 1354: 1338: 1334: 1329: 1325: 1319: 1314: 1311: 1308: 1304: 1298: 1295: 1288: 1285: 1279: 1247: 1244: 1238: 1211: 1207: 1197:with variance 1176: 1172: 1165: 1160: 1157: 1154: 1150: 1144: 1141: 1125: 1124: 1106: 1102: 1095: 1090: 1087: 1084: 1080: 1073: 1069: 1063: 1059: 1055: 1051: 1036: 1035: 1020: 1016: 1011: 1007: 1003: 996: 991: 988: 985: 981: 977: 971: 968: 942: 939: 934: 930: 926: 919: 914: 911: 908: 904: 888: 887: 876: 868: 864: 860: 857: 854: 849: 845: 841: 836: 832: 824: 820: 814: 810: 806: 803: 800: 795: 791: 785: 781: 777: 772: 768: 762: 758: 751: 745: 742: 725: 724: 713: 705: 701: 695: 690: 687: 684: 680: 672: 668: 662: 658: 652: 647: 644: 641: 637: 630: 624: 621: 594: 588: 584: 580: 577: 574: 569: 565: 561: 556: 552: 547: 522: 516: 512: 508: 505: 502: 497: 493: 489: 484: 480: 475: 458: 455: 454: 453: 442: 439: 436: 433: 430: 427: 424: 421: 418: 415: 412: 409: 406: 403: 397: 394: 377: 376: 365: 362: 356: 353: 350: 346: 333: 332: 321: 318: 312: 309: 306: 302: 274: 271: 267: 266: 255: 252: 246: 243: 240: 235: 232: 229: 226: 223: 220: 217: 214: 211: 208: 205: 199: 193: 190: 162: 159: 154: 151: 146: 140: 137: 122: 121: 115: 114: 82: 79: 77: 74: 26: 9: 6: 4: 3: 2: 20338: 20327: 20324: 20322: 20319: 20317: 20314: 20313: 20311: 20302: 20299: 20294: 20293: 20288: 20282: 20281: 20271: 20265: 20261: 20256: 20252: 20248: 20244: 20239: 20238: 20226: 20225:0-9771170-1-4 20222: 20218: 20217: 20210: 20203: 20199: 20195: 20189: 20181: 20179:981-270-527-9 20175: 20171: 20164: 20158: 20154: 20148: 20140: 20136: 20132: 20128: 20124: 20120: 20116: 20112: 20105: 20098: 20096: 20080: 20073: 20067: 20052: 20048: 20042: 20033: 20028: 20024: 20020: 20016: 20012: 20008: 20001: 19995: 19989: 19985: 19981: 19977: 19973: 19969: 19962: 19956: 19952: 19946: 19938: 19932: 19928: 19921: 19919: 19917: 19915: 19913: 19911: 19909: 19907: 19905: 19903: 19901: 19899: 19897: 19895: 19893: 19891: 19889: 19882: 19878: 19872: 19870: 19868: 19866: 19861: 19847: 19841: 19837: 19826: 19823: 19821: 19818: 19816: 19813: 19811: 19808: 19806: 19803: 19801: 19798: 19796: 19793: 19791: 19788: 19786: 19783: 19781: 19778: 19776: 19773: 19771: 19768: 19766: 19763: 19761: 19758: 19756: 19753: 19751: 19748: 19747: 19726: 19720: 19717: 19714: 19707: 19693: 19687: 19682: 19678: 19669: 19664: 19661: 19658: 19654: 19647: 19642: 19638: 19630: 19629: 19628: 19626: 19610: 19606: 19600: 19596: 19592: 19587: 19577: 19571: 19562: 19544: 19534: 19528: 19505: 19501: 19497: 19492: 19488: 19478: 19476: 19453: 19441: 19414: 19407: 19402: 19398: 19391: 19377: 19371: 19366: 19362: 19350: 19345: 19342: 19339: 19335: 19325: 19322: 19319: 19312: 19307: 19302: 19297: 19293: 19285: 19284: 19283: 19281: 19263: 19258: 19254: 19228: 19223: 19219: 19213: 19203: 19197: 19193: 19188: 19178: 19166: 19155: 19154: 19153: 19137: 19133: 19121: 19111: 19103: 19089: 19069: 19043: 19040: 19037: 19031: 19028: 19024: 19019: 18999: 18979: 18976: 18970: 18967: 18963: 18959: 18956: 18933: 18930: 18926: 18902: 18899: 18896: 18890: 18887: 18881: 18878: 18875: 18866: 18863: 18859: 18835: 18832: 18824: 18821: 18818: 18794: 18785: 18771: 18748: 18745: 18742: 18735: 18731: 18728: 18723: 18719: 18695: 18689: 18686: 18683: 18676: 18672: 18668: 18665: 18659: 18653: 18650: 18647: 18643: 18636: 18631: 18628: 18625: 18621: 18617: 18612: 18608: 18600: 18599: 18598: 18582: 18578: 18555: 18551: 18527: 18520: 18516: 18510: 18507: 18504: 18500: 18494: 18489: 18485: 18477: 18476: 18475: 18461: 18438: 18435: 18432: 18429: 18409: 18406: 18400: 18397: 18386: 18362: 18357: 18354: 18351: 18348: 18345: 18341: 18335: 18331: 18325: 18320: 18317: 18314: 18310: 18306: 18301: 18297: 18289: 18288: 18287: 18273: 18253: 18230: 18226: 18203: 18199: 18178: 18156: 18152: 18131: 18111: 18091: 18063: 18058: 18055: 18045: 18033: 18032: 18031: 18014: 17996: 17981: 17971: 17965: 17961: 17952: 17942: 17941: 17940: 17923: 17918: 17915: 17895: 17882: 17877: 17867: 17861: 17853: 17852: 17851: 17849: 17833: 17811: 17803: 17800: 17797: 17794: 17791: 17781: 17777: 17776:design matrix 17733: 17708: 17704: 17695: 17655: 17645: 17641: 17637: 17634: 17631: 17626: 17622: 17615: 17600: 17596: 17586: 17563: 17557: 17550: 17544: 17539: 17534: 17528: 17521: 17515: 17508: 17502: 17497: 17490: 17485: 17479: 17474: 17472: 17463: 17457: 17445: 17442: 17437: 17427: 17422: 17410: 17407: 17402: 17391: 17385: 17382: 17377: 17371: 17368: 17363: 17353: 17348: 17345: 17340: 17329: 17324: 17322: 17295: 17294: 17293: 17274: 17268: 17263: 17256: 17251: 17245: 17240: 17235: 17224: 17213: 17207: 17202: 17196: 17190: 17185: 17171: 17155: 17149: 17144: 17137: 17132: 17126: 17121: 17116: 17105: 17094: 17088: 17083: 17077: 17071: 17066: 17052: 17051: 17050: 17047: 17034: 17029: 17026: 17021: 17015: 17003: 16998: 16995: 16992: 16988: 16983: 16978: 16949: 16945: 16929: 16925: 16919: 16907: 16895: 16890: 16887: 16884: 16880: 16875: 16850: 16827: 16814: 16809: 16806: 16801: 16791: 16786: 16771: 16769: 16765: 16740: 16722: 16718: 16709: 16698: 16684: 16680: 16676: 16673: 16668: 16664: 16659: 16653: 16649: 16625: 16617: 16613: 16609: 16606: 16600: 16594: 16590: 16586: 16581: 16570: 16564: 16559: 16553: 16549: 16545: 16540: 16529: 16522: 16518: 16512: 16507: 16504: 16501: 16497: 16490: 16478: 16477: 16476: 16462: 16459: 16454: 16450: 16440: 16419: 16408: 16404: 16399: 16393: 16389: 16382: 16377: 16373: 16366: 16360: 16356: 16352: 16347: 16336: 16330: 16325: 16319: 16315: 16311: 16306: 16295: 16288: 16284: 16278: 16273: 16270: 16267: 16263: 16256: 16254: 16245: 16239: 16235: 16231: 16226: 16215: 16209: 16204: 16198: 16194: 16190: 16185: 16174: 16167: 16163: 16157: 16152: 16149: 16146: 16142: 16133: 16128: 16124: 16118: 16113: 16110: 16107: 16103: 16099: 16094: 16089: 16083: 16079: 16073: 16068: 16065: 16062: 16058: 16053: 16044: 16040: 16034: 16029: 16026: 16023: 16019: 16012: 16010: 15992: 15991: 15990: 15966: 15947: 15942: 15930: 15926: 15920: 15915: 15912: 15909: 15905: 15901: 15891: 15882: 15881: 15880: 15859: 15849: 15848:weighted mean 15825: 15821: 15815: 15810: 15807: 15804: 15800: 15793: 15789: 15783: 15779: 15775: 15771: 15763: 15762: 15761: 15745: 15741: 15717: 15714: 15709: 15705: 15699: 15694: 15691: 15688: 15684: 15680: 15675: 15671: 15663: 15662: 15661: 15659: 15655: 15645: 15643: 15639: 15635: 15616: 15610: 15607: 15602: 15598: 15591: 15585: 15581: 15577: 15572: 15561: 15555: 15550: 15544: 15540: 15536: 15531: 15520: 15513: 15509: 15503: 15498: 15495: 15492: 15488: 15481: 15469: 15468: 15467: 15441: 15437: 15427: 15410: 15403: 15399: 15393: 15387: 15383: 15379: 15374: 15363: 15357: 15352: 15346: 15342: 15338: 15333: 15322: 15315: 15311: 15305: 15300: 15297: 15294: 15290: 15283: 15271: 15270: 15269: 15252: 15244: 15240: 15234: 15229: 15226: 15223: 15219: 15211: 15199: 15195: 15189: 15184: 15181: 15178: 15174: 15167: 15157: 15148: 15147: 15146: 15125: 15115: 15114:weighted mean 15110: 15096: 15093: 15088: 15084: 15075: 15057: 15036: 15033: 15030: 15028: 15023: 15007: 14998: 14994: 14986: 14975: 14968: 14938: 14927: 14923: 14918: 14912: 14908: 14901: 14896: 14892: 14884: 14874: 14870: 14866: 14861: 14857: 14848: 14844: 14838: 14833: 14830: 14827: 14816: 14814: 14798: 14793: 14789: 14784: 14778: 14774: 14767: 14764: 14758: 14741: 14732: 14730: 14720: 14709: 14697: 14696: 14695: 14692: 14677: 14671: 14666: 14662: 14657: 14651: 14647: 14642: 14638: 14635: 14627: 14609: 14606: 14603: 14599: 14595: 14590: 14586: 14581: 14575: 14570: 14566: 14541: 14536: 14532: 14529: 14526: 14520: 14498: 14490: 14485: 14481: 14475: 14471: 14465: 14462: 14458: 14435: 14430: 14426: 14420: 14415: 14412: 14409: 14405: 14401: 14396: 14392: 14369: 14365: 14359: 14354: 14351: 14348: 14344: 14340: 14335: 14331: 14301: 14292: 14287: 14279: 14274: 14270: 14264: 14260: 14254: 14251: 14247: 14243: 14241: 14228: 14217: 14211: 14205: 14202: 14193: 14183: 14178: 14174: 14168: 14164: 14158: 14150: 14139: 14133: 14127: 14124: 14115: 14109: 14107: 14095: 14091: 14081: 14071: 14067: 14063: 14058: 14054: 14044: 14036: 14032: 14026: 14021: 14018: 14015: 14004: 14002: 13992: 13975: 13965: 13953: 13944: 13939: 13934: 13930: 13927: 13924: 13918: 13914: 13912: 13899: 13888: 13882: 13876: 13873: 13864: 13856: 13853: 13848: 13840: 13829: 13823: 13817: 13814: 13805: 13799: 13797: 13787: 13778: 13770: 13767: 13762: 13758: 13748: 13740: 13735: 13732: 13729: 13718: 13716: 13706: 13696: 13686: 13672: 13671: 13670: 13654: 13645: 13624: 13616: 13606: 13604: 13600: 13596: 13591: 13575: 13571: 13550: 13528: 13524: 13518: 13513: 13510: 13507: 13503: 13477: 13472: 13466: 13462: 13458: 13453: 13449: 13444: 13437: 13433: 13427: 13422: 13419: 13416: 13412: 13405: 13402: 13397: 13393: 13387: 13382: 13379: 13376: 13372: 13364: 13360: 13354: 13349: 13346: 13343: 13339: 13332: 13324: 13320: 13312: 13311: 13310: 13291: 13287: 13274: 13257: 13254: 13251: 13248: 13245: 13219: 13216: 13213: 13210: 13207: 13181: 13178: 13175: 13172: 13169: 13166: 13163: 13160: 13157: 13154: 13151: 13139: 13119: 13116: 13111: 13107: 13101: 13096: 13093: 13090: 13086: 13078: 13073: 13067: 13063: 13059: 13054: 13050: 13045: 13038: 13034: 13028: 13023: 13020: 13017: 13006: 12998: 12994: 12986: 12985: 12984: 12982: 12972: 12970: 12966: 12962: 12958: 12954: 12950: 12945: 12943: 12925: 12921: 12898: 12881: 12869: 12853: 12850: 12845: 12841: 12835: 12830: 12827: 12824: 12820: 12787: 12783: 12777: 12772: 12769: 12766: 12762: 12754: 12749: 12743: 12739: 12735: 12730: 12726: 12721: 12714: 12710: 12704: 12699: 12696: 12693: 12682: 12680: 12673: 12656: 12643: 12637: 12632: 12628: 12625: 12620: 12616: 12611: 12604: 12599: 12596: 12593: 12582: 12580: 12570: 12560: 12545: 12544: 12543: 12527: 12517: 12505: 12487: 12470: 12459: 12455: 12450: 12434: 12430: 12421: 12417: 12411: 12396: 12376: 12372: 12365: 12360: 12357: 12354: 12350: 12343: 12339: 12333: 12329: 12325: 12321: 12311: 12309: 12291: 12285: 12279: 12275: 12271: 12261: 12255: 12232: 12227: 12223: 12202: 12199: 12192: 12188: 12183: 12179: 12172: 12167: 12164: 12161: 12157: 12153: 12150: 12146: 12142: 12119: 12112: 12108: 12103: 12099: 12092: 12087: 12084: 12081: 12077: 12071: 12066: 12062: 12058: 12053: 12043: 12037: 12029: 12028: 12027: 12011: 12006: 12002: 11998: 11993: 11988: 11984: 11974: 11972: 11949: 11943: 11916: 11911: 11907: 11900: 11896: 11891: 11887: 11880: 11875: 11872: 11869: 11865: 11861: 11856: 11846: 11840: 11832: 11831: 11830: 11814: 11809: 11805: 11790: 11788: 11787:Bootstrapping 11784: 11774: 11754: 11744: 11734: 11727: 11722: 11718: 11709: 11704: 11700: 11696: 11688: 11674: 11668: 11659: 11656: 11653: 11646: 11641: 11635: 11630: 11620: 11614: 11603: 11602: 11601: 11585: 11579: 11575: 11571: 11565: 11556: 11529: 11523: 11509: 11503: 11498: 11494: 11487: 11482: 11477: 11467: 11460: 11452: 11442: 11429: 11423: 11418: 11414: 11408: 11404: 11388: 11382: 11377: 11373: 11366: 11361: 11351: 11344: 11341: 11336: 11326: 11316: 11303: 11297: 11292: 11288: 11282: 11278: 11271: 11267: 11258: 11244: 11238: 11229: 11226: 11223: 11216: 11211: 11205: 11200: 11193: 11183: 11175: 11164: 11163: 11162: 11160: 11159:Taylor series 11156: 11155:bootstrapping 11146: 11142: 11140: 11133: 11130: 11117: 11112: 11102: 11092: 11085: 11080: 11076: 11067: 11062: 11058: 11052: 11047: 11044: 11041: 11037: 11028: 11018: 11014: 11008: 11003: 11000: 10997: 10993: 10985: 10980: 10975: 10970: 10962: 10958: 10951: 10941: 10934: 10929: 10925: 10913: 10909: 10905: 10902: 10895: 10888: 10883: 10880: 10877: 10873: 10865: 10855: 10848: 10843: 10837: 10828: 10818: 10808: 10784: 10781: 10773: 10769: 10765: 10762: 10739: 10736: 10728: 10724: 10720: 10715: 10711: 10704: 10701: 10696: 10693: 10685: 10682: 10679: 10676: 10664: 10651: 10647: 10639: 10635: 10628: 10618: 10611: 10606: 10602: 10591: 10587: 10580: 10570: 10563: 10558: 10554: 10545: 10542: 10524: 10518: 10513: 10510: 10507: 10503: 10497: 10492: 10489: 10486: 10482: 10474: 10464: 10457: 10452: 10446: 10437: 10427: 10417: 10410: 10404: 10391: 10382: 10370: 10356: 10353: 10348: 10344: 10334: 10318: 10314: 10308: 10304: 10300: 10296: 10292: 10288: 10265: 10261: 10255: 10251: 10247: 10243: 10239: 10235: 10226: 10222: 10206: 10203: 10195: 10191: 10187: 10182: 10178: 10171: 10168: 10163: 10160: 10152: 10149: 10146: 10143: 10132: 10107: 10101: 10092: 10077: 10065: 10050: 10037: 10031: 10022: 10007: 9995: 9989: 9986: 9980: 9967: 9958: 9945: 9939: 9933: 9920: 9911: 9903: 9895: 9885: 9878: 9873: 9869: 9861: 9857: 9850: 9846: 9836: 9830: 9825: 9821: 9810: 9806: 9799: 9795: 9785: 9779: 9774: 9770: 9761: 9758: 9740: 9734: 9729: 9726: 9723: 9719: 9713: 9708: 9705: 9702: 9698: 9690: 9680: 9673: 9668: 9662: 9649: 9640: 9628: 9625: 9611: 9603: 9599: 9594: 9590: 9586: 9580: 9577: 9570: 9566: 9561: 9557: 9553: 9546: 9540: 9535: 9532: 9529: 9525: 9519: 9516: 9511: 9508: 9505: 9498: 9494: 9490: 9484: 9480: 9474: 9469: 9466: 9463: 9459: 9452: 9448: 9444: 9438: 9434: 9428: 9423: 9420: 9417: 9413: 9406: 9396: 9386: 9379: 9370: 9359: 9338: 9328: 9321: 9312: 9296: 9294: 9276: 9272: 9247: 9243: 9239: 9234: 9229: 9225: 9200: 9196: 9192: 9187: 9182: 9178: 9157: 9152: 9148: 9144: 9139: 9135: 9108: 9098: 9088: 9081: 9076: 9072: 9063: 9058: 9054: 9048: 9043: 9040: 9037: 9033: 9024: 9014: 9010: 9004: 8999: 8996: 8993: 8989: 8981: 8976: 8970: 8961: 8951: 8941: 8930: 8929: 8928: 8926: 8921: 8920:Taylor series 8916: 8911: 8909: 8905: 8900: 8882: 8878: 8874: 8868: 8864: 8858: 8853: 8850: 8847: 8843: 8836: 8832: 8828: 8822: 8818: 8812: 8807: 8804: 8801: 8797: 8790: 8785: 8775: 8763: 8742: 8738: 8732: 8727: 8724: 8721: 8717: 8709: 8705: 8699: 8695: 8689: 8684: 8681: 8678: 8674: 8667: 8662: 8652: 8626: 8616: 8609: 8602: 8598: 8594: 8588: 8584: 8578: 8573: 8570: 8567: 8563: 8556: 8552: 8548: 8542: 8538: 8532: 8527: 8524: 8521: 8517: 8510: 8503: 8499: 8495: 8489: 8485: 8479: 8474: 8471: 8468: 8464: 8457: 8453: 8449: 8443: 8439: 8433: 8428: 8425: 8422: 8418: 8411: 8404: 8400: 8390: 8381: 8376: 8373: 8370: 8366: 8359: 8355: 8345: 8336: 8331: 8328: 8325: 8321: 8314: 8304: 8300: 8296: 8289: 8285: 8279: 8274: 8271: 8268: 8264: 8254: 8250: 8244: 8240: 8232: 8228: 8222: 8217: 8214: 8211: 8207: 8200: 8186: 8178: 8169: 8141: 8137: 8131: 8126: 8123: 8120: 8116: 8108: 8104: 8098: 8094: 8088: 8083: 8080: 8077: 8073: 8066: 8058: 8048: 8039: 8034: 8031: 8028: 8024: 8016: 8006: 7997: 7992: 7989: 7986: 7982: 7975: 7965: 7961: 7957: 7950: 7945: 7942: 7939: 7935: 7925: 7921: 7915: 7911: 7903: 7898: 7895: 7892: 7888: 7881: 7872: 7866: 7863: 7841: 7837: 7815: 7811: 7801: 7792: 7787: 7784: 7781: 7777: 7773: 7766: 7762: 7756: 7752: 7744: 7739: 7736: 7733: 7729: 7725: 7720: 7716: 7710: 7706: 7700: 7695: 7692: 7689: 7685: 7681: 7672: 7645: 7641: 7637: 7632: 7627: 7623: 7602: 7576: 7553: 7533: 7507: 7485: 7466: 7461: 7455: 7451: 7445: 7441: 7436: 7429: 7424: 7421: 7418: 7414: 7406: 7402: 7398: 7393: 7391: 7382: 7374: 7370: 7364: 7360: 7350: 7346: 7340: 7336: 7325: 7321: 7317: 7314: 7307: 7301: 7296: 7293: 7290: 7286: 7278: 7274: 7270: 7265: 7263: 7254: 7248: 7238: 7229: 7219: 7210: 7207: 7189: 7183: 7178: 7175: 7172: 7168: 7160: 7156: 7152: 7147: 7145: 7136: 7130: 7120: 7111: 7101: 7092: 7089: 7071: 7065: 7060: 7057: 7054: 7050: 7044: 7039: 7036: 7033: 7029: 7021: 7017: 7013: 7008: 7006: 6991: 6976: 6966: 6963: 6939: 6936: 6928: 6924: 6920: 6917: 6885: 6880: 6874: 6870: 6864: 6860: 6855: 6848: 6843: 6840: 6837: 6833: 6825: 6821: 6817: 6812: 6799: 6779: 6767: 6764: 6757: 6756: 6755: 6741: 6738: 6730: 6726: 6722: 6717: 6713: 6706: 6703: 6700: 6697: 6694: 6682: 6666: 6662: 6658: 6655: 6652: 6645: 6641: 6634: 6630: 6624: 6620: 6613: 6610: 6607: 6602: 6599: 6561: 6558: 6554: 6547: 6543: 6537: 6533: 6526: 6523: 6520: 6515: 6512: 6476: 6473: 6469: 6446: 6443: 6435: 6430: 6426: 6420: 6416: 6412: 6407: 6404: 6400: 6396: 6388: 6384: 6380: 6375: 6371: 6364: 6340: 6336: 6330: 6326: 6320: 6315: 6305: 6292: 6278: 6272: 6262: 6253: 6243: 6234: 6231: 6213: 6207: 6202: 6199: 6196: 6192: 6186: 6181: 6178: 6175: 6171: 6163: 6159: 6155: 6150: 6137: 6117: 6105: 6102: 6094: 6092: 6086: 6083: 6064: 6059: 6049: 6046: 6040: 6035: 6031: 6025: 6021: 6016: 6009: 6004: 6001: 5998: 5994: 5987: 5984: 5981: 5977: 5972: 5970: 5960: 5955: 5945: 5942: 5936: 5931: 5927: 5921: 5917: 5912: 5905: 5900: 5897: 5894: 5890: 5883: 5880: 5877: 5873: 5866: 5861: 5857: 5851: 5849: 5839: 5834: 5828: 5822: 5818: 5812: 5808: 5802: 5797: 5794: 5791: 5787: 5780: 5777: 5770: 5766: 5760: 5756: 5750: 5746: 5739: 5734: 5731: 5728: 5724: 5717: 5714: 5711: 5707: 5700: 5697: 5692: 5687: 5682: 5676: 5672: 5666: 5662: 5656: 5651: 5648: 5645: 5641: 5635: 5632: 5627: 5620: 5616: 5610: 5606: 5598: 5595: 5589: 5582: 5577: 5574: 5571: 5567: 5560: 5557: 5554: 5550: 5543: 5540: 5535: 5533: 5523: 5518: 5512: 5509: 5506: 5496: 5489: 5482: 5478: 5472: 5468: 5461: 5454: 5449: 5446: 5443: 5439: 5432: 5429: 5426: 5422: 5415: 5412: 5407: 5405: 5385: 5375: 5372: 5360: 5356: 5340: 5334: 5330: 5324: 5320: 5311: 5306: 5303: 5300: 5296: 5292: 5283: 5280: 5255: 5250: 5240: 5237: 5231: 5226: 5222: 5216: 5212: 5207: 5200: 5195: 5192: 5189: 5185: 5178: 5175: 5172: 5168: 5163: 5155: 5152: 5149: 5139: 5129: 5126: 5118: 5113: 5098: 5094: 5090: 5084: 5080: 5074: 5069: 5066: 5063: 5059: 5055: 5048: 5044: 5039: 5035: 5031: 5023: 5018: 5015: 5012: 5008: 5004: 4996: 4992: 4988: 4983: 4979: 4975: 4967: 4962: 4959: 4956: 4952: 4948: 4941: 4937: 4932: 4928: 4924: 4916: 4911: 4908: 4905: 4901: 4895: 4892: 4887: 4882: 4879: 4876: 4866: 4842: 4834: 4830: 4817: 4808: 4790: 4786: 4780: 4775: 4772: 4769: 4765: 4761: 4758: 4735: 4730: 4725: 4721: 4717: 4711: 4707: 4701: 4696: 4693: 4690: 4686: 4679: 4674: 4666: 4662: 4657: 4653: 4649: 4641: 4636: 4633: 4630: 4626: 4619: 4614: 4606: 4602: 4597: 4593: 4589: 4581: 4576: 4573: 4570: 4566: 4560: 4557: 4549: 4544: 4539: 4536: 4533: 4523: 4514: 4509: 4489: 4476: 4475: 4474: 4451: 4446: 4436: 4433: 4427: 4422: 4418: 4412: 4408: 4403: 4396: 4391: 4388: 4385: 4381: 4374: 4371: 4368: 4364: 4355: 4351: 4347: 4342: 4338: 4333: 4313: 4304: 4300: 4297: 4290: 4289: 4288: 4286: 4282: 4278: 4273: 4257: 4252: 4248: 4244: 4238: 4234: 4228: 4223: 4220: 4217: 4213: 4206: 4201: 4196: 4193: 4190: 4180: 4171: 4166: 4146: 4132: 4124: 4118: 4099: 4095: 4091: 4088: 4084: 4079: 4072: 4068: 4064: 4059: 4054: 4050: 4041: 4036: 4019: 4015: 4008: 4004: 3998: 3994: 3987: 3982: 3972: 3963: 3959: 3955: 3951: 3947: 3937: 3911: 3901: 3894: 3891: 3884: 3880: 3874: 3870: 3859: 3843: 3819: 3815: 3809: 3805: 3799: 3794: 3784: 3772: 3756: 3747: 3728: 3724: 3720: 3717: 3709: 3705: 3699: 3694: 3690: 3686: 3678: 3674: 3667: 3662: 3657: 3653: 3649: 3642: 3638: 3634: 3627: 3605: 3601: 3595: 3591: 3587: 3579: 3575: 3568: 3563: 3559: 3555: 3548: 3544: 3540: 3533: 3511: 3507: 3501: 3497: 3493: 3489: 3485: 3481: 3458: 3454: 3431: 3427: 3417: 3415: 3396: 3392: 3369: 3364: 3360: 3354: 3349: 3345: 3341: 3325: 3322: 3317: 3313: 3306: 3284: 3280: 3276: 3270: 3262: 3259: 3256: 3251: 3247: 3240: 3232: 3228: 3210: 3206: 3197: 3193: 3189: 3167: 3144: 3136: 3133:or sometimes 3132: 3128: 3124: 3120: 3115: 3113: 3109: 3105: 3087: 3083: 3074: 3064: 3062: 3057: 3042: 3036: 3031: 3027: 3021: 3016: 3013: 3010: 3006: 2999: 2989: 2985: 2960: 2954: 2950: 2944: 2939: 2936: 2933: 2929: 2922: 2913: 2887: 2884: 2881: 2874: 2860: 2854: 2849: 2845: 2836: 2831: 2828: 2825: 2821: 2814: 2809: 2804: 2794: 2763: 2753: 2740: 2736: 2727: 2722: 2712: 2705: 2697: 2687: 2677: 2674: 2667: 2666: 2665: 2663: 2659: 2655: 2651: 2647: 2643: 2640: 2636: 2621: 2608: 2605: 2602: 2590: 2581: 2561: 2558: 2553: 2549: 2528: 2522: 2518: 2513: 2509: 2505: 2498: 2493: 2490: 2487: 2483: 2479: 2467: 2458: 2438: 2432: 2428: 2423: 2415: 2411: 2404: 2395: 2375: 2353: 2351: 2347: 2328: 2321: 2316: 2312: 2306: 2302: 2294: 2289: 2286: 2283: 2279: 2273: 2263: 2257: 2253: 2244: 2234: 2233: 2232: 2215: 2208: 2203: 2197: 2194: 2189: 2185: 2179: 2174: 2171: 2168: 2164: 2159: 2150: 2145: 2141: 2135: 2132: 2127: 2123: 2116: 2111: 2108: 2105: 2101: 2094: 2088: 2083: 2079: 2072: 2068: 2063: 2059: 2052: 2047: 2044: 2041: 2037: 2033: 2028: 2018: 2012: 2004: 2003: 2002: 1986: 1982: 1978: 1973: 1969: 1948: 1944: 1938: 1933: 1929: 1925: 1920: 1910: 1904: 1880: 1871: 1867: 1861: 1856: 1853: 1850: 1846: 1841: 1835: 1826: 1823: 1818: 1814: 1808: 1803: 1800: 1797: 1793: 1788: 1782: 1772: 1766: 1758: 1757: 1756: 1754: 1735: 1727: 1723: 1717: 1712: 1709: 1706: 1702: 1695: 1689: 1685: 1681: 1676: 1672: 1667: 1661: 1656: 1653: 1650: 1646: 1639: 1628: 1623: 1619: 1615: 1607: 1602: 1599: 1596: 1592: 1585: 1577: 1572: 1568: 1562: 1558: 1551: 1545: 1540: 1537: 1534: 1530: 1523: 1514: 1504: 1503: 1502: 1485: 1478: 1473: 1469: 1465: 1460: 1455: 1451: 1443: 1442: 1441: 1425: 1420: 1416: 1408: 1404: 1386: 1382: 1371: 1365: 1336: 1332: 1327: 1323: 1317: 1312: 1309: 1306: 1296: 1293: 1283: 1277: 1269: 1268: 1267: 1265: 1242: 1236: 1227: 1209: 1205: 1196: 1191: 1174: 1170: 1163: 1158: 1155: 1152: 1142: 1139: 1130: 1129:ordinary mean 1104: 1100: 1093: 1088: 1085: 1082: 1071: 1067: 1061: 1057: 1053: 1049: 1041: 1040: 1039: 1018: 1014: 1009: 1005: 1001: 994: 989: 986: 983: 975: 966: 956: 955: 954: 940: 937: 932: 928: 924: 917: 912: 909: 906: 892: 874: 866: 862: 858: 855: 852: 847: 843: 839: 834: 830: 822: 818: 812: 808: 804: 801: 798: 793: 789: 783: 779: 775: 770: 766: 760: 756: 749: 740: 730: 729: 728: 711: 703: 699: 693: 688: 685: 682: 670: 666: 660: 656: 650: 645: 642: 639: 628: 619: 609: 608: 607: 592: 586: 582: 578: 575: 572: 567: 563: 559: 554: 550: 545: 537: 520: 514: 510: 506: 503: 500: 495: 491: 487: 482: 478: 473: 464: 440: 437: 431: 428: 425: 419: 413: 410: 407: 401: 392: 382: 381: 380: 363: 360: 354: 351: 348: 344: 335: 334: 319: 316: 310: 307: 304: 300: 291: 290: 289: 286: 284: 280: 270: 253: 250: 244: 241: 238: 230: 227: 224: 218: 212: 209: 206: 197: 188: 178: 177: 176: 173: 160: 157: 152: 149: 144: 135: 119: 118: 117: 112: 111: 110: 81:Basic example 73: 71: 67: 62: 60: 56: 52: 48: 41: 37: 33: 19: 18:Weighted mean 20290: 20285:David Terr. 20259: 20242: 20215: 20209: 20194:Inequalities 20193: 20188: 20169: 20163: 20147: 20114: 20110: 20082:. Retrieved 20078: 20066: 20054:. Retrieved 20050: 20041: 20014: 20010: 20000: 19971: 19967: 19961: 19945: 19926: 19840: 19479: 19474: 19429: 19245: 19123: 19109: 18786: 18711:approaching 18710: 18542: 18388: 18083: 18029: 17938: 17602: 17584: 17291: 17048: 16828: 16772: 16704: 16640: 16441: 16438: 15964: 15962: 15845: 15732: 15653: 15651: 15634:standardized 15631: 15439: 15435: 15433: 15425: 15267: 15145:is given by 15111: 15073: 15042: 15034: 15031: 15026: 15024: 14957: 14693: 14322: 13614: 13612: 13595:standardized 13592: 13494: 13275: 13140: 13137: 12980: 12978: 12968: 12964: 12956: 12952: 12946: 12867: 12811: 12503: 12453: 12451: 12413: 12312: 12134: 11975: 11970: 11934: 11796: 11780: 11771: 11545: 11152: 11143: 11135: 11131: 10665: 10371: 10335: 10224: 10220: 10066: 9629: 9626: 9360: 9301: 9292: 9126: 8912: 8907: 8901: 7496: 6906: 6683: 6293: 6095: 6088: 6084: 5361: 5357: 5116: 5114: 4832: 4810: 4750: 4472: 4285:pps sampling 4283:(such as in 4280: 4274: 4130: 4128: 4122: 4040:design based 4039: 4037: 3857: 3770: 3748: 3418: 3226: 3187: 3134: 3130: 3126: 3122: 3116: 3072: 3070: 3060: 3058: 2782: 2639:uncorrelated 2634: 2632: 2396: 2364: 2343: 2230: 1895: 1752: 1750: 1500: 1373: 1225: 1192: 1126: 1037: 893: 889: 726: 460: 378: 287: 278: 276: 268: 174: 123: 116: 84: 63: 46: 44: 20084:22 December 20056:22 December 19627:(squared), 19563:(squared), 17826:(of length 17778:equal to a 16948:commutative 12310:, squared. 11793:Other notes 4506:known  4330:known  4163:known  3073:model based 2650:expectation 1405:with known 20310:Categories 19856:References 19118:See also: 18597:is simply 18383:See also: 17593:See also: 15658:normalized 15638:normalized 13599:normalized 12408:See also: 3856:-expanded 3769:-expanded 2361:Expectancy 1368:See also: 20292:MathWorld 20251:300283069 19815:Weighting 19718:− 19697:¯ 19688:− 19655:∑ 19639:σ 19597:σ 19581:¯ 19572:σ 19538:¯ 19529:σ 19502:σ 19489:σ 19457:¯ 19445:^ 19442:σ 19399:σ 19381:¯ 19372:− 19336:∑ 19323:− 19298:ν 19294:χ 19259:ν 19255:χ 19224:ν 19220:χ 19207:¯ 19198:σ 19182:¯ 19170:^ 19167:σ 19134:χ 19041:− 19029:− 19020:≤ 18968:− 18960:− 18931:− 18900:− 18879:− 18864:− 18833:− 18822:− 18746:− 18687:− 18669:− 18651:− 18622:∑ 18508:− 18442:Δ 18439:− 18404:Δ 18355:− 18311:∑ 18056:− 17975:¯ 17966:σ 17956:¯ 17916:− 17871:¯ 17862:σ 17798:… 17737:¯ 17635:… 17443:− 17408:− 17383:− 17369:− 17346:− 17315:¯ 17220:⊤ 17101:⊤ 17027:− 16989:∑ 16972:¯ 16881:∑ 16868:¯ 16845:¯ 16807:− 16719:σ 16610:− 16595:∗ 16591:μ 16587:− 16554:∗ 16550:μ 16546:− 16498:∑ 16383:− 16361:∗ 16357:μ 16353:− 16320:∗ 16316:μ 16312:− 16264:∑ 16240:∗ 16236:μ 16232:− 16199:∗ 16195:μ 16191:− 16143:∑ 16104:∑ 16100:− 16059:∑ 16020:∑ 15906:∑ 15896:∗ 15892:μ 15864:∗ 15860:μ 15846:Then the 15801:∑ 15685:∑ 15608:− 15586:∗ 15582:μ 15578:− 15545:∗ 15541:μ 15537:− 15489:∑ 15388:∗ 15384:μ 15380:− 15347:∗ 15343:μ 15339:− 15291:∑ 15220:∑ 15175:∑ 15162:∗ 15158:μ 15130:∗ 15126:μ 15112:Then the 15094:≥ 14999:σ 14969:⁡ 14902:− 14875:∗ 14871:μ 14867:− 14824:∑ 14768:− 14745:^ 14742:σ 14639:− 14530:− 14466:− 14406:∑ 14345:∑ 14293:σ 14255:− 14212:⁡ 14206:− 14194:⁡ 14159:− 14134:⁡ 14128:− 14116:⁡ 14072:∗ 14068:μ 14064:− 14045:⁡ 14012:∑ 13979:^ 13976:σ 13966:⁡ 13945:σ 13928:− 13883:⁡ 13877:− 13865:⁡ 13849:− 13824:⁡ 13818:− 13806:⁡ 13771:μ 13768:− 13749:⁡ 13726:∑ 13700:^ 13697:σ 13687:⁡ 13646:σ 13625:μ 13504:∑ 13467:∗ 13463:μ 13459:− 13413:∑ 13403:− 13373:∑ 13340:∑ 13117:− 13087:∑ 13068:∗ 13064:μ 13060:− 13014:∑ 12922:σ 12885:^ 12882:σ 12821:∑ 12763:∑ 12744:∗ 12740:μ 12736:− 12690:∑ 12660:^ 12657:σ 12629:μ 12626:− 12590:∑ 12564:^ 12561:σ 12521:^ 12518:σ 12474:^ 12471:σ 12456:weighted 12435:∗ 12431:μ 12351:∑ 12276:σ 12265:¯ 12256:σ 12224:σ 12200:≤ 12158:∑ 12154:≤ 12078:∑ 12063:σ 12047:¯ 12038:σ 12003:σ 11985:σ 11953:¯ 11944:σ 11908:σ 11866:∑ 11850:¯ 11841:σ 11806:σ 11783:Jackknife 11738:¯ 11728:− 11697:∑ 11678:¯ 11657:− 11636:^ 11624:¯ 11615:σ 11572:∑ 11560:¯ 11513:¯ 11504:− 11488:∑ 11471:¯ 11446:¯ 11433:¯ 11424:− 11392:¯ 11383:− 11367:∑ 11355:¯ 11342:− 11320:¯ 11307:¯ 11298:− 11272:∑ 11248:¯ 11227:− 11206:^ 11187:¯ 11176:σ 11096:¯ 11086:− 11038:∑ 10994:∑ 10959:π 10945:¯ 10935:− 10910:π 10906:− 10874:∑ 10859:^ 10838:^ 10822:¯ 10782:≈ 10770:π 10766:− 10690:Δ 10680:≠ 10674:∀ 10636:π 10622:¯ 10612:− 10588:π 10574:¯ 10564:− 10536:ˇ 10533:Δ 10504:∑ 10483:∑ 10468:^ 10447:^ 10431:¯ 10405:^ 10395:^ 10157:Δ 10147:≠ 10141:∀ 10111:^ 10096:^ 10081:^ 10067:The term 10041:^ 10026:^ 10011:^ 9999:^ 9987:− 9981:^ 9971:^ 9949:^ 9934:^ 9924:^ 9889:^ 9858:π 9840:^ 9831:− 9807:π 9789:^ 9780:− 9752:ˇ 9749:Δ 9720:∑ 9699:∑ 9684:^ 9663:^ 9653:^ 9600:π 9578:− 9567:π 9526:∑ 9506:≈ 9460:∑ 9414:∑ 9400:^ 9390:^ 9374:^ 9342:^ 9332:^ 9316:^ 9197:π 9145:≈ 9136:π 9092:¯ 9082:− 9034:∑ 8990:∑ 8971:^ 8955:¯ 8844:∑ 8798:∑ 8779:¯ 8718:∑ 8675:∑ 8656:¯ 8620:¯ 8564:∑ 8518:∑ 8465:∑ 8419:∑ 8394:ˇ 8367:∑ 8349:ˇ 8322:∑ 8301:π 8265:∑ 8251:π 8208:∑ 8195:^ 8190:¯ 8173:^ 8117:∑ 8074:∑ 8052:ˇ 8025:∑ 8010:ˇ 7983:∑ 7962:π 7936:∑ 7922:π 7889:∑ 7876:¯ 7805:ˇ 7778:∑ 7763:π 7730:∑ 7686:∑ 7676:^ 7642:π 7580:^ 7511:^ 7415:∑ 7371:π 7347:π 7322:π 7318:− 7287:∑ 7242:ˇ 7223:ˇ 7201:ˇ 7198:Δ 7169:∑ 7124:ˇ 7105:ˇ 7083:ˇ 7080:Δ 7051:∑ 7030:∑ 6980:^ 6967:⁡ 6952:and that 6937:≈ 6925:π 6921:− 6834:∑ 6788:^ 6783:¯ 6768:⁡ 6698:≠ 6692:∀ 6663:π 6659:− 6642:π 6631:π 6621:π 6614:− 6593:ˇ 6590:Δ 6555:π 6544:π 6534:π 6527:− 6506:ˇ 6503:Δ 6470:π 6440:Δ 6427:π 6417:π 6413:− 6401:π 6337:π 6309:ˇ 6266:ˇ 6247:ˇ 6225:ˇ 6222:Δ 6193:∑ 6172:∑ 6126:^ 6121:¯ 6106:⁡ 6054:¯ 6041:− 5995:∑ 5985:− 5950:¯ 5937:− 5891:∑ 5881:− 5788:∑ 5778:− 5767:π 5725:∑ 5715:− 5642:∑ 5628:− 5568:∑ 5558:− 5500:^ 5490:− 5440:∑ 5430:− 5389:^ 5376:⁡ 5297:∑ 5288:¯ 5245:¯ 5232:− 5186:∑ 5176:− 5143:^ 5130:⁡ 5060:∑ 5045:π 5009:∑ 5005:≈ 4953:∑ 4902:∑ 4870:^ 4818:π 4766:∑ 4687:∑ 4663:π 4627:∑ 4620:≈ 4567:∑ 4527:^ 4498:^ 4493:¯ 4441:¯ 4428:− 4382:∑ 4372:− 4322:^ 4317:¯ 4301:⁡ 4214:∑ 4207:≈ 4184:^ 4155:^ 4150:¯ 4092:× 4080:≈ 4069:π 4016:π 3976:ˇ 3941:ˇ 3905:ˇ 3816:π 3788:ˇ 3757:π 3725:π 3721:− 3706:π 3602:π 3416:design). 3361:π 3355:≈ 3281:π 3263:∣ 3196:Bernoulli 3171:^ 3007:∑ 2995:¯ 2930:∑ 2917:¯ 2885:− 2864:¯ 2855:− 2822:∑ 2798:^ 2795:σ 2757:¯ 2746:¯ 2716:^ 2713:σ 2691:¯ 2678:⁡ 2606:μ 2594:¯ 2562:μ 2550:μ 2519:μ 2484:∑ 2471:¯ 2429:μ 2379:¯ 2313:σ 2280:∑ 2267:¯ 2258:σ 2248:¯ 2195:− 2186:σ 2165:∑ 2142:σ 2133:− 2124:σ 2102:∑ 2080:σ 2038:∑ 2022:¯ 2013:σ 1983:σ 1970:σ 1961:when all 1930:σ 1914:¯ 1905:σ 1847:∑ 1824:− 1815:σ 1794:∑ 1776:¯ 1767:σ 1703:∑ 1682:⋅ 1647:∑ 1620:σ 1593:∑ 1569:σ 1531:∑ 1518:¯ 1470:σ 1417:σ 1303:∑ 1297:σ 1287:¯ 1278:σ 1246:¯ 1237:σ 1206:σ 1149:∑ 1079:∑ 980:∑ 970:¯ 903:∑ 856:⋯ 802:⋯ 744:¯ 679:∑ 636:∑ 623:¯ 576:… 504:… 429:× 411:× 396:¯ 228:× 210:× 192:¯ 139:¯ 20155:, 2011. 20139:37828617 19994:pdf link 19743:See also 18191:at time 16762:and the 15965:unbiased 15963:and the 15780:′ 15440:unbiased 12416:variance 12330:′ 12189:′ 12109:′ 11897:′ 10297:′ 10244:′ 10227:between 9595:′ 9562:′ 9499:′ 9453:′ 8883:′ 8837:′ 8762:estimand 8603:′ 8557:′ 8504:′ 8458:′ 8405:′ 8360:′ 7816:′ 6357:. Also, 5099:′ 5040:′ 4984:′ 4933:′ 4726:′ 4658:′ 4598:′ 4253:′ 4038:In this 3952:′ 3860:values: 3643:′ 3549:′ 3490:′ 3102:are not 2646:variance 2624:Variance 2514:′ 2069:′ 1751:and the 1407:variance 1333:′ 1058:′ 1010:′ 933:′ 465:of data 279:relative 99:students 76:Examples 20227:, 1980. 20204:, 1988. 20131:5073694 20051:Gnu.org 20019:Bibcode 19976:Bibcode 19750:Average 19278:is the 18124:, with 18030:where: 17846:). The 17692:is the 16946:is not 16766:by the 16737:by the 15850:vector 15116:vector 14624:is the 7662:we get 4275:If the 3071:From a 1266:to be: 536:weights 105:  101:  93:  89:  87:classes 55:average 20266:  20249:  20223:  20200:  20176:  20137:  20129:  19933:  19879:  19246:where 18543:where 17558:0.9901 17551:0.9901 17503:0.9901 17486:0.9901 15438:, the 15003:actual 14958:where 14726:  14563:  14323:where 14297:actual 13949:actual 13650:actual 13330:  13004:  12812:where 12576:  12504:biased 12454:biased 12135:where 11546:where 6461:where 5270:where 2975:, and 1224:, the 20316:Means 20135:S2CID 20107:(PDF) 20075:(PDF) 19832:Notes 17774:is a 13543:(not 9299:Proof 8915:ratio 6904:Proof 6294:With 4470:Proof 3104:i.i.d 2783:With 2662:proof 2660:(see 2654:i.i.d 463:tuple 38:, or 20264:ISBN 20247:OCLC 20221:ISBN 20198:ISBN 20174:ISBN 20127:PMID 20086:2017 20058:2017 19931:ISBN 19877:ISBN 19760:Mean 18980:0.61 18891:0.39 18407:< 18401:< 17939:and 17597:and 15989:is: 15636:nor 14384:and 13597:nor 12940:for 12452:The 12418:and 11785:and 10280:and 2648:and 1755:is: 1127:The 150:4300 45:The 20119:doi 20027:doi 19984:doi 19012:is 17252:100 17150:100 12942:iid 9217:or 9127:If 6964:Var 6765:Var 6103:Var 5396:pwr 5373:Var 5127:Var 5117:pwr 4833:pwr 4298:Var 4123:pwr 3117:In 2675:Var 2664:): 606:is 441:86. 426:0.6 408:0.4 364:0.6 320:0.4 254:86. 161:86. 107:and 95:one 20312:: 20289:. 20219:, 20133:. 20125:. 20115:35 20113:. 20109:. 20094:^ 20077:. 20049:. 20025:. 20015:27 20013:. 20009:. 19992:- 19982:. 19972:29 19970:. 19929:. 19887:^ 19864:^ 19477:. 19282:: 18784:. 18286:. 17723:, 17670:, 17241::= 17191::= 17122::= 17072::= 15760:: 15718:1. 15660:: 15109:. 15022:. 12971:. 12542:: 12395:. 11973:. 11789:. 10064:. 8910:. 8899:. 6681:. 5355:. 5112:. 4272:. 4117:. 3746:. 2903:, 1228:, 432:90 414:80 355:30 349:20 345:30 311:30 305:20 301:20 285:. 245:30 239:20 231:90 225:30 213:80 207:20 153:50 72:. 34:, 20295:. 20272:. 20253:. 20182:. 20141:. 20121:: 20088:. 20060:. 20035:. 20029:: 20021:: 19990:. 19986:: 19978:: 19939:. 19848:. 19727:. 19721:1 19715:n 19708:2 19704:) 19694:x 19683:i 19679:x 19675:( 19670:n 19665:1 19662:= 19659:i 19648:= 19643:2 19611:n 19607:/ 19601:2 19593:= 19588:2 19578:x 19545:2 19535:x 19506:0 19498:= 19493:i 19454:x 19415:; 19408:2 19403:i 19392:2 19388:) 19378:x 19367:i 19363:x 19359:( 19351:n 19346:1 19343:= 19340:i 19329:) 19326:1 19320:n 19317:( 19313:1 19308:= 19303:2 19264:2 19229:2 19214:2 19204:x 19194:= 19189:2 19179:x 19138:2 19090:w 19070:n 19047:) 19044:w 19038:1 19035:( 19032:n 19025:e 19000:n 18977:= 18971:1 18964:e 18957:1 18934:1 18927:e 18906:) 18903:w 18897:1 18894:( 18888:= 18885:) 18882:w 18876:1 18873:( 18867:1 18860:e 18836:1 18829:) 18825:w 18819:1 18816:( 18795:w 18772:m 18752:) 18749:w 18743:1 18740:( 18736:/ 18732:1 18729:= 18724:1 18720:V 18696:, 18690:w 18684:1 18677:m 18673:w 18666:1 18660:= 18654:1 18648:i 18644:w 18637:m 18632:1 18629:= 18626:i 18618:= 18613:1 18609:V 18583:1 18579:V 18556:1 18552:V 18528:, 18521:1 18517:V 18511:1 18505:i 18501:w 18495:= 18490:i 18486:w 18462:m 18436:1 18433:= 18430:w 18410:1 18398:0 18363:. 18358:i 18352:1 18349:+ 18346:k 18342:x 18336:i 18332:w 18326:m 18321:1 18318:= 18315:i 18307:= 18302:k 18298:z 18274:m 18254:z 18231:i 18227:x 18204:i 18200:t 18179:y 18157:i 18153:t 18132:n 18112:y 18092:x 18064:. 18059:1 18051:C 18046:= 18042:W 18015:, 18012:) 18008:X 18003:W 17997:T 17992:J 17987:( 17982:2 17972:x 17962:= 17953:x 17924:, 17919:1 17912:) 17907:J 17902:W 17896:T 17891:J 17886:( 17883:= 17878:2 17868:x 17834:n 17812:T 17808:] 17804:1 17801:, 17795:, 17792:1 17789:[ 17761:J 17734:x 17709:i 17705:x 17679:C 17656:T 17652:] 17646:n 17642:x 17638:, 17632:, 17627:1 17623:x 17619:[ 17616:= 17612:X 17564:] 17545:[ 17540:= 17535:] 17529:1 17522:1 17516:[ 17509:] 17498:0 17491:0 17480:[ 17475:= 17464:) 17458:2 17453:x 17446:1 17438:2 17433:C 17428:+ 17423:1 17418:x 17411:1 17403:1 17398:C 17392:( 17386:1 17378:) 17372:1 17364:2 17359:C 17354:+ 17349:1 17341:1 17336:C 17330:( 17325:= 17311:x 17275:] 17269:1 17264:0 17257:0 17246:[ 17236:2 17231:C 17225:, 17214:] 17208:1 17203:0 17197:[ 17186:2 17181:x 17156:] 17145:0 17138:0 17133:1 17127:[ 17117:1 17112:C 17106:, 17095:] 17089:0 17084:1 17078:[ 17067:1 17062:x 17035:, 17030:1 17022:) 17016:i 17011:W 17004:n 16999:1 16996:= 16993:i 16984:( 16979:= 16968:x 16960:C 16930:, 16926:) 16920:i 16915:x 16908:i 16903:W 16896:n 16891:1 16888:= 16885:i 16876:( 16864:x 16856:C 16851:= 16841:x 16815:. 16810:1 16802:i 16797:C 16792:= 16787:i 16782:W 16749:C 16723:2 16685:N 16681:/ 16677:1 16674:= 16669:1 16665:V 16660:/ 16654:i 16650:w 16626:. 16618:2 16614:V 16607:1 16601:) 16582:i 16577:x 16571:( 16565:T 16560:) 16541:i 16536:x 16530:( 16523:i 16519:w 16513:N 16508:1 16505:= 16502:i 16491:= 16487:C 16463:1 16460:= 16455:1 16451:V 16420:. 16414:) 16409:1 16405:V 16400:/ 16394:2 16390:V 16386:( 16378:1 16374:V 16367:) 16348:i 16343:x 16337:( 16331:T 16326:) 16307:i 16302:x 16296:( 16289:i 16285:w 16279:N 16274:1 16271:= 16268:i 16257:= 16246:) 16227:i 16222:x 16216:( 16210:T 16205:) 16186:i 16181:x 16175:( 16168:i 16164:w 16158:N 16153:1 16150:= 16147:i 16134:2 16129:i 16125:w 16119:N 16114:1 16111:= 16108:i 16095:2 16090:) 16084:i 16080:w 16074:N 16069:1 16066:= 16063:i 16054:( 16045:i 16041:w 16035:N 16030:1 16027:= 16024:i 16013:= 16005:C 15976:C 15948:. 15943:i 15938:x 15931:i 15927:w 15921:N 15916:1 15913:= 15910:i 15902:= 15826:i 15822:w 15816:N 15811:1 15808:= 15805:i 15794:i 15790:w 15784:= 15776:i 15772:w 15746:1 15742:V 15715:= 15710:i 15706:w 15700:N 15695:1 15692:= 15689:i 15681:= 15676:1 15672:V 15617:. 15611:1 15603:1 15599:V 15592:) 15573:i 15568:x 15562:( 15556:T 15551:) 15532:i 15527:x 15521:( 15514:i 15510:w 15504:N 15499:1 15496:= 15493:i 15482:= 15478:C 15452:C 15411:. 15404:1 15400:V 15394:) 15375:i 15370:x 15364:( 15358:T 15353:) 15334:i 15329:x 15323:( 15316:i 15312:w 15306:N 15301:1 15298:= 15295:i 15284:= 15280:C 15253:. 15245:i 15241:w 15235:N 15230:1 15227:= 15224:i 15212:i 15207:x 15200:i 15196:w 15190:N 15185:1 15182:= 15179:i 15168:= 15097:0 15089:i 15085:w 15074:K 15058:i 15053:x 15027:N 15008:2 14995:= 14992:] 14987:2 14981:w 14976:s 14972:[ 14966:E 14939:, 14933:) 14928:1 14924:V 14919:/ 14913:2 14909:V 14905:( 14897:1 14893:V 14885:2 14881:) 14862:i 14858:x 14854:( 14849:i 14845:w 14839:N 14834:1 14831:= 14828:i 14817:= 14804:) 14799:2 14794:1 14790:V 14785:/ 14779:2 14775:V 14771:( 14765:1 14759:2 14753:w 14733:= 14721:2 14715:w 14710:s 14678:) 14672:2 14667:1 14663:V 14658:/ 14652:2 14648:V 14643:( 14636:1 14610:f 14607:f 14604:e 14600:N 14596:= 14591:2 14587:V 14582:/ 14576:2 14571:1 14567:V 14542:) 14537:N 14533:1 14527:N 14521:( 14499:) 14491:2 14486:1 14482:V 14476:2 14472:V 14463:1 14459:( 14436:2 14431:i 14427:w 14421:N 14416:1 14413:= 14410:i 14402:= 14397:2 14393:V 14370:i 14366:w 14360:N 14355:1 14352:= 14349:i 14341:= 14336:1 14332:V 14302:2 14288:) 14280:2 14275:1 14271:V 14265:2 14261:V 14252:1 14248:( 14244:= 14234:] 14229:2 14225:) 14221:] 14218:X 14215:[ 14209:E 14203:X 14200:( 14197:[ 14191:E 14184:2 14179:1 14175:V 14169:2 14165:V 14156:] 14151:2 14147:) 14143:] 14140:X 14137:[ 14131:E 14125:X 14122:( 14119:[ 14113:E 14110:= 14096:1 14092:V 14087:] 14082:2 14078:) 14059:i 14055:x 14051:( 14048:[ 14042:E 14037:i 14033:w 14027:N 14022:1 14019:= 14016:i 14005:= 13998:] 13993:2 13987:w 13969:[ 13963:E 13954:2 13940:) 13935:N 13931:1 13925:N 13919:( 13915:= 13905:] 13900:2 13896:) 13892:] 13889:X 13886:[ 13880:E 13874:X 13871:( 13868:[ 13862:E 13857:N 13854:1 13846:] 13841:2 13837:) 13833:] 13830:X 13827:[ 13821:E 13815:X 13812:( 13809:[ 13803:E 13800:= 13788:N 13784:] 13779:2 13775:) 13763:i 13759:x 13755:( 13752:[ 13746:E 13741:N 13736:1 13733:= 13730:i 13719:= 13712:] 13707:2 13690:[ 13684:E 13655:2 13576:i 13572:w 13551:N 13529:i 13525:w 13519:N 13514:1 13511:= 13508:i 13478:2 13473:) 13454:i 13450:x 13445:( 13438:i 13434:w 13428:N 13423:1 13420:= 13417:i 13406:1 13398:i 13394:w 13388:N 13383:1 13380:= 13377:i 13365:i 13361:w 13355:N 13350:1 13347:= 13344:i 13333:= 13325:2 13321:s 13297:} 13292:i 13288:w 13284:{ 13261:} 13258:3 13255:, 13252:1 13249:, 13246:2 13243:{ 13223:} 13220:5 13217:, 13214:4 13211:, 13208:2 13205:{ 13185:} 13182:5 13179:, 13176:5 13173:, 13170:5 13167:, 13164:4 13161:, 13158:2 13155:, 13152:2 13149:{ 13120:1 13112:i 13108:w 13102:N 13097:1 13094:= 13091:i 13079:2 13074:) 13055:i 13051:x 13046:( 13039:i 13035:w 13029:N 13024:1 13021:= 13018:i 13007:= 12999:2 12995:s 12957:N 12953:N 12926:2 12899:2 12893:w 12854:1 12851:= 12846:i 12842:w 12836:N 12831:1 12828:= 12825:i 12788:i 12784:w 12778:N 12773:1 12770:= 12767:i 12755:2 12750:) 12731:i 12727:x 12722:( 12715:i 12711:w 12705:N 12700:1 12697:= 12694:i 12683:= 12674:2 12668:w 12644:N 12638:2 12633:) 12621:i 12617:x 12612:( 12605:N 12600:1 12597:= 12594:i 12583:= 12571:2 12528:2 12488:2 12482:w 12377:i 12373:w 12366:n 12361:1 12358:= 12355:i 12344:i 12340:w 12334:= 12326:i 12322:w 12292:n 12286:/ 12280:0 12272:= 12262:x 12233:2 12228:0 12203:1 12193:2 12184:i 12180:w 12173:n 12168:1 12165:= 12162:i 12151:n 12147:/ 12143:1 12120:, 12113:2 12104:i 12100:w 12093:n 12088:1 12085:= 12082:i 12072:2 12067:0 12059:= 12054:2 12044:x 12012:2 12007:0 11999:= 11994:2 11989:i 11950:x 11917:2 11912:i 11901:2 11892:i 11888:w 11881:n 11876:1 11873:= 11870:i 11862:= 11857:2 11847:x 11815:2 11810:i 11755:2 11751:) 11745:w 11735:x 11723:i 11719:x 11715:( 11710:2 11705:i 11701:w 11689:2 11685:) 11675:w 11669:n 11666:( 11663:) 11660:1 11654:n 11651:( 11647:n 11642:= 11631:2 11621:x 11586:n 11580:i 11576:w 11566:= 11557:w 11530:] 11524:2 11520:) 11510:w 11499:i 11495:w 11491:( 11483:2 11478:w 11468:x 11461:+ 11458:) 11453:w 11443:x 11430:w 11419:i 11415:x 11409:i 11405:w 11401:( 11398:) 11389:w 11378:i 11374:w 11370:( 11362:w 11352:x 11345:2 11337:2 11333:) 11327:w 11317:x 11304:w 11293:i 11289:x 11283:i 11279:w 11275:( 11268:[ 11259:2 11255:) 11245:w 11239:n 11236:( 11233:) 11230:1 11224:n 11221:( 11217:n 11212:= 11201:2 11194:w 11184:x 11118:. 11113:2 11109:) 11103:w 11093:y 11081:i 11077:y 11073:( 11068:2 11063:i 11059:w 11053:n 11048:1 11045:= 11042:i 11029:2 11025:) 11019:i 11015:w 11009:n 11004:1 11001:= 10998:i 10990:( 10986:1 10981:= 10976:2 10971:) 10963:i 10952:w 10942:y 10930:i 10926:y 10919:) 10914:i 10903:1 10900:( 10896:( 10889:n 10884:1 10881:= 10878:i 10866:2 10856:N 10849:1 10844:= 10834:) 10829:w 10819:y 10812:( 10809:V 10785:1 10779:) 10774:i 10763:1 10760:( 10740:0 10737:= 10734:) 10729:j 10725:I 10721:, 10716:i 10712:I 10708:( 10705:C 10702:= 10697:j 10694:i 10686:: 10683:j 10677:i 10652:. 10648:) 10640:j 10629:w 10619:y 10607:j 10603:y 10592:i 10581:w 10571:y 10559:i 10555:y 10546:j 10543:i 10525:( 10519:n 10514:1 10511:= 10508:j 10498:n 10493:1 10490:= 10487:i 10475:2 10465:N 10458:1 10453:= 10443:) 10438:w 10428:y 10421:( 10418:V 10411:= 10401:) 10392:R 10386:( 10383:V 10357:1 10354:= 10349:i 10345:z 10319:i 10315:z 10309:i 10305:I 10301:= 10293:i 10289:z 10266:i 10262:y 10256:i 10252:I 10248:= 10240:i 10236:y 10225:i 10221:n 10207:0 10204:= 10201:) 10196:j 10192:I 10188:, 10183:i 10179:I 10175:( 10172:C 10169:= 10164:j 10161:i 10153:: 10150:j 10144:i 10117:) 10108:Z 10102:, 10093:Y 10087:( 10078:C 10051:] 10047:) 10038:Z 10032:, 10023:Y 10017:( 10008:C 9996:R 9990:2 9977:) 9968:Z 9962:( 9959:V 9946:R 9940:+ 9930:) 9921:Y 9915:( 9912:V 9904:[ 9896:2 9886:Z 9879:1 9874:= 9870:) 9862:j 9851:j 9847:z 9837:R 9826:j 9822:y 9811:i 9800:i 9796:z 9786:R 9775:i 9771:y 9762:j 9759:i 9741:( 9735:n 9730:1 9727:= 9724:j 9714:n 9709:1 9706:= 9703:i 9691:2 9681:Z 9674:1 9669:= 9659:) 9650:R 9644:( 9641:V 9612:) 9604:i 9591:i 9587:z 9581:R 9571:i 9558:i 9554:y 9547:( 9541:n 9536:1 9533:= 9530:i 9520:Z 9517:1 9512:+ 9509:R 9495:i 9491:z 9485:i 9481:w 9475:n 9470:1 9467:= 9464:i 9449:i 9445:y 9439:i 9435:w 9429:n 9424:1 9421:= 9418:i 9407:= 9397:Z 9387:Y 9380:= 9371:R 9339:Z 9329:Y 9322:= 9313:R 9293:N 9277:i 9273:w 9248:i 9244:p 9240:1 9235:= 9230:i 9226:w 9201:i 9193:1 9188:= 9183:i 9179:w 9158:n 9153:i 9149:p 9140:i 9123:. 9109:2 9105:) 9099:w 9089:y 9077:i 9073:y 9069:( 9064:2 9059:i 9055:w 9049:n 9044:1 9041:= 9038:i 9025:2 9021:) 9015:i 9011:w 9005:n 9000:1 8997:= 8994:i 8986:( 8982:1 8977:= 8967:) 8962:w 8952:y 8945:( 8942:V 8908:R 8879:i 8875:1 8869:i 8865:w 8859:n 8854:1 8851:= 8848:i 8833:i 8829:y 8823:i 8819:w 8813:n 8808:1 8805:= 8802:i 8791:= 8786:w 8776:y 8743:i 8739:w 8733:n 8728:1 8725:= 8722:i 8710:i 8706:y 8700:i 8696:w 8690:n 8685:1 8682:= 8679:i 8668:= 8663:w 8653:y 8627:w 8617:y 8610:= 8599:i 8595:1 8589:i 8585:w 8579:n 8574:1 8571:= 8568:i 8553:i 8549:y 8543:i 8539:w 8533:n 8528:1 8525:= 8522:i 8511:= 8500:i 8496:1 8490:i 8486:w 8480:N 8475:1 8472:= 8469:i 8454:i 8450:y 8444:i 8440:w 8434:N 8429:1 8426:= 8423:i 8412:= 8401:i 8391:1 8382:N 8377:1 8374:= 8371:i 8356:i 8346:y 8337:N 8332:1 8329:= 8326:i 8315:= 8305:i 8297:1 8290:i 8286:I 8280:N 8275:1 8272:= 8269:i 8255:i 8245:i 8241:y 8233:i 8229:I 8223:N 8218:1 8215:= 8212:i 8201:= 8187:Y 8179:= 8170:R 8142:i 8138:w 8132:N 8127:1 8124:= 8121:i 8109:i 8105:y 8099:i 8095:w 8089:N 8084:1 8081:= 8078:i 8067:= 8059:i 8049:1 8040:N 8035:1 8032:= 8029:i 8017:i 8007:y 7998:N 7993:1 7990:= 7987:i 7976:= 7966:i 7958:1 7951:N 7946:1 7943:= 7940:i 7926:i 7916:i 7912:y 7904:N 7899:1 7896:= 7893:i 7882:= 7873:Y 7867:= 7864:R 7842:i 7838:y 7812:i 7802:1 7793:n 7788:1 7785:= 7782:i 7774:= 7767:i 7757:i 7753:I 7745:n 7740:1 7737:= 7734:i 7726:= 7721:i 7717:I 7711:i 7707:w 7701:n 7696:1 7693:= 7690:i 7682:= 7673:N 7646:i 7638:1 7633:= 7628:i 7624:w 7603:N 7577:N 7554:N 7534:N 7508:Y 7491:π 7467:2 7462:) 7456:i 7452:y 7446:i 7442:w 7437:( 7430:n 7425:1 7422:= 7419:i 7407:2 7403:N 7399:1 7394:= 7383:) 7375:i 7365:i 7361:y 7351:i 7341:i 7337:y 7331:) 7326:i 7315:1 7312:( 7308:( 7302:n 7297:1 7294:= 7291:i 7279:2 7275:N 7271:1 7266:= 7255:) 7249:i 7239:y 7230:i 7220:y 7211:i 7208:i 7190:( 7184:n 7179:1 7176:= 7173:i 7161:2 7157:N 7153:1 7148:= 7137:) 7131:j 7121:y 7112:i 7102:y 7093:j 7090:i 7072:( 7066:n 7061:1 7058:= 7055:j 7045:n 7040:1 7037:= 7034:i 7022:2 7018:N 7014:1 7009:= 7002:) 6996:) 6992:N 6977:Y 6970:( 6940:1 6934:) 6929:i 6918:1 6915:( 6886:2 6881:) 6875:i 6871:y 6865:i 6861:w 6856:( 6849:n 6844:1 6841:= 6838:i 6826:2 6822:N 6818:1 6813:= 6810:) 6804:) 6800:N 6780:Y 6771:( 6742:0 6739:= 6736:) 6731:j 6727:I 6723:, 6718:i 6714:I 6710:( 6707:C 6704:: 6701:j 6695:i 6667:i 6656:1 6653:= 6646:i 6635:i 6625:i 6611:1 6608:= 6603:i 6600:i 6562:j 6559:i 6548:j 6538:i 6524:1 6521:= 6516:j 6513:i 6477:j 6474:i 6447:j 6444:i 6436:= 6431:j 6421:i 6408:j 6405:i 6397:= 6394:) 6389:j 6385:I 6381:, 6376:i 6372:I 6368:( 6365:C 6341:i 6331:i 6327:y 6321:= 6316:i 6306:y 6279:) 6273:j 6263:y 6254:i 6244:y 6235:j 6232:i 6214:( 6208:n 6203:1 6200:= 6197:j 6187:n 6182:1 6179:= 6176:i 6164:2 6160:N 6156:1 6151:= 6148:) 6142:) 6138:N 6118:Y 6109:( 6065:2 6060:) 6050:y 6047:w 6036:i 6032:y 6026:i 6022:w 6017:( 6010:n 6005:1 6002:= 5999:i 5988:1 5982:n 5978:n 5973:= 5961:2 5956:) 5946:y 5943:w 5932:i 5928:y 5922:i 5918:w 5913:( 5906:n 5901:1 5898:= 5895:i 5884:1 5878:n 5874:1 5867:n 5862:2 5858:n 5852:= 5840:2 5835:) 5829:n 5823:i 5819:y 5813:i 5809:w 5803:n 5798:1 5795:= 5792:i 5781:n 5771:i 5761:i 5757:y 5751:n 5747:( 5740:n 5735:1 5732:= 5729:i 5718:1 5712:n 5708:1 5701:n 5698:1 5693:= 5688:2 5683:) 5677:i 5673:y 5667:i 5663:w 5657:n 5652:1 5649:= 5646:i 5636:n 5633:n 5621:i 5617:p 5611:i 5607:y 5599:n 5596:n 5590:( 5583:n 5578:1 5575:= 5572:i 5561:1 5555:n 5551:1 5544:n 5541:1 5536:= 5524:2 5519:) 5513:r 5510:w 5507:p 5497:Y 5483:i 5479:p 5473:i 5469:y 5462:( 5455:n 5450:1 5447:= 5444:i 5433:1 5427:n 5423:1 5416:n 5413:1 5408:= 5401:) 5386:Y 5379:( 5341:n 5335:i 5331:y 5325:i 5321:w 5312:n 5307:1 5304:= 5301:i 5293:= 5284:y 5281:w 5256:2 5251:) 5241:y 5238:w 5227:i 5223:y 5217:i 5213:w 5208:( 5201:n 5196:1 5193:= 5190:i 5179:1 5173:n 5169:n 5164:= 5161:) 5156:r 5153:w 5150:p 5140:Y 5133:( 5095:i 5091:y 5085:i 5081:w 5075:n 5070:1 5067:= 5064:i 5056:= 5049:i 5036:i 5032:y 5024:n 5019:1 5016:= 5013:i 4997:i 4993:p 4989:n 4980:i 4976:y 4968:n 4963:1 4960:= 4957:i 4949:= 4942:i 4938:p 4929:i 4925:y 4917:n 4912:1 4909:= 4906:i 4896:n 4893:1 4888:= 4883:r 4880:w 4877:p 4867:Y 4843:p 4791:i 4787:y 4781:N 4776:1 4773:= 4770:i 4762:= 4759:Y 4736:. 4731:N 4722:i 4718:y 4712:i 4708:w 4702:n 4697:1 4694:= 4691:i 4680:= 4675:N 4667:i 4654:i 4650:y 4642:n 4637:1 4634:= 4631:i 4615:N 4607:i 4603:p 4594:i 4590:y 4582:n 4577:1 4574:= 4571:i 4561:n 4558:1 4550:= 4545:N 4540:r 4537:w 4534:p 4524:Y 4515:= 4510:N 4490:Y 4452:2 4447:) 4437:y 4434:w 4423:i 4419:y 4413:i 4409:w 4404:( 4397:n 4392:1 4389:= 4386:i 4375:1 4369:n 4365:n 4356:2 4352:N 4348:1 4343:= 4339:) 4334:N 4314:Y 4305:( 4281:n 4258:N 4249:i 4245:y 4239:i 4235:w 4229:n 4224:1 4221:= 4218:i 4202:N 4197:r 4194:w 4191:p 4181:Y 4172:= 4167:N 4147:Y 4131:N 4100:i 4096:p 4089:n 4085:1 4073:i 4065:1 4060:= 4055:i 4051:w 4020:i 4009:i 4005:y 3999:i 3995:I 3988:= 3983:i 3973:y 3964:i 3960:I 3956:= 3948:i 3938:y 3912:i 3902:y 3895:n 3892:= 3885:i 3881:p 3875:i 3871:y 3858:y 3844:p 3820:i 3810:i 3806:y 3800:= 3795:i 3785:y 3771:y 3734:) 3729:i 3718:1 3715:( 3710:i 3700:2 3695:i 3691:y 3687:= 3684:] 3679:i 3675:I 3671:[ 3668:V 3663:2 3658:i 3654:y 3650:= 3647:] 3639:i 3635:y 3631:[ 3628:V 3606:i 3596:i 3592:y 3588:= 3585:] 3580:i 3576:I 3572:[ 3569:E 3564:i 3560:y 3556:= 3553:] 3545:i 3541:y 3537:[ 3534:E 3512:i 3508:I 3502:i 3498:y 3494:= 3486:i 3482:y 3459:i 3455:I 3432:i 3428:y 3397:i 3393:p 3370:n 3365:i 3350:i 3346:p 3342:= 3339:) 3330:| 3326:1 3323:= 3318:i 3314:I 3310:( 3307:P 3285:i 3277:= 3274:) 3271:n 3260:1 3257:= 3252:i 3248:I 3244:( 3241:P 3227:i 3211:i 3207:I 3188:y 3168:N 3145:N 3135:T 3131:Y 3127:y 3123:y 3088:i 3084:y 3061:y 3043:n 3037:2 3032:i 3028:w 3022:n 3017:1 3014:= 3011:i 3000:= 2990:2 2986:w 2961:n 2955:i 2951:w 2945:n 2940:1 2937:= 2934:i 2923:= 2914:w 2888:1 2882:n 2875:2 2871:) 2861:y 2850:i 2846:y 2842:( 2837:n 2832:1 2829:= 2826:i 2815:= 2810:2 2805:y 2764:2 2754:w 2741:2 2737:w 2728:2 2723:y 2706:= 2703:) 2698:w 2688:y 2681:( 2635:n 2609:. 2603:= 2600:) 2591:x 2585:( 2582:E 2559:= 2554:i 2529:. 2523:i 2510:i 2506:w 2499:n 2494:1 2491:= 2488:i 2480:= 2477:) 2468:x 2462:( 2459:E 2439:, 2433:i 2424:= 2421:) 2416:i 2412:x 2408:( 2405:E 2376:x 2329:. 2322:2 2317:i 2307:i 2303:x 2295:n 2290:1 2287:= 2284:i 2274:2 2264:x 2254:= 2245:x 2216:. 2209:2 2204:) 2198:2 2190:i 2180:n 2175:1 2172:= 2169:i 2160:( 2151:2 2146:i 2136:4 2128:i 2117:n 2112:1 2109:= 2106:i 2095:= 2089:2 2084:i 2073:2 2064:i 2060:w 2053:n 2048:1 2045:= 2042:i 2034:= 2029:2 2019:x 1987:0 1979:= 1974:i 1949:n 1945:/ 1939:2 1934:0 1926:= 1921:2 1911:x 1881:, 1872:i 1868:w 1862:n 1857:1 1854:= 1851:i 1842:1 1836:= 1827:2 1819:i 1809:n 1804:1 1801:= 1798:i 1789:1 1783:= 1773:x 1736:, 1728:i 1724:w 1718:n 1713:1 1710:= 1707:i 1696:) 1690:i 1686:w 1677:i 1673:x 1668:( 1662:n 1657:1 1654:= 1651:i 1640:= 1629:2 1624:i 1616:1 1608:n 1603:1 1600:= 1597:i 1586:) 1578:2 1573:i 1563:i 1559:x 1552:( 1546:n 1541:1 1538:= 1535:i 1524:= 1515:x 1486:. 1479:2 1474:i 1466:1 1461:= 1456:i 1452:w 1426:2 1421:i 1387:i 1383:x 1337:2 1328:i 1324:w 1318:n 1313:1 1310:= 1307:i 1294:= 1284:x 1243:x 1210:2 1175:i 1171:x 1164:n 1159:1 1156:= 1153:i 1143:n 1140:1 1123:. 1105:j 1101:w 1094:n 1089:1 1086:= 1083:j 1072:i 1068:w 1062:= 1054:i 1050:w 1034:. 1019:i 1015:x 1006:i 1002:w 995:n 990:1 987:= 984:i 976:= 967:x 941:1 938:= 929:i 925:w 918:n 913:1 910:= 907:i 875:. 867:n 863:w 859:+ 853:+ 848:2 844:w 840:+ 835:1 831:w 823:n 819:x 813:n 809:w 805:+ 799:+ 794:2 790:x 784:2 780:w 776:+ 771:1 767:x 761:1 757:w 750:= 741:x 712:, 704:i 700:w 694:n 689:1 686:= 683:i 671:i 667:x 661:i 657:w 651:n 646:1 643:= 640:i 629:= 620:x 593:) 587:n 583:w 579:, 573:, 568:2 564:w 560:, 555:1 551:w 546:( 521:) 515:n 511:x 507:, 501:, 496:2 492:x 488:, 483:1 479:x 474:( 438:= 435:) 423:( 420:+ 417:) 405:( 402:= 393:x 361:= 352:+ 317:= 308:+ 251:= 242:+ 234:) 222:( 219:+ 216:) 204:( 198:= 189:x 158:= 145:= 136:x 103:— 91:— 42:. 20:)

Index

Weighted mean
Weighted median
Weighted geometric mean
Weighted harmonic mean
arithmetic mean
average
descriptive statistics
arithmetic mean
Simpson's paradox
convex combination
tuple
weights
ordinary mean
independent and identically distributed random variables
uncertainty propagation
Inverse-variance weighting
Weighted least squares
probability distribution
variance
maximum likelihood estimator
normally distributed
uncorrelated
random variables
variance
expectation
i.i.d
Kish's design effect
proof
i.i.d
sampling design

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.