16629:
15690:
16624:{\displaystyle {\begin{aligned}\Pr(Y_{i}=1\mid \mathbf {X} _{i})={}&\Pr \left(Y_{i}^{1\ast }>Y_{i}^{0\ast }\mid \mathbf {X} _{i}\right)&\\={}&\Pr \left(Y_{i}^{1\ast }-Y_{i}^{0\ast }>0\mid \mathbf {X} _{i}\right)&\\={}&\Pr \left({\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}+\varepsilon _{1}-\left({\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}+\varepsilon _{0}\right)>0\right)&\\={}&\Pr \left(({\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}-{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i})+(\varepsilon _{1}-\varepsilon _{0})>0\right)&\\={}&\Pr(({\boldsymbol {\beta }}_{1}-{\boldsymbol {\beta }}_{0})\cdot \mathbf {X} _{i}+(\varepsilon _{1}-\varepsilon _{0})>0)&\\={}&\Pr(({\boldsymbol {\beta }}_{1}-{\boldsymbol {\beta }}_{0})\cdot \mathbf {X} _{i}+\varepsilon >0)&&{\text{(substitute }}\varepsilon {\text{ as above)}}\\={}&\Pr({\boldsymbol {\beta }}\cdot \mathbf {X} _{i}+\varepsilon >0)&&{\text{(substitute }}{\boldsymbol {\beta }}{\text{ as above)}}\\={}&\Pr(\varepsilon >-{\boldsymbol {\beta }}\cdot \mathbf {X} _{i})&&{\text{(now, same as above model)}}\\={}&\Pr(\varepsilon <{\boldsymbol {\beta }}\cdot \mathbf {X} _{i})&\\={}&\operatorname {logit} ^{-1}({\boldsymbol {\beta }}\cdot \mathbf {X} _{i})\\={}&p_{i}\end{aligned}}}
16730:
changes the utility of a given choice. A voter might expect that the right-of-center party would lower taxes, especially on rich people. This would give low-income people no benefit, i.e. no change in utility (since they usually don't pay taxes); would cause moderate benefit (i.e. somewhat more money, or moderate utility increase) for middle-incoming people; would cause significant benefits for high-income people. On the other hand, the left-of-center party might be expected to raise taxes and offset it with increased welfare and other assistance for the lower and middle classes. This would cause significant positive benefit to low-income people, perhaps a weak benefit to middle-income people, and significant negative benefit to high-income people. Finally, the secessionist party would take no direct actions on the economy, but simply secede. A low-income or middle-income voter might expect basically no clear utility gain or loss from this, but a high-income voter might expect negative utility since he/she is likely to own companies, which will have a harder time doing business in such an environment and probably lose money.
18889:
18173:
18884:{\displaystyle {\begin{aligned}\Pr(Y_{i}=1)&={\frac {e^{({\boldsymbol {\beta }}_{1}+\mathbf {C} )\cdot \mathbf {X} _{i}}}{e^{({\boldsymbol {\beta }}_{0}+\mathbf {C} )\cdot \mathbf {X} _{i}}+e^{({\boldsymbol {\beta }}_{1}+\mathbf {C} )\cdot \mathbf {X} _{i}}}}\\&={\frac {e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}e^{\mathbf {C} \cdot \mathbf {X} _{i}}}{e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}e^{\mathbf {C} \cdot \mathbf {X} _{i}}+e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}e^{\mathbf {C} \cdot \mathbf {X} _{i}}}}\\&={\frac {e^{\mathbf {C} \cdot \mathbf {X} _{i}}e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}{e^{\mathbf {C} \cdot \mathbf {X} _{i}}(e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}+e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}})}}\\&={\frac {e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}{e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}+e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}}.\end{aligned}}}
24635:
a single degree of freedom. If the predictor model has significantly smaller deviance (c.f. chi-square using the difference in degrees of freedom of the two models), then one can conclude that there is a significant association between the "predictor" and the outcome. Although some common statistical packages (e.g. SPSS) do provide likelihood ratio test statistics, without this computationally intensive test it would be more difficult to assess the contribution of individual predictors in the multiple logistic regression case. To assess the contribution of individual predictors one can enter the predictors hierarchically, comparing each new model with the previous to determine the contribution of each predictor. There is some debate among statisticians about the appropriateness of so-called "stepwise" procedures. The fear is that they may not preserve nominal statistical properties and may become misleading.
758:
probability value ranging between 0 and 1. This probability indicates the likelihood that a given input corresponds to one of two predefined categories. The essential mechanism of logistic regression is grounded in the logistic function's ability to model the probability of binary outcomes accurately. With its distinctive S-shaped curve, the logistic function effectively maps any real-valued number to a value within the 0 to 1 interval. This feature renders it particularly suitable for binary classification tasks, such as sorting emails into "spam" or "not spam". By calculating the probability that the dependent variable will be categorized into a specific group, logistic regression provides a probabilistic framework that supports informed decision-making.
24484:
14899:
24282:
14533:
29155:
14030:
24479:{\displaystyle {\begin{aligned}D_{\text{null}}-D_{\text{fitted}}&=-2\left(\ln {\frac {\text{likelihood of null model}}{\text{likelihood of the saturated model}}}-\ln {\frac {\text{likelihood of fitted model}}{\text{likelihood of the saturated model}}}\right)\\&=-2\ln {\frac {\left({\dfrac {\text{likelihood of null model}}{\text{likelihood of the saturated model}}}\right)}{\left({\dfrac {\text{likelihood of fitted model}}{\text{likelihood of the saturated model}}}\right)}}\\&=-2\ln {\frac {\text{likelihood of the null model}}{\text{likelihood of fitted model}}}.\end{aligned}}}
17702:
34342:
29609:
14894:{\displaystyle {\begin{aligned}\Pr(Y_{i}=1\mid \mathbf {X} _{i})&=\Pr(Y_{i}^{\ast }>0\mid \mathbf {X} _{i})\\&=\Pr({\boldsymbol {\beta }}\cdot \mathbf {X} _{i}+\varepsilon _{i}>0)\\&=\Pr(\varepsilon _{i}>-{\boldsymbol {\beta }}\cdot \mathbf {X} _{i})\\&=\Pr(\varepsilon _{i}<{\boldsymbol {\beta }}\cdot \mathbf {X} _{i})&&{\text{(because the logistic distribution is symmetric)}}\\&=\operatorname {logit} ^{-1}({\boldsymbol {\beta }}\cdot \mathbf {X} _{i})&\\&=p_{i}&&{\text{(see above)}}\end{aligned}}}
629:
25243:. Thus, although the observed dependent variable in binary logistic regression is a 0-or-1 variable, the logistic regression estimates the odds, as a continuous variable, that the dependent variable is a 'success'. In some applications, the odds are all that is needed. In others, a specific yes-or-no prediction is needed for whether the dependent variable is or is not a 'success'; this categorical prediction can be based on the computed odds of success, with predicted odds above some chosen cutoff value being translated into a prediction of success.
21475:
28636:
13667:
7877:
8426:
17389:
1042:
27:
29150:{\displaystyle {\begin{aligned}&\lim \limits _{N\rightarrow +\infty }N^{-1}\sum _{i=1}^{N}\log \Pr(y_{i}\mid x_{i};\theta )=\sum _{x\in {\mathcal {X}}}\sum _{y\in {\mathcal {Y}}}\Pr(X=x,Y=y)\log \Pr(Y=y\mid X=x;\theta )\\={}&\sum _{x\in {\mathcal {X}}}\sum _{y\in {\mathcal {Y}}}\Pr(X=x,Y=y)\left(-\log {\frac {\Pr(Y=y\mid X=x)}{\Pr(Y=y\mid X=x;\theta )}}+\log \Pr(Y=y\mid X=x)\right)\\={}&-D_{\text{KL}}(Y\parallel Y_{\theta })-H(Y\mid X)\end{aligned}}}
34328:
25042:) rather than a continuous outcome. Given this difference, the assumptions of linear regression are violated. In particular, the residuals cannot be normally distributed. In addition, linear regression may make nonsensical predictions for a binary dependent variable. What is needed is a way to convert a binary variable into a continuous one that can take on any real value (negative or positive). To do that, binomial logistic regression first calculates the
25291:. The Lagrangian is equal to the entropy plus the sum of the products of Lagrange multipliers times various constraint expressions. The general multinomial case will be considered, since the proof is not made that much simpler by considering simpler cases. Equating the derivative of the Lagrangian with respect to the various probabilities to zero yields a functional form for those probabilities which corresponds to those used in logistic regression.
14025:{\displaystyle \Pr(Y_{i}=y\mid \mathbf {X} _{i})={p_{i}}^{y}(1-p_{i})^{1-y}=\left({\frac {e^{{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}{1+e^{{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}}\right)^{y}\left(1-{\frac {e^{{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}{1+e^{{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}}\right)^{1-y}={\frac {e^{{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}\cdot y}}{1+e^{{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}}}
24112:
a model with at least one predictor and the saturated model. In this respect, the null model provides a baseline upon which to compare predictor models. Given that deviance is a measure of the difference between a given model and the saturated model, smaller values indicate better fit. Thus, to assess the contribution of a predictor or set of predictors, one can subtract the model deviance from the null deviance and assess the difference on a
2354:), meaning the actual outcome is "more surprising". Since the value of the logistic function is always strictly between zero and one, the log loss is always greater than zero and less than infinity. Unlike in a linear regression, where the model can have zero loss at a point by passing through a data point (and zero loss overall if all points are on a line), in a logistic regression it is not possible to have zero loss at any points, since
7387:
4649:
20486:
10513:
16647:
17697:{\displaystyle {\begin{aligned}\Pr(Y_{i}=0)&={\frac {e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}}{e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}+e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}}\\\Pr(Y_{i}=1)&={\frac {e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}{e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}+e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}}.\end{aligned}}}
34366:
24271:
34354:
32016:
6193:
21723:
subjectively) regard confidence interval coverage less than 93 percent, type I error greater than 7 percent, or relative bias greater than 15 percent as problematic, our results indicate that problems are fairly frequent with 2–4 EPV, uncommon with 5–9 EPV, and still observed with 10–16 EPV. The worst instances of each problem were not severe with 5–9 EPV and usually comparable to those with 10–16 EPV".
6485:
23757:) increases, becoming exactly chi-square distributed in the limit of an infinite number of data points. As in the case of linear regression, we may use this fact to estimate the probability that a random set of data points will give a better fit than the fit obtained by the proposed model, and so have an estimate how significantly the model is improved by including the
7872:{\displaystyle {\begin{aligned}Y_{i}\mid x_{1,i},\ldots ,x_{m,i}\ &\sim \operatorname {Bernoulli} (p_{i})\\\operatorname {\mathbb {E} } &=p_{i}\\\Pr(Y_{i}=y\mid x_{1,i},\ldots ,x_{m,i})&={\begin{cases}p_{i}&{\text{if }}y=1\\1-p_{i}&{\text{if }}y=0\end{cases}}\\\Pr(Y_{i}=y\mid x_{1,i},\ldots ,x_{m,i})&=p_{i}^{y}(1-p_{i})^{(1-y)}\end{aligned}}}
9274:
19221:
15095:
24747:
healthy people in order to obtain data for only a few diseased individuals. Thus, we may evaluate more diseased individuals, perhaps all of the rare outcomes. This is also retrospective sampling, or equivalently it is called unbalanced data. As a rule of thumb, sampling controls at a rate of five times the number of cases will produce sufficient control data.
21076:
17249:
28435:
24104:. Smaller values indicate better fit as the fitted model deviates less from the saturated model. When assessed upon a chi-square distribution, nonsignificant chi-square values indicate very little unexplained variance and thus, good model fit. Conversely, a significant chi-square value indicates that a significant amount of the variance is unexplained.
20168:
10214:
29246:. The model of logistic regression, however, is based on quite different assumptions (about the relationship between the dependent and independent variables) from those of linear regression. In particular, the key differences between these two models can be seen in the following two features of logistic regression. First, the conditional distribution
20157:
24166:
17025:
29486:
surpassed it. This relative popularity was due to the adoption of the logit outside of bioassay, rather than displacing the probit within bioassay, and its informal use in practice; the logit's popularity is credited to the logit model's computational simplicity, mathematical properties, and generality, allowing its use in varied fields.
14368:
13153:
12587:= 0. This was convenient, but not necessary. Again, the optimum beta coefficients may be found by maximizing the log-likelihood function generally using numerical methods. A possible method of solution is to set the derivatives of the log-likelihood with respect to each beta coefficient equal to zero and solve for the beta coefficients:
20544:
predictors. The model will not converge with zero cell counts for categorical predictors because the natural logarithm of zero is an undefined value so that the final solution to the model cannot be reached. To remedy this problem, researchers may collapse categories in a theoretically meaningful way or add a constant to all cells.
13648:
6202:
13326:
24614:
predictor. In logistic regression, however, the regression coefficients represent the change in the logit for each unit change in the predictor. Given that the logit is not intuitive, researchers are likely to focus on a predictor's effect on the exponential function of the regression coefficient – the odds ratio (see
11668:
26935:
21381:
27361:
11868:
17980:
24746:
Suppose cases are rare. Then we might wish to sample them more frequently than their prevalence in the population. For example, suppose there is a disease that affects 1 person in 10,000 and to collect our data we need to do a complete physical. It may be too expensive to do thousands of physicals of
24111:
Two measures of deviance are particularly important in logistic regression: null deviance and model deviance. The null deviance represents the difference between a model with only the intercept (which means "no predictors") and the saturated model. The model deviance represents the difference between
21726:
Others have found results that are not consistent with the above, using different criteria. A useful criterion is whether the fitted model will be expected to achieve the same predictive discrimination in a new sample as it appeared to achieve in the model development sample. For that criterion, 20
24733:
Although several statistical packages (e.g., SPSS, SAS) report the Wald statistic to assess the contribution of individual predictors, the Wald statistic has limitations. When the regression coefficient is large, the standard error of the regression coefficient also tends to be larger increasing the
24634:
discussed above to assess model fit is also the recommended procedure to assess the contribution of individual "predictors" to a given model. In the case of a single predictor model, one simply compares the deviance of the predictor model with that of the null model on a chi-square distribution with
20515:
In some instances, the model may not reach convergence. Non-convergence of a model indicates that the coefficients are not meaningful because the iterative process was unable to find appropriate solutions. A failure to converge may occur for a number of reasons: having a large ratio of predictors to
24613:
After fitting the model, it is likely that researchers will want to examine the contribution of individual predictors. To do so, they will want to examine the regression coefficients. In linear regression, the regression coefficients represent the change in the criterion for each unit change in the
24032:
is used in lieu of a sum of squares calculations. Deviance is analogous to the sum of squares calculations in linear regression and is a measure of the lack of fit to the data in a logistic regression model. When a "saturated" model is available (a model with a theoretically perfect fit), deviance
21740:
In any fitting procedure, the addition of another fitting parameter to a model (e.g. the beta parameters in a logistic regression model) will almost always improve the ability of the model to predict the measured outcomes. This will be true even if the additional term has no predictive value, since
15220:
20543:
Sparseness in the data refers to having a large proportion of empty cells (cells with zero counts). Zero cell counts are particularly problematic with categorical predictors. With continuous predictors, the model can infer values for the zero cell counts, but this is not the case with categorical
19285:
model start out either by extending the "log-linear" formulation presented here or the two-way latent variable formulation presented above, since both clearly show the way that the model could be extended to multi-way outcomes. In general, the presentation with latent variables is more common in
16802:
Separate sets of regression coefficients need to exist for each choice. When phrased in terms of utility, this can be seen very easily. Different choices have different effects on net utility; furthermore, the effects vary in complex ways that depend on the characteristics of each individual, so
15501:
It turns out that this model is equivalent to the previous model, although this seems non-obvious, since there are now two sets of regression coefficients and error variables, and the error variables have a different distribution. In fact, this model reduces directly to the previous one with the
5940:
illustrates that the probability of the dependent variable equaling a case is equal to the value of the logistic function of the linear regression expression. This is important in that it shows that the value of the linear regression expression can vary from negative to positive infinity and yet,
29485:
and following years. The logit model was initially dismissed as inferior to the probit model, but "gradually achieved an equal footing with the probit", particularly between 1960 and 1970. By 1970, the logit model achieved parity with the probit model in use in statistics journals and thereafter
20547:
Another numerical problem that may lead to a lack of convergence is complete separation, which refers to the instance in which the predictors perfectly predict the criterion – all cases are accurately classified and the likelihood maximized with infinite coefficients. In such instances, one
20539:
Multicollinearity refers to unacceptably high correlations between predictors. As multicollinearity increases, coefficients remain unbiased but standard errors increase and the likelihood of model convergence decreases. To detect multicollinearity amongst the predictors, one can conduct a linear
17837:
24750:
Logistic regression is unique in that it may be estimated on unbalanced data, rather than randomly sampled data, and still yield correct coefficient estimates of the effects of each independent variable on the outcome. That is to say, if we form a logistic model from such data, if the model is
16729:
that results from making each of the choices. We can also interpret the regression coefficients as indicating the strength that the associated factor (i.e. explanatory variable) has in contributing to the utility — or more correctly, the amount by which a unit change in an explanatory variable
12768:
6196:
The image represents an outline of what an odds ratio looks like in writing, through a template in addition to the test score example in the "Example" section of the contents. In simple terms, if we hypothetically get an odds ratio of 2 to 1, we can say... "For every one-unit increase in hours
21722:
participants. However, there is considerable debate about the reliability of this rule, which is based on simulation studies and lacks a secure theoretical underpinning. According to some authors the rule is overly conservative in some circumstances, with the authors stating, "If we (somewhat
9130:
14373:
The choice of modeling the error variable specifically with a standard logistic distribution, rather than a general logistic distribution with the location and scale set to arbitrary values, seems restrictive, but in fact, it is not. It must be kept in mind that we can choose the regression
25797:
and will minimized by equating the derivatives of the
Lagrangian with respect to these probabilities to zero. An important point is that the probabilities are treated equally and the fact that they sum to 1 is part of the Lagrangian formulation, rather than being assumed from the beginning.
19033:
14939:
12149:
757:
tasks, such as identifying whether an email is spam or not and diagnosing diseases by assessing the presence or absence of specific conditions based on patient test results. This approach utilizes the logistic (or sigmoid) function to transform a linear combination of input features into a
3077:
24968:
20920:
21745:" to the noise in the data. The question arises as to whether the improvement gained by the addition of another fitting parameter is significant enough to recommend the inclusion of the additional term, or whether the improvement is simply that which may be expected from overfitting.
20481:{\displaystyle \Pr(Y_{i}=y\mid \mathbf {X} _{i})={n_{i} \choose y}p_{i}^{y}(1-p_{i})^{n_{i}-y}={n_{i} \choose y}\left({\frac {1}{1+e^{-{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}}\right)^{y}\left(1-{\frac {1}{1+e^{-{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}}\right)^{n_{i}-y}\,.}
17073:
10508:{\displaystyle p={\frac {b^{{\boldsymbol {\beta }}\cdot {\boldsymbol {x}}}}{1+b^{{\boldsymbol {\beta }}\cdot x}}}={\frac {b^{\beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2}}}{1+b^{\beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2}}}}={\frac {1}{1+b^{-(\beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2})}}}}
28178:
6091:
serves as a link function between the probability and the linear regression expression. Given that the logit ranges between negative and positive infinity, it provides an adequate criterion upon which to conduct linear regression and the logit is easily converted back into the odds.
9766:
7275:, response variable, output variable, or class), i.e. it can assume only the two possible values 0 (often meaning "no" or "failure") or 1 (often meaning "yes" or "success"). The goal of logistic regression is to use the dataset to create a predictive model of the outcome variable.
29374:
for details. In his earliest paper (1838), Verhulst did not specify how he fit the curves to the data. In his more detailed paper (1845), Verhulst determined the three parameters of the model by making the curve pass through three observed points, which yielded poor predictions.
9935:
15468:
This model has a separate latent variable and a separate set of regression coefficients for each possible outcome of the dependent variable. The reason for this separation is that it makes it easy to extend logistic regression to multi-outcome categorical variables, as in the
5679:
26418:
24651:-test in linear regression, is used to assess the significance of coefficients. The Wald statistic is the ratio of the square of the regression coefficient to the square of the standard error of the coefficient and is asymptotically distributed as a chi-square distribution.
24604:
to assess whether or not the observed event rates match expected event rates in subgroups of the model population. This test is considered to be obsolete by some statisticians because of its dependence on arbitrary binning of predicted probabilities and relative low power.
19961:
24266:{\displaystyle {\begin{aligned}D_{\text{null}}&=-2\ln {\frac {\text{likelihood of null model}}{\text{likelihood of the saturated model}}}\\D_{\text{fitted}}&=-2\ln {\frac {\text{likelihood of fitted model}}{\text{likelihood of the saturated model}}}.\end{aligned}}}
23747:
733:. Disaster planners and engineers rely on these models to predict decision take by householders or building occupants in small-scale and large-scales evacuations, such as building fires, wildfires, hurricanes among others. These models help in the development of reliable
24511:
is used to assess goodness of fit as it represents the proportion of variance in the criterion that is explained by the predictors. In logistic regression analysis, there is no agreed upon analogous measure, but there are several competing measures each with limitations.
2015:
16847:
15463:
20551:
One can also take semi-parametric or non-parametric approaches, e.g., via local-likelihood or nonparametric quasi-likelihood methods, which avoid assumptions of a parametric form for the index function and is robust to the choice of the link function (e.g., probit or
29339:. If the assumptions of linear discriminant analysis hold, the conditioning can be reversed to produce logistic regression. The converse is not true, however, because logistic regression does not require the multivariate normal assumption of discriminant analysis.
23402:
6819:
17378:
14234:
12908:
709:, etc.). Another example might be to predict whether a Nepalese voter will vote Nepali Congress or Communist Party of Nepal or Any Other Party, based on age, income, sex, race, state of residence, votes in previous elections, etc. The technique can also be used in
6480:{\displaystyle \mathrm {OR} ={\frac {\operatorname {odds} (x+1)}{\operatorname {odds} (x)}}={\frac {\left({\frac {p(x+1)}{1-p(x+1)}}\right)}{\left({\frac {p(x)}{1-p(x)}}\right)}}={\frac {e^{\beta _{0}+\beta _{1}(x+1)}}{e^{\beta _{0}+\beta _{1}x}}}=e^{\beta _{1}}}
13489:
29230:. This leads to the intuition that by maximizing the log-likelihood of a model, you are minimizing the KL divergence of your model from the maximal entropy distribution. Intuitively searching for the model that makes the fewest assumptions in its parameters.
26200:
24086:
15695:
13164:
23218:
8708:
21616:
allows these posteriors to be computed using simulation, so lack of conjugacy is not a concern. However, when the sample size or the number of parameters is large, full
Bayesian simulation can be slow, and people often use approximate methods such as
19022:
23752:
which will always be positive or zero. The reason for this choice is that not only is the deviance a good measure of the goodness of fit, it is also approximately chi-squared distributed, with the approximation improving as the number of data points
173:(it is not a classifier), though it can be used to make a classifier, for instance by choosing a cutoff value and classifying inputs with probability greater than the cutoff as one class, below the cutoff as the other; this is a common way to make a
27035:
7184:
7039:
21727:
events per candidate variable may be required. Also, one can argue that 96 observations are needed only to estimate the model's intercept precisely enough that the margin of error in predicted probabilities is ±0.1 with a 0.95 confidence level.
12506:
11545:
19683:
19462:
29414:
in 1925 and has been followed since. Pearl and Reed first applied the model to the population of the United States, and also initially fitted the curve by making it pass through three points; as with
Verhulst, this again yielded poor results.
28582:
27701:
26622:
13342:
The intuition for transforming using the logit function (the natural log of the odds) was explained above. It also has the practical effect of converting the probability (which is bounded to be between 0 and 1) to a variable that ranges over
24107:
When the saturated model is not available (a common case), deviance is calculated simply as −2·(log likelihood of the fitted model), and the reference to the saturated model's log likelihood can be removed from all that follows without harm.
21224:
15682:
3574:
20508:. Unlike linear regression with normally distributed residuals, it is not possible to find a closed-form expression for the coefficient values that maximize the likelihood function so an iterative process must be used instead; for example
27238:
22404:
26731:
7349:), that is, separate explanatory variables taking the value 0 or 1 are created for each possible value of the discrete variable, with a 1 meaning "variable does have the given value" and a 0 meaning "variable does not have that value".)
25213:
25136:
22914:
14145:
11718:
8998:
19276:
15555:
3458:
17856:
27161:
19818:
16806:
Even though income is a continuous variable, its effect on utility is too complex for it to be treated as a single variable. Either it needs to be directly split up into ranges, or higher powers of income need to be added so that
4303:
4153:
25034:, logistic regression makes use of one or more predictor variables that may be either continuous or categorical. Unlike ordinary linear regression, however, logistic regression is used for predicting dependent variables that take
208:, and in this sense is the "simplest" way to convert a real number to a probability. In particular, it maximizes entropy (minimizes added information), and in this sense makes the fewest assumptions of the data being modeled; see
27880:
23509:
102:(any real value). The corresponding probability of the value labeled "1" can vary between 0 (certainly the value "0") and 1 (certainly the value "1"), hence the labeling; the function that converts log-odds to probability is the
26212:
and the data. Rather than being specific to the assumed multinomial logistic case, it is taken to be a general statement of the condition at which the log-likelihood is maximized and makes no reference to the functional form of
25929:
12874:-th measurement. Once the beta coefficients have been estimated from the data, we will be able to estimate the probability that any subsequent set of explanatory variables will result in any of the possible outcome categories.
15106:
14538:
692:
using logistic regression. Many other medical scales used to assess severity of a patient have been developed using logistic regression. Logistic regression may be used to predict the risk of developing a given disease (e.g.
28166:
17713:
15346:
13416:
on the coefficients, but other regularizers are also possible.) Whether or not regularization is used, it is usually not possible to find a closed-form solution; instead, an iterative numerical method must be used, such as
9269:{\displaystyle p({\boldsymbol {x}})={\frac {b^{{\boldsymbol {\beta }}\cdot {\boldsymbol {x}}}}{1+b^{{\boldsymbol {\beta }}\cdot {\boldsymbol {x}}}}}={\frac {1}{1+b^{-{\boldsymbol {\beta }}\cdot {\boldsymbol {x}}}}}=S_{b}(t)}
16820:
Yet another formulation combines the two-way latent variable formulation above with the original formulation higher up without latent variables, and in the process provides a link to one of the standard formulations of the
26050:
19216:{\displaystyle \Pr(Y_{i}=1)={\frac {e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}{1+e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}}={\frac {1}{1+e^{-{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}}=p_{i}}
15090:{\displaystyle {\begin{aligned}Y_{i}^{0\ast }&={\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}+\varepsilon _{0}\,\\Y_{i}^{1\ast }&={\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}+\varepsilon _{1}\,\end{aligned}}}
12593:
9089:
1560:
Remark: This model is actually an oversimplification, since it assumes everybody will pass if they learn long enough (limit = 1). The limit value should be a variable parameter too, if you want to make it more realistic.
24095:
represents the deviance and ln represents the natural logarithm. The log of this likelihood ratio (the ratio of the fitted model to the saturated model) will produce a negative value, hence the need for a negative sign.
21948:
23929:
12037:
8185:
5248:
20890:
29390:
for the same reaction, while the supply of one of the reactants is fixed. This naturally gives rise to the logistic equation for the same reason as population growth: the reaction is self-reinforcing but constrained.
21071:{\displaystyle \mathbf {w} _{k+1}=\left(\mathbf {X} ^{T}\mathbf {S} _{k}\mathbf {X} \right)^{-1}\mathbf {X} ^{T}\left(\mathbf {S} _{k}\mathbf {X} \mathbf {w} _{k}+\mathbf {y} -\mathbf {\boldsymbol {\mu }} _{k}\right)}
2828:
17244:{\displaystyle {\begin{aligned}\Pr(Y_{i}=0)&={\frac {1}{Z}}e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}\\\Pr(Y_{i}=1)&={\frac {1}{Z}}e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}\end{aligned}}}
14522:
4943:
28430:{\displaystyle {\begin{aligned}L(\theta \mid y;x)&=\Pr(Y\mid X;\theta )\\&=\prod _{i}\Pr(y_{i}\mid x_{i};\theta )\\&=\prod _{i}h_{\theta }(x_{i})^{y_{i}}(1-h_{\theta }(x_{i}))^{(1-y_{i})}\end{aligned}}}
24838:
20512:. This process begins with a tentative solution, revises it slightly to see if it can be improved, and repeats this revision until no more improvement is made, at which point the process is said to have converged.
18178:
14916:
instead of a standard logistic distribution. Both the logistic and normal distributions are symmetric with a basic unimodal, "bell curve" shape. The only difference is that the logistic distribution has somewhat
14374:
coefficients ourselves, and very often can use them to offset changes in the parameters of the error variable's distribution. For example, a logistic error-variable distribution with a non-zero location parameter
7392:
21152:
3201:
25637:
14203:
8911:
4085:
24728:
12301:
4235:
31744:
26742:
22105:
15473:
model. In such a model, it is natural to model each possible outcome using a different set of regression coefficients. It is also possible to motivate each of the separate latent variables as the theoretical
9585:
8413:
5779:
28641:
24287:
1302:
23020:
11350:
In the above cases of two categories (binomial logistic regression), the categories were indexed by "0" and "1", and we had two probabilities: The probability that the outcome was in category 1 was given by
2547:
30:
Example graph of a logistic regression curve fitted to data. The curve shows the estimated probability of passing an exam (binary dependent variable) versus hours studying (scalar independent variable). See
9785:
20152:{\displaystyle \operatorname {logit} \left(\operatorname {\mathbb {E} } \left\right)=\operatorname {logit} (p_{i})=\ln \left({\frac {p_{i}}{1-p_{i}}}\right)={\boldsymbol {\beta }}\cdot \mathbf {X} _{i}\,,}
15602:
15486:
models, because it both provides a theoretically strong foundation and facilitates intuitions about the model, which in turn makes it easy to consider various sorts of extensions. (See the example below.)
7980:
The fourth line is another way of writing the probability mass function, which avoids having to write separate cases and is more convenient for certain types of calculations. This relies on the fact that
12233:
8429:
This is an example of an SPSS output for a logistic regression model using three explanatory variables (coffee use per week, energy drink use per week, and soda use per week) and two categories (male and
23607:
10208:
5504:
26234:
24171:
19950:
17394:
17078:
23813:
23622:
17020:{\displaystyle {\begin{aligned}\ln \Pr(Y_{i}=0)&={\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}-\ln Z\\\ln \Pr(Y_{i}=1)&={\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}-\ln Z\end{aligned}}}
6177:
22440:
outcomes. It can be shown that the optimized error of any of these fits will never be less than the optimum error of the null model, and that the difference between these minimum error will follow a
18932:
1872:
27961:
7990:
can take only the value 0 or 1. In each case, one of the exponents will be 1, "choosing" the value under it, while the other is 0, "canceling out" the value under it. Hence, the outcome is either
25275:", and the logistic function is the canonical link function), while other sigmoid functions are non-canonical link functions; this underlies its mathematical elegance and ease of optimization. See
3856:
24489:
If the model deviance is significantly smaller than the null deviance then one can conclude that the predictor or set of predictors significantly improve the model's fit. This is analogous to the
15357:
14363:{\displaystyle Y_{i}={\begin{cases}1&{\text{if }}Y_{i}^{\ast }>0\ {\text{ i.e. }}{-\varepsilon _{i}}<{\boldsymbol {\beta }}\cdot \mathbf {X} _{i},\\0&{\text{otherwise.}}\end{cases}}}
13148:{\displaystyle \operatorname {logit} (\operatorname {\mathbb {E} } )=\operatorname {logit} (p_{i})=\ln \left({\frac {p_{i}}{1-p_{i}}}\right)=\beta _{0}+\beta _{1}x_{1,i}+\cdots +\beta _{m}x_{m,i}}
4845:
141:
Binary variables are widely used in statistics to model the probability of a certain class or event taking place, such as the probability of a team winning, of a patient being healthy, etc. (see
28183:
25777:
23273:
16852:
15111:
14944:
6665:
1148:
29926:
Biondo, S.; Ramos, E.; Deiros, M.; Ragué, J. M.; De Oca, J.; Moreno, P.; Farran, L.; Jaurrieta, E. (2000). "Prognostic factors for mortality in left colonic peritonitis: A new scoring system".
17289:
24622:
test. In logistic regression, there are several different tests designed to assess the significance of an individual predictor, most notably the likelihood ratio test and the Wald statistic.
13393:
and the regression coefficients are unobserved, and the means of determining them is not part of the model itself. They are typically determined by some sort of optimization procedure, e.g.
29961:
Marshall, J. C.; Cook, D. J.; Christou, N. V.; Bernard, G. R.; Sprung, C. L.; Sibbald, W. J. (1995). "Multiple organ dysfunction score: A reliable descriptor of a complex clinical outcome".
13643:{\displaystyle \operatorname {\mathbb {E} } =p_{i}=\operatorname {logit} ^{-1}({\boldsymbol {\beta }}\cdot \mathbf {X} _{i})={\frac {1}{1+e^{-{\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}}}}
5141:
2686:
2620:
23854:
21562:
8234:
27642:
27613:
27584:
27551:
27458:
27393:
23261:
15607:
An intuition for this comes from the fact that, since we choose based on the maximum of two values, only their difference matters, not the exact values — and this effectively removes one
22300:
22174:
18167:
will produce the same probabilities for all possible explanatory variables. In fact, it can be seen that adding any constant vector to both of them will produce the same probabilities:
18143:
13321:{\displaystyle \operatorname {logit} (\operatorname {\mathbb {E} } )=\operatorname {logit} (p_{i})=\ln \left({\frac {p_{i}}{1-p_{i}}}\right)={\boldsymbol {\beta }}\cdot \mathbf {X} _{i}}
1514:
31325:, p. 8, "As far as I can see the introduction of the logistics as an alternative to the normal probability function is the work of a single person, Joseph Berkson (1899–1982), ..."
26061:
22637:
19567:
12834:
11938:
3904:
24043:
21987:
30529:
22751:
12018:
11976:
11909:
5092:
1410:
27220:
26492:
23077:
23066:
11417:
6660:
3751:
22234:
21216:
12882:
There are various equivalent specifications and interpretations of logistic regression, which fit into different types of more general models, and allow different generalizations.
12345:
9520:
8571:
3785:
28043:
25147:
25063:
21464:
20807:
18940:
13401:
conditions that seek to exclude unlikely values, e.g. extremely large values for any of the regression coefficients. The use of a regularization condition is equivalent to doing
11380:
9461:
13379:
4726:
4636:
This simple model is an example of binary logistic regression, and has one explanatory variable and a binary categorical variable which can assume one of two categorical values.
721:, it can be used to predict the likelihood of a person ending up in the labor force, and a business application would be to predict the likelihood of a homeowner defaulting on a
25444:
23999:
10893:
1349:
26946:
20707:
11663:{\displaystyle p_{n}({\boldsymbol {x}})={\frac {e^{{\boldsymbol {\beta }}_{n}\cdot {\boldsymbol {x}}}}{1+\sum _{u=1}^{N}e^{{\boldsymbol {\beta }}_{u}\cdot {\boldsymbol {x}}}}}}
9430:
7050:
24151:
22538:
22142:
12551:
6904:
6589:
5378:
4622:
25020:
23539:
22261:
20540:
regression analysis with the predictors of interest for the sole purpose of examining the tolerance statistic used to assess whether multicollinearity is unacceptably high.
6915:
4428:
4390:
2424:
775:
A group of 20 students spends between 0 and 6 hours studying for an exam. How does the number of hours spent studying affect the probability of the student passing the exam?
22567:
21828:
12380:
11712:
8493:
6549:
5496:
1646:
1555:
1451:
29224:
21376:{\displaystyle \mathbf {X} ={\begin{bmatrix}1&x_{1}(1)&x_{2}(1)&\ldots \\1&x_{1}(2)&x_{2}(2)&\ldots \\\vdots &\vdots &\vdots \end{bmatrix}}}
19598:
19588:. With this choice, the single-layer neural network is identical to the logistic regression model. This function has a continuous derivative, which allows it to be used in
14048:
models and makes it easier to extend to certain more complicated models with multiple, correlated choices, as well as to compare logistic regression to the closely related
10839:
10711:
10665:
10586:
10047:
28002:
20912:
19328:
18067:
18025:
15482:. (In terms of utility theory, a rational actor always chooses the choice with the greatest associated utility.) This is the approach taken by economists when formulating
11119:
10927:
10113:
10080:
28446:
27647:
26503:
21643:", states that logistic regression models give stable values for the explanatory variables if based on a minimum of about 10 events per explanatory variable (EPV); where
12801:
6033:
1202:
30379:
29193:
25251:
Of all the functional forms used for estimating the probabilities of a particular categorical outcome which optimize the fit by maximizing the likelihood function (e.g.
24830:
24803:
24776:
23432:
21521:
13474:
9356:
8742:
6854:
6515:
5997:
4679:
3972:
3941:
3702:
3671:
3637:
3606:
3350:
3319:
3280:
3249:
2776:
2745:
2352:
2286:
1809:
1778:
27356:{\displaystyle p_{nk}={\frac {e^{{\boldsymbol {\lambda }}_{n}\cdot {\boldsymbol {x}}_{k}}}{\sum _{u=0}^{N}e^{{\boldsymbol {\lambda }}_{u}\cdot {\boldsymbol {x}}_{k}}}}}
24600:
15614:
1862:
682:
Logistic regression is used in various fields, including machine learning, most medical fields, and social sciences. For example, the Trauma and Injury
Severity Score (
22194:
3466:
29270:
27767:
25390:
24028:
calculations – variance in the criterion is essentially divided into variance accounted for by the predictors and residual variance. In logistic regression analysis,
11863:{\displaystyle p_{0}({\boldsymbol {x}})=1-\sum _{n=1}^{N}p_{n}({\boldsymbol {x}})={\frac {1}{1+\sum _{u=1}^{N}e^{{\boldsymbol {\beta }}_{u}\cdot {\boldsymbol {x}}}}}}
11249:
10767:
5848:
4588:
2319:
2253:
2220:
2187:
2154:
2121:
1743:
27488:
27076:
25474:
22308:
17061:
17030:
Two separate sets of regression coefficients have been introduced, just as in the two-way latent variable model, and the two equations appear a form that writes the
12864:
11054:
5448:
2803:
2714:
1015:
28624:
27427:
26640:
25546:
25503:
25355:
22474:
21720:
12372:
11303:
11276:
11193:
11146:
11001:
10954:
9551:
9305:
8820:
8439:
The above example of binary logistic regression on one explanatory variable can be generalized to binary logistic regression on any number of explanatory variables
7970:. This is because doing an average this way simply computes the proportion of successes seen, which we expect to converge to the underlying probability of success.
5405:
5330:
2381:
2082:
2051:
1706:
1675:
24991:
22762:
22671:
14081:
10011:
8917:
8069:
5968:
5938:
5909:
5874:
5280:
192:. The defining characteristic of the logistic model is that increasing one of the independent variables multiplicatively scales the odds of the given outcome at a
27520:
26205:
A very important point here is that this expression is (remarkably) not an explicit function of the beta coefficients. It is only a function of the probabilities
25716:
25665:
25324:
22943:
22699:
20611:
20585:
19229:
17975:{\displaystyle \Pr(Y_{i}=c)=\operatorname {softmax} (c,{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i},{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i},\dots ).}
16709:
As an example, consider a province-level election where the choice is between a right-of-center party, a left-of-center party, and a secessionist party (e.g. the
15508:
12178:
11515:
11485:
11448:
11333:
11219:
11084:
11027:
10793:
10737:
10619:
10547:
9985:
9577:
9487:
9408:
9382:
9122:
8559:
7213:
4003:
3360:
983:
7952:, which is a general property of the Bernoulli distribution. In other words, if we run a large number of Bernoulli trials using the same probability of success
779:
The reason for using logistic regression for this problem is that the values of the dependent variable, pass and fail, while represented by "1" and "0", are not
27740:
21689:
21665:
11166:
10974:
9329:
8773:
6113:
6085:
6056:
5810:
5428:
5300:
5040:
5016:
4996:
4973:
4787:
4746:
21752:
is defined which is a measure of the error between the logistic model fit and the outcome data. In the limit of a large number of data points, the deviance is
29547:
dependent variable (with unordered values, also called "classification"). The general case of having dependent variables with more than two values is termed
27084:
19728:
4243:
4093:
29996:
Le Gall, J. R.; Lemeshow, S.; Saulnier, F. (1993). "A new
Simplified Acute Physiology Score (SAPS II) based on a European/North American multicenter study".
27775:
23440:
15215:{\displaystyle {\begin{aligned}\varepsilon _{0}&\sim \operatorname {EV} _{1}(0,1)\\\varepsilon _{1}&\sim \operatorname {EV} _{1}(0,1)\end{aligned}}}
7918:. As noted above, each separate trial has its own probability of success, just as each trial has its own explanatory variables. The probability of success
25811:
22540:
in the linear regression case, except that the likelihood is maximized rather than minimized. Denote the maximized log-likelihood of the proposed model by
12020:
to be defined in terms of the other probabilities is artificial. Any of the probabilities could have been selected to be so defined. This special value of
17832:{\displaystyle \Pr(Y_{i}=c)={\frac {e^{{\boldsymbol {\beta }}_{c}\cdot \mathbf {X} _{i}}}{\sum _{h}e^{{\boldsymbol {\beta }}_{h}\cdot \mathbf {X} _{i}}}}}
14412:
than in the former case, for all sets of explanatory variables — but critically, it will always remain on the same side of 0, and hence lead to the same
28048:
12763:{\displaystyle {\frac {\partial \ell }{\partial \beta _{nm}}}=0=\sum _{k=1}^{K}\Delta (n,y_{k})x_{mk}-\sum _{k=1}^{K}p_{n}({\boldsymbol {x}}_{k})x_{mk}}
30156:
Palei, S. K.; Das, S. K. (2009). "Logistic regression model for prediction of roof fall risks in bord and pillar workings in coal mines: An approach".
19226:
which shows that this formulation is indeed equivalent to the previous formulation. (As in the two-way latent variable formulation, any settings where
15242:
25940:
13397:, that finds values that best fit the observed data (i.e. that give the most accurate predictions for the data already observed), usually subject to
12144:{\displaystyle t_{n}=\ln \left({\frac {p_{n}({\boldsymbol {x}})}{p_{0}({\boldsymbol {x}})}}\right)={\boldsymbol {\beta }}_{n}\cdot {\boldsymbol {x}}}
153:
when there are more than two possible values (e.g. whether an image is of a cat, dog, lion, etc.), and the binary logistic regression generalized to
29899:
Kologlu, M.; Elker, D.; Altun, H.; Sayek, I. (2001). "Validation of MPI and PIA II in two different groups of patients with secondary peritonitis".
16664:
9016:
7379:
that is specific to the outcome at hand, but related to the explanatory variables. This can be expressed in any of the following equivalent forms:
21840:
14424:(This predicts that the irrelevancy of the scale parameter may not carry over into more complex models where more than two choices are available.)
3072:{\displaystyle \ell =\sum _{k:y_{k}=1}\ln(p_{k})+\sum _{k:y_{k}=0}\ln(1-p_{k})=\sum _{k=1}^{K}\left(\,y_{k}\ln(p_{k})+(1-y_{k})\ln(1-p_{k})\right)}
23859:
8081:
5149:
20812:
24963:{\displaystyle {\widehat {\beta }}_{0}^{*}={\widehat {\beta }}_{0}+\log {\frac {\pi }{1-\pi }}-\log {{\tilde {\pi }} \over {1-{\tilde {\pi }}}}}
83:
the parameters of a logistic model (the coefficients in the linear or non linear combinations). In binary logistic regression there is a single
33463:
20532:
Having a large ratio of variables to cases results in an overly conservative Wald statistic (discussed below) and can lead to non-convergence.
18894:
As a result, we can simplify matters, and restore identifiability, by picking an arbitrary value for one of the two vectors. We choose to set
14457:
9775:
parameters will require numerical methods. One useful technique is to equate the derivatives of the log likelihood with respect to each of the
4853:
33968:
29410:, but they gave him little credit and did not adopt his terminology. Verhulst's priority was acknowledged and the term "logistic" revived by
29280:, because the dependent variable is binary. Second, the predicted values are probabilities and are therefore restricted to (0,1) through the
6821:. Then when this is used in the equation relating the log odds of a success to the values of the predictors, the linear regression will be a
21084:
3092:
1813:
which give the "best fit" to the data. In the case of linear regression, the sum of the squared deviations of the fit from the data points (
26930:{\displaystyle {\frac {\partial {\mathcal {L}}}{\partial p_{n'k'}}}=0=-\ln(p_{n'k'})-1+\sum _{m=0}^{M}(\lambda _{n'm}x_{mk'})-\alpha _{k'}}
25553:
14156:
9761:{\displaystyle \ell =\sum _{k=1}^{K}y_{k}\log _{b}(p({\boldsymbol {x_{k}}}))+\sum _{k=1}^{K}(1-y_{k})\log _{b}(1-p({\boldsymbol {x_{k}}}))}
8830:
4011:
34118:
30983:
24657:
13339:
by fitting a linear predictor function of the above form to some sort of arbitrary transformation of the expected value of the variable.
12238:
4164:
771:
As a simple example, we can use a logistic regression with one explanatory variable and two categories to answer the following question:
22020:
9930:{\displaystyle {\frac {\partial \ell }{\partial \beta _{m}}}=0=\sum _{k=1}^{K}y_{k}x_{mk}-\sum _{k=1}^{K}p({\boldsymbol {x}}_{k})x_{mk}}
8363:
5690:
169:
for further extensions. The logistic regression model itself simply models probability of output in terms of input and does not perform
33742:
32383:
30607:
Gourieroux, Christian; Monfort, Alain (1981). "Asymptotic
Properties of the Maximum Likelihood Estimator in Dichotomous Logit Models".
1218:
25140:
Although the dependent variable in logistic regression is
Bernoulli, the logit is on an unrestricted scale. The logit function is the
22951:
2437:
31366:
26055:
Assuming the multinomial logistic function, the derivative of the log-likelihood with respect the beta coefficients was found to be:
22433:, and then fitted using the proposed model. Specifically, we can consider the fits of the proposed model to every permutation of the
5674:{\displaystyle g(p(x))=\sigma ^{-1}(p(x))=\operatorname {logit} p(x)=\ln \left({\frac {p(x)}{1-p(x)}}\right)=\beta _{0}+\beta _{1}x,}
26413:{\displaystyle {\mathcal {L}}_{fit}=\sum _{n=0}^{N}\sum _{m=0}^{M}\lambda _{nm}\sum _{k=1}^{K}(p_{nk}x_{mk}-\Delta (n,y_{k})x_{mk})}
15561:
33516:
25260:
12183:
659:
24643:
Alternatively, when assessing the contribution of individual predictors in a given model, one may examine the significance of the
23742:{\displaystyle D=\ln \left({\frac {{\hat {L}}^{2}}{{\hat {L}}_{\varphi }^{2}}}\right)=2({\hat {\ell }}-{\hat {\ell }}_{\varphi })}
23561:
16737:
Estimated strength of regression coefficient for different outcomes (party choices) and different values of explanatory variables
15611:. Another critical fact is that the difference of two type-1 extreme-value-distributed variables is a logistic distribution, i.e.
10121:
33955:
30339:
29521:
29473:. However, the development of the logistic model as a general alternative to the probit model was principally due to the work of
29406:, which led to its use in modern statistics. They were initially unaware of Verhulst's work and presumably learned about it from
23555:
and it can be shown that the maximum log-likelihood of these permutation fits will never be smaller than that of the null model:
19854:
11419:. The sum of these probabilities equals 1, which must be true, since "0" and "1" are the only possible categories in this setup.
4640:
is the generalization of binary logistic regression to include any number of explanatory variables and any number of categories.
2010:{\displaystyle \ell _{k}={\begin{cases}-\ln p_{k}&{\text{ if }}y_{k}=1,\\-\ln(1-p_{k})&{\text{ if }}y_{k}=0.\end{cases}}}
569:
12031:) are expressed in terms of the pivot probability and are again expressed as a linear combination of the explanatory variables:
30423:
29524:
and interpreting odds of alternatives as relative preferences; this gave a theoretical foundation for the logistic regression.
23770:
6121:
5911:
is the probability that the dependent variable equals a case, given some linear combination of the predictors. The formula for
31604:
18897:
15458:{\displaystyle Y_{i}={\begin{cases}1&{\text{if }}Y_{i}^{1\ast }>Y_{i}^{0\ast },\\0&{\text{otherwise.}}\end{cases}}}
31951:
31932:
31913:
31891:
31872:
31846:
31827:
31804:
31014:
30936:
30888:
30677:
30591:
30450:
30113:
29777:
27891:
25267:
of distributions maximizes entropy, given an expected value. In the case of the logistic model, the logistic function is the
23962:
variable and data in the proposed model is a very significant improvement over the null model. In other words, we reject the
12310:
measurements or data points will be generated by the above probabilities can now be calculated. Indexing each measurement by
239:
for discussion. The logistic regression as a general statistical model was originally developed and popularized primarily by
31582:
Studies in
History and Philosophy of Science Part C: Studies in History and Philosophy of Biological and Biomedical Sciences
23397:{\displaystyle {\hat {\ell }}_{\varphi }=K(\,{\overline {y}}\ln({\overline {y}})+(1-{\overline {y}})\ln(1-{\overline {y}}))}
19718:
is the number of successes observed (the sum of the individual
Bernoulli-distributed random variables), and hence follows a
16803:
there need to be separate sets of coefficients for each characteristic, not simply a single extra per-choice characteristic.
6814:{\displaystyle \beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2}+\cdots +\beta _{m}x_{m}=\beta _{0}+\sum _{i=1}^{m}\beta _{i}x_{i}}
3804:
32378:
32078:
32020:
29811:
Walker, SH; Duncan, DB (1967). "Estimation of the probability of an event as a function of several independent variables".
17373:{\displaystyle Z=e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}+e^{{\boldsymbol {\beta }}_{1}\cdot \mathbf {X} _{i}}}
13418:
6087:
of the predictors) is equivalent to the exponential function of the linear regression expression. This illustrates how the
4804:
559:
30129:
M. Strano; B.M. Colosimo (2006). "Logistic regression analysis for experimental determination of forming limit diagrams".
30056:
Truett, J; Cornfield, J; Kannel, W (1967). "A multivariate analysis of the risk of coronary heart disease in
Framingham".
23767:
For the simple model of student test scores described above, the maximum value of the log-likelihood of the null model is
32982:
32130:
31211:
31032:
25721:
19706:
5303:
4976:
1073:
184:
instead of the logistic function (to convert the linear combination to a probability) can also be used, most notably the
30693:
Van Smeden, M.; De Groot, J. A.; Moons, K. G.; Collins, G. S.; Altman, D. G.; Eijkemans, M. J.; Reitsma, J. B. (2016).
29458:. The probit model influenced the subsequent development of the logit model and these models competed with each other.
24016:. Since this has no direct analog in logistic regression, various methods including the following can be used instead.
22444:, with degrees of freedom equal those of the proposed model minus those of the null model which, in this case, will be
9553:
as the categorical outcome of that measurement, the log likelihood may be written in a form very similar to the simple
220:
25050:
to create a continuous criterion as a transformed version of the dependent variable. The logarithm of the odds is the
227:. Logistic regression by MLE plays a similarly basic role for binary or categorical responses as linear regression by
33765:
33657:
32041:
21763:
Linear regression and logistic regression have many similarities. For example, in simple linear regression, a set of
16694:
5100:
2625:
2559:
27711:
In machine learning applications where logistic regression is used for binary classification, the MLE minimises the
23818:
16676:
196:
rate, with each independent variable having its own parameter; for a binary dependent variable this generalizes the
34370:
33943:
33817:
31967:
30750:
29589:
26195:{\displaystyle {\frac {\partial \ell }{\partial \beta _{nm}}}=\sum _{k=1}^{K}(p_{nk}x_{mk}-\Delta (n,y_{k})x_{mk})}
21526:
21487:
14436:
8193:
3976:
coefficients may be entered into the logistic regression equation to estimate the probability of passing the exam.
523:
27618:
27589:
27560:
27527:
27434:
27369:
25276:
24081:{\displaystyle D=-2\ln {\frac {\text{likelihood of the fitted model}}{\text{likelihood of the saturated model}}}.}
23226:
21647:
denotes the cases belonging to the less frequent category in the dependent variable. Thus a study designed to use
14208:
i.e. the latent variable can be written directly in terms of the linear predictor function and an additive random
34001:
33662:
33407:
32778:
32368:
29581:
29562:
29536:
29227:
24154:
22273:
22147:
18072:
15608:
15494:
seems fairly arbitrary, but it makes the mathematics work out, and it may be possible to justify its use through
11345:
4637:
4564:, the output indicates that hours studying is significantly associated with the probability of passing the exam (
1468:
783:. If the problem was changed so that pass/fail was replaced with the grade 0–100 (cardinal numbers), then simple
717:
applications such as prediction of a customer's propensity to purchase a product or halt a subscription, etc. In
713:, especially for predicting the probability of failure of a given process, system or product. It is also used in
574:
512:
332:
307:
154:
23213:{\displaystyle \ell _{\varphi }=\sum _{k=1}^{K}\left(y_{k}\ln(p_{\varphi })+(1-y_{k})\ln(1-p_{\varphi })\right)}
22582:
21834:
parameters which minimize the sum of the squares of the residuals (the squared error term) for each data point:
19512:
12810:
11914:
3862:
34052:
33264:
33071:
32960:
32918:
30237:
Wibbenmeyer, Matthew J.; Hand, Michael S.; Calkin, David E.; Venn, Tyron J.; Thompson, Matthew P. (June 2013).
29696:
Tolles, Juliana; Meurer, William J (2016). "Logistic Regression Relating Patient Characteristics to Outcomes".
21956:
16657:
1033:" consisting of two categories: "pass" or "fail" corresponding to the categorical values 1 and 0 respectively.
434:
32992:
31007:
Regression Modeling Strategies: With Applications to Linear Models, Logistic Regression, and Survival Analysis
30468:
29571:
is an extension of multinomial logit that allows for correlations among the choices of the dependent variable.
22707:
11985:
11943:
11876:
8744:
are parameters of the model. An additional generalization has been introduced in which the base of the model (
8703:{\displaystyle t=\log _{b}{\frac {p}{1-p}}=\beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2}+\cdots +\beta _{M}x_{M}}
5048:
1366:
34397:
34295:
33254:
32157:
31680:"On the Rate of Growth of the Population of the United States since 1790 and Its Mathematical Representation"
31074:
29439:
27172:
26440:
23031:
20505:
19017:{\displaystyle e^{{\boldsymbol {\beta }}_{0}\cdot \mathbf {X} _{i}}=e^{\mathbf {0} \cdot \mathbf {X} _{i}}=1}
13394:
3207:
393:
216:
13381:— thereby matching the potential range of the linear prediction function on the right side of the equation.
11535:. The sum of these probabilities over all categories must equal 1. Using the mathematically convenient base
11385:
6622:
3720:
790:
The table shows the number of hours each student spent studying, and whether they passed (1) or failed (0).
34392:
33846:
33795:
33780:
33770:
33639:
33511:
33478:
33304:
33259:
33089:
29336:
29320:). Equivalently, in the latent variable interpretations of these two methods, the first assumes a standard
25802:
24025:
22199:
21618:
21157:
20533:
17985:
In order to prove that this is equivalent to the previous model, the above model is overspecified, in that
13398:
12321:
9496:
3757:
2090:. Log loss is always greater than or equal to 0, equals 0 only in case of a perfect prediction (i.e., when
652:
28007:
27030:{\displaystyle \sum _{m=0}^{M}\lambda _{nm}x_{mk}={\boldsymbol {\lambda }}_{n}\cdot {\boldsymbol {x}}_{k}}
21389:
20712:
11354:
9435:
7179:{\displaystyle p={\frac {1}{1+b^{-(\beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2}+\cdots +\beta _{m}x_{m})}}}}
34358:
34190:
33991:
33915:
33216:
32970:
32639:
32103:
30239:"Risk Preferences in Strategic Wildfire Decision Making: A Choice Experiment with U.S. Wildfire Managers"
29672:
29554:
19311:
13346:
12902:
outcomes, is the way the probability of a particular outcome is linked to the linear predictor function:
7342:
4684:
750:
730:
595:
162:
158:
31548:(1966). "Some procedures connected with the logistic qualitative response curve". In F. N. David (ed.).
29363:
25399:
23969:
10844:
7034:{\displaystyle \log {\frac {p}{1-p}}=\beta _{0}+\beta _{1}x_{1}+\beta _{2}x_{2}+\cdots +\beta _{m}x_{m}}
3086:
itself, which is the probability that the given data set is produced by a particular logistic function:
1310:
34402:
34075:
34047:
34042:
33790:
33549:
33455:
33435:
33343:
33054:
32872:
32355:
32227:
32032:
30953:
30238:
30042:
29637:
21601:
20632:
15491:
15233:
14427:
It turns out that this formulation is exactly equivalent to the preceding one, phrased in terms of the
12501:{\displaystyle \ell =\sum _{k=1}^{K}\sum _{n=0}^{N}\Delta (n,y_{k})\,\ln(p_{n}({\boldsymbol {x}}_{k}))}
9413:
564:
533:
460:
170:
60:
30830:"Modern modelling techniques are data hungry: a simulation study for predicting dichotomous endpoints"
30371:
30301:
24115:
23548:
values. Again, we can conceptually consider the fit of the proposed model to every permutation of the
22516:
22120:
19678:{\displaystyle {\frac {\mathrm {d} y}{\mathrm {d} X}}=y(1-y){\frac {\mathrm {d} f}{\mathrm {d} X}}.\,}
12514:
7977:
of the Bernoulli distribution, specifying the probability of seeing each of the two possible outcomes.
6859:
6557:
5335:
4601:
33807:
33575:
33296:
33221:
33150:
33079:
32999:
32987:
32857:
32845:
32838:
32546:
32267:
30199:
29632:
24996:
24735:
23517:
22239:
21565:
19472:
19457:{\displaystyle p_{i}={\frac {1}{1+e^{-(\beta _{0}+\beta _{1}x_{1,i}+\cdots +\beta _{k}x_{k,i})}}}.\,}
14918:
13658:
12565:
and zero otherwise. In the case of two explanatory variables, this indicator function was defined as
9779:
parameters to zero yielding a set of equations which will hold at the maximum of the log likelihood:
8777:
8031:
7974:
6003:
from the linear regression equation (the value of the criterion when the predictor is equal to zero).
4404:
4369:
4308:
This table shows the estimated probability of passing the exam for several values of hours studying.
4158:
Similarly, for a student who studies 4 hours, the estimated probability of passing the exam is 0.87:
2390:
554:
543:
507:
414:
28577:{\displaystyle N^{-1}\log L(\theta \mid y;x)=N^{-1}\sum _{i=1}^{N}\log \Pr(y_{i}\mid x_{i};\theta )}
27696:{\displaystyle {\boldsymbol {\beta }}_{n}={\boldsymbol {\lambda }}_{n}-{\boldsymbol {\lambda }}_{0}}
27554:
26617:{\displaystyle {\mathcal {L}}_{norm}=\sum _{k=1}^{K}\alpha _{k}\left(1-\sum _{n=1}^{N}p_{nk}\right)}
25295:
22543:
21784:
16841:
as a linear predictor, we separate the linear predictor into two, one for each of the two outcomes:
15379:
14256:
11673:
8454:
7657:
6520:
5854:(i.e., log-odds or natural logarithm of the odds) is equivalent to the linear regression expression.
5465:
1894:
1602:
1519:
1415:
34290:
34057:
33920:
33605:
33570:
33534:
33319:
32761:
32670:
32629:
32541:
32232:
32071:
32050:
31545:
31513:
29652:
29575:
29490:
29301:
29239:
29202:
27746:
26634:
are the appropriate Lagrange multipliers. The Lagrangian is then the sum of the above three terms:
24575:
24571:
24101:
23938:
22441:
21753:
17268:
14428:
14395:
is equivalent to setting the scale parameter to 1 and then dividing all regression coefficients by
13654:
13449:
the explanatory variable. In the case of a dichotomous explanatory variable, for instance, gender
13405:(MAP) estimation, an extension of maximum likelihood. (Regularization is most commonly done using
13336:
13332:
10798:
10670:
10624:
10555:
10016:
8042:
that are specific to the model at hand but the same for all trials. The linear predictor function
7889:
1824:, is taken as a measure of the goodness of fit, and the best fit is obtained when that function is
726:
686:), which is widely used to predict mortality in injured patients, was originally developed by Boyd
615:
486:
409:
302:
281:
30635:
30200:"Household-Level Model for Hurricane Evacuation Destination Type Choice Using Hurricane Ivan Data"
29508:, which greatly increased the scope of application and the popularity of the logit model. In 1973
27969:
26736:
Setting the derivative of the Lagrangian with respect to one of the probabilities to zero yields:
20895:
18030:
17988:
15677:{\displaystyle \varepsilon =\varepsilon _{1}-\varepsilon _{0}\sim \operatorname {Logistic} (0,1).}
11339:
11091:
10899:
10085:
10052:
8781:. However, in some cases it can be easier to communicate results by working in base 2 or base 10.
6197:
studied, the odds of passing (group 1) or failing (group 0) are (expectedly) 2 to 1 (Denis, 2019).
34199:
33812:
33752:
33689:
33327:
33311:
33049:
32911:
32901:
32751:
32665:
31634:
Reports on Biological Standards: Methods of biological assay depending on a quantal response. III
24618:). In linear regression, the significance of a regression coefficient is assessed by computing a
24515:
Four of the most commonly used indices and one less commonly used one are examined on this page:
21622:
20525:
16672:
12776:
6008:
3569:{\displaystyle 0={\frac {\partial \ell }{\partial \beta _{1}}}=\sum _{k=1}^{K}(y_{k}-p_{k})x_{k}}
1164:
698:
645:
538:
31856:
31049:
30970:
Hosmer, D.W. (1997). "A comparison of goodness-of-fit tests for the logistic regression model".
30198:
Mesa-Arango, Rodrigo; Hasan, Samiul; Ukkusuri, Satish V.; Murray-Tuite, Pamela (February 2013).
29394:
The logistic function was independently rediscovered as a model of population growth in 1920 by
29163:
24808:
24781:
24754:
23410:
21497:
15478:
associated with making the associated choice, and thus motivate logistic regression in terms of
13452:
9334:
8720:
6832:
6493:
5975:
4655:
3950:
3919:
3680:
3649:
3615:
3584:
3328:
3297:
3258:
3227:
2754:
2723:
2324:
2258:
1787:
1756:
34237:
34167:
33960:
33897:
33652:
33539:
32536:
32433:
32340:
32219:
32118:
29517:
29407:
29273:
24578:
24033:
is calculated by comparing a given model with the saturated model. This computation gives the
22399:{\displaystyle {\hat {\varepsilon }}_{\varphi }^{2}=\sum _{k=1}^{K}({\overline {y}}-y_{k})^{2}}
21596:
difficult to calculate except in very low dimensions. Now, though, automatic software such as
21593:
20622:
15495:
8237:
8039:
7902:
7367:
4789:, and outputs a value between zero and one. For the logit, this is interpreted as taking input
1840:
1748:
502:
497:
439:
228:
205:
31815:
29461:
The logistic model was likely first used as an alternative to the probit model in bioassay by
26726:{\displaystyle {\mathcal {L}}={\mathcal {L}}_{ent}+{\mathcal {L}}_{fit}+{\mathcal {L}}_{norm}}
22179:
11335:
is not as much as 10 times greater, it's only the effect on the odds that is 10 times greater.
34262:
34204:
34147:
33973:
33866:
33775:
33501:
33385:
33244:
33236:
33126:
33118:
32933:
32829:
32807:
32766:
32731:
32698:
32644:
32619:
32574:
32513:
32473:
32275:
32098:
30695:"No rationale for 1 variable per 10 events criterion for binary logistic regression analysis"
29593:
29585:
29371:
29321:
29277:
29249:
27752:
25362:
25208:{\displaystyle \operatorname {logit} \operatorname {\mathcal {E}} (Y)=\beta _{0}+\beta _{1}x}
25131:{\displaystyle \operatorname {logit} p=\ln {\frac {p}{1-p}}\quad {\text{for }}0<p<1\,.}
25046:
of the event happening for different levels of each independent variable, and then takes its
24631:
24034:
24029:
23613:
22909:{\displaystyle \ell =\sum _{k=1}^{K}\left(y_{k}\ln(p(x_{k}))+(1-y_{k})\ln(1-p(x_{k}))\right)}
21749:
21668:
21581:
19719:
16808:
16721:). We would then use three latent variables, one for each choice. Then, in accordance with
16668:
14440:
14213:
14140:{\displaystyle Y_{i}^{\ast }={\boldsymbol {\beta }}\cdot \mathbf {X} _{i}+\varepsilon _{i}\,}
14041:
13410:
11224:
10742:
9410:
for a given observation. The main use-case of a logistic model is to be given an observation
8993:{\displaystyle {\boldsymbol {\beta }}=\{\beta _{0},\beta _{1},\beta _{2},\dots ,\beta _{M}\}}
7315:
5815:
4595:
4567:
3284:, determining their optimum values will require numerical methods. One method of maximizing
2291:
2225:
2192:
2159:
2126:
2093:
1715:
754:
590:
286:
31860:
30302:"A discrete choice model based on random utilities for exit choice in emergency evacuations"
28626:
pairs are drawn uniformly from the underlying distribution, then in the limit of large
27463:
27051:
25449:
19271:{\displaystyle {\boldsymbol {\beta }}={\boldsymbol {\beta }}_{1}-{\boldsymbol {\beta }}_{0}}
17067:
ensuring that the result is a distribution. This can be seen by exponentiating both sides:
17037:
15550:{\displaystyle {\boldsymbol {\beta }}={\boldsymbol {\beta }}_{1}-{\boldsymbol {\beta }}_{0}}
14908:
models—makes clear the relationship between logistic regression (the "logit model") and the
14378:(which sets the mean) is equivalent to a distribution with a zero location parameter, where
12839:
11032:
5433:
3453:{\displaystyle 0={\frac {\partial \ell }{\partial \beta _{0}}}=\sum _{k=1}^{K}(y_{k}-p_{k})}
2785:
2696:
988:
34185:
33760:
33709:
33685:
33647:
33565:
33544:
33496:
33375:
33353:
33322:
33231:
33108:
33059:
32977:
32950:
32906:
32862:
32624:
32400:
32280:
32026:
31753:
31731:
31691:
31482:
30388:
30250:
30215:
29544:
29462:
29423:
29343:
28597:
27400:
25519:
25481:
25333:
25288:
25240:
25235:
of the probability of success is then fitted to the predictors. The predicted value of the
25035:
22447:
21694:
21592:
in logistic regression. When Bayesian inference was performed analytically, this made the
17280:
16710:
13402:
12350:
11281:
11254:
11171:
11124:
10979:
10932:
9529:
9283:
8793:
7327:
7255:
6095:
So we define odds of the dependent variable equaling a case (given some linear combination
5383:
5308:
2359:
2060:
2029:
1684:
1653:
1030:
1022:
734:
610:
600:
481:
449:
404:
383:
291:
232:
150:
95:
64:
31624:
31377:
29342:
The assumption of linear predictor effects can easily be relaxed using techniques such as
28172:
assuming that all the observations in the sample are independently Bernoulli distributed,
24976:
22647:
22409:
which is proportional to the square of the (uncorrected) sample standard deviation of the
17275:
is simply the sum of all un-normalized probabilities, and by dividing each probability by
14391:
regardless of settings of explanatory variables. Similarly, an arbitrary scale parameter
9990:
8045:
7927:
is not observed, only the outcome of an individual Bernoulli trial using that probability.
7341:(Discrete variables referring to more than two possible choices are typically coded using
5944:
5914:
5885:
5859:
5256:
3579:
and the maximization procedure can be accomplished by solving the above two equations for
8:
34332:
34257:
34180:
33861:
33625:
33580:
33488:
33468:
33440:
33173:
33039:
33034:
33024:
33016:
32834:
32795:
32685:
32675:
32584:
32363:
32319:
32237:
32162:
32064:
31963:"A simulation study of the number of events per variable in logistic regression analysis"
30746:"A simulation study of the number of events per variable in logistic regression analysis"
29677:
29325:
29196:
28169:
27499:
25695:
25644:
25303:
25031:
22922:
22678:
22494:, and so we can estimate how significant an improvement is given by the inclusion of the
21589:
21573:
21491:
20590:
20564:
14913:
14382:
has been added to the intercept coefficient. Both situations produce the same value for
13476:
is the estimate of the odds of having the outcome for, say, males compared with females.
13422:
12895:
12157:
11494:
11464:
11427:
11312:
11198:
11063:
11006:
10772:
10716:
10598:
10526:
9964:
9556:
9489:. The optimum beta coefficients may again be found by maximizing the log-likelihood. For
9466:
9387:
9361:
9101:
8538:
7331:
7192:
6822:
3982:
3083:
2087:
962:
784:
528:
429:
424:
378:
317:
262:
107:
99:
68:
31757:
31695:
31486:
31131:"Comparison of Logistic Regression and Linear Discriminant Analysis: A Simulation Study"
31111:
30392:
30254:
27156:{\displaystyle p_{nk}=e^{{\boldsymbol {\lambda }}_{n}\cdot {\boldsymbol {x}}_{k}}/Z_{k}}
22487:
will yield an minimum error less than or equal to the minimum error using the original
19813:{\displaystyle Y_{i}\,\sim \operatorname {Bin} (n_{i},p_{i}),{\text{ for }}i=1,\dots ,n}
4298:{\displaystyle p={\frac {1}{1+e^{-t}}}\approx 0.87={\text{Probability of passing exam}}}
4148:{\displaystyle p={\frac {1}{1+e^{-t}}}\approx 0.25={\text{Probability of passing exam}}}
688:
34346:
34157:
34011:
33907:
33856:
33732:
33629:
33613:
33590:
33367:
33101:
33084:
33044:
32955:
32850:
32812:
32783:
32743:
32703:
32649:
32566:
32252:
32247:
31776:
31739:
31714:
31679:
31666:
31620:
31533:
31529:
31459:
31422:
30856:
30829:
30721:
30694:
30439:
30406:
30282:
29828:
29614:
27725:
25264:
25256:
21760:
to be implemented in order to determine the significance of the explanatory variables.
21674:
21650:
21577:
20626:
20509:
17842:
This shows clearly how to generalize this formulation to more than two outcomes, as in
17064:
13413:
12890:
The particular model used by logistic regression, which distinguishes it from standard
12554:
11151:
10959:
9314:
8758:
8035:
7961:, then take the average of all the 1 and 0 outcomes, then the result would be close to
7346:
7272:
6098:
6070:
6041:
5795:
5413:
5285:
5025:
5001:
4981:
4958:
4772:
4731:
1821:
1352:
1158:
633:
362:
347:
91:
87:
31981:
31962:
31033:
https://class.stanford.edu/c4x/HumanitiesScience/StatLearning/asset/classification.pdf
30764:
30745:
30142:
29939:
29860:"Evaluating trauma care: The TRISS method. Trauma Score and the Injury Severity Score"
27875:{\displaystyle h_{\theta }(X)={\frac {1}{1+e^{-\theta ^{T}X}}}=\Pr(Y=1\mid X;\theta )}
25239:
is converted back into predicted odds, via the inverse of the natural logarithm – the
23504:{\displaystyle \beta _{0}=\ln \left({\frac {\overline {y}}{1-{\overline {y}}}}\right)}
34341:
34252:
34222:
34214:
34034:
34025:
33950:
33881:
33737:
33722:
33697:
33585:
33526:
33392:
33380:
33006:
32923:
32867:
32790:
32634:
32556:
32335:
32209:
31986:
31947:
31928:
31909:
31887:
31868:
31842:
31823:
31800:
31781:
31719:
31637:
31498:
31473:
31451:
31130:
31010:
30987:
30932:
30884:
30861:
30810:
30769:
30726:
30673:
30620:
30587:
30446:
30321:
30274:
30266:
30262:
30219:
30109:
30073:
30069:
30034:
30013:
30009:
29978:
29974:
29943:
29908:
29881:
29876:
29859:
29773:
29729:
29721:
29713:
29642:
29622:
29608:
29359:
29281:
29243:
25924:{\displaystyle {\mathcal {L}}_{ent}=-\sum _{k=1}^{K}\sum _{n=0}^{N}p_{nk}\ln(p_{nk})}
25268:
21479:
20517:
20491:
This model can be fit using the same sorts of methods as the above more basic model.
19581:
19291:
19282:
17843:
16822:
15470:
14922:
14444:
13480:
12891:
8018:
7335:
6608:
5877:
4758:
1064:
738:
694:
628:
419:
322:
276:
201:
174:
146:
103:
80:
52:
30286:
21474:
8240:
indicating the relative effect of a particular explanatory variable on the outcome.
8017:
The basic idea of logistic regression is to use the mechanism already developed for
4757:
An explanation of logistic regression can begin with an explanation of the standard
34277:
34232:
33996:
33983:
33876:
33851:
33785:
33717:
33595:
33203:
33096:
33029:
32942:
32889:
32708:
32579:
32373:
32257:
32172:
32139:
31976:
31771:
31761:
31709:
31699:
31658:
31616:
31589:
31567:
31525:
31490:
31443:
31418:
31414:
30979:
30851:
30841:
30800:
30759:
30716:
30706:
30650:
30616:
30549:
30396:
30313:
30258:
30211:
30165:
30138:
30101:
30065:
30005:
29970:
29935:
29871:
29820:
29705:
29455:
29443:
29386:, 1883). An autocatalytic reaction is one in which one of the products is itself a
29367:
29329:
28588:
27719:
26228:+1) fitting constraints and the fitting constraint term in the Lagrangian is then:
24502:
24493:-test used in linear regression analysis to assess the significance of prediction.
24024:
In linear regression analysis, one is concerned with partitioning variance via the
23934:
22501:
For logistic regression, the measure of goodness-of-fit is the likelihood function
22477:
21757:
19307:
19303:
17847:
13406:
11086:
has also increased, but it has not increased by as much as the odds have increased.
9308:
4762:
444:
373:
181:
31216:
Nouveaux Mémoires de l'Académie Royale des Sciences et Belles-Lettres de Bruxelles
28161:{\displaystyle \Pr(y\mid X;\theta )=h_{\theta }(X)^{y}(1-h_{\theta }(X))^{(1-y)}.}
14921:, which means that it is less sensitive to outlying data (and hence somewhat more
6067:
The odds of the dependent variable equaling a case (given some linear combination
34194:
33938:
33800:
33727:
33402:
33276:
33249:
33226:
33195:
32822:
32817:
32771:
32501:
32152:
31362:
31165:
30636:"Nonparametric estimation of dynamic discrete choice models for time series data"
29627:
29513:
29509:
29383:
29313:
25039:
24009:
23963:
23941:
with one degree of freedom from 11.6661... to infinity is equal to 0.00063649...
23612:
Also, as an analog to the error of the linear regression case, we may define the
22196:
subscript denotes the null model. It is seen that the null model is optimized by
21640:
21634:
21609:
21585:
21483:
19589:
19295:
18146:
15483:
15341:{\displaystyle \Pr(\varepsilon _{0}=x)=\Pr(\varepsilon _{1}=x)=e^{-x}e^{-e^{-x}}}
14905:
14432:
14072:
14060:
14045:
7323:
7259:
1570:
1209:
1059:) data. The curve shows the probability of passing an exam versus hours studying.
780:
702:
605:
312:
98:
can each be a binary variable (two classes, coded by an indicator variable) or a
84:
33684:
31593:
941:
We wish to fit a logistic function to the data consisting of the hours studied (
34143:
34138:
32601:
32531:
32177:
31745:
Proceedings of the National Academy of Sciences of the United States of America
31735:
31494:
30924:
30744:
Peduzzi, P; Concato, J; Kemper, E; Holford, TR; Feinstein, AR (December 1996).
30654:
30317:
30300:
Lovreglio, Ruggiero; Borri, Dino; dell’Olio, Luigi; Ibeas, Angel (2014-02-01).
30169:
29474:
29466:
29317:
29309:
26045:{\displaystyle \ell =\sum _{k=1}^{K}\sum _{n=0}^{N}\Delta (n,y_{k})\ln(p_{nk})}
25272:
21580:
are normally placed on the regression coefficients, for example in the form of
21466:
the vector of response variables. More details can be found in the literature.
20618:
19845:
19592:. This function is also preferred because its derivative is easily calculated:
19299:
16722:
15479:
14448:
14209:
11340:
Multinomial logistic regression: Many explanatory variables and many categories
8784:
For a more compact notation, we will specify the explanatory variables and the
7931:
4005:
into the equation gives the estimated probability of passing the exam of 0.25:
1582:
1454:
357:
240:
20:
31214:[Mathematical Researches into the Law of Population Growth Increase].
30711:
30105:
29725:
29574:
An extension of the logistic model to sets of interdependent variables is the
18145:
so knowing one automatically determines the other. As a result, the model is
10115:
which have been determined by the above method. To be concrete, the model is:
9084:{\displaystyle t=\sum _{m=0}^{M}\beta _{m}x_{m}={\boldsymbol {\beta }}\cdot x}
8425:
7258:, explanatory variables, predictor variables, features, or attributes), and a
34386:
34300:
34267:
34130:
34091:
33902:
33871:
33335:
33289:
32894:
32596:
32423:
32187:
32182:
31641:
31455:
31405:
Berkson, Joseph (1944). "Application of the Logistic Function to Bio-Assay".
30846:
30789:"Relaxing the Rule of Ten Events per Variable in Logistic and Cox Regression"
30546:
Proceedings of the Sixth Conference on Natural Language Learning (CoNLL-2002)
30541:
30325:
30270:
30223:
29717:
29647:
29395:
29379:
29378:
The logistic function was independently developed in chemistry as a model of
29305:
27712:
25141:
21943:{\displaystyle \varepsilon ^{2}=\sum _{k=1}^{K}(b_{0}+b_{1}x_{k}-y_{k})^{2}.}
20521:
19476:
12899:
5408:
5302:
equaling a success/case rather than a failure/non-case. It is clear that the
5022:
of multiple explanatory variables is treated similarly). We can then express
2553:
1574:
955: =1 for pass, 0 for fail). The data points are indexed by the subscript
476:
352:
30984:
10.1002/(sici)1097-0258(19970515)16:9<965::aid-sim509>3.3.co;2-f
30929:
Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences
30554:
29481:, where he coined "logit", by analogy with "probit", and continuing through
23924:{\displaystyle D=2({\hat {\ell }}-{\hat {\ell }}_{\varphi })=11.6661\ldots }
14228:
can be viewed as an indicator for whether this latent variable is positive:
8180:{\displaystyle f(i)=\beta _{0}+\beta _{1}x_{1,i}+\cdots +\beta _{m}x_{m,i},}
5243:{\displaystyle p(x)=\sigma (t)={\frac {1}{1+e^{-(\beta _{0}+\beta _{1}x)}}}}
34242:
34175:
34152:
34067:
33397:
32693:
32591:
32526:
32468:
32453:
32390:
32345:
31901:
31785:
31723:
31502:
30865:
30814:
30730:
30401:
30367:
30363:
30278:
29947:
29912:
29733:
29709:
29450:, as an addendum to Bliss's work. The probit model was principally used in
29419:
29297:
25252:
22576:
data points are fitted in a probabilistic sense to a function of the form:
20885:{\displaystyle \mu (i)={\frac {1}{1+e^{-\mathbf {w} ^{T}\mathbf {x} (i)}}}}
19287:
14909:
14049:
13479:
An equivalent formula uses the inverse of the logit function, which is the
8749:
8528:
1041:
342:
185:
26:
31990:
31649:
Theil, Henri (1969). "A Multinomial Extension of the Linear Logit Model".
31566:(Technical report). Vol. 119. Tinbergen Institute. pp. 167–178.
31096:
30991:
30773:
30077:
30017:
29982:
29885:
25038:(treating the dependent variable in the binomial case as the outcome of a
22003:
outcomes: The data points are fitted to a null model function of the form
20536:
logistic regression is specifically intended to be used in this situation.
19475:. A single-layer neural network computes a continuous output instead of a
8527:= 0 and 1). For the simple binary logistic regression model, we assumed a
8357:
This makes it possible to write the linear predictor function as follows:
34285:
34247:
33930:
33831:
33693:
33506:
33473:
32965:
32882:
32877:
32521:
32478:
32458:
32438:
32428:
32197:
31961:
Peduzzi, P.; J. Concato; E. Kemper; T.R. Holford; A.R. Feinstein (1996).
31766:
31704:
31550:
Research Papers in Probability and Statistics (Festschrift for J. Neyman)
31516:(1958). "The regression analysis of binary sequences (with discussion)".
30805:
30788:
29657:
29568:
29431:
29399:
21742:
14517:{\displaystyle \Pr(\varepsilon _{i}<x)=\operatorname {logit} ^{-1}(x)}
10589:
8419:
8298:
7319:
6000:
4938:{\displaystyle \sigma (t)={\frac {e^{t}}{e^{t}+1}}={\frac {1}{1+e^{-t}}}}
4794:
4766:
2691:
The sum of these, the total loss, is the overall negative log-likelihood
2688:, as probability distributions on the two-element space of (pass, fail).
710:
388:
337:
31571:
31558:
17034:
of the associated probability as a linear predictor, with an extra term
11305:, but the effect on the odds is 10 times greater. But the effect on the
6187:
For a continuous independent variable the odds ratio can be defined as:
6035:
is the regression coefficient multiplied by some value of the predictor.
5498:
of the standard logistic function. It is easy to see that it satisfies:
4590:). Rather than the Wald method, the recommended method to calculate the
33131:
32611:
32311:
32242:
32192:
32167:
32087:
32036:
31670:
31537:
31463:
31426:
31112:
Gareth James; Daniela Witten; Trevor Hastie; Robert Tibshirani (2013).
29832:
23951:
can be expected to have a better fit (smaller deviance) than the given
21992:
19468:
706:
215:
The parameters of a logistic regression are most commonly estimated by
197:
40:
31960:
31212:"Recherches mathématiques sur la loi d'accroissement de la population"
31113:
30828:
van der Ploeg, Tjeerd; Austin, Peter C.; Steyerberg, Ewout W. (2014).
30372:"On the problem of the most efficient tests of statistical hypotheses"
21147:{\displaystyle \mathbf {S} =\operatorname {diag} (\mu (i)(1-\mu (i)))}
3196:{\displaystyle L=\prod _{k:y_{k}=1}p_{k}\,\prod _{k:y_{k}=0}(1-p_{k})}
729:, an extension of logistic regression to sequential data, are used in
33284:
33136:
32756:
32551:
32463:
32448:
32443:
32408:
31166:"Notice sur la loi que la population poursuit dans son accroissement"
30743:
30542:"A comparison of algorithms for maximum entropy parameter estimation"
30465:
For example, the indicator function in this case could be defined as
30410:
29592:
data when the strata are small. It is mostly used in the analysis of
29520:, showing that the multinomial logit followed from the assumption of
29411:
27722:
algorithm. The goal is to model the probability of a random variable
25632:{\displaystyle {\boldsymbol {x}}_{k}=\{x_{0k},x_{1k},\dots ,x_{Mk}\}}
25047:
24644:
21735:
21613:
17031:
14198:{\displaystyle \varepsilon _{i}\sim \operatorname {Logistic} (0,1)\,}
5453:
4561:
3641:, which, again, will generally require the use of numerical methods.
2021:
718:
714:
31662:
31447:
30197:
30100:. Springer Series in Statistics (2nd ed.). New York; Springer.
29824:
25790:
The Lagrangian will be expressed as a function of the probabilities
25259:, etc.), the logistic regression solution is unique in that it is a
16679:. Statements consisting only of original research should be removed.
11382:
and the probability that the outcome was in category 0 was given by
8906:{\displaystyle {\boldsymbol {x}}=\{x_{0},x_{1},x_{2},\dots ,x_{M}\}}
7370:
data, where each outcome is determined by an unobserved probability
5684:
and equivalently, after exponentiating both sides we have the odds:
4080:{\displaystyle t=\beta _{0}+2\beta _{1}\approx -4.1+2\cdot 1.5=-1.1}
2222:), and approaches infinity as the prediction gets worse (i.e., when
145:), and the logistic model has been the most commonly used model for
32800:
32418:
32295:
32290:
32285:
32046:
31050:"The Equivalence of Logistic Regression and Maximum Entropy models"
29451:
29387:
24738:. The Wald statistic also tends to be biased when data are sparse.
24723:{\displaystyle W_{j}={\frac {\beta _{j}^{2}}{SE_{\beta _{j}}^{2}}}}
24615:
24013:
23944:
This effectively means that about 6 out of a 10,000 fits to random
21597:
20629:. If the problem is written in vector matrix form, with parameters
14912:, which uses an error variable distributed according to a standard
12374:
which can be equal to any integer in . The log-likelihood is then:
12296:{\displaystyle p_{0}({\boldsymbol {x}})=1-p_{1}({\boldsymbol {x}})}
5941:
after transformation, the resulting expression for the probability
4790:
4230:{\displaystyle t=\beta _{0}+4\beta _{1}\approx -4.1+4\cdot 1.5=1.9}
3979:
For example, for a student who studies 2 hours, entering the value
1578:
722:
22100:{\displaystyle \varepsilon ^{2}=\sum _{k=1}^{K}(b_{0}-y_{k})^{2}.}
13158:
Written using the more compact notation described above, this is:
8408:{\displaystyle f(i)={\boldsymbol {\beta }}\cdot \mathbf {X} _{i},}
6619:
If there are multiple explanatory variables, the above expression
5774:{\displaystyle {\frac {p(x)}{1-p(x)}}=e^{\beta _{0}+\beta _{1}x}.}
34305:
34006:
31740:"The Determination of L.D.50 and Its Sampling Error in Bio-Assay"
21564:, which makes the slopes the same at the origin. This shows the
19302:
reign, while the "log-linear" formulation here is more common in
16726:
15475:
4648:
1297:{\displaystyle p(x)={\frac {1}{1+e^{-(\beta _{0}+\beta _{1}x)}}}}
32000:
Data Mining Techniques For Marketing, Sales and Customer Support
30185:
Data Mining Techniques For Marketing, Sales and Customer Support
29296:
A common alternative to the logistic model (logit model) is the
23815:
The maximum value of the log-likelihood for the simple model is
23015:{\displaystyle p_{\varphi }(x)={\frac {1}{1+e^{-t_{\varphi }}}}}
13441:
parameter estimates is as the additive effect on the log of the
2542:{\displaystyle \ell _{k}=-y_{k}\ln p_{k}-(1-y_{k})\ln(1-p_{k}).}
165:(for example the proportional odds ordinal logistic model). See
34227:
33208:
33182:
33162:
32413:
32204:
32015:
29662:
29335:
Logistic regression is an alternative to Fisher's 1936 method,
24157:
equal to the difference in the number of parameters estimated.
21953:
The minimum value which constitutes the fit will be denoted by
20548:
should re-examine the data, as there may be some kind of error.
16725:, we can then interpret the latent variables as expressing the
16718:
16714:
13426:
7901: : conditioned on the explanatory variables, it follows a
6554:
For a binary independent variable the odds ratio is defined as
6192:
701:), based on observed characteristics of the patient (age, sex,
31580:
Cramer, J. S. (2004). "The early origins of the logit model".
30959:. Statistical Horizons LLC and the University of Pennsylvania.
30692:
20556:
15597:{\displaystyle \varepsilon =\varepsilon _{1}-\varepsilon _{0}}
11523:, which describe the probability that the categorical outcome
8243:
The model is usually put into a more compact form as follows:
180:
Analogous linear models for binary variables with a different
32056:
31507:
These arbitrary probability units have been called 'probits'.
30827:
29666:
29312:(inverse logistic function), while the probit model uses the
25052:
22572:
In the case of simple binary logistic regression, the set of
16829:
12228:{\displaystyle p({\boldsymbol {x}})=p_{1}({\boldsymbol {x}})}
8532:
8531:
between the predictor variable and the log-odds (also called
8434:
6490:
This exponential relationship provides an interpretation for
6088:
5851:
5459:
4483:
The logistic regression analysis gives the following output.
683:
235:
responses: it is a simple, well-analyzed baseline model; see
112:
56:
31376:. New York: Academic Press. pp. 105–142. Archived from
30634:
Park, Byeong U.; Simar, Léopold; Zelenyuk, Valentin (2017).
30299:
29500:
The multinomial logit model was introduced independently in
29454:, and had been preceded by earlier work dating to 1860; see
29288:
of particular outcomes rather than the outcomes themselves.
27225:
Imposing the normalization constraint, we can solve for the
23602:{\displaystyle {\hat {\ell }}\geq {\hat {\ell }}_{\varphi }}
21748:
In short, for logistic regression, a statistic known as the
14933:
Yet another formulation uses two separate latent variables:
13331:
This formulation expresses logistic regression as a type of
10203:{\displaystyle t=\log _{10}{\frac {p}{1-p}}=-3+x_{1}+2x_{2}}
5282:
is interpreted as the probability of the dependent variable
32147:
31367:"Conditional Logit Analysis of Qualitative Choice Behavior"
30380:
Philosophical Transactions of the Royal Society of London A
29960:
25043:
21605:
19992:
19945:{\displaystyle p_{i}=\operatorname {\mathbb {E} } \left\,,}
19887:
15451:
14356:
13442:
11519:
separate probabilities, one for each category, indexed by
7725:
2003:
219:(MLE). This does not have a closed-form expression, unlike
31434:
Berkson, Joseph (1951). "Why I Prefer Logits to Probits".
29489:
Various refinements occurred during that time, notably by
29354:
A detailed history of the logistic regression is given in
22480:, we may then estimate how many of these permuted sets of
19823:
An example of this distribution is the fraction of seeds (
9358:
are fixed, we can easily compute either the log-odds that
8561:. This linear relationship may be extended to the case of
94:, where the two values are labeled "0" and "1", while the
30236:
29791:
29789:
29238:
Logistic regression can be seen as a special case of the
28587:
which is maximized using optimization techniques such as
23808:{\displaystyle {\hat {\ell }}_{\varphi }=-13.8629\ldots }
19697:
is associated not with a single Bernoulli trial but with
14435:. This can be shown as follows, using the fact that the
6172:{\displaystyle {\text{odds}}=e^{\beta _{0}+\beta _{1}x}.}
149:
since about 1970. Binary variables can be generalized to
29300:, as the related names suggest. From the perspective of
24012:
in linear regression models is generally measured using
20504:
The regression coefficients are usually estimated using
18927:{\displaystyle {\boldsymbol {\beta }}_{0}=\mathbf {0} .}
12318:-th set of measured explanatory variables be denoted by
8297:
is added, with a fixed value of 1, corresponding to the
4625:
31867:. New York: Cambridge University Press. pp. 6–37.
31345:
31343:
31200:, p. 4, "He did not say how he fitted the curves."
29995:
27956:{\displaystyle \Pr(Y=0\mid X;\theta )=1-h_{\theta }(X)}
24507:
In linear regression the squared multiple correlation,
19467:
This functional form is commonly called a single-layer
17063:
at the end. This term, as it turns out, serves as the
30445:. Cambridge, UK New York: Cambridge University Press.
30131:
International Journal of Machine Tools and Manufacture
30128:
29925:
29898:
29786:
25688:
be the probability, given explanatory variable vector
24019:
21781:) are fitted to a proposed model function of the form
21529:
21241:
20914:
can be found using the following iterative algorithm:
20499:
14040:
The logistic model has an equivalent formulation as a
8523:
and, as in the example above, two categorical values (
4752:
4409:
3851:{\displaystyle \mu =-\beta _{0}/\beta _{1}\approx 2.7}
1461:-intercept and slope of the log-odds as a function of
236:
19:"Logit model" redirects here. Not to be confused with
31292:
31268:
31145:
30787:
Vittinghoff, E.; McCulloch, C. E. (12 January 2007).
30471:
30424:"How to Interpret Odds Ratio in Logistic Regression?"
30055:
29252:
29205:
29166:
28639:
28600:
28449:
28181:
28051:
28010:
27972:
27894:
27778:
27755:
27728:
27650:
27621:
27592:
27563:
27530:
27502:
27466:
27437:
27403:
27372:
27241:
27175:
27087:
27054:
26949:
26745:
26643:
26506:
26497:
so that the normalization term in the Lagrangian is:
26443:
26237:
26064:
25943:
25814:
25724:
25698:
25647:
25556:
25522:
25484:
25452:
25402:
25365:
25336:
25306:
25150:
25066:
24999:
24979:
24841:
24811:
24784:
24757:
24660:
24581:
24420:
24400:
24285:
24169:
24118:
24046:
23972:
23862:
23821:
23773:
23625:
23564:
23520:
23443:
23413:
23276:
23229:
23080:
23034:
22954:
22925:
22765:
22710:
22681:
22650:
22585:
22546:
22519:
22450:
22311:
22276:
22242:
22202:
22182:
22150:
22123:
22023:
21959:
21843:
21787:
21697:
21691:
of participants in the study will require a total of
21677:
21653:
21500:
21392:
21227:
21160:
21087:
20923:
20898:
20815:
20715:
20635:
20593:
20567:
20171:
19964:
19857:
19731:
19601:
19515:
19331:
19232:
19036:
18943:
18900:
18176:
18075:
18033:
17991:
17859:
17716:
17392:
17292:
17076:
17040:
16850:
15693:
15617:
15564:
15511:
15360:
15245:
15109:
14942:
14536:
14460:
14237:
14159:
14084:
13670:
13492:
13455:
13349:
13167:
12911:
12842:
12813:
12779:
12596:
12517:
12383:
12353:
12324:
12241:
12186:
12160:
12040:
11988:
11946:
11917:
11879:
11721:
11676:
11548:
11497:
11467:
11430:
11388:
11357:
11315:
11284:
11257:
11227:
11201:
11174:
11154:
11127:
11094:
11066:
11035:
11009:
10982:
10962:
10935:
10902:
10847:
10801:
10775:
10745:
10719:
10673:
10627:
10601:
10558:
10529:
10217:
10124:
10088:
10055:
10019:
9993:
9967:
9788:
9588:
9559:
9532:
9499:
9469:
9438:
9416:
9390:
9364:
9337:
9317:
9286:
9133:
9104:
9019:
8920:
8833:
8796:
8761:
8723:
8574:
8541:
8498:
To begin with, we may consider a logistic model with
8457:
8366:
8196:
8084:
8048:
7390:
7195:
7053:
6918:
6862:
6835:
6668:
6625:
6560:
6523:
6496:
6205:
6124:
6101:
6073:
6044:
6011:
5978:
5947:
5917:
5888:
5862:
5818:
5798:
5693:
5507:
5468:
5436:
5416:
5386:
5338:
5311:
5288:
5259:
5152:
5103:
5051:
5028:
5004:
4984:
4961:
4856:
4840:{\displaystyle \sigma :\mathbb {R} \rightarrow (0,1)}
4807:
4775:
4734:
4687:
4658:
4604:
4570:
4407:
4372:
4246:
4167:
4096:
4014:
3985:
3953:
3922:
3865:
3807:
3760:
3723:
3683:
3652:
3618:
3587:
3469:
3363:
3331:
3300:
3261:
3230:
3095:
2831:
2788:
2757:
2726:
2699:
2628:
2562:
2440:
2393:
2362:
2327:
2294:
2261:
2228:
2195:
2162:
2129:
2096:
2063:
2032:
1875:
1843:
1790:
1759:
1718:
1687:
1656:
1605:
1522:
1471:
1418:
1369:
1313:
1221:
1167:
1076:
991:
965:
33969:
Autoregressive conditional heteroskedasticity (ARCH)
31340:
31328:
30786:
29604:
29358:. The logistic function was developed as a model of
29233:
26430:
are the appropriate Lagrange multipliers. There are
25277:
Exponential family § Maximum entropy derivation
24574:
uses a test statistic that asymptotically follows a
22110:
The fitting process consists of choosing a value of
16815:
11940:. It can be seen that, as required, the sum of the
9771:
As in the simple example above, finding the optimum
2718:, and the best fit is obtained for those choices of
1045:
Graph of a logistic regression curve fitted to the (
31304:
31280:
31256:
31244:
31232:
30922:
25772:{\displaystyle p_{nk}=p_{n}({\boldsymbol {x}}_{k})}
25263:solution. This is a case of a general property: an
25220:is the Bernoulli-distributed response variable and
25036:
membership in one of a limited number of categories
23267:, the maximum log-likelihood for the null model is
21995:may be introduced, in which it is assumed that the
11911:will have their own set of regression coefficients
11278:on the log-odds is twice as great as the effect of
7287:are assumed to depend on the explanatory variables
1143:{\displaystyle p(x)={\frac {1}{1+e^{-(x-\mu )/s}}}}
33431:
30523:
30438:
29839:
29264:
29218:
29187:
29149:
28618:
28576:
28429:
28160:
28037:
27996:
27955:
27874:
27761:
27734:
27695:
27636:
27607:
27578:
27545:
27514:
27482:
27452:
27421:
27387:
27355:
27214:
27155:
27070:
27029:
26929:
26725:
26616:
26486:
26412:
26194:
26044:
25923:
25771:
25710:
25659:
25631:
25540:
25497:
25468:
25438:
25384:
25349:
25318:
25207:
25130:
25014:
24985:
24962:
24824:
24797:
24770:
24722:
24594:
24478:
24265:
24145:
24080:
23993:
23923:
23848:
23807:
23741:
23601:
23533:
23503:
23426:
23396:
23255:
23212:
23060:
23014:
22937:
22908:
22745:
22693:
22665:
22631:
22561:
22532:
22468:
22398:
22294:
22255:
22228:
22188:
22168:
22136:
22099:
21981:
21942:
21822:
21736:Deviance and likelihood ratio test ─ a simple case
21714:
21683:
21659:
21556:
21515:
21458:
21375:
21210:
21146:
21070:
20906:
20884:
20801:
20701:
20605:
20579:
20480:
20151:
19944:
19812:
19677:
19561:
19456:
19270:
19215:
19016:
18926:
18883:
18137:
18061:
18019:
17974:
17831:
17696:
17372:
17258:is to ensure that the resulting distribution over
17243:
17055:
17019:
16623:
15676:
15596:
15549:
15457:
15340:
15214:
15089:
14893:
14516:
14362:
14197:
14139:
14024:
13642:
13468:
13373:
13320:
13147:
12858:
12828:
12795:
12762:
12545:
12500:
12366:
12339:
12295:
12227:
12172:
12143:
12012:
11970:
11932:
11903:
11862:
11706:
11662:
11509:
11479:
11442:
11411:
11374:
11327:
11297:
11270:
11243:
11213:
11187:
11160:
11140:
11113:
11078:
11048:
11021:
10995:
10968:
10948:
10921:
10887:
10833:
10787:
10761:
10731:
10705:
10659:
10613:
10580:
10541:
10507:
10202:
10107:
10074:
10041:
10005:
9979:
9929:
9760:
9571:
9545:
9514:
9481:
9455:
9424:
9402:
9376:
9350:
9323:
9299:
9268:
9116:
9083:
8992:
8905:
8814:
8767:
8736:
8702:
8553:
8487:
8407:
8228:
8179:
8063:
7871:
7207:
7178:
7033:
6898:
6848:
6813:
6654:
6583:
6543:
6509:
6479:
6171:
6107:
6079:
6050:
6027:
5991:
5962:
5932:
5903:
5868:
5842:
5804:
5789:In the above equations, the terms are as follows:
5773:
5673:
5490:
5454:Definition of the inverse of the logistic function
5442:
5422:
5399:
5372:
5324:
5294:
5274:
5242:
5135:
5086:
5034:
5010:
4990:
4967:
4937:
4839:
4781:
4740:
4720:
4673:
4616:
4582:
4422:
4384:
4297:
4229:
4147:
4079:
3997:
3966:
3935:
3898:
3850:
3779:
3745:
3696:
3665:
3631:
3600:
3568:
3452:
3344:
3313:
3274:
3243:
3195:
3071:
2797:
2770:
2739:
2708:
2680:
2614:
2541:
2418:
2375:
2346:
2313:
2280:
2247:
2214:
2181:
2148:
2115:
2076:
2045:
2009:
1856:
1803:
1772:
1747:are the probabilities that they will be zero (see
1737:
1700:
1669:
1640:
1549:
1508:
1445:
1404:
1343:
1296:
1196:
1142:
1009:
977:
30633:
30627:
30606:
30356:
30033:
29858:Boyd, C. R.; Tolson, M. A.; Copes, W. S. (1987).
27395:are not all independent. We can add any constant
23958:and so we can conclude that the inclusion of the
22426:data points are randomly assigned to the various
20809:and expected value of the Bernoulli distribution
20327:
20307:
20238:
20218:
13335:, which predicts variables with various types of
9384:for a given observation, or the probability that
6614:
16:Statistical model for a binary dependent variable
34384:
31129:Pohar, Maja; Blas, Mateja; Turk, Sandra (2004).
30923:Cohen, Jacob; Cohen, Patricia; West, Steven G.;
29039:
28991:
28959:
28912:
28823:
28787:
28706:
28536:
28261:
28217:
28052:
28011:
27895:
27842:
26434:normalization constraints which may be written:
25801:The first contribution to the Lagrangian is the
21730:
20172:
19037:
18181:
18104:
18076:
18034:
17992:
17860:
17717:
17544:
17397:
17161:
17081:
16942:
16861:
16507:
16448:
16376:
16278:
16169:
16041:
15915:
15828:
15747:
15698:
15274:
15246:
14928:
14925:to model mis-specifications or erroneous data).
14803:(because the logistic distribution is symmetric)
14756:
14701:
14643:
14588:
14541:
14461:
13671:
12885:
7914:, the probability of the outcome of 1 for trial
7733:
7576:
5784:
2431:These can be combined into a single expression:
200:. More abstractly, the logistic function is the
33517:Multivariate adaptive regression splines (MARS)
31865:Econometrics of Qualitative Dependent Variables
31730:
31684:Proceedings of the National Academy of Sciences
31407:Journal of the American Statistical Association
30426:. Institute for Digital Research and Education.
29857:
29470:
25144:in this kind of generalized linear model, i.e.
19317:
14044:. This formulation is common in the theory of
12024:is termed the "pivot index", and the log-odds (
10769:. Similarly, the probability of the event that
7278:As in linear regression, the outcome variables
5136:{\displaystyle p:\mathbb {R} \rightarrow (0,1)}
2681:{\displaystyle {\big (}y_{k},(1-y_{k}){\big )}}
2615:{\displaystyle {\big (}p_{k},(1-p_{k}){\big )}}
31471:Bliss, C. I. (1934). "The Method of Probits".
30670:Machine Learning – A Probabilistic Perspective
29767:
29512:linked the multinomial logit to the theory of
29366:in the 1830s and 1840s, under the guidance of
27644:to 1, and the beta coefficients were given by
23849:{\displaystyle {\hat {\ell }}=-8.02988\ldots }
23025:The log-odds for the null model are given by:
21557:{\textstyle \Phi ({\sqrt {\frac {\pi }{8}}}x)}
20617:(IRLS), which is equivalent to maximizing the
16733:These intuitions can be expressed as follows:
15684:We can demonstrate the equivalent as follows:
14399:. In the latter case, the resulting value of
5407:to another, though they are independent given
2552:This expression is more formally known as the
744:
32072:
31822:. Oxford: Basil Blackwell. pp. 267–359.
31678:Pearl, Raymond; Reed, Lowell J. (June 1920).
31605:"The Case of Zero Survivors in Probit Assays"
31128:
30362:
30029:
30027:
17254:In this form it is clear that the purpose of
14035:
13409:, which is equivalent to placing a zero-mean
12347:and their categorical outcomes be denoted by
8229:{\displaystyle \beta _{0},\ldots ,\beta _{m}}
2673:
2631:
2607:
2565:
1679:are the probabilities that the corresponding
653:
31998:Berry, Michael J.A.; Linoff, Gordon (1997).
31836:
31043:
31041:
30643:Computational Statistics & Data Analysis
29810:
29768:Hosmer, David W.; Lemeshow, Stanley (2000).
29695:
29332:or error distributions can be used instead.
28440:Typically, the log likelihood is maximized,
27991:
27979:
27637:{\displaystyle {\boldsymbol {\lambda }}_{0}}
27608:{\displaystyle {\boldsymbol {\lambda }}_{n}}
27579:{\displaystyle {\boldsymbol {\lambda }}_{0}}
27546:{\displaystyle {\boldsymbol {\lambda }}_{n}}
27453:{\displaystyle {\boldsymbol {\lambda }}_{n}}
27388:{\displaystyle {\boldsymbol {\lambda }}_{n}}
25783:-th measurement, the categorical outcome is
25669:possible values of the categorical variable
25626:
25572:
25433:
25409:
25287:In order to show this, we use the method of
23256:{\displaystyle p_{\varphi }={\overline {y}}}
19688:
14212:that is distributed according to a standard
12306:The log-likelihood that a particular set of
8987:
8929:
8900:
8842:
8287:, an additional explanatory pseudo-variable
7930:The second line expresses the fact that the
2822:its inverse, the (positive) log-likelihood:
31997:
29928:Journal of the American College of Surgeons
24832:if we know the true prevalence as follows:
24608:
22295:{\displaystyle \varepsilon _{\varphi }^{2}}
22169:{\displaystyle \varepsilon _{\varphi }^{2}}
20557:Iteratively reweighted least squares (IRLS)
18138:{\displaystyle \Pr(Y_{i}=0)+\Pr(Y_{i}=1)=1}
12180:, the two-category case is recovered, with
6909:Again, the more traditional equations are:
1509:{\displaystyle \mu =-\beta _{0}/\beta _{1}}
32117:
32079:
32065:
31944:Statistical Methods for Psychology, 7th ed
31855:
31098:Lecture Notes on Generalized Linear Models
30024:
29806:
29804:
26940:Using the more condensed vector notation:
24004:
22632:{\displaystyle p(x)={\frac {1}{1+e^{-t}}}}
22144:of the fit to the null model, denoted by
21628:
19693:A closely related model assumes that each
19562:{\displaystyle y={\frac {1}{1+e^{-f(X)}}}}
18069:cannot be independently specified: rather
17846:. This general formulation is exactly the
12829:{\displaystyle {\boldsymbol {\beta }}_{n}}
11933:{\displaystyle {\boldsymbol {\beta }}_{n}}
11531:, conditional on the vector of covariates
10667:. By exponentiating, we can see that when
8435:Many explanatory variables, two categories
8038:of the explanatory variables and a set of
4626:§ Deviance and likelihood ratio tests
3899:{\displaystyle s=1/\beta _{1}\approx 0.67}
660:
646:
32730:
32028:Econometrics Lecture (topic: Logit model)
31980:
31775:
31765:
31713:
31703:
31677:
31094:
31038:
30954:"Measures of fit for logistic regression"
30855:
30845:
30804:
30763:
30720:
30710:
30577:
30575:
30573:
30571:
30569:
30567:
30565:
30553:
30400:
29875:
29763:
29403:
29284:because logistic regression predicts the
27615:which set the exponential term involving
25271:of the Bernoulli distribution (it is in "
25124:
23305:
22919:For the null model, the probability that
21999:variable is of no use in predicting the y
21982:{\displaystyle {\hat {\varepsilon }}^{2}}
21667:explanatory variables for an event (e.g.
20474:
20145:
20025:
20019:
19978:
19938:
19920:
19914:
19873:
19742:
19674:
19453:
16695:Learn how and when to remove this message
15082:
15012:
14194:
14136:
13495:
13179:
12923:
12457:
7487:
6062:
5111:
4815:
4652:Figure 1. The standard logistic function
3141:
2978:
31361:
31209:
31163:
30918:
30916:
30914:
30912:
30910:
30908:
30906:
30904:
30902:
30900:
30821:
30524:{\displaystyle \Delta (n,y)=1-(y-n)^{2}}
30155:
30122:
29761:
29759:
29757:
29755:
29753:
29751:
29749:
29747:
29745:
29743:
24741:
24625:
22746:{\displaystyle t=\beta _{0}+\beta _{1}x}
21473:
20613:) can, for example, be calculated using
19322:The model has an equivalent formulation
12013:{\displaystyle p_{0}({\boldsymbol {x}})}
11971:{\displaystyle p_{n}({\boldsymbol {x}})}
11904:{\displaystyle p_{0}({\boldsymbol {x}})}
9331:. The above formula shows that once the
8775:of the logarithm is usually taken to be
8424:
7330:, etc. The main distinction is between
7314:The explanatory variables may be of any
6191:
5812:is the logit function. The equation for
5087:{\displaystyle t=\beta _{0}+\beta _{1}x}
4948:A graph of the logistic function on the
4647:
2020:The log loss can be interpreted as the "
1405:{\displaystyle y=\beta _{0}+\beta _{1}x}
1040:
237:§ Comparison with linear regression
25:
31813:
31794:
31433:
31404:
31173:Correspondance Mathématique et Physique
31115:An Introduction to Statistical Learning
31028:
31026:
31004:
30436:
30095:
30039:Statistical Models: Theory and Practice
29801:
29532:There are large numbers of extensions:
29522:independence of irrelevant alternatives
29482:
29478:
27683:
27668:
27653:
27624:
27595:
27566:
27533:
27440:
27375:
27338:
27323:
27282:
27267:
27215:{\displaystyle Z_{k}=e^{1+\alpha _{k}}}
27126:
27111:
27017:
27002:
26487:{\displaystyle \sum _{n=0}^{N}p_{nk}=1}
25756:
25559:
24751:correct in the general population, the
24647:. The Wald statistic, analogous to the
24565:
23061:{\displaystyle t_{\varphi }=\beta _{0}}
22505:, or its logarithm, the log-likelihood
21162:
21053:
20426:
20359:
20126:
19258:
19243:
19234:
19170:
19115:
19074:
18951:
18903:
18844:
18807:
18772:
18720:
18683:
18617:
18513:
18449:
18386:
18325:
18274:
18225:
17935:
17905:
17799:
17754:
17657:
17620:
17585:
17510:
17473:
17438:
17343:
17306:
17210:
17130:
16976:
16895:
16574:
16520:
16464:
16426:
16383:
16304:
16289:
16195:
16180:
16084:
16054:
15973:
15925:
15537:
15522:
15513:
15044:
14974:
14836:
14776:
14724:
14650:
14317:
14104:
13998:
13958:
13906:
13872:
13818:
13784:
13616:
13570:
13421:(IRLS) or, more commonly these days, a
13299:
12816:
12734:
12482:
12327:
12286:
12256:
12218:
12194:
12137:
12123:
12104:
12081:
12003:
11961:
11920:
11894:
11851:
11837:
11787:
11736:
11651:
11637:
11596:
11582:
11563:
11402:
11365:
10595:. It is the log-odds of the event that
10259:
10240:
10232:
9901:
9746:
9742:
9654:
9650:
9502:
9446:
9418:
9235:
9227:
9194:
9186:
9167:
9159:
9141:
9071:
8922:
8835:
8383:
7943:is equal to the probability of success
4952:-interval (−6,6) is shown in Figure 1.
3213:
1212:. This expression may be rewritten as:
244:
34385:
34043:Kaplan–Meier estimator (product limit)
31941:
31922:
31881:
31631:
31602:
31579:
31556:
31349:
31334:
31322:
31310:
31298:
31286:
31274:
31262:
31250:
31238:
31197:
31151:
30969:
30878:
30872:
30667:
30581:
30562:
30539:
30441:The Cambridge Dictionary of Statistics
30149:
30091:
30089:
30087:
29845:
29795:
29447:
29435:
29355:
27742:being 0 or 1 given experimental data.
25779:which is the probability that for the
24778:parameters are all correct except for
24100:can be shown to follow an approximate
21830:. The fit is obtained by choosing the
19848:, this model is expressed as follows:
17271:, i.e. it sums to 1. This means that
14904:This formulation—which is standard in
12154:Note also that for the simple case of
11412:{\displaystyle 1-p({\boldsymbol {x}})}
10549:. This can be interpreted as follows:
8340:are then grouped into a single vector
7884:The meanings of these four lines are:
6655:{\displaystyle \beta _{0}+\beta _{1}x}
4594:-value for logistic regression is the
3746:{\displaystyle \beta _{0}\approx -4.1}
3714:using the above data are found to be:
34116:
33683:
33430:
32729:
32499:
32116:
32060:
31900:
31839:Handbook of the Logistic Distribution
31648:
31470:
31047:
30897:
30182:
29740:
29669:implementation of logistic regression
29565:dependent variables (ordered values).
29505:
29427:
27490:probabilities so that there are only
23937:of significance, the integral of the
22229:{\displaystyle b_{0}={\overline {y}}}
21211:{\displaystyle {\boldsymbol {\mu }}=}
19506:) is computed from the general form:
13653:The formula can also be written as a
12340:{\displaystyle {\boldsymbol {x}}_{k}}
10523:is the probability of the event that
9515:{\displaystyle {\boldsymbol {x}}_{k}}
9010:=1. The logit may now be written as:
8451:and any number of categorical values
3780:{\displaystyle \beta _{1}\approx 1.5}
34353:
34053:Accelerated failure time (AFT) model
31023:
29324:of errors and the second a standard
28038:{\displaystyle \Pr(y\mid X;\theta )}
27718:Logistic regression is an important
21671:) expected to occur in a proportion
21459:{\displaystyle \mathbf {y} (i)=^{T}}
20802:{\displaystyle \mathbf {x} (i)=^{T}}
20615:iteratively reweighted least squares
16663:Relevant discussion may be found on
16640:
13419:iteratively reweighted least squares
11375:{\displaystyle p({\boldsymbol {x}})}
9456:{\displaystyle p({\boldsymbol {x}})}
8311:The resulting explanatory variables
209:
34365:
33648:Analysis of variance (ANOVA, anova)
32500:
31884:Econometric Analysis, fifth edition
31544:
31512:
30951:
30084:
29501:
29494:
27706:
27431:-dimensional vector to each of the
25446:, and the data points are given by
24496:
24020:Deviance and likelihood ratio tests
23764:data points in the proposed model.
20500:Maximum likelihood estimation (MLE)
19707:independent identically distributed
18149:, in that multiple combinations of
13374:{\displaystyle (-\infty ,+\infty )}
9003:with an added explanatory variable
5462:(log odds) function as the inverse
4753:Definition of the logistic function
4721:{\displaystyle \sigma (t)\in (0,1)}
4478:
126:, hence the alternative names. See
110:for the log-odds scale is called a
13:
33743:Cochran–Mantel–Haenszel statistics
32369:Pearson product-moment correlation
31946:. Belmont, CA; Thomson Wadsworth.
31621:10.1111/j.1744-7348.1935.tb07713.x
31560:The origins of logistic regression
31530:10.1111/j.2517-6161.1958.tb00292.x
31210:Verhulst, Pierre-François (1845).
31164:Verhulst, Pierre-François (1838).
31072:
30945:
30472:
30216:10.1061/(ASCE)NH.1527-6996.0000083
29543:) handles the case of a multi-way
29426:, who coined the term "probit" in
29422:was developed and systematized by
28905:
28885:
28780:
28760:
28661:
27460:without changing the value of the
26761:
26754:
26749:
26703:
26680:
26657:
26646:
26510:
26369:
26241:
26151:
26076:
26068:
25992:
25818:
25439:{\displaystyle k=\{1,2,\dots ,K\}}
25246:
25228:values are the linear parameters.
25159:
23994:{\displaystyle 1-D\approx 99.94\%}
23988:
21530:
20311:
20222:
19661:
19651:
19616:
19606:
19278:will produce equivalent results.)
14075:) that is distributed as follows:
13365:
13356:
12877:
12657:
12608:
12600:
12518:
12432:
10888:{\displaystyle 1/(1000+1)=1/1001.}
9800:
9792:
6210:
6207:
5097:And the general logistic function
4631:
3487:
3479:
3381:
3373:
1344:{\displaystyle \beta _{0}=-\mu /s}
1161:(the midpoint of the curve, where
224:
14:
34414:
32008:
30883:(Fifth ed.). Prentice-Hall.
30143:10.1016/j.ijmachtools.2005.07.005
29234:Comparison with linear regression
25022:is the prevalence in the sample.
24638:
24426:likelihood of the saturated model
24406:likelihood of the saturated model
24362:likelihood of the saturated model
24343:likelihood of the saturated model
24252:likelihood of the saturated model
24209:likelihood of the saturated model
24071:likelihood of the saturated model
23616:of a logistic regression fit as:
20702:{\displaystyle \mathbf {w} ^{T}=}
11873:Each of the probabilities except
11452:explanatory variables (including
9522:as the explanatory vector of the
9425:{\displaystyle {\boldsymbol {x}}}
8755:. In most applications, the base
8270:are grouped into a single vector
6182:
6058:denotes the exponential function.
5332:are not identically distributed:
4975:is a linear function of a single
4598:(LRT), which for these data give
3288:is to require the derivatives of
1751:). We wish to find the values of
189:
157:. If the multiple categories are
142:
34364:
34352:
34340:
34327:
34326:
34117:
32014:
31968:Journal of Clinical Epidemiology
31908:. Chapman & Hall/CRC Press.
31799:. New York: Wiley-Interscience.
31552:. London: Wiley. pp. 55–71.
30834:BMC Medical Research Methodology
30793:American Journal of Epidemiology
30751:Journal of Clinical Epidemiology
30699:BMC Medical Research Methodology
30263:10.1111/j.1539-6924.2012.01894.x
30010:10.1001/jama.1993.03510240069035
29975:10.1097/00003246-199510000-00007
29877:10.1097/00005373-198704000-00005
29607:
29477:over many decades, beginning in
29372:Logistic function § History
29304:, these differ in the choice of
27232:and write the probabilities as:
24276:Then the difference of both is:
24146:{\displaystyle \chi _{s-p}^{2},}
22533:{\displaystyle \varepsilon ^{2}}
22498:variable in the proposed model.
22419:We can imagine a case where the
22137:{\displaystyle \varepsilon ^{2}}
21394:
21229:
21154:is a diagonal weighting matrix,
21089:
21044:
21030:
21024:
21013:
20996:
20976:
20965:
20953:
20926:
20900:
20864:
20853:
20717:
20638:
20494:
20435:
20368:
20199:
20135:
20028:
19923:
19185:
19130:
19089:
18996:
18987:
18966:
18917:
18859:
18822:
18787:
18735:
18698:
18661:
18652:
18632:
18598:
18589:
18555:
18546:
18528:
18491:
18482:
18464:
18428:
18419:
18401:
18351:
18339:
18300:
18288:
18251:
18239:
17950:
17920:
17814:
17769:
17672:
17635:
17600:
17525:
17488:
17453:
17383:and the resulting equations are
17358:
17321:
17225:
17145:
16991:
16910:
16645:
16583:
16529:
16473:
16392:
16322:
16213:
16099:
16069:
15988:
15940:
15886:
15799:
15725:
15059:
14989:
14845:
14785:
14733:
14659:
14620:
14568:
14437:cumulative distribution function
14326:
14113:
14007:
13967:
13915:
13881:
13827:
13793:
13698:
13625:
13579:
13521:
13308:
13205:
12870:-th explanatory variable of the
12546:{\displaystyle \Delta (n,y_{k})}
8392:
6899:{\displaystyle i=0,1,2,\dots ,m}
6584:{\displaystyle {\frac {ad}{bc}}}
6551:for every 1-unit increase in x.
5373:{\displaystyle P(Y_{i}=1\mid X)}
4617:{\displaystyle p\approx 0.00064}
627:
34002:Least-squares spectral analysis
31355:
31316:
31203:
31191:
31157:
31122:
31105:
31088:
31066:
30998:
30963:
30780:
30737:
30686:
30661:
30600:
30533:
30459:
30430:
30416:
30332:
30293:
30230:
30191:
30176:
30049:
29989:
29582:Conditional logistic regression
29537:Multinomial logistic regression
29291:
27555:multinomial logistic regression
27040:and dropping the primes on the
25514:will also be represented as an
25296:multinomial logistic regression
25224:is the predictor variable; the
25103:
25015:{\displaystyle {\tilde {\pi }}}
23534:{\displaystyle {\overline {y}}}
22256:{\displaystyle {\overline {y}}}
21218:the vector of expected values,
14408:will be smaller by a factor of
13407:a squared regularizing function
11346:Multinomial logistic regression
11148:by 1 increases the log-odds by
10956:by 1 increases the log-odds by
9432:, and estimate the probability
6115:of the predictors) as follows:
4638:Multinomial logistic regression
4423:{\displaystyle {\tfrac {1}{2}}}
4385:{\displaystyle \mu \approx 2.7}
2419:{\displaystyle 0<p_{k}<1}
1573:for a logistic regression uses
948:) and the outcome of the test (
672:
575:Least-squares spectral analysis
513:Generalized estimating equation
333:Multinomial logistic regression
308:Vector generalized linear model
247:, where he coined "logit"; see
166:
155:multinomial logistic regression
131:
127:
32983:Mean-unbiased minimum-variance
32086:
31927:. Hoboken, New Jersey: Wiley.
31419:10.1080/01621459.1944.10500699
31101:. pp. Chapter 3, page 45.
30672:. The MIT Press. p. 245.
30512:
30499:
30487:
30475:
30098:Regression Modeling Strategies
29954:
29919:
29892:
29851:
29689:
29308:: the logistic model uses the
29282:logistic distribution function
29182:
29170:
29140:
29128:
29119:
29100:
29066:
29042:
29024:
28994:
28986:
28962:
28939:
28915:
28856:
28826:
28814:
28790:
28741:
28709:
28655:
28613:
28601:
28571:
28539:
28490:
28472:
28418:
28399:
28395:
28391:
28378:
28359:
28343:
28329:
28296:
28264:
28238:
28220:
28207:
28189:
28150:
28138:
28134:
28130:
28124:
28105:
28096:
28089:
28073:
28055:
28032:
28014:
27950:
27944:
27922:
27898:
27869:
27845:
27795:
27789:
27416:
27404:
27048:indices, and then solving for
26906:
26867:
26834:
26808:
26407:
26391:
26372:
26337:
26189:
26173:
26154:
26119:
26039:
26023:
26014:
25995:
25918:
25902:
25766:
25751:
25535:
25523:
25328:explanatory variables denoted
25173:
25167:
25006:
24951:
24932:
24068:likelihood of the fitted model
23909:
23897:
23881:
23872:
23828:
23781:
23736:
23724:
23708:
23699:
23670:
23651:
23587:
23571:
23391:
23388:
23369:
23360:
23341:
23335:
23322:
23302:
23284:
23202:
23183:
23174:
23155:
23149:
23136:
22971:
22965:
22898:
22895:
22882:
22870:
22861:
22842:
22836:
22833:
22820:
22814:
22660:
22654:
22595:
22589:
22562:{\displaystyle {\hat {\ell }}}
22553:
22387:
22360:
22319:
22085:
22058:
21967:
21928:
21878:
21823:{\displaystyle y=b_{0}+b_{1}x}
21551:
21533:
21510:
21504:
21447:
21437:
21431:
21422:
21416:
21410:
21404:
21398:
21340:
21334:
21319:
21313:
21286:
21280:
21265:
21259:
21205:
21196:
21190:
21181:
21175:
21169:
21141:
21138:
21135:
21129:
21117:
21114:
21108:
21102:
20874:
20868:
20825:
20819:
20790:
20780:
20774:
20758:
20752:
20733:
20727:
20721:
20696:
20651:
20279:
20259:
20209:
20175:
20070:
20057:
19778:
19752:
19709:trials, where the observation
19644:
19632:
19551:
19545:
19442:
19365:
19059:
19040:
18747:
18673:
18343:
18320:
18292:
18269:
18243:
18220:
18203:
18184:
18126:
18107:
18098:
18079:
18056:
18037:
18014:
17995:
17966:
17894:
17882:
17863:
17739:
17720:
17566:
17547:
17419:
17400:
17183:
17164:
17103:
17084:
16964:
16945:
16883:
16864:
16811:on income is effectively done.
16593:
16570:
16539:
16510:
16483:
16451:
16414:
16379:
16344:
16314:
16284:
16281:
16261:
16252:
16226:
16205:
16175:
16172:
16141:
16115:
16109:
16049:
15735:
15701:
15668:
15656:
15296:
15277:
15268:
15249:
15205:
15193:
15156:
15144:
14855:
14832:
14795:
14759:
14743:
14704:
14688:
14646:
14630:
14591:
14578:
14544:
14511:
14505:
14483:
14464:
14447:, which is the inverse of the
14191:
14179:
13753:
13733:
13708:
13674:
13589:
13566:
13531:
13503:
13368:
13350:
13243:
13230:
13218:
13215:
13187:
13174:
13016:
13003:
12991:
12988:
12931:
12918:
12744:
12729:
12679:
12660:
12540:
12521:
12495:
12492:
12477:
12464:
12454:
12435:
12290:
12282:
12260:
12252:
12222:
12214:
12198:
12190:
12108:
12100:
12085:
12077:
12007:
11999:
11965:
11957:
11898:
11890:
11791:
11783:
11740:
11732:
11707:{\displaystyle n=1,2,\dots ,N}
11567:
11559:
11406:
11398:
11369:
11361:
11195:increases by 1, the odds that
11003:increases by 1, the odds that
10868:
10856:
10497:
10438:
9954:explanatory variable from the
9911:
9896:
9755:
9752:
9737:
9725:
9709:
9690:
9663:
9660:
9645:
9639:
9450:
9442:
9263:
9257:
9145:
9137:
8809:
8797:
8488:{\displaystyle y=0,1,2,\dots }
8376:
8370:
8094:
8088:
8058:
8052:
7973:The third line writes out the
7860:
7848:
7844:
7824:
7799:
7736:
7642:
7579:
7552:
7495:
7478:
7465:
7168:
7080:
6615:Multiple explanatory variables
6544:{\displaystyle e^{\beta _{1}}}
6417:
6405:
6360:
6354:
6340:
6334:
6314:
6302:
6288:
6276:
6255:
6249:
6238:
6226:
5957:
5951:
5927:
5921:
5898:
5892:
5837:
5834:
5828:
5822:
5726:
5720:
5706:
5700:
5629:
5623:
5609:
5603:
5581:
5575:
5560:
5557:
5551:
5545:
5526:
5523:
5517:
5511:
5491:{\displaystyle g=\sigma ^{-1}}
5367:
5342:
5269:
5263:
5232:
5203:
5177:
5171:
5162:
5156:
5130:
5118:
5115:
4866:
4860:
4834:
4822:
4819:
4715:
4703:
4697:
4691:
4668:
4662:
3909:
3553:
3527:
3447:
3421:
3190:
3171:
3061:
3042:
3033:
3014:
3008:
2995:
2946:
2927:
2886:
2873:
2668:
2649:
2602:
2583:
2556:of the predicted distribution
2533:
2514:
2505:
2486:
2338:
2272:
1974:
1955:
1641:{\displaystyle p_{k}=p(x_{k})}
1635:
1622:
1550:{\displaystyle s=1/\beta _{1}}
1446:{\displaystyle \beta _{1}=1/s}
1286:
1257:
1231:
1225:
1177:
1171:
1124:
1112:
1086:
1080:
1:
34296:Geographic information system
33512:Simultaneous equations models
31982:10.1016/s0895-4356(96)00236-3
31816:"Qualitative Response Models"
31651:International Economic Review
30765:10.1016/s0895-4356(96)00236-3
29940:10.1016/S1072-7515(00)00758-4
29683:
29527:
29471:Wilson & Worcester (1943)
29440:maximum likelihood estimation
29219:{\displaystyle D_{\text{KL}}}
25025:
24153:chi-square distribution with
22701:. The log-odds are given by:
21731:Error and significance of fit
21568:of the logistic distribution.
20506:maximum likelihood estimation
16828:Here, instead of writing the
14929:Two-way latent-variable model
14055:Imagine that, for each trial
13395:maximum likelihood estimation
12886:As a generalized linear model
10834:{\displaystyle x_{1}=x_{2}=0}
10706:{\displaystyle x_{1}=x_{2}=0}
10660:{\displaystyle x_{1}=x_{2}=0}
10581:{\displaystyle \beta _{0}=-3}
10042:{\displaystyle \beta _{0}=-3}
7888:The first line expresses the
7218:
5785:Interpretation of these terms
4761:. The logistic function is a
4643:
3208:maximum likelihood estimation
3082:or equivalently maximize the
2622:from the actual distribution
394:Nonlinear mixed-effects model
217:maximum-likelihood estimation
33479:Coefficient of determination
33090:Uniformly most powerful test
32042:Logistic Regression tutorial
30621:10.1016/0304-4076(81)90060-9
30070:10.1016/0021-9681(67)90082-3
29337:linear discriminant analysis
27997:{\displaystyle Y\in \{0,1\}}
24462:likelihood of the null model
23526:
23489:
23473:
23383:
23355:
23330:
23311:
23248:
22368:
22248:
22221:
21756:distributed, which allows a
21619:variational Bayesian methods
20907:{\displaystyle \mathbf {w} }
20561:Binary logistic regression (
19318:As a single-layer perceptron
18062:{\displaystyle \Pr(Y_{i}=1)}
18020:{\displaystyle \Pr(Y_{i}=0)}
17279:, the probabilities become "
16634:
11114:{\displaystyle \beta _{2}=2}
10922:{\displaystyle \beta _{1}=1}
10108:{\displaystyle \beta _{2}=2}
10075:{\displaystyle \beta _{1}=1}
9094:Solving for the probability
8247:The regression coefficients
8071:for a particular data point
8021:by modeling the probability
6829:explanators; the parameters
5380:differs from one data point
1453:(inverse scale parameter or
248:
135:
134:for formal mathematics, and
32:
7:
34048:Proportional hazards models
33992:Spectral density estimation
33974:Vector autoregression (VAR)
33408:Maximum posterior estimator
32640:Randomized controlled trial
31925:Applied logistic regression
31882:Greene, William H. (2003).
31594:10.1016/j.shpsc.2004.09.003
30931:(3rd ed.). Routledge.
30879:Greene, William N. (2003).
30584:Applied Logistic Regression
30183:Berry, Michael J.A (1997).
30058:Journal of Chronic Diseases
29770:Applied Logistic Regression
29673:Local case-control sampling
29600:
29555:Ordered logistic regression
29456:Probit model § History
29228:Kullback–Leibler divergence
28648:
25692:, that the outcome will be
25392:. There will be a total of
25294:As in the above section on
24993:is the true prevalence and
23071:and the log-likelihood is:
22756:and the log-likelihood is:
22014:with a squared error term:
21469:
19312:natural language processing
16675:the claims made and adding
15232:(0,1) is a standard type-1
12796:{\displaystyle \beta _{nm}}
11539:, these probabilities are:
10713:the odds of the event that
8748:) is not restricted to the
6028:{\displaystyle \beta _{1}x}
4292:Probability of passing exam
4142:Probability of passing exam
2055:relative to the prediction
1197:{\displaystyle p(\mu )=1/2}
751:supervised machine learning
745:Supervised machine learning
731:natural language processing
596:Mean and predicted response
163:ordinal logistic regression
10:
34419:
33808:Multivariate distributions
32228:Average absolute deviation
31906:Logistic Regression Models
31636:. H.M. Stationery Office.
31495:10.1126/science.79.2037.38
31397:
31005:Harrell, Frank E. (2010).
30655:10.1016/j.csda.2016.10.024
30318:10.1016/j.ssci.2013.10.004
30170:10.1016/j.ssci.2008.01.002
30096:Harrell, Frank E. (2015).
30043:Cambridge University Press
29638:Limited dependent variable
29349:
29188:{\displaystyle H(Y\mid X)}
27749:function parameterized by
24825:{\displaystyle \beta _{0}}
24798:{\displaystyle \beta _{0}}
24771:{\displaystyle \beta _{j}}
24500:
24465:likelihood of fitted model
24423:likelihood of fitted model
24359:likelihood of fitted model
24249:likelihood of fitted model
23427:{\displaystyle \beta _{0}}
22509:. The likelihood function
22270:values, and the optimized
21741:the model will simply be "
21632:
21516:{\displaystyle \sigma (x)}
16491:(now, same as above model)
15492:extreme value distribution
15234:extreme value distribution
14036:As a latent-variable model
13469:{\displaystyle e^{\beta }}
13432:The interpretation of the
11343:
9351:{\displaystyle \beta _{m}}
8737:{\displaystyle \beta _{i}}
8008:, as in the previous line.
6849:{\displaystyle \beta _{i}}
6510:{\displaystyle \beta _{1}}
5992:{\displaystyle \beta _{0}}
4674:{\displaystyle \sigma (t)}
3967:{\displaystyle \beta _{1}}
3936:{\displaystyle \beta _{0}}
3697:{\displaystyle \beta _{1}}
3666:{\displaystyle \beta _{0}}
3632:{\displaystyle \beta _{1}}
3601:{\displaystyle \beta _{0}}
3345:{\displaystyle \beta _{1}}
3314:{\displaystyle \beta _{0}}
3275:{\displaystyle \beta _{1}}
3244:{\displaystyle \beta _{0}}
2814:Alternatively, instead of
2771:{\displaystyle \beta _{1}}
2740:{\displaystyle \beta _{0}}
2347:{\displaystyle p_{k}\to 1}
2281:{\displaystyle p_{k}\to 0}
1804:{\displaystyle \beta _{1}}
1773:{\displaystyle \beta _{0}}
766:
761:
753:algorithm widely used for
677:
389:Linear mixed-effects model
171:statistical classification
18:
34322:
34276:
34213:
34166:
34129:
34125:
34112:
34084:
34066:
34033:
34024:
33982:
33929:
33890:
33839:
33830:
33796:Structural equation model
33751:
33708:
33704:
33679:
33638:
33604:
33558:
33525:
33487:
33454:
33450:
33426:
33366:
33275:
33194:
33158:
33149:
33132:Score/Lagrange multiplier
33117:
33070:
33015:
32941:
32932:
32742:
32738:
32725:
32684:
32658:
32610:
32565:
32547:Sample size determination
32512:
32508:
32495:
32399:
32354:
32328:
32310:
32266:
32218:
32138:
32129:
32125:
32112:
32094:
31942:Howell, David C. (2010).
31837:Balakrishnan, N. (1991).
31814:Amemiya, Takeshi (1985).
31797:Categorical Data Analysis
31609:Annals of Applied Biology
31374:Frontiers in Econometrics
30712:10.1186/s12874-016-0267-3
30668:Murphy, Kevin P. (2012).
30582:Menard, Scott W. (2002).
30106:10.1007/978-3-319-19425-7
29302:generalized linear models
27586:was subtracted from each
24595:{\displaystyle \chi ^{2}}
23541:is again the mean of the
21386:The regressor matrix and
19689:In terms of binomial data
19473:artificial neural network
15502:following substitutions:
15490:The choice of the type-1
13659:probability mass function
13445:for a unit change in the
13337:probability distributions
11489:categories, we will need
9961:Consider an example with
8418:using the notation for a
8032:linear predictor function
8013:Linear predictor function
7975:probability mass function
4320:
4313:
3790:which yields a value for
1857:{\displaystyle \ell _{k}}
749:Logistic regression is a
737:and safer design for the
727:Conditional random fields
555:Least absolute deviations
34291:Environmental statistics
33813:Elliptical distributions
33606:Generalized linear model
33535:Simple linear regression
33305:Hodges–Lehmann estimator
32762:Probability distribution
32671:Stochastic approximation
32233:Coefficient of variation
31632:Gaddum, John H. (1933).
31372:. In P. Zarembka (ed.).
30847:10.1186/1471-2288-14-137
29576:conditional random field
29364:Pierre François Verhulst
29362:and named "logistic" by
29240:generalized linear model
27747:generalized linear model
25396:data points, indexed by
25282:
25056:of the probability, the
24609:Coefficient significance
24403:likelihood of null model
24340:likelihood of null model
24206:likelihood of null model
24102:chi-squared distribution
23939:chi-squared distribution
23856:so that the deviance is
22675:is the probability that
22442:chi-squared distribution
22189:{\displaystyle \varphi }
20709:, explanatory variables
17269:probability distribution
16798:This clearly shows that
16665:Talk:Logistic regression
14429:generalized linear model
14059:, there is a continuous
13655:probability distribution
13333:generalized linear model
12894:and from other types of
11221:increase by a factor of
11029:increase by a factor of
7890:probability distribution
3206:This method is known as
2024:" of the actual outcome
1036:
1029:variable is called the "
1021:variable is called the "
303:Generalized linear model
33951:Cross-correlation (XCF)
33559:Non-standard predictors
32993:Lehmann–Scheffé theorem
32666:Adaptive clinical trial
31795:Agresti, Alan. (2002).
30609:Journal of Econometrics
30555:10.3115/1118853.1118871
30540:Malouf, Robert (2002).
30437:Everitt, Brian (1998).
29901:Hepato-Gastroenterology
29772:(2nd ed.). Wiley.
29643:Multinomial logit model
29438:, and the model fit by
29404:Pearl & Reed (1920)
29265:{\displaystyle y\mid x}
27762:{\displaystyle \theta }
25934:The log-likelihood is:
25385:{\displaystyle x_{0}=1}
25060:is defined as follows:
24091:In the above equation,
24005:Goodness of fit summary
21623:expectation propagation
19832:) that germinate after
19281:Most treatments of the
16816:As a "log-linear" model
13657:(specifically, using a
13384:Both the probabilities
12807:-th coefficient of the
11982:is 1. The selection of
11422:In general, if we have
11251:Note how the effect of
11244:{\displaystyle 10^{2}.}
10762:{\displaystyle 10^{-3}}
9987:explanatory variables,
9493:measurements, defining
8565:explanatory variables:
8502:explanatory variables,
8238:regression coefficients
8040:regression coefficients
7366:are described as being
7357:Formally, the outcomes
6517:: The odds multiply by
5970:ranges between 0 and 1.
5843:{\displaystyle g(p(x))}
5253:In the logistic model,
5143:can now be written as:
4847:is defined as follows:
4583:{\displaystyle p=0.017}
2314:{\displaystyle y_{k}=0}
2248:{\displaystyle y_{k}=1}
2215:{\displaystyle y_{k}=0}
2182:{\displaystyle p_{k}=0}
2149:{\displaystyle y_{k}=1}
2116:{\displaystyle p_{k}=1}
1738:{\displaystyle 1-p_{k}}
1363:-intercept of the line
735:disaster managing plans
34347:Mathematics portal
34168:Engineering statistics
34076:Nelson–Aalen estimator
33653:Analysis of covariance
33540:Ordinary least squares
33464:Pearson product-moment
32868:Statistical functional
32779:Empirical distribution
32612:Controlled experiments
32341:Frequency distribution
32119:Descriptive statistics
31923:Hosmer, David (2013).
31861:"The Simple Dichotomy"
31841:. Marcel Dekker, Inc.
31603:Fisher, R. A. (1935).
31557:Cramer, J. S. (2002).
31118:. Springer. p. 6.
31095:Rodríguez, G. (2007).
31009:. New York: Springer.
30586:(2nd ed.). SAGE.
30525:
30402:10.1098/rsta.1933.0009
30204:Natural Hazards Review
29963:Critical Care Medicine
29710:10.1001/jama.2016.7653
29408:L. Gustave du Pasquier
29274:Bernoulli distribution
29266:
29242:and thus analogous to
29220:
29189:
29151:
28699:
28620:
28578:
28529:
28431:
28162:
28039:
27998:
27957:
27876:
27763:
27736:
27697:
27638:
27609:
27580:
27547:
27516:
27484:
27483:{\displaystyle p_{nk}}
27454:
27423:
27389:
27357:
27315:
27216:
27157:
27072:
27071:{\displaystyle p_{nk}}
27031:
26970:
26931:
26866:
26727:
26618:
26595:
26553:
26488:
26464:
26414:
26336:
26302:
26281:
26196:
26118:
26046:
25991:
25970:
25925:
25882:
25861:
25773:
25712:
25661:
25633:
25542:
25499:
25470:
25469:{\displaystyle x_{mk}}
25440:
25386:
25351:
25320:
25209:
25132:
25016:
24987:
24964:
24826:
24799:
24772:
24724:
24596:
24480:
24267:
24147:
24082:
23995:
23925:
23850:
23809:
23743:
23603:
23535:
23505:
23428:
23398:
23257:
23214:
23114:
23062:
23016:
22939:
22910:
22792:
22747:
22695:
22667:
22633:
22563:
22534:
22470:
22400:
22359:
22296:
22257:
22230:
22190:
22170:
22138:
22101:
22057:
21983:
21944:
21877:
21824:
21716:
21685:
21661:
21594:posterior distribution
21582:Gaussian distributions
21569:
21558:
21517:
21482:with a scaled inverse
21460:
21377:
21212:
21148:
21072:
20908:
20886:
20803:
20703:
20607:
20581:
20482:
20153:
19946:
19814:
19679:
19563:
19458:
19272:
19217:
19018:
18928:
18885:
18139:
18063:
18021:
17976:
17833:
17698:
17374:
17245:
17057:
17056:{\displaystyle -\ln Z}
17021:
16625:
15678:
15598:
15551:
15496:rational choice theory
15459:
15342:
15216:
15091:
14895:
14518:
14439:(CDF) of the standard
14364:
14199:
14141:
14026:
13644:
13470:
13375:
13322:
13149:
12860:
12859:{\displaystyle x_{mk}}
12830:
12797:
12764:
12718:
12656:
12547:
12502:
12431:
12410:
12368:
12341:
12297:
12229:
12174:
12145:
12014:
11972:
11934:
11905:
11864:
11829:
11772:
11708:
11664:
11629:
11511:
11481:
11444:
11413:
11376:
11329:
11299:
11272:
11245:
11215:
11189:
11162:
11142:
11121:means that increasing
11115:
11080:
11050:
11049:{\displaystyle 10^{1}}
11023:
10997:
10970:
10950:
10929:means that increasing
10923:
10889:
10835:
10789:
10763:
10733:
10707:
10661:
10621:, when the predictors
10615:
10582:
10543:
10509:
10204:
10109:
10076:
10043:
10007:
9981:
9931:
9892:
9845:
9762:
9689:
9615:
9573:
9547:
9516:
9483:
9457:
9426:
9404:
9378:
9352:
9325:
9301:
9270:
9118:
9085:
9046:
8994:
8907:
8824:-dimensional vectors:
8816:
8769:
8738:
8704:
8555:
8489:
8431:
8409:
8230:
8181:
8065:
7903:Bernoulli distribution
7873:
7209:
7180:
7035:
6900:
6850:
6815:
6790:
6656:
6585:
6545:
6511:
6481:
6198:
6173:
6109:
6081:
6063:Definition of the odds
6052:
6029:
5993:
5964:
5934:
5905:
5870:
5844:
5806:
5775:
5675:
5492:
5458:We can now define the
5444:
5443:{\displaystyle \beta }
5430:and shared parameters
5424:
5401:
5374:
5326:
5296:
5276:
5244:
5137:
5088:
5036:
5012:
4992:
4969:
4939:
4841:
4783:
4749:
4742:
4722:
4675:
4618:
4584:
4424:
4386:
4299:
4231:
4149:
4081:
3999:
3968:
3937:
3900:
3852:
3781:
3747:
3698:
3667:
3633:
3602:
3570:
3526:
3454:
3420:
3346:
3315:
3276:
3245:
3197:
3073:
2972:
2799:
2798:{\displaystyle -\ell }
2772:
2741:
2710:
2709:{\displaystyle -\ell }
2682:
2616:
2543:
2420:
2385:is either 0 or 1, but
2377:
2348:
2315:
2282:
2249:
2216:
2183:
2150:
2117:
2086:, and is a measure of
2078:
2047:
2011:
1858:
1805:
1774:
1749:Bernoulli distribution
1739:
1702:
1671:
1642:
1564:
1551:
1510:
1447:
1406:
1345:
1298:
1198:
1144:
1060:
1011:
1010:{\displaystyle k=K=20}
979:
777:
699:coronary heart disease
634:Mathematics portal
560:Iteratively reweighted
229:ordinary least squares
210:§ Maximum entropy
206:Bernoulli distribution
138:for a worked example.
106:, hence the name. The
36:
34263:Population statistics
34205:System identification
33939:Autocorrelation (ACF)
33867:Exponential smoothing
33781:Discriminant analysis
33776:Canonical correlation
33640:Partition of variance
33502:Regression validation
33346:(Jonckheere–Terpstra)
33245:Likelihood-ratio test
32934:Frequentist inference
32846:Location–scale family
32767:Sampling distribution
32732:Statistical inference
32699:Cross-sectional study
32686:Observational studies
32645:Randomized experiment
32474:Stem-and-leaf display
32276:Central limit theorem
32053:for teaching purposes
31857:Gouriéroux, Christian
31820:Advanced Econometrics
31075:"CS229 Lecture Notes"
30526:
30340:"Logistic Regression"
29864:The Journal of Trauma
29633:Jarrow–Turnbull model
29594:observational studies
29549:polytomous regression
29322:logistic distribution
29278:Gaussian distribution
29267:
29221:
29190:
29152:
28679:
28621:
28619:{\displaystyle (x,y)}
28579:
28509:
28432:
28168:We now calculate the
28163:
28040:
27999:
27958:
27877:
27764:
27737:
27698:
27639:
27610:
27581:
27548:
27517:
27485:
27455:
27424:
27422:{\displaystyle (M+1)}
27390:
27358:
27295:
27217:
27158:
27073:
27032:
26950:
26932:
26846:
26728:
26619:
26575:
26533:
26489:
26444:
26415:
26316:
26282:
26261:
26197:
26098:
26047:
25971:
25950:
25926:
25862:
25841:
25774:
25713:
25673:ranging from 0 to N.
25662:
25634:
25550:-dimensional vector
25543:
25541:{\displaystyle (M+1)}
25500:
25498:{\displaystyle y_{k}}
25471:
25441:
25387:
25352:
25350:{\displaystyle x_{m}}
25321:
25210:
25133:
25017:
24988:
24965:
24827:
24800:
24773:
24742:Case-control sampling
24725:
24632:likelihood-ratio test
24626:Likelihood ratio test
24597:
24481:
24268:
24148:
24083:
24035:likelihood-ratio test
23996:
23926:
23851:
23810:
23744:
23604:
23536:
23506:
23429:
23399:
23258:
23215:
23094:
23063:
23017:
22940:
22911:
22772:
22748:
22696:
22668:
22634:
22564:
22535:
22471:
22469:{\displaystyle 2-1=1}
22401:
22339:
22297:
22258:
22231:
22191:
22171:
22139:
22102:
22037:
21984:
21945:
21857:
21825:
21717:
21715:{\displaystyle 10k/p}
21686:
21669:myocardial infarction
21662:
21559:
21518:
21477:
21461:
21378:
21213:
21149:
21073:
20909:
20887:
20804:
20704:
20623:Bernoulli distributed
20608:
20582:
20483:
20154:
19947:
19815:
19720:binomial distribution
19680:
19564:
19459:
19273:
19218:
19019:
18929:
18886:
18140:
18064:
18022:
17977:
17834:
17699:
17375:
17246:
17058:
17022:
16832:of the probabilities
16809:polynomial regression
16626:
15679:
15599:
15552:
15460:
15343:
15217:
15092:
14896:
14519:
14441:logistic distribution
14365:
14214:logistic distribution
14200:
14142:
14042:latent-variable model
14027:
13645:
13471:
13376:
13323:
13150:
12861:
12831:
12798:
12765:
12698:
12636:
12548:
12503:
12411:
12390:
12369:
12367:{\displaystyle y_{k}}
12342:
12298:
12230:
12175:
12146:
12015:
11973:
11935:
11906:
11865:
11809:
11752:
11709:
11665:
11609:
11512:
11482:
11445:
11414:
11377:
11330:
11300:
11298:{\displaystyle x_{1}}
11273:
11271:{\displaystyle x_{2}}
11246:
11216:
11190:
11188:{\displaystyle x_{2}}
11163:
11143:
11141:{\displaystyle x_{2}}
11116:
11081:
11051:
11024:
10998:
10996:{\displaystyle x_{1}}
10971:
10951:
10949:{\displaystyle x_{1}}
10924:
10890:
10836:
10790:
10764:
10734:
10708:
10662:
10616:
10583:
10544:
10510:
10205:
10110:
10077:
10044:
10008:
9982:
9932:
9872:
9825:
9763:
9669:
9595:
9574:
9548:
9546:{\displaystyle y_{k}}
9526:-th measurement, and
9517:
9484:
9458:
9427:
9405:
9379:
9353:
9326:
9302:
9300:{\displaystyle S_{b}}
9271:
9119:
9086:
9026:
8995:
8908:
8817:
8815:{\displaystyle (M+1)}
8770:
8739:
8705:
8556:
8490:
8428:
8422:between two vectors.
8410:
8231:
8182:
8066:
7874:
7368:Bernoulli-distributed
7310:Explanatory variables
7256:independent variables
7231:consists of a set of
7210:
7181:
7036:
6901:
6851:
6816:
6770:
6657:
6586:
6546:
6512:
6482:
6195:
6174:
6110:
6082:
6053:
6030:
5994:
5965:
5935:
5906:
5871:
5850:illustrates that the
5845:
5807:
5776:
5676:
5493:
5445:
5425:
5402:
5400:{\displaystyle X_{i}}
5375:
5327:
5325:{\displaystyle Y_{i}}
5297:
5277:
5245:
5138:
5089:
5037:
5013:
4993:
4970:
4940:
4842:
4784:
4743:
4723:
4676:
4651:
4619:
4596:likelihood-ratio test
4585:
4425:
4387:
4300:
4232:
4150:
4082:
4000:
3969:
3938:
3901:
3853:
3782:
3748:
3699:
3668:
3634:
3603:
3571:
3506:
3455:
3400:
3347:
3316:
3277:
3246:
3198:
3074:
2952:
2800:
2773:
2742:
2711:
2683:
2617:
2544:
2421:
2378:
2376:{\displaystyle y_{k}}
2349:
2316:
2283:
2250:
2217:
2184:
2151:
2118:
2079:
2077:{\displaystyle p_{k}}
2048:
2046:{\displaystyle y_{k}}
2012:
1859:
1831:The log loss for the
1806:
1775:
1740:
1703:
1701:{\displaystyle y_{k}}
1672:
1670:{\displaystyle p_{k}}
1643:
1569:The usual measure of
1552:
1511:
1448:
1407:
1346:
1299:
1199:
1145:
1044:
1012:
980:
773:
755:binary classification
705:, results of various
591:Regression validation
570:Bayesian multivariate
287:Polynomial regression
151:categorical variables
96:independent variables
65:independent variables
29:
34398:Predictive analytics
34186:Probabilistic design
33771:Principal components
33614:Exponential families
33566:Nonlinear regression
33545:General linear model
33507:Mixed effects models
33497:Errors and residuals
33474:Confounding variable
33376:Bayesian probability
33354:Van der Waerden test
33344:Ordered alternative
33109:Multiple comparisons
32988:Rao–Blackwellization
32951:Estimating equations
32907:Statistical distance
32625:Factorial experiment
32158:Arithmetic-Geometric
32023:at Wikimedia Commons
31767:10.1073/pnas.29.2.79
31705:10.1073/pnas.6.6.275
30881:Econometric Analysis
30469:
30387:(694–706): 289–337,
30187:. Wiley. p. 10.
29653:Hosmer–Lemeshow test
29463:Edwin Bidwell Wilson
29424:Chester Ittner Bliss
29250:
29203:
29164:
28637:
28598:
28447:
28179:
28049:
28008:
27970:
27892:
27776:
27753:
27726:
27648:
27619:
27590:
27561:
27528:
27500:
27464:
27435:
27401:
27370:
27239:
27173:
27085:
27052:
26947:
26743:
26641:
26504:
26441:
26235:
26062:
25941:
25812:
25722:
25696:
25645:
25554:
25520:
25482:
25450:
25400:
25363:
25334:
25304:
25289:Lagrange multipliers
25241:exponential function
25148:
25064:
25030:Like other forms of
24997:
24986:{\displaystyle \pi }
24977:
24839:
24809:
24782:
24755:
24658:
24579:
24572:Hosmer–Lemeshow test
24566:Hosmer–Lemeshow test
24283:
24167:
24116:
24044:
23970:
23860:
23819:
23771:
23623:
23562:
23518:
23441:
23411:
23274:
23227:
23078:
23032:
22952:
22923:
22763:
22708:
22679:
22666:{\displaystyle p(x)}
22648:
22583:
22544:
22517:
22513:is analogous to the
22448:
22309:
22274:
22240:
22200:
22180:
22148:
22121:
22021:
21957:
21841:
21785:
21695:
21675:
21651:
21527:
21498:
21390:
21225:
21158:
21085:
20921:
20896:
20813:
20713:
20633:
20591:
20565:
20169:
19962:
19855:
19729:
19599:
19513:
19479:. The derivative of
19329:
19230:
19034:
18941:
18898:
18174:
18073:
18031:
17989:
17857:
17714:
17390:
17290:
17074:
17038:
16848:
15691:
15615:
15562:
15509:
15358:
15243:
15107:
14940:
14534:
14458:
14235:
14157:
14082:
14071:(i.e. an unobserved
13668:
13490:
13453:
13403:maximum a posteriori
13347:
13165:
12909:
12840:
12811:
12777:
12594:
12515:
12381:
12351:
12322:
12239:
12184:
12158:
12038:
11986:
11978:over all categories
11944:
11915:
11877:
11719:
11674:
11546:
11527:will be in category
11495:
11465:
11428:
11386:
11355:
11313:
11282:
11255:
11225:
11199:
11172:
11152:
11125:
11092:
11064:
11033:
11007:
10980:
10960:
10933:
10900:
10845:
10799:
10773:
10743:
10717:
10671:
10625:
10599:
10556:
10527:
10215:
10122:
10086:
10053:
10017:
10006:{\displaystyle b=10}
9991:
9965:
9947:is the value of the
9786:
9586:
9557:
9530:
9497:
9467:
9436:
9414:
9388:
9362:
9335:
9315:
9284:
9131:
9102:
9017:
8918:
8831:
8794:
8759:
8721:
8717:is the log-odds and
8572:
8539:
8535:) of the event that
8455:
8364:
8283:For each data point
8194:
8082:
8064:{\displaystyle f(i)}
8046:
7388:
7332:continuous variables
7193:
7051:
6916:
6860:
6833:
6666:
6623:
6558:
6521:
6494:
6203:
6122:
6099:
6071:
6042:
6009:
5976:
5963:{\displaystyle p(x)}
5945:
5933:{\displaystyle p(x)}
5915:
5904:{\displaystyle p(x)}
5886:
5869:{\displaystyle \ln }
5860:
5816:
5796:
5691:
5505:
5466:
5434:
5414:
5384:
5336:
5309:
5286:
5275:{\displaystyle p(x)}
5257:
5150:
5101:
5049:
5026:
5002:
4982:
4977:explanatory variable
4959:
4854:
4805:
4773:
4732:
4685:
4656:
4602:
4568:
4405:
4370:
4244:
4165:
4094:
4012:
3983:
3951:
3920:
3863:
3805:
3758:
3721:
3681:
3650:
3616:
3585:
3467:
3361:
3329:
3298:
3259:
3228:
3214:Parameter estimation
3093:
2829:
2786:
2755:
2724:
2697:
2626:
2560:
2438:
2391:
2360:
2325:
2292:
2259:
2226:
2193:
2160:
2127:
2094:
2061:
2030:
1873:
1841:
1788:
1757:
1716:
1685:
1654:
1603:
1520:
1469:
1416:
1367:
1351:and is known as the
1311:
1219:
1165:
1074:
1031:categorical variable
1023:explanatory variable
989:
963:
616:Gauss–Markov theorem
611:Studentized residual
601:Errors and residuals
435:Principal components
405:Nonlinear regression
292:General linear model
225:§ Model fitting
221:linear least squares
34393:Logistic regression
34258:Official statistics
34181:Methods engineering
33862:Seasonal adjustment
33630:Poisson regressions
33550:Bayesian regression
33489:Regression analysis
33469:Partial correlation
33441:Regression analysis
33040:Prediction interval
33035:Likelihood interval
33025:Confidence interval
33017:Interval estimation
32978:Unbiased estimators
32796:Model specification
32676:Up-and-down designs
32364:Partial correlation
32320:Index of dispersion
32238:Interquartile range
32021:Logistic regression
31758:1943PNAS...29...79W
31696:1920PNAS....6..275P
31572:10.2139/ssrn.360300
31487:1934Sci....79...38B
31082:CS229 Lecture Notes
31073:Ng, Andrew (2000).
30393:1933RSPTA.231..289N
30255:2013RiskA..33.1021W
29678:Logistic model tree
29518:Luce's choice axiom
29326:normal distribution
29197:conditional entropy
28170:likelihood function
27557:section above, the
27515:{\displaystyle N+1}
25711:{\displaystyle y=n}
25660:{\displaystyle N+1}
25319:{\displaystyle M+1}
25298:, we will consider
25032:regression analysis
24865:
24716:
24690:
24139:
23686:
22938:{\displaystyle y=1}
22694:{\displaystyle y=1}
22335:
22291:
22263:is the mean of the
22165:
21590:likelihood function
21578:prior distributions
21574:Bayesian statistics
21492:normal distribution
20606:{\displaystyle y=1}
20580:{\displaystyle y=0}
20258:
16738:
15874:
15853:
15793:
15772:
15430:
15409:
15034:
14964:
14914:normal distribution
14608:
14283:
14099:
13423:quasi-Newton method
12896:regression analysis
12173:{\displaystyle N=1}
11510:{\displaystyle N+1}
11480:{\displaystyle N+1}
11443:{\displaystyle M+1}
11328:{\displaystyle y=1}
11214:{\displaystyle y=1}
11079:{\displaystyle y=1}
11022:{\displaystyle y=1}
10841:can be computed as
10788:{\displaystyle y=1}
10732:{\displaystyle y=1}
10614:{\displaystyle y=1}
10542:{\displaystyle y=1}
10013:, and coefficients
9980:{\displaystyle M=2}
9572:{\displaystyle M=1}
9482:{\displaystyle y=1}
9403:{\displaystyle y=1}
9377:{\displaystyle y=1}
9117:{\displaystyle y=1}
8554:{\displaystyle y=1}
8529:linear relationship
7823:
7347:indicator variables
7227:points. Each point
7223:A dataset contains
7208:{\displaystyle b=e}
6906:are all estimated.
6823:multiple regression
6607:are cells in a 2×2
4955:Let us assume that
3998:{\displaystyle x=2}
3084:likelihood function
2088:information content
1710:will equal one and
978:{\displaystyle k=1}
785:regression analysis
461:Errors-in-variables
328:Logistic regression
318:Binomial regression
263:Regression analysis
257:Part of a series on
190:§ Alternatives
143:§ Applications
108:unit of measurement
100:continuous variable
73:logistic regression
69:regression analysis
35:for worked details.
34278:Spatial statistics
34158:Medical statistics
34058:First hitting time
34012:Whittle likelihood
33663:Degrees of freedom
33658:Multivariate ANOVA
33591:Heteroscedasticity
33403:Bayesian estimator
33368:Bayesian inference
33217:Kolmogorov–Smirnov
33102:Randomization test
33072:Testing hypotheses
33045:Tolerance interval
32956:Maximum likelihood
32851:Exponential family
32784:Density estimation
32744:Statistical theory
32704:Natural experiment
32650:Scientific control
32567:Survey methodology
32253:Standard deviation
31135:Metodološki Zvezki
31048:Mount, J. (2011).
30806:10.1093/aje/kwk052
30548:. pp. 49–55.
30521:
29615:Mathematics portal
29418:In the 1930s, the
29262:
29216:
29185:
29147:
29145:
28911:
28891:
28786:
28766:
28665:
28616:
28574:
28427:
28425:
28318:
28260:
28158:
28035:
27994:
27953:
27872:
27759:
27732:
27693:
27634:
27605:
27576:
27543:
27512:
27480:
27450:
27419:
27385:
27353:
27212:
27153:
27068:
27027:
26927:
26723:
26614:
26484:
26410:
26220:. There are then (
26192:
26042:
25921:
25769:
25708:
25657:
25629:
25538:
25495:
25466:
25436:
25382:
25359:and which include
25347:
25316:
25265:exponential family
25257:Poisson regression
25205:
25128:
25012:
24983:
24960:
24842:
24822:
24795:
24768:
24720:
24695:
24676:
24592:
24476:
24474:
24429:
24409:
24263:
24261:
24155:degrees of freedom
24143:
24119:
24078:
23991:
23921:
23846:
23805:
23739:
23663:
23599:
23531:
23501:
23424:
23394:
23263:at the maximum of
23253:
23210:
23058:
23012:
22935:
22906:
22743:
22691:
22663:
22629:
22559:
22530:
22466:
22396:
22312:
22292:
22277:
22253:
22226:
22186:
22166:
22151:
22134:
22097:
21979:
21940:
21820:
21712:
21681:
21657:
21639:Widely used, the "
21570:
21554:
21513:
21456:
21373:
21367:
21208:
21144:
21068:
20904:
20882:
20799:
20699:
20603:
20577:
20478:
20244:
20149:
19942:
19810:
19675:
19559:
19454:
19268:
19213:
19014:
18924:
18881:
18879:
18135:
18059:
18017:
17972:
17829:
17791:
17694:
17692:
17370:
17241:
17239:
17065:normalizing factor
17053:
17017:
17015:
16736:
16656:possibly contains
16621:
16619:
15857:
15836:
15776:
15755:
15674:
15594:
15547:
15455:
15450:
15413:
15392:
15338:
15212:
15210:
15087:
15085:
15017:
14947:
14891:
14889:
14594:
14514:
14360:
14355:
14269:
14195:
14137:
14085:
14022:
13640:
13466:
13414:prior distribution
13371:
13318:
13145:
12856:
12826:
12793:
12760:
12557:which equals 1 if
12555:indicator function
12543:
12498:
12364:
12337:
12293:
12225:
12170:
12141:
12010:
11968:
11930:
11901:
11860:
11704:
11660:
11507:
11477:
11440:
11409:
11372:
11325:
11295:
11268:
11241:
11211:
11185:
11158:
11138:
11111:
11076:
11046:
11019:
10993:
10966:
10946:
10919:
10885:
10831:
10785:
10759:
10739:are 1-to-1000, or
10729:
10703:
10657:
10611:
10578:
10539:
10505:
10200:
10105:
10072:
10039:
10003:
9977:
9927:
9758:
9569:
9543:
9512:
9479:
9453:
9422:
9400:
9374:
9348:
9321:
9297:
9266:
9114:
9081:
8990:
8903:
8812:
8765:
8734:
8700:
8551:
8485:
8432:
8405:
8226:
8177:
8061:
8036:linear combination
7869:
7867:
7809:
7724:
7336:discrete variables
7273:dependent variable
7205:
7176:
7031:
6896:
6846:
6811:
6662:can be revised to
6652:
6581:
6541:
6507:
6477:
6199:
6169:
6105:
6077:
6048:
6025:
5989:
5960:
5930:
5901:
5866:
5840:
5802:
5771:
5671:
5488:
5440:
5420:
5397:
5370:
5322:
5304:response variables
5292:
5272:
5240:
5133:
5084:
5032:
5020:linear combination
5008:
4988:
4965:
4935:
4837:
4801:logistic function
4793:and having output
4779:
4765:, which takes any
4750:
4738:
4718:
4671:
4614:
4580:
4420:
4418:
4382:
4295:
4227:
4145:
4077:
3995:
3964:
3933:
3896:
3848:
3777:
3743:
3694:
3663:
3629:
3598:
3566:
3450:
3342:
3311:
3272:
3241:
3193:
3170:
3130:
3069:
2920:
2866:
2818:the loss, one can
2795:
2768:
2737:
2706:
2678:
2612:
2539:
2416:
2373:
2344:
2311:
2278:
2245:
2212:
2179:
2146:
2113:
2074:
2043:
2007:
2002:
1854:
1822:squared error loss
1801:
1770:
1735:
1698:
1667:
1638:
1547:
1506:
1443:
1402:
1341:
1294:
1194:
1159:location parameter
1140:
1061:
1007:
975:
348:Multinomial probit
161:, one can use the
92:indicator variable
88:dependent variable
61:linear combination
37:
34403:Regression models
34380:
34379:
34318:
34317:
34314:
34313:
34253:National accounts
34223:Actuarial science
34215:Social statistics
34108:
34107:
34104:
34103:
34100:
34099:
34035:Survival function
34020:
34019:
33882:Granger causality
33723:Contingency table
33698:Survival analysis
33675:
33674:
33671:
33670:
33527:Linear regression
33422:
33421:
33418:
33417:
33393:Credible interval
33362:
33361:
33145:
33144:
32961:Method of moments
32830:Parametric family
32791:Statistical model
32721:
32720:
32717:
32716:
32635:Random assignment
32557:Statistical power
32491:
32490:
32487:
32486:
32336:Contingency table
32306:
32305:
32173:Generalized/power
32019:Media related to
31975:(12): 1373–1379.
31953:978-0-495-59786-5
31934:978-0-470-58247-3
31915:978-1-4200-7575-5
31893:978-0-13-066189-0
31886:. Prentice Hall.
31874:978-0-521-58985-7
31848:978-0-8247-8587-1
31829:978-0-631-13345-2
31806:978-0-471-36093-3
31016:978-1-4419-2918-1
30952:Allison, Paul D.
30938:978-0-8058-2223-6
30890:978-0-13-066189-0
30679:978-0-262-01802-9
30593:978-0-7619-2208-7
30452:978-0-521-59346-5
30115:978-3-319-19424-0
30035:David A. Freedman
29779:978-0-471-35632-5
29623:Logistic function
29541:multinomial logit
29360:population growth
29330:sigmoid functions
29328:of errors. Other
29244:linear regression
29213:
29097:
29028:
28892:
28872:
28767:
28747:
28647:
28309:
28251:
27837:
27735:{\displaystyle Y}
27351:
26788:
26093:
25269:natural parameter
25253:probit regression
25107:
25101:
25009:
24958:
24954:
24935:
24913:
24879:
24852:
24805:. We can correct
24718:
24519:Likelihood ratio
24467:
24466:
24463:
24435:
24428:
24427:
24424:
24408:
24407:
24404:
24364:
24363:
24360:
24345:
24344:
24341:
24310:
24297:
24254:
24253:
24250:
24224:
24211:
24210:
24207:
24181:
24073:
24072:
24069:
23900:
23884:
23831:
23784:
23727:
23711:
23687:
23673:
23654:
23590:
23574:
23529:
23495:
23492:
23476:
23386:
23358:
23333:
23314:
23287:
23251:
23010:
22627:
22556:
22371:
22322:
22251:
22224:
21970:
21684:{\displaystyle p}
21660:{\displaystyle k}
21546:
21545:
21480:logistic function
20892:, the parameters
20880:
20518:multicollinearity
20448:
20381:
20325:
20236:
20162:Or equivalently:
20116:
20017:
19912:
19787:
19669:
19624:
19582:analytic function
19557:
19486:with respect to
19448:
19292:political science
19283:multinomial logit
19198:
19143:
18872:
18751:
18568:
18364:
17844:multinomial logit
17827:
17782:
17685:
17538:
17201:
17121:
16823:multinomial logit
16796:
16795:
16705:
16704:
16697:
16658:original research
16492:
16433:
16423:
16422:(substitute
16361:
16353:
16352:(substitute
15609:degree of freedom
15471:multinomial logit
15446:
15390:
14885:
14804:
14445:logistic function
14351:
14296:
14292:
14267:
14020:
13928:
13840:
13638:
13481:logistic function
13289:
13062:
12892:linear regression
12625:
12112:
11858:
11658:
11161:{\displaystyle 2}
10969:{\displaystyle 1}
10503:
10413:
10272:
10160:
9814:
9324:{\displaystyle b}
9242:
9201:
8768:{\displaystyle b}
8610:
8019:linear regression
7999:or 1 −
7711:
7675:
7451:
7353:Outcome variables
7271:(also known as a
7262:outcome variable
7174:
6941:
6609:contingency table
6579:
6455:
6370:
6364:
6318:
6259:
6128:
6108:{\displaystyle x}
6080:{\displaystyle x}
6051:{\displaystyle e}
5878:natural logarithm
5805:{\displaystyle g}
5730:
5633:
5423:{\displaystyle X}
5295:{\displaystyle Y}
5238:
5035:{\displaystyle t}
5011:{\displaystyle t}
4991:{\displaystyle x}
4968:{\displaystyle t}
4933:
4902:
4782:{\displaystyle t}
4759:logistic function
4741:{\displaystyle t}
4558:
4557:
4476:
4475:
4417:
4293:
4279:
4143:
4129:
3501:
3395:
3142:
3102:
2892:
2838:
1982:
1921:
1457:): these are the
1292:
1138:
1065:logistic function
939:
938:
739:built environment
670:
669:
323:Binary regression
282:Simple regression
277:Linear regression
202:natural parameter
175:binary classifier
167:§ Extensions
147:binary regression
132:§ Definition
128:§ Background
104:logistic function
59:of an event as a
53:statistical model
34410:
34368:
34367:
34356:
34355:
34345:
34344:
34330:
34329:
34233:Crime statistics
34127:
34126:
34114:
34113:
34031:
34030:
33997:Fourier analysis
33984:Frequency domain
33964:
33911:
33877:Structural break
33837:
33836:
33786:Cluster analysis
33733:Log-linear model
33706:
33705:
33681:
33680:
33622:
33596:Homoscedasticity
33452:
33451:
33428:
33427:
33347:
33339:
33331:
33330:(Kruskal–Wallis)
33315:
33300:
33255:Cross validation
33240:
33222:Anderson–Darling
33169:
33156:
33155:
33127:Likelihood-ratio
33119:Parametric tests
33097:Permutation test
33080:1- & 2-tails
32971:Minimum distance
32943:Point estimation
32939:
32938:
32890:Optimal decision
32841:
32740:
32739:
32727:
32726:
32709:Quasi-experiment
32659:Adaptive designs
32510:
32509:
32497:
32496:
32374:Rank correlation
32136:
32135:
32127:
32126:
32114:
32113:
32081:
32074:
32067:
32058:
32057:
32029:
32018:
32003:
31994:
31984:
31957:
31938:
31919:
31902:Hilbe, Joseph M.
31897:
31878:
31852:
31833:
31810:
31789:
31779:
31769:
31727:
31717:
31707:
31674:
31645:
31628:
31623:. Archived from
31597:
31575:
31565:
31553:
31541:
31509:
31467:
31430:
31413:(227): 357–365.
31392:
31391:
31389:
31388:
31382:
31371:
31363:McFadden, Daniel
31359:
31353:
31347:
31338:
31332:
31326:
31320:
31314:
31308:
31302:
31296:
31290:
31284:
31278:
31272:
31266:
31260:
31254:
31248:
31242:
31236:
31230:
31229:
31227:
31226:
31207:
31201:
31195:
31189:
31188:
31186:
31184:
31170:
31161:
31155:
31149:
31143:
31142:
31126:
31120:
31119:
31109:
31103:
31102:
31092:
31086:
31085:
31079:
31070:
31064:
31063:
31061:
31059:
31054:
31045:
31036:
31030:
31021:
31020:
31002:
30996:
30995:
30967:
30961:
30960:
30958:
30949:
30943:
30942:
30920:
30895:
30894:
30876:
30870:
30869:
30859:
30849:
30825:
30819:
30818:
30808:
30784:
30778:
30777:
30767:
30741:
30735:
30734:
30724:
30714:
30690:
30684:
30683:
30665:
30659:
30658:
30640:
30631:
30625:
30624:
30604:
30598:
30597:
30579:
30560:
30559:
30557:
30537:
30531:
30530:
30528:
30527:
30522:
30520:
30519:
30463:
30457:
30456:
30444:
30434:
30428:
30427:
30420:
30414:
30413:
30404:
30376:
30360:
30354:
30353:
30351:
30350:
30344:CORP-MIDS1 (MDS)
30336:
30330:
30329:
30297:
30291:
30290:
30249:(6): 1021–1037.
30234:
30228:
30227:
30195:
30189:
30188:
30180:
30174:
30173:
30153:
30147:
30146:
30126:
30120:
30119:
30093:
30082:
30081:
30053:
30047:
30046:
30031:
30022:
30021:
29993:
29987:
29986:
29958:
29952:
29951:
29923:
29917:
29916:
29896:
29890:
29889:
29879:
29855:
29849:
29843:
29837:
29836:
29819:(1/2): 167–178.
29808:
29799:
29798:, p. 10–11.
29793:
29784:
29783:
29765:
29738:
29737:
29693:
29617:
29612:
29611:
29465:and his student
29444:Ronald A. Fisher
29368:Adolphe Quetelet
29344:spline functions
29271:
29269:
29268:
29263:
29225:
29223:
29222:
29217:
29215:
29214:
29211:
29194:
29192:
29191:
29186:
29156:
29154:
29153:
29148:
29146:
29118:
29117:
29099:
29098:
29095:
29082:
29073:
29069:
29029:
29027:
28989:
28957:
28910:
28909:
28908:
28890:
28889:
28888:
28867:
28785:
28784:
28783:
28765:
28764:
28763:
28734:
28733:
28721:
28720:
28698:
28693:
28678:
28677:
28664:
28643:
28625:
28623:
28622:
28617:
28589:gradient descent
28583:
28581:
28580:
28575:
28564:
28563:
28551:
28550:
28528:
28523:
28508:
28507:
28462:
28461:
28436:
28434:
28433:
28428:
28426:
28422:
28421:
28417:
28416:
28390:
28389:
28377:
28376:
28358:
28357:
28356:
28355:
28341:
28340:
28328:
28327:
28317:
28302:
28289:
28288:
28276:
28275:
28259:
28244:
28167:
28165:
28164:
28159:
28154:
28153:
28123:
28122:
28104:
28103:
28088:
28087:
28044:
28042:
28041:
28036:
28003:
28001:
28000:
27995:
27962:
27960:
27959:
27954:
27943:
27942:
27881:
27879:
27878:
27873:
27838:
27836:
27835:
27834:
27830:
27829:
27802:
27788:
27787:
27768:
27766:
27765:
27760:
27741:
27739:
27738:
27733:
27720:machine learning
27707:Other approaches
27702:
27700:
27699:
27694:
27692:
27691:
27686:
27677:
27676:
27671:
27662:
27661:
27656:
27643:
27641:
27640:
27635:
27633:
27632:
27627:
27614:
27612:
27611:
27606:
27604:
27603:
27598:
27585:
27583:
27582:
27577:
27575:
27574:
27569:
27552:
27550:
27549:
27544:
27542:
27541:
27536:
27523:
27521:
27519:
27518:
27513:
27489:
27487:
27486:
27481:
27479:
27478:
27459:
27457:
27456:
27451:
27449:
27448:
27443:
27430:
27428:
27426:
27425:
27420:
27394:
27392:
27391:
27386:
27384:
27383:
27378:
27362:
27360:
27359:
27354:
27352:
27350:
27349:
27348:
27347:
27346:
27341:
27332:
27331:
27326:
27314:
27309:
27293:
27292:
27291:
27290:
27285:
27276:
27275:
27270:
27259:
27254:
27253:
27221:
27219:
27218:
27213:
27211:
27210:
27209:
27208:
27185:
27184:
27162:
27160:
27159:
27154:
27152:
27151:
27142:
27137:
27136:
27135:
27134:
27129:
27120:
27119:
27114:
27100:
27099:
27077:
27075:
27074:
27069:
27067:
27066:
27036:
27034:
27033:
27028:
27026:
27025:
27020:
27011:
27010:
27005:
26996:
26995:
26983:
26982:
26969:
26964:
26936:
26934:
26933:
26928:
26926:
26925:
26924:
26905:
26904:
26903:
26887:
26886:
26882:
26865:
26860:
26833:
26832:
26831:
26823:
26789:
26787:
26786:
26785:
26784:
26776:
26759:
26758:
26757:
26747:
26732:
26730:
26729:
26724:
26722:
26721:
26707:
26706:
26696:
26695:
26684:
26683:
26673:
26672:
26661:
26660:
26650:
26649:
26623:
26621:
26620:
26615:
26613:
26609:
26608:
26607:
26594:
26589:
26563:
26562:
26552:
26547:
26529:
26528:
26514:
26513:
26493:
26491:
26490:
26485:
26477:
26476:
26463:
26458:
26419:
26417:
26416:
26411:
26406:
26405:
26390:
26389:
26365:
26364:
26352:
26351:
26335:
26330:
26315:
26314:
26301:
26296:
26280:
26275:
26257:
26256:
26245:
26244:
26201:
26199:
26198:
26193:
26188:
26187:
26172:
26171:
26147:
26146:
26134:
26133:
26117:
26112:
26094:
26092:
26091:
26090:
26074:
26066:
26051:
26049:
26048:
26043:
26038:
26037:
26013:
26012:
25990:
25985:
25969:
25964:
25930:
25928:
25927:
25922:
25917:
25916:
25895:
25894:
25881:
25876:
25860:
25855:
25834:
25833:
25822:
25821:
25778:
25776:
25775:
25770:
25765:
25764:
25759:
25750:
25749:
25737:
25736:
25717:
25715:
25714:
25709:
25668:
25666:
25664:
25663:
25658:
25639:. There will be
25638:
25636:
25635:
25630:
25625:
25624:
25603:
25602:
25587:
25586:
25568:
25567:
25562:
25549:
25547:
25545:
25544:
25539:
25506:
25504:
25502:
25501:
25496:
25494:
25493:
25475:
25473:
25472:
25467:
25465:
25464:
25445:
25443:
25442:
25437:
25391:
25389:
25388:
25383:
25375:
25374:
25358:
25356:
25354:
25353:
25348:
25346:
25345:
25327:
25325:
25323:
25322:
25317:
25238:
25234:
25227:
25223:
25219:
25214:
25212:
25211:
25206:
25201:
25200:
25188:
25187:
25163:
25162:
25137:
25135:
25134:
25129:
25108:
25105:
25102:
25100:
25086:
25059:
25055:
25021:
25019:
25018:
25013:
25011:
25010:
25002:
24992:
24990:
24989:
24984:
24969:
24967:
24966:
24961:
24959:
24957:
24956:
24955:
24947:
24937:
24936:
24928:
24925:
24914:
24912:
24898:
24887:
24886:
24881:
24880:
24872:
24864:
24859:
24854:
24853:
24845:
24831:
24829:
24828:
24823:
24821:
24820:
24804:
24802:
24801:
24796:
24794:
24793:
24777:
24775:
24774:
24769:
24767:
24766:
24729:
24727:
24726:
24721:
24719:
24717:
24715:
24710:
24709:
24708:
24689:
24684:
24675:
24670:
24669:
24601:
24599:
24598:
24593:
24591:
24590:
24558:
24549:
24540:
24531:
24522:
24510:
24503:Pseudo-R-squared
24497:Pseudo-R-squared
24492:
24485:
24483:
24482:
24477:
24475:
24468:
24464:
24461:
24460:
24440:
24436:
24434:
24430:
24425:
24422:
24421:
24414:
24410:
24405:
24402:
24401:
24394:
24374:
24370:
24366:
24365:
24361:
24358:
24357:
24346:
24342:
24339:
24338:
24312:
24311:
24308:
24299:
24298:
24295:
24272:
24270:
24269:
24264:
24262:
24255:
24251:
24248:
24247:
24226:
24225:
24222:
24212:
24208:
24205:
24204:
24183:
24182:
24179:
24152:
24150:
24149:
24144:
24138:
24133:
24099:
24094:
24087:
24085:
24084:
24079:
24074:
24070:
24067:
24066:
24000:
23998:
23997:
23992:
23935:chi-squared test
23930:
23928:
23927:
23922:
23908:
23907:
23902:
23901:
23893:
23886:
23885:
23877:
23855:
23853:
23852:
23847:
23833:
23832:
23824:
23814:
23812:
23811:
23806:
23792:
23791:
23786:
23785:
23777:
23748:
23746:
23745:
23740:
23735:
23734:
23729:
23728:
23720:
23713:
23712:
23704:
23692:
23688:
23685:
23680:
23675:
23674:
23666:
23662:
23661:
23656:
23655:
23647:
23643:
23608:
23606:
23605:
23600:
23598:
23597:
23592:
23591:
23583:
23576:
23575:
23567:
23540:
23538:
23537:
23532:
23530:
23522:
23510:
23508:
23507:
23502:
23500:
23496:
23494:
23493:
23485:
23469:
23468:
23453:
23452:
23433:
23431:
23430:
23425:
23423:
23422:
23403:
23401:
23400:
23395:
23387:
23379:
23359:
23351:
23334:
23326:
23315:
23307:
23295:
23294:
23289:
23288:
23280:
23262:
23260:
23259:
23254:
23252:
23244:
23239:
23238:
23219:
23217:
23216:
23211:
23209:
23205:
23201:
23200:
23173:
23172:
23148:
23147:
23129:
23128:
23113:
23108:
23090:
23089:
23067:
23065:
23064:
23059:
23057:
23056:
23044:
23043:
23021:
23019:
23018:
23013:
23011:
23009:
23008:
23007:
23006:
23005:
22978:
22964:
22963:
22944:
22942:
22941:
22936:
22915:
22913:
22912:
22907:
22905:
22901:
22894:
22893:
22860:
22859:
22832:
22831:
22807:
22806:
22791:
22786:
22752:
22750:
22749:
22744:
22739:
22738:
22726:
22725:
22700:
22698:
22697:
22692:
22674:
22672:
22670:
22669:
22664:
22638:
22636:
22635:
22630:
22628:
22626:
22625:
22624:
22602:
22568:
22566:
22565:
22560:
22558:
22557:
22549:
22539:
22537:
22536:
22531:
22529:
22528:
22478:chi-squared test
22475:
22473:
22472:
22467:
22405:
22403:
22402:
22397:
22395:
22394:
22385:
22384:
22372:
22364:
22358:
22353:
22334:
22329:
22324:
22323:
22315:
22301:
22299:
22298:
22293:
22290:
22285:
22262:
22260:
22259:
22254:
22252:
22244:
22235:
22233:
22232:
22227:
22225:
22217:
22212:
22211:
22195:
22193:
22192:
22187:
22175:
22173:
22172:
22167:
22164:
22159:
22143:
22141:
22140:
22135:
22133:
22132:
22117:which minimizes
22106:
22104:
22103:
22098:
22093:
22092:
22083:
22082:
22070:
22069:
22056:
22051:
22033:
22032:
21988:
21986:
21985:
21980:
21978:
21977:
21972:
21971:
21963:
21949:
21947:
21946:
21941:
21936:
21935:
21926:
21925:
21913:
21912:
21903:
21902:
21890:
21889:
21876:
21871:
21853:
21852:
21829:
21827:
21826:
21821:
21816:
21815:
21803:
21802:
21758:chi-squared test
21721:
21719:
21718:
21713:
21708:
21690:
21688:
21687:
21682:
21666:
21664:
21663:
21658:
21563:
21561:
21560:
21555:
21547:
21538:
21537:
21522:
21520:
21519:
21514:
21465:
21463:
21462:
21457:
21455:
21454:
21397:
21382:
21380:
21379:
21374:
21372:
21371:
21333:
21332:
21312:
21311:
21279:
21278:
21258:
21257:
21232:
21217:
21215:
21214:
21209:
21165:
21153:
21151:
21150:
21145:
21092:
21077:
21075:
21074:
21069:
21067:
21063:
21062:
21061:
21056:
21047:
21039:
21038:
21033:
21027:
21022:
21021:
21016:
21005:
21004:
20999:
20993:
20992:
20984:
20980:
20979:
20974:
20973:
20968:
20962:
20961:
20956:
20941:
20940:
20929:
20913:
20911:
20910:
20905:
20903:
20891:
20889:
20888:
20883:
20881:
20879:
20878:
20877:
20867:
20862:
20861:
20856:
20832:
20808:
20806:
20805:
20800:
20798:
20797:
20773:
20772:
20751:
20750:
20720:
20708:
20706:
20705:
20700:
20689:
20688:
20676:
20675:
20663:
20662:
20647:
20646:
20641:
20612:
20610:
20609:
20604:
20586:
20584:
20583:
20578:
20487:
20485:
20484:
20479:
20473:
20472:
20465:
20464:
20454:
20450:
20449:
20447:
20446:
20445:
20444:
20443:
20438:
20429:
20406:
20392:
20391:
20386:
20382:
20380:
20379:
20378:
20377:
20376:
20371:
20362:
20339:
20332:
20331:
20330:
20321:
20320:
20310:
20300:
20299:
20292:
20291:
20277:
20276:
20257:
20252:
20243:
20242:
20241:
20232:
20231:
20221:
20208:
20207:
20202:
20187:
20186:
20158:
20156:
20155:
20150:
20144:
20143:
20138:
20129:
20121:
20117:
20115:
20114:
20113:
20097:
20096:
20087:
20069:
20068:
20047:
20043:
20042:
20038:
20037:
20036:
20031:
20024:
20020:
20018:
20016:
20015:
20006:
20005:
19996:
19982:
19981:
19951:
19949:
19948:
19943:
19937:
19933:
19932:
19931:
19926:
19919:
19915:
19913:
19911:
19910:
19901:
19900:
19891:
19877:
19876:
19867:
19866:
19819:
19817:
19816:
19811:
19788:
19785:
19777:
19776:
19764:
19763:
19741:
19740:
19684:
19682:
19681:
19676:
19670:
19668:
19664:
19658:
19654:
19648:
19625:
19623:
19619:
19613:
19609:
19603:
19568:
19566:
19565:
19560:
19558:
19556:
19555:
19554:
19523:
19471:or single-layer
19463:
19461:
19460:
19455:
19449:
19447:
19446:
19445:
19441:
19440:
19425:
19424:
19406:
19405:
19390:
19389:
19377:
19376:
19346:
19341:
19340:
19308:machine learning
19304:computer science
19277:
19275:
19274:
19269:
19267:
19266:
19261:
19252:
19251:
19246:
19237:
19222:
19220:
19219:
19214:
19212:
19211:
19199:
19197:
19196:
19195:
19194:
19193:
19188:
19179:
19178:
19173:
19149:
19144:
19142:
19141:
19140:
19139:
19138:
19133:
19124:
19123:
19118:
19100:
19099:
19098:
19097:
19092:
19083:
19082:
19077:
19066:
19052:
19051:
19023:
19021:
19020:
19015:
19007:
19006:
19005:
19004:
18999:
18990:
18977:
18976:
18975:
18974:
18969:
18960:
18959:
18954:
18933:
18931:
18930:
18925:
18920:
18912:
18911:
18906:
18890:
18888:
18887:
18882:
18880:
18873:
18871:
18870:
18869:
18868:
18867:
18862:
18853:
18852:
18847:
18833:
18832:
18831:
18830:
18825:
18816:
18815:
18810:
18798:
18797:
18796:
18795:
18790:
18781:
18780:
18775:
18764:
18756:
18752:
18750:
18746:
18745:
18744:
18743:
18738:
18729:
18728:
18723:
18709:
18708:
18707:
18706:
18701:
18692:
18691:
18686:
18672:
18671:
18670:
18669:
18664:
18655:
18644:
18643:
18642:
18641:
18640:
18635:
18626:
18625:
18620:
18609:
18608:
18607:
18606:
18601:
18592:
18581:
18573:
18569:
18567:
18566:
18565:
18564:
18563:
18558:
18549:
18539:
18538:
18537:
18536:
18531:
18522:
18521:
18516:
18502:
18501:
18500:
18499:
18494:
18485:
18475:
18474:
18473:
18472:
18467:
18458:
18457:
18452:
18440:
18439:
18438:
18437:
18436:
18431:
18422:
18412:
18411:
18410:
18409:
18404:
18395:
18394:
18389:
18377:
18369:
18365:
18363:
18362:
18361:
18360:
18359:
18354:
18342:
18334:
18333:
18328:
18311:
18310:
18309:
18308:
18303:
18291:
18283:
18282:
18277:
18262:
18261:
18260:
18259:
18254:
18242:
18234:
18233:
18228:
18214:
18196:
18195:
18144:
18142:
18141:
18136:
18119:
18118:
18091:
18090:
18068:
18066:
18065:
18060:
18049:
18048:
18026:
18024:
18023:
18018:
18007:
18006:
17981:
17979:
17978:
17973:
17959:
17958:
17953:
17944:
17943:
17938:
17929:
17928:
17923:
17914:
17913:
17908:
17875:
17874:
17848:softmax function
17838:
17836:
17835:
17830:
17828:
17826:
17825:
17824:
17823:
17822:
17817:
17808:
17807:
17802:
17790:
17780:
17779:
17778:
17777:
17772:
17763:
17762:
17757:
17746:
17732:
17731:
17703:
17701:
17700:
17695:
17693:
17686:
17684:
17683:
17682:
17681:
17680:
17675:
17666:
17665:
17660:
17646:
17645:
17644:
17643:
17638:
17629:
17628:
17623:
17611:
17610:
17609:
17608:
17603:
17594:
17593:
17588:
17577:
17559:
17558:
17539:
17537:
17536:
17535:
17534:
17533:
17528:
17519:
17518:
17513:
17499:
17498:
17497:
17496:
17491:
17482:
17481:
17476:
17464:
17463:
17462:
17461:
17456:
17447:
17446:
17441:
17430:
17412:
17411:
17379:
17377:
17376:
17371:
17369:
17368:
17367:
17366:
17361:
17352:
17351:
17346:
17332:
17331:
17330:
17329:
17324:
17315:
17314:
17309:
17250:
17248:
17247:
17242:
17240:
17236:
17235:
17234:
17233:
17228:
17219:
17218:
17213:
17202:
17194:
17176:
17175:
17156:
17155:
17154:
17153:
17148:
17139:
17138:
17133:
17122:
17114:
17096:
17095:
17062:
17060:
17059:
17054:
17026:
17024:
17023:
17018:
17016:
17000:
16999:
16994:
16985:
16984:
16979:
16957:
16956:
16919:
16918:
16913:
16904:
16903:
16898:
16876:
16875:
16739:
16735:
16700:
16693:
16689:
16686:
16680:
16677:inline citations
16649:
16648:
16641:
16630:
16628:
16627:
16622:
16620:
16616:
16615:
16604:
16592:
16591:
16586:
16577:
16566:
16565:
16551:
16543:
16538:
16537:
16532:
16523:
16502:
16493:
16490:
16487:
16482:
16481:
16476:
16467:
16443:
16434:
16431:
16429:
16424:
16421:
16418:
16401:
16400:
16395:
16386:
16371:
16362:
16359:
16354:
16351:
16348:
16331:
16330:
16325:
16313:
16312:
16307:
16298:
16297:
16292:
16273:
16265:
16251:
16250:
16238:
16237:
16222:
16221:
16216:
16204:
16203:
16198:
16189:
16188:
16183:
16164:
16156:
16154:
16150:
16140:
16139:
16127:
16126:
16108:
16107:
16102:
16093:
16092:
16087:
16078:
16077:
16072:
16063:
16062:
16057:
16036:
16028:
16026:
16022:
16015:
16011:
16010:
16009:
15997:
15996:
15991:
15982:
15981:
15976:
15962:
15961:
15949:
15948:
15943:
15934:
15933:
15928:
15910:
15902:
15900:
15896:
15895:
15894:
15889:
15873:
15865:
15852:
15844:
15823:
15815:
15813:
15809:
15808:
15807:
15802:
15792:
15784:
15771:
15763:
15742:
15734:
15733:
15728:
15713:
15712:
15683:
15681:
15680:
15675:
15646:
15645:
15633:
15632:
15603:
15601:
15600:
15595:
15593:
15592:
15580:
15579:
15556:
15554:
15553:
15548:
15546:
15545:
15540:
15531:
15530:
15525:
15516:
15464:
15462:
15461:
15456:
15454:
15453:
15447:
15444:
15429:
15421:
15408:
15400:
15391:
15388:
15370:
15369:
15347:
15345:
15344:
15339:
15337:
15336:
15335:
15334:
15314:
15313:
15289:
15288:
15261:
15260:
15221:
15219:
15218:
15213:
15211:
15189:
15188:
15172:
15171:
15140:
15139:
15123:
15122:
15096:
15094:
15093:
15088:
15086:
15081:
15080:
15068:
15067:
15062:
15053:
15052:
15047:
15033:
15025:
15011:
15010:
14998:
14997:
14992:
14983:
14982:
14977:
14963:
14955:
14900:
14898:
14897:
14892:
14890:
14886:
14883:
14880:
14878:
14877:
14862:
14859:
14854:
14853:
14848:
14839:
14828:
14827:
14809:
14805:
14802:
14799:
14794:
14793:
14788:
14779:
14771:
14770:
14749:
14742:
14741:
14736:
14727:
14716:
14715:
14694:
14681:
14680:
14668:
14667:
14662:
14653:
14636:
14629:
14628:
14623:
14607:
14602:
14577:
14576:
14571:
14556:
14555:
14523:
14521:
14520:
14515:
14501:
14500:
14476:
14475:
14433:latent variables
14431:and without any
14369:
14367:
14366:
14361:
14359:
14358:
14352:
14349:
14335:
14334:
14329:
14320:
14312:
14311:
14310:
14297:
14295: i.e.
14294:
14290:
14282:
14277:
14268:
14265:
14247:
14246:
14204:
14202:
14201:
14196:
14169:
14168:
14146:
14144:
14143:
14138:
14135:
14134:
14122:
14121:
14116:
14107:
14098:
14093:
14031:
14029:
14028:
14023:
14021:
14019:
14018:
14017:
14016:
14015:
14010:
14001:
13984:
13983:
13976:
13975:
13970:
13961:
13951:
13946:
13945:
13934:
13930:
13929:
13927:
13926:
13925:
13924:
13923:
13918:
13909:
13892:
13891:
13890:
13889:
13884:
13875:
13865:
13851:
13850:
13845:
13841:
13839:
13838:
13837:
13836:
13835:
13830:
13821:
13804:
13803:
13802:
13801:
13796:
13787:
13777:
13767:
13766:
13751:
13750:
13732:
13731:
13726:
13725:
13724:
13707:
13706:
13701:
13686:
13685:
13649:
13647:
13646:
13641:
13639:
13637:
13636:
13635:
13634:
13633:
13628:
13619:
13596:
13588:
13587:
13582:
13573:
13562:
13561:
13546:
13545:
13530:
13529:
13524:
13515:
13514:
13499:
13498:
13475:
13473:
13472:
13467:
13465:
13464:
13380:
13378:
13377:
13372:
13327:
13325:
13324:
13319:
13317:
13316:
13311:
13302:
13294:
13290:
13288:
13287:
13286:
13270:
13269:
13260:
13242:
13241:
13214:
13213:
13208:
13199:
13198:
13183:
13182:
13154:
13152:
13151:
13146:
13144:
13143:
13128:
13127:
13109:
13108:
13093:
13092:
13080:
13079:
13067:
13063:
13061:
13060:
13059:
13043:
13042:
13033:
13015:
13014:
12987:
12986:
12962:
12961:
12943:
12942:
12927:
12926:
12865:
12863:
12862:
12857:
12855:
12854:
12835:
12833:
12832:
12827:
12825:
12824:
12819:
12802:
12800:
12799:
12794:
12792:
12791:
12769:
12767:
12766:
12761:
12759:
12758:
12743:
12742:
12737:
12728:
12727:
12717:
12712:
12694:
12693:
12678:
12677:
12655:
12650:
12626:
12624:
12623:
12622:
12606:
12598:
12552:
12550:
12549:
12544:
12539:
12538:
12507:
12505:
12504:
12499:
12491:
12490:
12485:
12476:
12475:
12453:
12452:
12430:
12425:
12409:
12404:
12373:
12371:
12370:
12365:
12363:
12362:
12346:
12344:
12343:
12338:
12336:
12335:
12330:
12302:
12300:
12299:
12294:
12289:
12281:
12280:
12259:
12251:
12250:
12234:
12232:
12231:
12226:
12221:
12213:
12212:
12197:
12179:
12177:
12176:
12171:
12150:
12148:
12147:
12142:
12140:
12132:
12131:
12126:
12117:
12113:
12111:
12107:
12099:
12098:
12088:
12084:
12076:
12075:
12065:
12050:
12049:
12019:
12017:
12016:
12011:
12006:
11998:
11997:
11977:
11975:
11974:
11969:
11964:
11956:
11955:
11939:
11937:
11936:
11931:
11929:
11928:
11923:
11910:
11908:
11907:
11902:
11897:
11889:
11888:
11869:
11867:
11866:
11861:
11859:
11857:
11856:
11855:
11854:
11846:
11845:
11840:
11828:
11823:
11798:
11790:
11782:
11781:
11771:
11766:
11739:
11731:
11730:
11713:
11711:
11710:
11705:
11669:
11667:
11666:
11661:
11659:
11657:
11656:
11655:
11654:
11646:
11645:
11640:
11628:
11623:
11601:
11600:
11599:
11591:
11590:
11585:
11574:
11566:
11558:
11557:
11518:
11516:
11514:
11513:
11508:
11488:
11486:
11484:
11483:
11478:
11451:
11449:
11447:
11446:
11441:
11418:
11416:
11415:
11410:
11405:
11381:
11379:
11378:
11373:
11368:
11334:
11332:
11331:
11326:
11304:
11302:
11301:
11296:
11294:
11293:
11277:
11275:
11274:
11269:
11267:
11266:
11250:
11248:
11247:
11242:
11237:
11236:
11220:
11218:
11217:
11212:
11194:
11192:
11191:
11186:
11184:
11183:
11167:
11165:
11164:
11159:
11147:
11145:
11144:
11139:
11137:
11136:
11120:
11118:
11117:
11112:
11104:
11103:
11085:
11083:
11082:
11077:
11055:
11053:
11052:
11047:
11045:
11044:
11028:
11026:
11025:
11020:
11002:
11000:
10999:
10994:
10992:
10991:
10975:
10973:
10972:
10967:
10955:
10953:
10952:
10947:
10945:
10944:
10928:
10926:
10925:
10920:
10912:
10911:
10894:
10892:
10891:
10886:
10881:
10855:
10840:
10838:
10837:
10832:
10824:
10823:
10811:
10810:
10794:
10792:
10791:
10786:
10768:
10766:
10765:
10760:
10758:
10757:
10738:
10736:
10735:
10730:
10712:
10710:
10709:
10704:
10696:
10695:
10683:
10682:
10666:
10664:
10663:
10658:
10650:
10649:
10637:
10636:
10620:
10618:
10617:
10612:
10587:
10585:
10584:
10579:
10568:
10567:
10548:
10546:
10545:
10540:
10514:
10512:
10511:
10506:
10504:
10502:
10501:
10500:
10496:
10495:
10486:
10485:
10473:
10472:
10463:
10462:
10450:
10449:
10419:
10414:
10412:
10411:
10410:
10409:
10408:
10399:
10398:
10386:
10385:
10376:
10375:
10363:
10362:
10341:
10340:
10339:
10338:
10329:
10328:
10316:
10315:
10306:
10305:
10293:
10292:
10278:
10273:
10271:
10270:
10269:
10262:
10245:
10244:
10243:
10235:
10225:
10209:
10207:
10206:
10201:
10199:
10198:
10183:
10182:
10161:
10159:
10145:
10140:
10139:
10114:
10112:
10111:
10106:
10098:
10097:
10081:
10079:
10078:
10073:
10065:
10064:
10048:
10046:
10045:
10040:
10029:
10028:
10012:
10010:
10009:
10004:
9986:
9984:
9983:
9978:
9936:
9934:
9933:
9928:
9926:
9925:
9910:
9909:
9904:
9891:
9886:
9868:
9867:
9855:
9854:
9844:
9839:
9815:
9813:
9812:
9811:
9798:
9790:
9767:
9765:
9764:
9759:
9751:
9750:
9749:
9721:
9720:
9708:
9707:
9688:
9683:
9659:
9658:
9657:
9635:
9634:
9625:
9624:
9614:
9609:
9578:
9576:
9575:
9570:
9552:
9550:
9549:
9544:
9542:
9541:
9521:
9519:
9518:
9513:
9511:
9510:
9505:
9488:
9486:
9485:
9480:
9462:
9460:
9459:
9454:
9449:
9431:
9429:
9428:
9423:
9421:
9409:
9407:
9406:
9401:
9383:
9381:
9380:
9375:
9357:
9355:
9354:
9349:
9347:
9346:
9330:
9328:
9327:
9322:
9309:sigmoid function
9306:
9304:
9303:
9298:
9296:
9295:
9275:
9273:
9272:
9267:
9256:
9255:
9243:
9241:
9240:
9239:
9238:
9230:
9207:
9202:
9200:
9199:
9198:
9197:
9189:
9172:
9171:
9170:
9162:
9152:
9144:
9123:
9121:
9120:
9115:
9090:
9088:
9087:
9082:
9074:
9066:
9065:
9056:
9055:
9045:
9040:
8999:
8997:
8996:
8991:
8986:
8985:
8967:
8966:
8954:
8953:
8941:
8940:
8925:
8912:
8910:
8909:
8904:
8899:
8898:
8880:
8879:
8867:
8866:
8854:
8853:
8838:
8823:
8821:
8819:
8818:
8813:
8788:coefficients as
8774:
8772:
8771:
8766:
8743:
8741:
8740:
8735:
8733:
8732:
8709:
8707:
8706:
8701:
8699:
8698:
8689:
8688:
8670:
8669:
8660:
8659:
8647:
8646:
8637:
8636:
8624:
8623:
8611:
8609:
8595:
8590:
8589:
8560:
8558:
8557:
8552:
8494:
8492:
8491:
8486:
8414:
8412:
8411:
8406:
8401:
8400:
8395:
8386:
8235:
8233:
8232:
8227:
8225:
8224:
8206:
8205:
8186:
8184:
8183:
8178:
8173:
8172:
8157:
8156:
8138:
8137:
8122:
8121:
8109:
8108:
8070:
8068:
8067:
8062:
7905:with parameters
7878:
7876:
7875:
7870:
7868:
7864:
7863:
7842:
7841:
7822:
7817:
7798:
7797:
7773:
7772:
7748:
7747:
7728:
7727:
7712:
7709:
7705:
7704:
7676:
7673:
7669:
7668:
7641:
7640:
7616:
7615:
7591:
7590:
7571:
7570:
7551:
7550:
7526:
7525:
7507:
7506:
7491:
7490:
7477:
7476:
7449:
7448:
7447:
7423:
7422:
7404:
7403:
7235:input variables
7214:
7212:
7211:
7206:
7185:
7183:
7182:
7177:
7175:
7173:
7172:
7171:
7167:
7166:
7157:
7156:
7138:
7137:
7128:
7127:
7115:
7114:
7105:
7104:
7092:
7091:
7061:
7040:
7038:
7037:
7032:
7030:
7029:
7020:
7019:
7001:
7000:
6991:
6990:
6978:
6977:
6968:
6967:
6955:
6954:
6942:
6940:
6926:
6905:
6903:
6902:
6897:
6855:
6853:
6852:
6847:
6845:
6844:
6820:
6818:
6817:
6812:
6810:
6809:
6800:
6799:
6789:
6784:
6766:
6765:
6753:
6752:
6743:
6742:
6724:
6723:
6714:
6713:
6701:
6700:
6691:
6690:
6678:
6677:
6661:
6659:
6658:
6653:
6648:
6647:
6635:
6634:
6590:
6588:
6587:
6582:
6580:
6578:
6570:
6562:
6550:
6548:
6547:
6542:
6540:
6539:
6538:
6537:
6516:
6514:
6513:
6508:
6506:
6505:
6486:
6484:
6483:
6478:
6476:
6475:
6474:
6473:
6456:
6454:
6453:
6449:
6448:
6436:
6435:
6421:
6420:
6404:
6403:
6391:
6390:
6376:
6371:
6369:
6365:
6363:
6343:
6329:
6323:
6319:
6317:
6291:
6271:
6265:
6260:
6258:
6241:
6218:
6213:
6178:
6176:
6175:
6170:
6165:
6164:
6160:
6159:
6147:
6146:
6129:
6126:
6114:
6112:
6111:
6106:
6086:
6084:
6083:
6078:
6057:
6055:
6054:
6049:
6034:
6032:
6031:
6026:
6021:
6020:
5998:
5996:
5995:
5990:
5988:
5987:
5969:
5967:
5966:
5961:
5939:
5937:
5936:
5931:
5910:
5908:
5907:
5902:
5875:
5873:
5872:
5867:
5849:
5847:
5846:
5841:
5811:
5809:
5808:
5803:
5780:
5778:
5777:
5772:
5767:
5766:
5762:
5761:
5749:
5748:
5731:
5729:
5709:
5695:
5680:
5678:
5677:
5672:
5664:
5663:
5651:
5650:
5638:
5634:
5632:
5612:
5598:
5544:
5543:
5497:
5495:
5494:
5489:
5487:
5486:
5449:
5447:
5446:
5441:
5429:
5427:
5426:
5421:
5406:
5404:
5403:
5398:
5396:
5395:
5379:
5377:
5376:
5371:
5354:
5353:
5331:
5329:
5328:
5323:
5321:
5320:
5301:
5299:
5298:
5293:
5281:
5279:
5278:
5273:
5249:
5247:
5246:
5241:
5239:
5237:
5236:
5235:
5228:
5227:
5215:
5214:
5184:
5142:
5140:
5139:
5134:
5114:
5093:
5091:
5090:
5085:
5080:
5079:
5067:
5066:
5041:
5039:
5038:
5033:
5017:
5015:
5014:
5009:
4998:(the case where
4997:
4995:
4994:
4989:
4974:
4972:
4971:
4966:
4944:
4942:
4941:
4936:
4934:
4932:
4931:
4930:
4908:
4903:
4901:
4894:
4893:
4883:
4882:
4873:
4846:
4844:
4843:
4838:
4818:
4788:
4786:
4785:
4780:
4763:sigmoid function
4747:
4745:
4744:
4739:
4727:
4725:
4724:
4719:
4680:
4678:
4677:
4672:
4623:
4621:
4620:
4615:
4589:
4587:
4586:
4581:
4486:
4485:
4479:Model evaluation
4429:
4427:
4426:
4421:
4419:
4410:
4393:
4391:
4389:
4388:
4383:
4332:Probability (p)
4311:
4310:
4304:
4302:
4301:
4296:
4294:
4291:
4280:
4278:
4277:
4276:
4254:
4236:
4234:
4233:
4228:
4199:
4198:
4183:
4182:
4154:
4152:
4151:
4146:
4144:
4141:
4130:
4128:
4127:
4126:
4104:
4086:
4084:
4083:
4078:
4046:
4045:
4030:
4029:
4004:
4002:
4001:
3996:
3975:
3973:
3971:
3970:
3965:
3963:
3962:
3944:
3942:
3940:
3939:
3934:
3932:
3931:
3905:
3903:
3902:
3897:
3889:
3888:
3879:
3857:
3855:
3854:
3849:
3841:
3840:
3831:
3826:
3825:
3786:
3784:
3783:
3778:
3770:
3769:
3752:
3750:
3749:
3744:
3733:
3732:
3705:
3703:
3701:
3700:
3695:
3693:
3692:
3674:
3672:
3670:
3669:
3664:
3662:
3661:
3640:
3638:
3636:
3635:
3630:
3628:
3627:
3609:
3607:
3605:
3604:
3599:
3597:
3596:
3575:
3573:
3572:
3567:
3565:
3564:
3552:
3551:
3539:
3538:
3525:
3520:
3502:
3500:
3499:
3498:
3485:
3477:
3459:
3457:
3456:
3451:
3446:
3445:
3433:
3432:
3419:
3414:
3396:
3394:
3393:
3392:
3379:
3371:
3353:
3351:
3349:
3348:
3343:
3341:
3340:
3322:
3320:
3318:
3317:
3312:
3310:
3309:
3292:with respect to
3283:
3281:
3279:
3278:
3273:
3271:
3270:
3252:
3250:
3248:
3247:
3242:
3240:
3239:
3222:is nonlinear in
3202:
3200:
3199:
3194:
3189:
3188:
3169:
3162:
3161:
3140:
3139:
3129:
3122:
3121:
3078:
3076:
3075:
3070:
3068:
3064:
3060:
3059:
3032:
3031:
3007:
3006:
2988:
2987:
2971:
2966:
2945:
2944:
2919:
2912:
2911:
2885:
2884:
2865:
2858:
2857:
2806:
2804:
2802:
2801:
2796:
2779:
2777:
2775:
2774:
2769:
2767:
2766:
2748:
2746:
2744:
2743:
2738:
2736:
2735:
2717:
2715:
2713:
2712:
2707:
2687:
2685:
2684:
2679:
2677:
2676:
2667:
2666:
2645:
2644:
2635:
2634:
2621:
2619:
2618:
2613:
2611:
2610:
2601:
2600:
2579:
2578:
2569:
2568:
2548:
2546:
2545:
2540:
2532:
2531:
2504:
2503:
2482:
2481:
2466:
2465:
2450:
2449:
2427:
2425:
2423:
2422:
2417:
2409:
2408:
2384:
2382:
2380:
2379:
2374:
2372:
2371:
2353:
2351:
2350:
2345:
2337:
2336:
2320:
2318:
2317:
2312:
2304:
2303:
2287:
2285:
2284:
2279:
2271:
2270:
2254:
2252:
2251:
2246:
2238:
2237:
2221:
2219:
2218:
2213:
2205:
2204:
2188:
2186:
2185:
2180:
2172:
2171:
2155:
2153:
2152:
2147:
2139:
2138:
2122:
2120:
2119:
2114:
2106:
2105:
2085:
2083:
2081:
2080:
2075:
2073:
2072:
2054:
2052:
2050:
2049:
2044:
2042:
2041:
2016:
2014:
2013:
2008:
2006:
2005:
1993:
1992:
1983:
1980:
1973:
1972:
1932:
1931:
1922:
1919:
1915:
1914:
1885:
1884:
1865:
1863:
1861:
1860:
1855:
1853:
1852:
1812:
1810:
1808:
1807:
1802:
1800:
1799:
1781:
1779:
1777:
1776:
1771:
1769:
1768:
1746:
1744:
1742:
1741:
1736:
1734:
1733:
1709:
1707:
1705:
1704:
1699:
1697:
1696:
1678:
1676:
1674:
1673:
1668:
1666:
1665:
1647:
1645:
1644:
1639:
1634:
1633:
1615:
1614:
1581:), the negative
1556:
1554:
1553:
1548:
1546:
1545:
1536:
1515:
1513:
1512:
1507:
1505:
1504:
1495:
1490:
1489:
1452:
1450:
1449:
1444:
1439:
1428:
1427:
1411:
1409:
1408:
1403:
1398:
1397:
1385:
1384:
1350:
1348:
1347:
1342:
1337:
1323:
1322:
1303:
1301:
1300:
1295:
1293:
1291:
1290:
1289:
1282:
1281:
1269:
1268:
1238:
1203:
1201:
1200:
1195:
1190:
1149:
1147:
1146:
1141:
1139:
1137:
1136:
1135:
1131:
1093:
1067:is of the form:
1016:
1014:
1013:
1008:
984:
982:
981:
976:
959:which runs from
793:
792:
781:cardinal numbers
690:
662:
655:
648:
632:
631:
539:Ridge regression
374:Multilevel model
254:
253:
231:(OLS) plays for
182:sigmoid function
77:logit regression
55:that models the
34418:
34417:
34413:
34412:
34411:
34409:
34408:
34407:
34383:
34382:
34381:
34376:
34339:
34310:
34272:
34209:
34195:quality control
34162:
34144:Clinical trials
34121:
34096:
34080:
34068:Hazard function
34062:
34016:
33978:
33962:
33925:
33921:Breusch–Godfrey
33909:
33886:
33826:
33801:Factor analysis
33747:
33728:Graphical model
33700:
33667:
33634:
33620:
33600:
33554:
33521:
33483:
33446:
33445:
33414:
33358:
33345:
33337:
33329:
33313:
33298:
33277:Rank statistics
33271:
33250:Model selection
33238:
33196:Goodness of fit
33190:
33167:
33141:
33113:
33066:
33011:
33000:Median unbiased
32928:
32839:
32772:Order statistic
32734:
32713:
32680:
32654:
32606:
32561:
32504:
32502:Data collection
32483:
32395:
32350:
32324:
32302:
32262:
32214:
32131:Continuous data
32121:
32108:
32090:
32085:
32027:
32011:
32006:
31954:
31935:
31916:
31894:
31875:
31849:
31830:
31807:
31663:10.2307/2525642
31563:
31481:(2037): 38–39.
31448:10.2307/3001655
31400:
31395:
31386:
31384:
31380:
31369:
31360:
31356:
31348:
31341:
31333:
31329:
31321:
31317:
31309:
31305:
31297:
31293:
31285:
31281:
31273:
31269:
31261:
31257:
31249:
31245:
31237:
31233:
31224:
31222:
31208:
31204:
31196:
31192:
31182:
31180:
31168:
31162:
31158:
31154:, pp. 3–5.
31150:
31146:
31127:
31123:
31110:
31106:
31093:
31089:
31077:
31071:
31067:
31057:
31055:
31052:
31046:
31039:
31031:
31024:
31017:
31003:
30999:
30968:
30964:
30956:
30950:
30946:
30939:
30925:Aiken, Leona S.
30921:
30898:
30891:
30877:
30873:
30826:
30822:
30785:
30781:
30742:
30738:
30691:
30687:
30680:
30666:
30662:
30638:
30632:
30628:
30605:
30601:
30594:
30580:
30563:
30538:
30534:
30515:
30511:
30470:
30467:
30466:
30464:
30460:
30453:
30435:
30431:
30422:
30421:
30417:
30374:
30361:
30357:
30348:
30346:
30338:
30337:
30333:
30298:
30294:
30235:
30231:
30196:
30192:
30181:
30177:
30154:
30150:
30127:
30123:
30116:
30094:
30085:
30054:
30050:
30032:
30025:
30004:(24): 2957–63.
29994:
29990:
29969:(10): 1638–52.
29959:
29955:
29924:
29920:
29897:
29893:
29856:
29852:
29844:
29840:
29825:10.2307/2333860
29809:
29802:
29794:
29787:
29780:
29766:
29741:
29694:
29690:
29686:
29628:Discrete choice
29613:
29606:
29603:
29530:
29516:, specifically
29514:discrete choice
29510:Daniel McFadden
29402:, published as
29384:Wilhelm Ostwald
29352:
29314:probit function
29294:
29251:
29248:
29247:
29236:
29210:
29206:
29204:
29201:
29200:
29165:
29162:
29161:
29144:
29143:
29113:
29109:
29094:
29090:
29083:
29081:
29075:
29074:
28990:
28958:
28956:
28946:
28942:
28904:
28903:
28896:
28884:
28883:
28876:
28868:
28866:
28860:
28859:
28779:
28778:
28771:
28759:
28758:
28751:
28729:
28725:
28716:
28712:
28694:
28683:
28670:
28666:
28651:
28640:
28638:
28635:
28634:
28599:
28596:
28595:
28559:
28555:
28546:
28542:
28524:
28513:
28500:
28496:
28454:
28450:
28448:
28445:
28444:
28424:
28423:
28412:
28408:
28398:
28394:
28385:
28381:
28372:
28368:
28351:
28347:
28346:
28342:
28336:
28332:
28323:
28319:
28313:
28300:
28299:
28284:
28280:
28271:
28267:
28255:
28242:
28241:
28210:
28182:
28180:
28177:
28176:
28137:
28133:
28118:
28114:
28099:
28095:
28083:
28079:
28050:
28047:
28046:
28009:
28006:
28005:
27971:
27968:
27967:
27938:
27934:
27893:
27890:
27889:
27825:
27821:
27817:
27813:
27806:
27801:
27783:
27779:
27777:
27774:
27773:
27754:
27751:
27750:
27727:
27724:
27723:
27715:loss function.
27709:
27687:
27682:
27681:
27672:
27667:
27666:
27657:
27652:
27651:
27649:
27646:
27645:
27628:
27623:
27622:
27620:
27617:
27616:
27599:
27594:
27593:
27591:
27588:
27587:
27570:
27565:
27564:
27562:
27559:
27558:
27537:
27532:
27531:
27529:
27526:
27525:
27501:
27498:
27497:
27495:
27471:
27467:
27465:
27462:
27461:
27444:
27439:
27438:
27436:
27433:
27432:
27402:
27399:
27398:
27396:
27379:
27374:
27373:
27371:
27368:
27367:
27342:
27337:
27336:
27327:
27322:
27321:
27320:
27316:
27310:
27299:
27294:
27286:
27281:
27280:
27271:
27266:
27265:
27264:
27260:
27258:
27246:
27242:
27240:
27237:
27236:
27230:
27204:
27200:
27193:
27189:
27180:
27176:
27174:
27171:
27170:
27147:
27143:
27138:
27130:
27125:
27124:
27115:
27110:
27109:
27108:
27104:
27092:
27088:
27086:
27083:
27082:
27059:
27055:
27053:
27050:
27049:
27021:
27016:
27015:
27006:
27001:
27000:
26988:
26984:
26975:
26971:
26965:
26954:
26948:
26945:
26944:
26917:
26916:
26912:
26896:
26892:
26888:
26875:
26874:
26870:
26861:
26850:
26824:
26816:
26815:
26811:
26777:
26769:
26768:
26764:
26760:
26753:
26752:
26748:
26746:
26744:
26741:
26740:
26708:
26702:
26701:
26700:
26685:
26679:
26678:
26677:
26662:
26656:
26655:
26654:
26645:
26644:
26642:
26639:
26638:
26632:
26600:
26596:
26590:
26579:
26568:
26564:
26558:
26554:
26548:
26537:
26515:
26509:
26508:
26507:
26505:
26502:
26501:
26469:
26465:
26459:
26448:
26442:
26439:
26438:
26428:
26398:
26394:
26385:
26381:
26357:
26353:
26344:
26340:
26331:
26320:
26307:
26303:
26297:
26286:
26276:
26265:
26246:
26240:
26239:
26238:
26236:
26233:
26232:
26218:
26210:
26180:
26176:
26167:
26163:
26139:
26135:
26126:
26122:
26113:
26102:
26083:
26079:
26075:
26067:
26065:
26063:
26060:
26059:
26030:
26026:
26008:
26004:
25986:
25975:
25965:
25954:
25942:
25939:
25938:
25909:
25905:
25887:
25883:
25877:
25866:
25856:
25845:
25823:
25817:
25816:
25815:
25813:
25810:
25809:
25795:
25760:
25755:
25754:
25745:
25741:
25729:
25725:
25723:
25720:
25719:
25697:
25694:
25693:
25681:
25646:
25643:
25642:
25640:
25617:
25613:
25595:
25591:
25579:
25575:
25563:
25558:
25557:
25555:
25552:
25551:
25521:
25518:
25517:
25515:
25512:
25489:
25485:
25483:
25480:
25479:
25477:
25457:
25453:
25451:
25448:
25447:
25401:
25398:
25397:
25370:
25366:
25364:
25361:
25360:
25341:
25337:
25335:
25332:
25331:
25329:
25305:
25302:
25301:
25299:
25285:
25261:maximum entropy
25249:
25247:Maximum entropy
25236:
25232:
25225:
25221:
25217:
25196:
25192:
25183:
25179:
25158:
25157:
25149:
25146:
25145:
25104:
25090:
25085:
25065:
25062:
25061:
25057:
25051:
25040:Bernoulli trial
25028:
25001:
25000:
24998:
24995:
24994:
24978:
24975:
24974:
24946:
24945:
24938:
24927:
24926:
24924:
24902:
24897:
24882:
24871:
24870:
24869:
24860:
24855:
24844:
24843:
24840:
24837:
24836:
24816:
24812:
24810:
24807:
24806:
24789:
24785:
24783:
24780:
24779:
24762:
24758:
24756:
24753:
24752:
24744:
24734:probability of
24711:
24704:
24700:
24699:
24691:
24685:
24680:
24674:
24665:
24661:
24659:
24656:
24655:
24641:
24628:
24611:
24586:
24582:
24580:
24577:
24576:
24568:
24561:
24556:
24552:
24547:
24543:
24538:
24534:
24529:
24525:
24520:
24508:
24505:
24499:
24490:
24473:
24472:
24459:
24438:
24437:
24419:
24415:
24399:
24395:
24393:
24372:
24371:
24356:
24337:
24330:
24326:
24313:
24307:
24303:
24294:
24290:
24286:
24284:
24281:
24280:
24260:
24259:
24246:
24227:
24221:
24217:
24214:
24213:
24203:
24184:
24178:
24174:
24170:
24168:
24165:
24164:
24134:
24123:
24117:
24114:
24113:
24097:
24092:
24065:
24045:
24042:
24041:
24022:
24010:Goodness of fit
24007:
23971:
23968:
23967:
23964:null hypothesis
23956:
23949:
23903:
23892:
23891:
23890:
23876:
23875:
23861:
23858:
23857:
23823:
23822:
23820:
23817:
23816:
23787:
23776:
23775:
23774:
23772:
23769:
23768:
23762:
23730:
23719:
23718:
23717:
23703:
23702:
23681:
23676:
23665:
23664:
23657:
23646:
23645:
23644:
23642:
23638:
23624:
23621:
23620:
23593:
23582:
23581:
23580:
23566:
23565:
23563:
23560:
23559:
23553:
23546:
23521:
23519:
23516:
23515:
23484:
23477:
23467:
23463:
23448:
23444:
23442:
23439:
23438:
23418:
23414:
23412:
23409:
23408:
23378:
23350:
23325:
23306:
23290:
23279:
23278:
23277:
23275:
23272:
23271:
23243:
23234:
23230:
23228:
23225:
23224:
23196:
23192:
23168:
23164:
23143:
23139:
23124:
23120:
23119:
23115:
23109:
23098:
23085:
23081:
23079:
23076:
23075:
23052:
23048:
23039:
23035:
23033:
23030:
23029:
23001:
22997:
22993:
22989:
22982:
22977:
22959:
22955:
22953:
22950:
22949:
22924:
22921:
22920:
22889:
22885:
22855:
22851:
22827:
22823:
22802:
22798:
22797:
22793:
22787:
22776:
22764:
22761:
22760:
22734:
22730:
22721:
22717:
22709:
22706:
22705:
22680:
22677:
22676:
22649:
22646:
22645:
22643:
22617:
22613:
22606:
22601:
22584:
22581:
22580:
22548:
22547:
22545:
22542:
22541:
22524:
22520:
22518:
22515:
22514:
22492:
22485:
22449:
22446:
22445:
22438:
22431:
22424:
22414:
22390:
22386:
22380:
22376:
22363:
22354:
22343:
22330:
22325:
22314:
22313:
22310:
22307:
22306:
22286:
22281:
22275:
22272:
22271:
22268:
22243:
22241:
22238:
22237:
22216:
22207:
22203:
22201:
22198:
22197:
22181:
22178:
22177:
22160:
22155:
22149:
22146:
22145:
22128:
22124:
22122:
22119:
22118:
22116:
22088:
22084:
22078:
22074:
22065:
22061:
22052:
22041:
22028:
22024:
22022:
22019:
22018:
22013:
22002:
21973:
21962:
21961:
21960:
21958:
21955:
21954:
21931:
21927:
21921:
21917:
21908:
21904:
21898:
21894:
21885:
21881:
21872:
21861:
21848:
21844:
21842:
21839:
21838:
21811:
21807:
21798:
21794:
21786:
21783:
21782:
21779:
21772:
21738:
21733:
21704:
21696:
21693:
21692:
21676:
21673:
21672:
21652:
21649:
21648:
21641:one in ten rule
21637:
21635:One in ten rule
21631:
21586:conjugate prior
21584:. There is no
21536:
21528:
21525:
21524:
21499:
21496:
21495:
21484:probit function
21472:
21450:
21446:
21393:
21391:
21388:
21387:
21366:
21365:
21360:
21355:
21349:
21348:
21343:
21328:
21324:
21322:
21307:
21303:
21301:
21295:
21294:
21289:
21274:
21270:
21268:
21253:
21249:
21247:
21237:
21236:
21228:
21226:
21223:
21222:
21161:
21159:
21156:
21155:
21088:
21086:
21083:
21082:
21057:
21052:
21051:
21043:
21034:
21029:
21028:
21023:
21017:
21012:
21011:
21010:
21006:
21000:
20995:
20994:
20985:
20975:
20969:
20964:
20963:
20957:
20952:
20951:
20950:
20946:
20945:
20930:
20925:
20924:
20922:
20919:
20918:
20899:
20897:
20894:
20893:
20863:
20857:
20852:
20851:
20847:
20843:
20836:
20831:
20814:
20811:
20810:
20793:
20789:
20768:
20764:
20746:
20742:
20716:
20714:
20711:
20710:
20684:
20680:
20671:
20667:
20658:
20654:
20642:
20637:
20636:
20634:
20631:
20630:
20627:Newton's method
20592:
20589:
20588:
20566:
20563:
20562:
20559:
20510:Newton's method
20502:
20497:
20460:
20456:
20455:
20439:
20434:
20433:
20425:
20421:
20417:
20410:
20405:
20398:
20394:
20393:
20387:
20372:
20367:
20366:
20358:
20354:
20350:
20343:
20338:
20334:
20333:
20326:
20316:
20312:
20306:
20305:
20304:
20287:
20283:
20282:
20278:
20272:
20268:
20253:
20248:
20237:
20227:
20223:
20217:
20216:
20215:
20203:
20198:
20197:
20182:
20178:
20170:
20167:
20166:
20139:
20134:
20133:
20125:
20109:
20105:
20098:
20092:
20088:
20086:
20082:
20064:
20060:
20032:
20027:
20026:
20011:
20007:
20001:
19997:
19995:
19994:
19991:
19990:
19986:
19977:
19976:
19975:
19971:
19963:
19960:
19959:
19927:
19922:
19921:
19906:
19902:
19896:
19892:
19890:
19889:
19886:
19885:
19881:
19872:
19871:
19862:
19858:
19856:
19853:
19852:
19846:expected values
19840:
19831:
19786: for
19784:
19772:
19768:
19759:
19755:
19736:
19732:
19730:
19727:
19726:
19717:
19705:
19691:
19660:
19659:
19650:
19649:
19647:
19615:
19614:
19605:
19604:
19602:
19600:
19597:
19596:
19590:backpropagation
19538:
19534:
19527:
19522:
19514:
19511:
19510:
19505:
19496:
19484:
19430:
19426:
19420:
19416:
19395:
19391:
19385:
19381:
19372:
19368:
19361:
19357:
19350:
19345:
19336:
19332:
19330:
19327:
19326:
19320:
19296:discrete choice
19262:
19257:
19256:
19247:
19242:
19241:
19233:
19231:
19228:
19227:
19207:
19203:
19189:
19184:
19183:
19174:
19169:
19168:
19164:
19160:
19153:
19148:
19134:
19129:
19128:
19119:
19114:
19113:
19112:
19108:
19101:
19093:
19088:
19087:
19078:
19073:
19072:
19071:
19067:
19065:
19047:
19043:
19035:
19032:
19031:
19000:
18995:
18994:
18986:
18985:
18981:
18970:
18965:
18964:
18955:
18950:
18949:
18948:
18944:
18942:
18939:
18938:
18916:
18907:
18902:
18901:
18899:
18896:
18895:
18878:
18877:
18863:
18858:
18857:
18848:
18843:
18842:
18841:
18837:
18826:
18821:
18820:
18811:
18806:
18805:
18804:
18800:
18799:
18791:
18786:
18785:
18776:
18771:
18770:
18769:
18765:
18763:
18754:
18753:
18739:
18734:
18733:
18724:
18719:
18718:
18717:
18713:
18702:
18697:
18696:
18687:
18682:
18681:
18680:
18676:
18665:
18660:
18659:
18651:
18650:
18646:
18645:
18636:
18631:
18630:
18621:
18616:
18615:
18614:
18610:
18602:
18597:
18596:
18588:
18587:
18583:
18582:
18580:
18571:
18570:
18559:
18554:
18553:
18545:
18544:
18540:
18532:
18527:
18526:
18517:
18512:
18511:
18510:
18506:
18495:
18490:
18489:
18481:
18480:
18476:
18468:
18463:
18462:
18453:
18448:
18447:
18446:
18442:
18441:
18432:
18427:
18426:
18418:
18417:
18413:
18405:
18400:
18399:
18390:
18385:
18384:
18383:
18379:
18378:
18376:
18367:
18366:
18355:
18350:
18349:
18338:
18329:
18324:
18323:
18319:
18315:
18304:
18299:
18298:
18287:
18278:
18273:
18272:
18268:
18264:
18263:
18255:
18250:
18249:
18238:
18229:
18224:
18223:
18219:
18215:
18213:
18206:
18191:
18187:
18177:
18175:
18172:
18171:
18166:
18157:
18147:nonidentifiable
18114:
18110:
18086:
18082:
18074:
18071:
18070:
18044:
18040:
18032:
18029:
18028:
18002:
17998:
17990:
17987:
17986:
17954:
17949:
17948:
17939:
17934:
17933:
17924:
17919:
17918:
17909:
17904:
17903:
17870:
17866:
17858:
17855:
17854:
17818:
17813:
17812:
17803:
17798:
17797:
17796:
17792:
17786:
17781:
17773:
17768:
17767:
17758:
17753:
17752:
17751:
17747:
17745:
17727:
17723:
17715:
17712:
17711:
17691:
17690:
17676:
17671:
17670:
17661:
17656:
17655:
17654:
17650:
17639:
17634:
17633:
17624:
17619:
17618:
17617:
17613:
17612:
17604:
17599:
17598:
17589:
17584:
17583:
17582:
17578:
17576:
17569:
17554:
17550:
17541:
17540:
17529:
17524:
17523:
17514:
17509:
17508:
17507:
17503:
17492:
17487:
17486:
17477:
17472:
17471:
17470:
17466:
17465:
17457:
17452:
17451:
17442:
17437:
17436:
17435:
17431:
17429:
17422:
17407:
17403:
17393:
17391:
17388:
17387:
17362:
17357:
17356:
17347:
17342:
17341:
17340:
17336:
17325:
17320:
17319:
17310:
17305:
17304:
17303:
17299:
17291:
17288:
17287:
17266:
17238:
17237:
17229:
17224:
17223:
17214:
17209:
17208:
17207:
17203:
17193:
17186:
17171:
17167:
17158:
17157:
17149:
17144:
17143:
17134:
17129:
17128:
17127:
17123:
17113:
17106:
17091:
17087:
17077:
17075:
17072:
17071:
17039:
17036:
17035:
17014:
17013:
16995:
16990:
16989:
16980:
16975:
16974:
16967:
16952:
16948:
16933:
16932:
16914:
16909:
16908:
16899:
16894:
16893:
16886:
16871:
16867:
16851:
16849:
16846:
16845:
16840:
16818:
16717:to secede from
16711:Parti Québécois
16701:
16690:
16684:
16681:
16662:
16650:
16646:
16637:
16618:
16617:
16611:
16607:
16605:
16603:
16597:
16596:
16587:
16582:
16581:
16573:
16558:
16554:
16552:
16550:
16544:
16542:
16533:
16528:
16527:
16519:
16503:
16501:
16495:
16494:
16489:
16486:
16477:
16472:
16471:
16463:
16444:
16442:
16436:
16435:
16432: as above)
16430:
16425:
16420:
16417:
16396:
16391:
16390:
16382:
16372:
16370:
16364:
16363:
16360: as above)
16358:
16350:
16347:
16326:
16321:
16320:
16308:
16303:
16302:
16293:
16288:
16287:
16274:
16272:
16266:
16264:
16246:
16242:
16233:
16229:
16217:
16212:
16211:
16199:
16194:
16193:
16184:
16179:
16178:
16165:
16163:
16157:
16155:
16135:
16131:
16122:
16118:
16103:
16098:
16097:
16088:
16083:
16082:
16073:
16068:
16067:
16058:
16053:
16052:
16048:
16044:
16037:
16035:
16029:
16027:
16005:
16001:
15992:
15987:
15986:
15977:
15972:
15971:
15970:
15966:
15957:
15953:
15944:
15939:
15938:
15929:
15924:
15923:
15922:
15918:
15911:
15909:
15903:
15901:
15890:
15885:
15884:
15866:
15861:
15845:
15840:
15835:
15831:
15824:
15822:
15816:
15814:
15803:
15798:
15797:
15785:
15780:
15764:
15759:
15754:
15750:
15743:
15741:
15729:
15724:
15723:
15708:
15704:
15694:
15692:
15689:
15688:
15641:
15637:
15628:
15624:
15616:
15613:
15612:
15588:
15584:
15575:
15571:
15563:
15560:
15559:
15541:
15536:
15535:
15526:
15521:
15520:
15512:
15510:
15507:
15506:
15484:discrete choice
15449:
15448:
15443:
15441:
15435:
15434:
15422:
15417:
15401:
15396:
15387:
15385:
15375:
15374:
15365:
15361:
15359:
15356:
15355:
15327:
15323:
15319:
15315:
15306:
15302:
15284:
15280:
15256:
15252:
15244:
15241:
15240:
15231:
15209:
15208:
15184:
15180:
15173:
15167:
15163:
15160:
15159:
15135:
15131:
15124:
15118:
15114:
15110:
15108:
15105:
15104:
15084:
15083:
15076:
15072:
15063:
15058:
15057:
15048:
15043:
15042:
15035:
15026:
15021:
15014:
15013:
15006:
15002:
14993:
14988:
14987:
14978:
14973:
14972:
14965:
14956:
14951:
14943:
14941:
14938:
14937:
14931:
14906:discrete choice
14888:
14887:
14882:
14879:
14873:
14869:
14860:
14858:
14849:
14844:
14843:
14835:
14820:
14816:
14807:
14806:
14801:
14798:
14789:
14784:
14783:
14775:
14766:
14762:
14747:
14746:
14737:
14732:
14731:
14723:
14711:
14707:
14692:
14691:
14676:
14672:
14663:
14658:
14657:
14649:
14634:
14633:
14624:
14619:
14618:
14603:
14598:
14581:
14572:
14567:
14566:
14551:
14547:
14537:
14535:
14532:
14531:
14493:
14489:
14471:
14467:
14459:
14456:
14455:
14420:
14407:
14390:
14354:
14353:
14348:
14346:
14340:
14339:
14330:
14325:
14324:
14316:
14306:
14302:
14298:
14293:
14278:
14273:
14264:
14262:
14252:
14251:
14242:
14238:
14236:
14233:
14232:
14227:
14164:
14160:
14158:
14155:
14154:
14130:
14126:
14117:
14112:
14111:
14103:
14094:
14089:
14083:
14080:
14079:
14073:random variable
14070:
14061:latent variable
14046:discrete choice
14038:
14011:
14006:
14005:
13997:
13996:
13992:
13985:
13971:
13966:
13965:
13957:
13956:
13952:
13950:
13935:
13919:
13914:
13913:
13905:
13904:
13900:
13893:
13885:
13880:
13879:
13871:
13870:
13866:
13864:
13857:
13853:
13852:
13846:
13831:
13826:
13825:
13817:
13816:
13812:
13805:
13797:
13792:
13791:
13783:
13782:
13778:
13776:
13772:
13771:
13756:
13752:
13746:
13742:
13727:
13720:
13716:
13715:
13714:
13702:
13697:
13696:
13681:
13677:
13669:
13666:
13665:
13629:
13624:
13623:
13615:
13611:
13607:
13600:
13595:
13583:
13578:
13577:
13569:
13554:
13550:
13541:
13537:
13525:
13520:
13519:
13510:
13506:
13494:
13493:
13491:
13488:
13487:
13460:
13456:
13454:
13451:
13450:
13440:
13392:
13348:
13345:
13344:
13312:
13307:
13306:
13298:
13282:
13278:
13271:
13265:
13261:
13259:
13255:
13237:
13233:
13209:
13204:
13203:
13194:
13190:
13178:
13177:
13166:
13163:
13162:
13133:
13129:
13123:
13119:
13098:
13094:
13088:
13084:
13075:
13071:
13055:
13051:
13044:
13038:
13034:
13032:
13028:
13010:
13006:
12976:
12972:
12951:
12947:
12938:
12934:
12922:
12921:
12910:
12907:
12906:
12888:
12880:
12878:Interpretations
12847:
12843:
12841:
12838:
12837:
12820:
12815:
12814:
12812:
12809:
12808:
12784:
12780:
12778:
12775:
12774:
12751:
12747:
12738:
12733:
12732:
12723:
12719:
12713:
12702:
12686:
12682:
12673:
12669:
12651:
12640:
12615:
12611:
12607:
12599:
12597:
12595:
12592:
12591:
12581:
12570:
12562:
12534:
12530:
12516:
12513:
12512:
12486:
12481:
12480:
12471:
12467:
12448:
12444:
12426:
12415:
12405:
12394:
12382:
12379:
12378:
12358:
12354:
12352:
12349:
12348:
12331:
12326:
12325:
12323:
12320:
12319:
12285:
12276:
12272:
12255:
12246:
12242:
12240:
12237:
12236:
12217:
12208:
12204:
12193:
12185:
12182:
12181:
12159:
12156:
12155:
12136:
12127:
12122:
12121:
12103:
12094:
12090:
12089:
12080:
12071:
12067:
12066:
12064:
12060:
12045:
12041:
12039:
12036:
12035:
12029:
12002:
11993:
11989:
11987:
11984:
11983:
11960:
11951:
11947:
11945:
11942:
11941:
11924:
11919:
11918:
11916:
11913:
11912:
11893:
11884:
11880:
11878:
11875:
11874:
11850:
11841:
11836:
11835:
11834:
11830:
11824:
11813:
11802:
11797:
11786:
11777:
11773:
11767:
11756:
11735:
11726:
11722:
11720:
11717:
11716:
11675:
11672:
11671:
11650:
11641:
11636:
11635:
11634:
11630:
11624:
11613:
11602:
11595:
11586:
11581:
11580:
11579:
11575:
11573:
11562:
11553:
11549:
11547:
11544:
11543:
11496:
11493:
11492:
11490:
11466:
11463:
11462:
11460:
11457:
11429:
11426:
11425:
11423:
11401:
11387:
11384:
11383:
11364:
11356:
11353:
11352:
11348:
11342:
11314:
11311:
11310:
11289:
11285:
11283:
11280:
11279:
11262:
11258:
11256:
11253:
11252:
11232:
11228:
11226:
11223:
11222:
11200:
11197:
11196:
11179:
11175:
11173:
11170:
11169:
11153:
11150:
11149:
11132:
11128:
11126:
11123:
11122:
11099:
11095:
11093:
11090:
11089:
11065:
11062:
11061:
11040:
11036:
11034:
11031:
11030:
11008:
11005:
11004:
10987:
10983:
10981:
10978:
10977:
10961:
10958:
10957:
10940:
10936:
10934:
10931:
10930:
10907:
10903:
10901:
10898:
10897:
10877:
10851:
10846:
10843:
10842:
10819:
10815:
10806:
10802:
10800:
10797:
10796:
10774:
10771:
10770:
10750:
10746:
10744:
10741:
10740:
10718:
10715:
10714:
10691:
10687:
10678:
10674:
10672:
10669:
10668:
10645:
10641:
10632:
10628:
10626:
10623:
10622:
10600:
10597:
10596:
10563:
10559:
10557:
10554:
10553:
10528:
10525:
10524:
10491:
10487:
10481:
10477:
10468:
10464:
10458:
10454:
10445:
10441:
10434:
10430:
10423:
10418:
10404:
10400:
10394:
10390:
10381:
10377:
10371:
10367:
10358:
10354:
10353:
10349:
10342:
10334:
10330:
10324:
10320:
10311:
10307:
10301:
10297:
10288:
10284:
10283:
10279:
10277:
10258:
10257:
10253:
10246:
10239:
10231:
10230:
10226:
10224:
10216:
10213:
10212:
10194:
10190:
10178:
10174:
10149:
10144:
10135:
10131:
10123:
10120:
10119:
10093:
10089:
10087:
10084:
10083:
10060:
10056:
10054:
10051:
10050:
10024:
10020:
10018:
10015:
10014:
9992:
9989:
9988:
9966:
9963:
9962:
9952:
9945:
9918:
9914:
9905:
9900:
9899:
9887:
9876:
9860:
9856:
9850:
9846:
9840:
9829:
9807:
9803:
9799:
9791:
9789:
9787:
9784:
9783:
9745:
9741:
9740:
9716:
9712:
9703:
9699:
9684:
9673:
9653:
9649:
9648:
9630:
9626:
9620:
9616:
9610:
9599:
9587:
9584:
9583:
9558:
9555:
9554:
9537:
9533:
9531:
9528:
9527:
9506:
9501:
9500:
9498:
9495:
9494:
9468:
9465:
9464:
9445:
9437:
9434:
9433:
9417:
9415:
9412:
9411:
9389:
9386:
9385:
9363:
9360:
9359:
9342:
9338:
9336:
9333:
9332:
9316:
9313:
9312:
9291:
9287:
9285:
9282:
9281:
9251:
9247:
9234:
9226:
9222:
9218:
9211:
9206:
9193:
9185:
9184:
9180:
9173:
9166:
9158:
9157:
9153:
9151:
9140:
9132:
9129:
9128:
9103:
9100:
9099:
9070:
9061:
9057:
9051:
9047:
9041:
9030:
9018:
9015:
9014:
9008:
8981:
8977:
8962:
8958:
8949:
8945:
8936:
8932:
8921:
8919:
8916:
8915:
8894:
8890:
8875:
8871:
8862:
8858:
8849:
8845:
8834:
8832:
8829:
8828:
8795:
8792:
8791:
8789:
8760:
8757:
8756:
8728:
8724:
8722:
8719:
8718:
8694:
8690:
8684:
8680:
8665:
8661:
8655:
8651:
8642:
8638:
8632:
8628:
8619:
8615:
8599:
8594:
8585:
8581:
8573:
8570:
8569:
8540:
8537:
8536:
8521:
8514:
8507:
8456:
8453:
8452:
8448:
8444:
8437:
8396:
8391:
8390:
8382:
8365:
8362:
8361:
8353: + 1.
8346:
8339:
8330:
8320:
8307:
8296:
8280: + 1.
8269:
8260:
8253:
8220:
8216:
8201:
8197:
8195:
8192:
8191:
8162:
8158:
8152:
8148:
8127:
8123:
8117:
8113:
8104:
8100:
8083:
8080:
8079:
8075:is written as:
8047:
8044:
8043:
8029:
8007:
7998:
7989:
7969:
7960:
7951:
7942:
7926:
7913:
7900:
7866:
7865:
7847:
7843:
7837:
7833:
7818:
7813:
7802:
7787:
7783:
7762:
7758:
7743:
7739:
7730:
7729:
7723:
7722:
7708:
7706:
7700:
7696:
7687:
7686:
7672:
7670:
7664:
7660:
7653:
7652:
7645:
7630:
7626:
7605:
7601:
7586:
7582:
7573:
7572:
7566:
7562:
7555:
7540:
7536:
7515:
7511:
7502:
7498:
7486:
7485:
7482:
7481:
7472:
7468:
7452:
7437:
7433:
7412:
7408:
7399:
7395:
7391:
7389:
7386:
7385:
7378:
7365:
7343:dummy variables
7305:
7296:
7286:
7270:
7253:
7244:
7221:
7194:
7191:
7190:
7162:
7158:
7152:
7148:
7133:
7129:
7123:
7119:
7110:
7106:
7100:
7096:
7087:
7083:
7076:
7072:
7065:
7060:
7052:
7049:
7048:
7025:
7021:
7015:
7011:
6996:
6992:
6986:
6982:
6973:
6969:
6963:
6959:
6950:
6946:
6930:
6925:
6917:
6914:
6913:
6861:
6858:
6857:
6840:
6836:
6834:
6831:
6830:
6805:
6801:
6795:
6791:
6785:
6774:
6761:
6757:
6748:
6744:
6738:
6734:
6719:
6715:
6709:
6705:
6696:
6692:
6686:
6682:
6673:
6669:
6667:
6664:
6663:
6643:
6639:
6630:
6626:
6624:
6621:
6620:
6617:
6571:
6563:
6561:
6559:
6556:
6555:
6533:
6529:
6528:
6524:
6522:
6519:
6518:
6501:
6497:
6495:
6492:
6491:
6469:
6465:
6464:
6460:
6444:
6440:
6431:
6427:
6426:
6422:
6399:
6395:
6386:
6382:
6381:
6377:
6375:
6344:
6330:
6328:
6324:
6292:
6272:
6270:
6266:
6264:
6242:
6219:
6217:
6206:
6204:
6201:
6200:
6185:
6155:
6151:
6142:
6138:
6137:
6133:
6125:
6123:
6120:
6119:
6100:
6097:
6096:
6072:
6069:
6068:
6065:
6043:
6040:
6039:
6016:
6012:
6010:
6007:
6006:
5983:
5979:
5977:
5974:
5973:
5946:
5943:
5942:
5916:
5913:
5912:
5887:
5884:
5883:
5861:
5858:
5857:
5817:
5814:
5813:
5797:
5794:
5793:
5787:
5757:
5753:
5744:
5740:
5739:
5735:
5710:
5696:
5694:
5692:
5689:
5688:
5659:
5655:
5646:
5642:
5613:
5599:
5597:
5593:
5536:
5532:
5506:
5503:
5502:
5479:
5475:
5467:
5464:
5463:
5456:
5435:
5432:
5431:
5415:
5412:
5411:
5391:
5387:
5385:
5382:
5381:
5349:
5345:
5337:
5334:
5333:
5316:
5312:
5310:
5307:
5306:
5287:
5284:
5283:
5258:
5255:
5254:
5223:
5219:
5210:
5206:
5199:
5195:
5188:
5183:
5151:
5148:
5147:
5110:
5102:
5099:
5098:
5075:
5071:
5062:
5058:
5050:
5047:
5046:
5027:
5024:
5023:
5003:
5000:
4999:
4983:
4980:
4979:
4960:
4957:
4956:
4923:
4919:
4912:
4907:
4889:
4885:
4884:
4878:
4874:
4872:
4855:
4852:
4851:
4814:
4806:
4803:
4802:
4774:
4771:
4770:
4755:
4733:
4730:
4729:
4686:
4683:
4682:
4657:
4654:
4653:
4646:
4634:
4632:Generalizations
4603:
4600:
4599:
4569:
4566:
4565:
4541:
4517:
4481:
4408:
4406:
4403:
4402:
4371:
4368:
4367:
4365:
4317:
4315:
4290:
4269:
4265:
4258:
4253:
4245:
4242:
4241:
4194:
4190:
4178:
4174:
4166:
4163:
4162:
4140:
4119:
4115:
4108:
4103:
4095:
4092:
4091:
4041:
4037:
4025:
4021:
4013:
4010:
4009:
3984:
3981:
3980:
3958:
3954:
3952:
3949:
3948:
3946:
3927:
3923:
3921:
3918:
3917:
3915:
3912:
3884:
3880:
3875:
3864:
3861:
3860:
3836:
3832:
3827:
3821:
3817:
3806:
3803:
3802:
3765:
3761:
3759:
3756:
3755:
3728:
3724:
3722:
3719:
3718:
3706:which maximize
3688:
3684:
3682:
3679:
3678:
3676:
3657:
3653:
3651:
3648:
3647:
3645:
3623:
3619:
3617:
3614:
3613:
3611:
3592:
3588:
3586:
3583:
3582:
3580:
3560:
3556:
3547:
3543:
3534:
3530:
3521:
3510:
3494:
3490:
3486:
3478:
3476:
3468:
3465:
3464:
3441:
3437:
3428:
3424:
3415:
3404:
3388:
3384:
3380:
3372:
3370:
3362:
3359:
3358:
3336:
3332:
3330:
3327:
3326:
3324:
3305:
3301:
3299:
3296:
3295:
3293:
3266:
3262:
3260:
3257:
3256:
3254:
3235:
3231:
3229:
3226:
3225:
3223:
3216:
3184:
3180:
3157:
3153:
3146:
3135:
3131:
3117:
3113:
3106:
3094:
3091:
3090:
3055:
3051:
3027:
3023:
3002:
2998:
2983:
2979:
2977:
2973:
2967:
2956:
2940:
2936:
2907:
2903:
2896:
2880:
2876:
2853:
2849:
2842:
2830:
2827:
2826:
2787:
2784:
2783:
2781:
2762:
2758:
2756:
2753:
2752:
2750:
2731:
2727:
2725:
2722:
2721:
2719:
2698:
2695:
2694:
2692:
2672:
2671:
2662:
2658:
2640:
2636:
2630:
2629:
2627:
2624:
2623:
2606:
2605:
2596:
2592:
2574:
2570:
2564:
2563:
2561:
2558:
2557:
2527:
2523:
2499:
2495:
2477:
2473:
2461:
2457:
2445:
2441:
2439:
2436:
2435:
2404:
2400:
2392:
2389:
2388:
2386:
2367:
2363:
2361:
2358:
2357:
2355:
2332:
2328:
2326:
2323:
2322:
2299:
2295:
2293:
2290:
2289:
2266:
2262:
2260:
2257:
2256:
2233:
2229:
2227:
2224:
2223:
2200:
2196:
2194:
2191:
2190:
2167:
2163:
2161:
2158:
2157:
2134:
2130:
2128:
2125:
2124:
2101:
2097:
2095:
2092:
2091:
2068:
2064:
2062:
2059:
2058:
2056:
2037:
2033:
2031:
2028:
2027:
2025:
2001:
2000:
1988:
1984:
1979:
1977:
1968:
1964:
1943:
1942:
1927:
1923:
1918:
1916:
1910:
1906:
1890:
1889:
1880:
1876:
1874:
1871:
1870:
1848:
1844:
1842:
1839:
1838:
1836:
1818:
1795:
1791:
1789:
1786:
1785:
1783:
1764:
1760:
1758:
1755:
1754:
1752:
1729:
1725:
1717:
1714:
1713:
1711:
1692:
1688:
1686:
1683:
1682:
1680:
1661:
1657:
1655:
1652:
1651:
1649:
1629:
1625:
1610:
1606:
1604:
1601:
1600:
1597:
1590:
1571:goodness of fit
1567:
1541:
1537:
1532:
1521:
1518:
1517:
1500:
1496:
1491:
1485:
1481:
1470:
1467:
1466:
1435:
1423:
1419:
1417:
1414:
1413:
1393:
1389:
1380:
1376:
1368:
1365:
1364:
1333:
1318:
1314:
1312:
1309:
1308:
1277:
1273:
1264:
1260:
1253:
1249:
1242:
1237:
1220:
1217:
1216:
1210:scale parameter
1186:
1166:
1163:
1162:
1127:
1108:
1104:
1097:
1092:
1075:
1072:
1071:
1057:
1050:
1039:
990:
987:
986:
964:
961:
960:
953:
946:
873:
801:
787:could be used.
769:
764:
747:
703:body mass index
680:
675:
666:
626:
606:Goodness of fit
313:Discrete choice
243:, beginning in
63:of one or more
24:
17:
12:
11:
5:
34416:
34406:
34405:
34400:
34395:
34378:
34377:
34375:
34374:
34362:
34350:
34336:
34323:
34320:
34319:
34316:
34315:
34312:
34311:
34309:
34308:
34303:
34298:
34293:
34288:
34282:
34280:
34274:
34273:
34271:
34270:
34265:
34260:
34255:
34250:
34245:
34240:
34235:
34230:
34225:
34219:
34217:
34211:
34210:
34208:
34207:
34202:
34197:
34188:
34183:
34178:
34172:
34170:
34164:
34163:
34161:
34160:
34155:
34150:
34141:
34139:Bioinformatics
34135:
34133:
34123:
34122:
34110:
34109:
34106:
34105:
34102:
34101:
34098:
34097:
34095:
34094:
34088:
34086:
34082:
34081:
34079:
34078:
34072:
34070:
34064:
34063:
34061:
34060:
34055:
34050:
34045:
34039:
34037:
34028:
34022:
34021:
34018:
34017:
34015:
34014:
34009:
34004:
33999:
33994:
33988:
33986:
33980:
33979:
33977:
33976:
33971:
33966:
33958:
33953:
33948:
33947:
33946:
33944:partial (PACF)
33935:
33933:
33927:
33926:
33924:
33923:
33918:
33913:
33905:
33900:
33894:
33892:
33891:Specific tests
33888:
33887:
33885:
33884:
33879:
33874:
33869:
33864:
33859:
33854:
33849:
33843:
33841:
33834:
33828:
33827:
33825:
33824:
33823:
33822:
33821:
33820:
33805:
33804:
33803:
33793:
33791:Classification
33788:
33783:
33778:
33773:
33768:
33763:
33757:
33755:
33749:
33748:
33746:
33745:
33740:
33738:McNemar's test
33735:
33730:
33725:
33720:
33714:
33712:
33702:
33701:
33677:
33676:
33673:
33672:
33669:
33668:
33666:
33665:
33660:
33655:
33650:
33644:
33642:
33636:
33635:
33633:
33632:
33616:
33610:
33608:
33602:
33601:
33599:
33598:
33593:
33588:
33583:
33578:
33576:Semiparametric
33573:
33568:
33562:
33560:
33556:
33555:
33553:
33552:
33547:
33542:
33537:
33531:
33529:
33523:
33522:
33520:
33519:
33514:
33509:
33504:
33499:
33493:
33491:
33485:
33484:
33482:
33481:
33476:
33471:
33466:
33460:
33458:
33448:
33447:
33444:
33443:
33438:
33432:
33424:
33423:
33420:
33419:
33416:
33415:
33413:
33412:
33411:
33410:
33400:
33395:
33390:
33389:
33388:
33383:
33372:
33370:
33364:
33363:
33360:
33359:
33357:
33356:
33351:
33350:
33349:
33341:
33333:
33317:
33314:(Mann–Whitney)
33309:
33308:
33307:
33294:
33293:
33292:
33281:
33279:
33273:
33272:
33270:
33269:
33268:
33267:
33262:
33257:
33247:
33242:
33239:(Shapiro–Wilk)
33234:
33229:
33224:
33219:
33214:
33206:
33200:
33198:
33192:
33191:
33189:
33188:
33180:
33171:
33159:
33153:
33151:Specific tests
33147:
33146:
33143:
33142:
33140:
33139:
33134:
33129:
33123:
33121:
33115:
33114:
33112:
33111:
33106:
33105:
33104:
33094:
33093:
33092:
33082:
33076:
33074:
33068:
33067:
33065:
33064:
33063:
33062:
33057:
33047:
33042:
33037:
33032:
33027:
33021:
33019:
33013:
33012:
33010:
33009:
33004:
33003:
33002:
32997:
32996:
32995:
32990:
32975:
32974:
32973:
32968:
32963:
32958:
32947:
32945:
32936:
32930:
32929:
32927:
32926:
32921:
32916:
32915:
32914:
32904:
32899:
32898:
32897:
32887:
32886:
32885:
32880:
32875:
32865:
32860:
32855:
32854:
32853:
32848:
32843:
32827:
32826:
32825:
32820:
32815:
32805:
32804:
32803:
32798:
32788:
32787:
32786:
32776:
32775:
32774:
32764:
32759:
32754:
32748:
32746:
32736:
32735:
32723:
32722:
32719:
32718:
32715:
32714:
32712:
32711:
32706:
32701:
32696:
32690:
32688:
32682:
32681:
32679:
32678:
32673:
32668:
32662:
32660:
32656:
32655:
32653:
32652:
32647:
32642:
32637:
32632:
32627:
32622:
32616:
32614:
32608:
32607:
32605:
32604:
32602:Standard error
32599:
32594:
32589:
32588:
32587:
32582:
32571:
32569:
32563:
32562:
32560:
32559:
32554:
32549:
32544:
32539:
32534:
32532:Optimal design
32529:
32524:
32518:
32516:
32506:
32505:
32493:
32492:
32489:
32488:
32485:
32484:
32482:
32481:
32476:
32471:
32466:
32461:
32456:
32451:
32446:
32441:
32436:
32431:
32426:
32421:
32416:
32411:
32405:
32403:
32397:
32396:
32394:
32393:
32388:
32387:
32386:
32381:
32371:
32366:
32360:
32358:
32352:
32351:
32349:
32348:
32343:
32338:
32332:
32330:
32329:Summary tables
32326:
32325:
32323:
32322:
32316:
32314:
32308:
32307:
32304:
32303:
32301:
32300:
32299:
32298:
32293:
32288:
32278:
32272:
32270:
32264:
32263:
32261:
32260:
32255:
32250:
32245:
32240:
32235:
32230:
32224:
32222:
32216:
32215:
32213:
32212:
32207:
32202:
32201:
32200:
32195:
32190:
32185:
32180:
32175:
32170:
32165:
32163:Contraharmonic
32160:
32155:
32144:
32142:
32133:
32123:
32122:
32110:
32109:
32107:
32106:
32101:
32095:
32092:
32091:
32084:
32083:
32076:
32069:
32061:
32055:
32054:
32049:: software in
32044:
32039:
32024:
32010:
32009:External links
32007:
32005:
32004:
31995:
31958:
31952:
31939:
31933:
31920:
31914:
31898:
31892:
31879:
31873:
31853:
31847:
31834:
31828:
31811:
31805:
31791:
31790:
31728:
31690:(6): 275–288.
31675:
31646:
31629:
31627:on 2014-04-30.
31600:
31599:
31598:
31588:(4): 613–626.
31578:Published in:
31554:
31542:
31524:(2): 215–242.
31518:J R Stat Soc B
31510:
31468:
31442:(4): 327–339.
31431:
31401:
31399:
31396:
31394:
31393:
31354:
31339:
31327:
31315:
31303:
31301:, p. 7–9.
31291:
31279:
31277:, p. 6–7.
31267:
31255:
31243:
31231:
31202:
31190:
31156:
31144:
31121:
31104:
31087:
31065:
31037:
31022:
31015:
30997:
30978:(9): 965–980.
30962:
30944:
30937:
30896:
30889:
30871:
30820:
30799:(6): 710–718.
30779:
30758:(12): 1373–9.
30736:
30685:
30678:
30660:
30626:
30599:
30592:
30561:
30532:
30518:
30514:
30510:
30507:
30504:
30501:
30498:
30495:
30492:
30489:
30486:
30483:
30480:
30477:
30474:
30458:
30451:
30429:
30415:
30368:Pearson, E. S.
30355:
30331:
30306:Safety Science
30292:
30229:
30190:
30175:
30158:Safety Science
30148:
30137:(6): 673–682.
30121:
30114:
30083:
30048:
30045:. p. 128.
30023:
29988:
29953:
29918:
29907:(37): 147–51.
29891:
29870:(4): 370–378.
29850:
29838:
29800:
29785:
29778:
29739:
29687:
29685:
29682:
29681:
29680:
29675:
29670:
29660:
29655:
29650:
29645:
29640:
29635:
29630:
29625:
29619:
29618:
29602:
29599:
29598:
29597:
29579:
29572:
29566:
29552:
29529:
29526:
29483:Berkson (1951)
29479:Berkson (1944)
29475:Joseph Berkson
29467:Jane Worcester
29351:
29348:
29318:error function
29310:logit function
29293:
29290:
29276:rather than a
29261:
29258:
29255:
29235:
29232:
29209:
29184:
29181:
29178:
29175:
29172:
29169:
29158:
29157:
29142:
29139:
29136:
29133:
29130:
29127:
29124:
29121:
29116:
29112:
29108:
29105:
29102:
29093:
29089:
29086:
29084:
29080:
29077:
29076:
29072:
29068:
29065:
29062:
29059:
29056:
29053:
29050:
29047:
29044:
29041:
29038:
29035:
29032:
29026:
29023:
29020:
29017:
29014:
29011:
29008:
29005:
29002:
28999:
28996:
28993:
28988:
28985:
28982:
28979:
28976:
28973:
28970:
28967:
28964:
28961:
28955:
28952:
28949:
28945:
28941:
28938:
28935:
28932:
28929:
28926:
28923:
28920:
28917:
28914:
28907:
28902:
28899:
28895:
28887:
28882:
28879:
28875:
28871:
28869:
28865:
28862:
28861:
28858:
28855:
28852:
28849:
28846:
28843:
28840:
28837:
28834:
28831:
28828:
28825:
28822:
28819:
28816:
28813:
28810:
28807:
28804:
28801:
28798:
28795:
28792:
28789:
28782:
28777:
28774:
28770:
28762:
28757:
28754:
28750:
28746:
28743:
28740:
28737:
28732:
28728:
28724:
28719:
28715:
28711:
28708:
28705:
28702:
28697:
28692:
28689:
28686:
28682:
28676:
28673:
28669:
28663:
28660:
28657:
28654:
28650:
28646:
28644:
28642:
28615:
28612:
28609:
28606:
28603:
28585:
28584:
28573:
28570:
28567:
28562:
28558:
28554:
28549:
28545:
28541:
28538:
28535:
28532:
28527:
28522:
28519:
28516:
28512:
28506:
28503:
28499:
28495:
28492:
28489:
28486:
28483:
28480:
28477:
28474:
28471:
28468:
28465:
28460:
28457:
28453:
28438:
28437:
28420:
28415:
28411:
28407:
28404:
28401:
28397:
28393:
28388:
28384:
28380:
28375:
28371:
28367:
28364:
28361:
28354:
28350:
28345:
28339:
28335:
28331:
28326:
28322:
28316:
28312:
28308:
28305:
28303:
28301:
28298:
28295:
28292:
28287:
28283:
28279:
28274:
28270:
28266:
28263:
28258:
28254:
28250:
28247:
28245:
28243:
28240:
28237:
28234:
28231:
28228:
28225:
28222:
28219:
28216:
28213:
28211:
28209:
28206:
28203:
28200:
28197:
28194:
28191:
28188:
28185:
28184:
28157:
28152:
28149:
28146:
28143:
28140:
28136:
28132:
28129:
28126:
28121:
28117:
28113:
28110:
28107:
28102:
28098:
28094:
28091:
28086:
28082:
28078:
28075:
28072:
28069:
28066:
28063:
28060:
28057:
28054:
28034:
28031:
28028:
28025:
28022:
28019:
28016:
28013:
28004:, we see that
27993:
27990:
27987:
27984:
27981:
27978:
27975:
27964:
27963:
27952:
27949:
27946:
27941:
27937:
27933:
27930:
27927:
27924:
27921:
27918:
27915:
27912:
27909:
27906:
27903:
27900:
27897:
27883:
27882:
27871:
27868:
27865:
27862:
27859:
27856:
27853:
27850:
27847:
27844:
27841:
27833:
27828:
27824:
27820:
27816:
27812:
27809:
27805:
27800:
27797:
27794:
27791:
27786:
27782:
27758:
27731:
27708:
27705:
27690:
27685:
27680:
27675:
27670:
27665:
27660:
27655:
27631:
27626:
27602:
27597:
27573:
27568:
27540:
27535:
27511:
27508:
27505:
27477:
27474:
27470:
27447:
27442:
27418:
27415:
27412:
27409:
27406:
27382:
27377:
27364:
27363:
27345:
27340:
27335:
27330:
27325:
27319:
27313:
27308:
27305:
27302:
27298:
27289:
27284:
27279:
27274:
27269:
27263:
27257:
27252:
27249:
27245:
27228:
27223:
27222:
27207:
27203:
27199:
27196:
27192:
27188:
27183:
27179:
27164:
27163:
27150:
27146:
27141:
27133:
27128:
27123:
27118:
27113:
27107:
27103:
27098:
27095:
27091:
27065:
27062:
27058:
27038:
27037:
27024:
27019:
27014:
27009:
27004:
26999:
26994:
26991:
26987:
26981:
26978:
26974:
26968:
26963:
26960:
26957:
26953:
26938:
26937:
26923:
26920:
26915:
26911:
26908:
26902:
26899:
26895:
26891:
26885:
26881:
26878:
26873:
26869:
26864:
26859:
26856:
26853:
26849:
26845:
26842:
26839:
26836:
26830:
26827:
26822:
26819:
26814:
26810:
26807:
26804:
26801:
26798:
26795:
26792:
26783:
26780:
26775:
26772:
26767:
26763:
26756:
26751:
26734:
26733:
26720:
26717:
26714:
26711:
26705:
26699:
26694:
26691:
26688:
26682:
26676:
26671:
26668:
26665:
26659:
26653:
26648:
26630:
26625:
26624:
26612:
26606:
26603:
26599:
26593:
26588:
26585:
26582:
26578:
26574:
26571:
26567:
26561:
26557:
26551:
26546:
26543:
26540:
26536:
26532:
26527:
26524:
26521:
26518:
26512:
26495:
26494:
26483:
26480:
26475:
26472:
26468:
26462:
26457:
26454:
26451:
26447:
26426:
26421:
26420:
26409:
26404:
26401:
26397:
26393:
26388:
26384:
26380:
26377:
26374:
26371:
26368:
26363:
26360:
26356:
26350:
26347:
26343:
26339:
26334:
26329:
26326:
26323:
26319:
26313:
26310:
26306:
26300:
26295:
26292:
26289:
26285:
26279:
26274:
26271:
26268:
26264:
26260:
26255:
26252:
26249:
26243:
26216:
26208:
26203:
26202:
26191:
26186:
26183:
26179:
26175:
26170:
26166:
26162:
26159:
26156:
26153:
26150:
26145:
26142:
26138:
26132:
26129:
26125:
26121:
26116:
26111:
26108:
26105:
26101:
26097:
26089:
26086:
26082:
26078:
26073:
26070:
26053:
26052:
26041:
26036:
26033:
26029:
26025:
26022:
26019:
26016:
26011:
26007:
26003:
26000:
25997:
25994:
25989:
25984:
25981:
25978:
25974:
25968:
25963:
25960:
25957:
25953:
25949:
25946:
25932:
25931:
25920:
25915:
25912:
25908:
25904:
25901:
25898:
25893:
25890:
25886:
25880:
25875:
25872:
25869:
25865:
25859:
25854:
25851:
25848:
25844:
25840:
25837:
25832:
25829:
25826:
25820:
25793:
25768:
25763:
25758:
25753:
25748:
25744:
25740:
25735:
25732:
25728:
25707:
25704:
25701:
25679:
25656:
25653:
25650:
25628:
25623:
25620:
25616:
25612:
25609:
25606:
25601:
25598:
25594:
25590:
25585:
25582:
25578:
25574:
25571:
25566:
25561:
25537:
25534:
25531:
25528:
25525:
25510:
25492:
25488:
25463:
25460:
25456:
25435:
25432:
25429:
25426:
25423:
25420:
25417:
25414:
25411:
25408:
25405:
25381:
25378:
25373:
25369:
25344:
25340:
25315:
25312:
25309:
25284:
25281:
25273:canonical form
25248:
25245:
25204:
25199:
25195:
25191:
25186:
25182:
25178:
25175:
25172:
25169:
25166:
25161:
25156:
25153:
25127:
25123:
25120:
25117:
25114:
25111:
25099:
25096:
25093:
25089:
25084:
25081:
25078:
25075:
25072:
25069:
25027:
25024:
25008:
25005:
24982:
24971:
24970:
24953:
24950:
24944:
24941:
24934:
24931:
24923:
24920:
24917:
24911:
24908:
24905:
24901:
24896:
24893:
24890:
24885:
24878:
24875:
24868:
24863:
24858:
24851:
24848:
24819:
24815:
24792:
24788:
24765:
24761:
24743:
24740:
24731:
24730:
24714:
24707:
24703:
24698:
24694:
24688:
24683:
24679:
24673:
24668:
24664:
24645:Wald statistic
24640:
24639:Wald statistic
24637:
24627:
24624:
24610:
24607:
24589:
24585:
24567:
24564:
24563:
24562:
24559:
24553:
24550:
24544:
24541:
24535:
24532:
24528:Cox and Snell
24526:
24523:
24501:Main article:
24498:
24495:
24487:
24486:
24471:
24458:
24455:
24452:
24449:
24446:
24443:
24441:
24439:
24433:
24418:
24413:
24398:
24392:
24389:
24386:
24383:
24380:
24377:
24375:
24373:
24369:
24355:
24352:
24349:
24336:
24333:
24329:
24325:
24322:
24319:
24316:
24314:
24306:
24302:
24293:
24289:
24288:
24274:
24273:
24258:
24245:
24242:
24239:
24236:
24233:
24230:
24228:
24220:
24216:
24215:
24202:
24199:
24196:
24193:
24190:
24187:
24185:
24177:
24173:
24172:
24142:
24137:
24132:
24129:
24126:
24122:
24089:
24088:
24077:
24064:
24061:
24058:
24055:
24052:
24049:
24026:sum of squares
24021:
24018:
24006:
24003:
23990:
23987:
23984:
23981:
23978:
23975:
23954:
23947:
23920:
23917:
23914:
23911:
23906:
23899:
23896:
23889:
23883:
23880:
23874:
23871:
23868:
23865:
23845:
23842:
23839:
23836:
23830:
23827:
23804:
23801:
23798:
23795:
23790:
23783:
23780:
23760:
23750:
23749:
23738:
23733:
23726:
23723:
23716:
23710:
23707:
23701:
23698:
23695:
23691:
23684:
23679:
23672:
23669:
23660:
23653:
23650:
23641:
23637:
23634:
23631:
23628:
23610:
23609:
23596:
23589:
23586:
23579:
23573:
23570:
23551:
23544:
23528:
23525:
23512:
23511:
23499:
23491:
23488:
23483:
23480:
23475:
23472:
23466:
23462:
23459:
23456:
23451:
23447:
23421:
23417:
23405:
23404:
23393:
23390:
23385:
23382:
23377:
23374:
23371:
23368:
23365:
23362:
23357:
23354:
23349:
23346:
23343:
23340:
23337:
23332:
23329:
23324:
23321:
23318:
23313:
23310:
23304:
23301:
23298:
23293:
23286:
23283:
23250:
23247:
23242:
23237:
23233:
23223:Since we have
23221:
23220:
23208:
23204:
23199:
23195:
23191:
23188:
23185:
23182:
23179:
23176:
23171:
23167:
23163:
23160:
23157:
23154:
23151:
23146:
23142:
23138:
23135:
23132:
23127:
23123:
23118:
23112:
23107:
23104:
23101:
23097:
23093:
23088:
23084:
23069:
23068:
23055:
23051:
23047:
23042:
23038:
23023:
23022:
23004:
23000:
22996:
22992:
22988:
22985:
22981:
22976:
22973:
22970:
22967:
22962:
22958:
22934:
22931:
22928:
22917:
22916:
22904:
22900:
22897:
22892:
22888:
22884:
22881:
22878:
22875:
22872:
22869:
22866:
22863:
22858:
22854:
22850:
22847:
22844:
22841:
22838:
22835:
22830:
22826:
22822:
22819:
22816:
22813:
22810:
22805:
22801:
22796:
22790:
22785:
22782:
22779:
22775:
22771:
22768:
22754:
22753:
22742:
22737:
22733:
22729:
22724:
22720:
22716:
22713:
22690:
22687:
22684:
22662:
22659:
22656:
22653:
22640:
22639:
22623:
22620:
22616:
22612:
22609:
22605:
22600:
22597:
22594:
22591:
22588:
22555:
22552:
22527:
22523:
22490:
22483:
22465:
22462:
22459:
22456:
22453:
22436:
22429:
22422:
22412:
22407:
22406:
22393:
22389:
22383:
22379:
22375:
22370:
22367:
22362:
22357:
22352:
22349:
22346:
22342:
22338:
22333:
22328:
22321:
22318:
22289:
22284:
22280:
22266:
22250:
22247:
22223:
22220:
22215:
22210:
22206:
22185:
22163:
22158:
22154:
22131:
22127:
22114:
22108:
22107:
22096:
22091:
22087:
22081:
22077:
22073:
22068:
22064:
22060:
22055:
22050:
22047:
22044:
22040:
22036:
22031:
22027:
22011:
22000:
21991:The idea of a
21976:
21969:
21966:
21951:
21950:
21939:
21934:
21930:
21924:
21920:
21916:
21911:
21907:
21901:
21897:
21893:
21888:
21884:
21880:
21875:
21870:
21867:
21864:
21860:
21856:
21851:
21847:
21819:
21814:
21810:
21806:
21801:
21797:
21793:
21790:
21777:
21770:
21737:
21734:
21732:
21729:
21711:
21707:
21703:
21700:
21680:
21656:
21633:Main article:
21630:
21627:
21553:
21550:
21544:
21541:
21535:
21532:
21512:
21509:
21506:
21503:
21478:Comparison of
21471:
21468:
21453:
21449:
21445:
21442:
21439:
21436:
21433:
21430:
21427:
21424:
21421:
21418:
21415:
21412:
21409:
21406:
21403:
21400:
21396:
21384:
21383:
21370:
21364:
21361:
21359:
21356:
21354:
21351:
21350:
21347:
21344:
21342:
21339:
21336:
21331:
21327:
21323:
21321:
21318:
21315:
21310:
21306:
21302:
21300:
21297:
21296:
21293:
21290:
21288:
21285:
21282:
21277:
21273:
21269:
21267:
21264:
21261:
21256:
21252:
21248:
21246:
21243:
21242:
21240:
21235:
21231:
21207:
21204:
21201:
21198:
21195:
21192:
21189:
21186:
21183:
21180:
21177:
21174:
21171:
21168:
21164:
21143:
21140:
21137:
21134:
21131:
21128:
21125:
21122:
21119:
21116:
21113:
21110:
21107:
21104:
21101:
21098:
21095:
21091:
21079:
21078:
21066:
21060:
21055:
21050:
21046:
21042:
21037:
21032:
21026:
21020:
21015:
21009:
21003:
20998:
20991:
20988:
20983:
20978:
20972:
20967:
20960:
20955:
20949:
20944:
20939:
20936:
20933:
20928:
20902:
20876:
20873:
20870:
20866:
20860:
20855:
20850:
20846:
20842:
20839:
20835:
20830:
20827:
20824:
20821:
20818:
20796:
20792:
20788:
20785:
20782:
20779:
20776:
20771:
20767:
20763:
20760:
20757:
20754:
20749:
20745:
20741:
20738:
20735:
20732:
20729:
20726:
20723:
20719:
20698:
20695:
20692:
20687:
20683:
20679:
20674:
20670:
20666:
20661:
20657:
20653:
20650:
20645:
20640:
20625:process using
20619:log-likelihood
20602:
20599:
20596:
20576:
20573:
20570:
20558:
20555:
20554:
20553:
20549:
20545:
20541:
20537:
20524:, or complete
20501:
20498:
20496:
20493:
20489:
20488:
20477:
20471:
20468:
20463:
20459:
20453:
20442:
20437:
20432:
20428:
20424:
20420:
20416:
20413:
20409:
20404:
20401:
20397:
20390:
20385:
20375:
20370:
20365:
20361:
20357:
20353:
20349:
20346:
20342:
20337:
20329:
20324:
20319:
20315:
20309:
20303:
20298:
20295:
20290:
20286:
20281:
20275:
20271:
20267:
20264:
20261:
20256:
20251:
20247:
20240:
20235:
20230:
20226:
20220:
20214:
20211:
20206:
20201:
20196:
20193:
20190:
20185:
20181:
20177:
20174:
20160:
20159:
20148:
20142:
20137:
20132:
20128:
20124:
20120:
20112:
20108:
20104:
20101:
20095:
20091:
20085:
20081:
20078:
20075:
20072:
20067:
20063:
20059:
20056:
20053:
20050:
20046:
20041:
20035:
20030:
20023:
20014:
20010:
20004:
20000:
19993:
19989:
19985:
19980:
19974:
19970:
19967:
19953:
19952:
19941:
19936:
19930:
19925:
19918:
19909:
19905:
19899:
19895:
19888:
19884:
19880:
19875:
19870:
19865:
19861:
19836:
19827:
19821:
19820:
19809:
19806:
19803:
19800:
19797:
19794:
19791:
19783:
19780:
19775:
19771:
19767:
19762:
19758:
19754:
19751:
19748:
19745:
19739:
19735:
19713:
19701:
19690:
19687:
19686:
19685:
19673:
19667:
19663:
19657:
19653:
19646:
19643:
19640:
19637:
19634:
19631:
19628:
19622:
19618:
19612:
19608:
19570:
19569:
19553:
19550:
19547:
19544:
19541:
19537:
19533:
19530:
19526:
19521:
19518:
19501:
19494:
19490: = (
19482:
19465:
19464:
19452:
19444:
19439:
19436:
19433:
19429:
19423:
19419:
19415:
19412:
19409:
19404:
19401:
19398:
19394:
19388:
19384:
19380:
19375:
19371:
19367:
19364:
19360:
19356:
19353:
19349:
19344:
19339:
19335:
19319:
19316:
19300:utility theory
19265:
19260:
19255:
19250:
19245:
19240:
19236:
19224:
19223:
19210:
19206:
19202:
19192:
19187:
19182:
19177:
19172:
19167:
19163:
19159:
19156:
19152:
19147:
19137:
19132:
19127:
19122:
19117:
19111:
19107:
19104:
19096:
19091:
19086:
19081:
19076:
19070:
19064:
19061:
19058:
19055:
19050:
19046:
19042:
19039:
19025:
19024:
19013:
19010:
19003:
18998:
18993:
18989:
18984:
18980:
18973:
18968:
18963:
18958:
18953:
18947:
18923:
18919:
18915:
18910:
18905:
18892:
18891:
18876:
18866:
18861:
18856:
18851:
18846:
18840:
18836:
18829:
18824:
18819:
18814:
18809:
18803:
18794:
18789:
18784:
18779:
18774:
18768:
18762:
18759:
18757:
18755:
18749:
18742:
18737:
18732:
18727:
18722:
18716:
18712:
18705:
18700:
18695:
18690:
18685:
18679:
18675:
18668:
18663:
18658:
18654:
18649:
18639:
18634:
18629:
18624:
18619:
18613:
18605:
18600:
18595:
18591:
18586:
18579:
18576:
18574:
18572:
18562:
18557:
18552:
18548:
18543:
18535:
18530:
18525:
18520:
18515:
18509:
18505:
18498:
18493:
18488:
18484:
18479:
18471:
18466:
18461:
18456:
18451:
18445:
18435:
18430:
18425:
18421:
18416:
18408:
18403:
18398:
18393:
18388:
18382:
18375:
18372:
18370:
18368:
18358:
18353:
18348:
18345:
18341:
18337:
18332:
18327:
18322:
18318:
18314:
18307:
18302:
18297:
18294:
18290:
18286:
18281:
18276:
18271:
18267:
18258:
18253:
18248:
18245:
18241:
18237:
18232:
18227:
18222:
18218:
18212:
18209:
18207:
18205:
18202:
18199:
18194:
18190:
18186:
18183:
18180:
18179:
18164:
18155:
18134:
18131:
18128:
18125:
18122:
18117:
18113:
18109:
18106:
18103:
18100:
18097:
18094:
18089:
18085:
18081:
18078:
18058:
18055:
18052:
18047:
18043:
18039:
18036:
18016:
18013:
18010:
18005:
18001:
17997:
17994:
17983:
17982:
17971:
17968:
17965:
17962:
17957:
17952:
17947:
17942:
17937:
17932:
17927:
17922:
17917:
17912:
17907:
17902:
17899:
17896:
17893:
17890:
17887:
17884:
17881:
17878:
17873:
17869:
17865:
17862:
17840:
17839:
17821:
17816:
17811:
17806:
17801:
17795:
17789:
17785:
17776:
17771:
17766:
17761:
17756:
17750:
17744:
17741:
17738:
17735:
17730:
17726:
17722:
17719:
17707:Or generally:
17705:
17704:
17689:
17679:
17674:
17669:
17664:
17659:
17653:
17649:
17642:
17637:
17632:
17627:
17622:
17616:
17607:
17602:
17597:
17592:
17587:
17581:
17575:
17572:
17570:
17568:
17565:
17562:
17557:
17553:
17549:
17546:
17543:
17542:
17532:
17527:
17522:
17517:
17512:
17506:
17502:
17495:
17490:
17485:
17480:
17475:
17469:
17460:
17455:
17450:
17445:
17440:
17434:
17428:
17425:
17423:
17421:
17418:
17415:
17410:
17406:
17402:
17399:
17396:
17395:
17381:
17380:
17365:
17360:
17355:
17350:
17345:
17339:
17335:
17328:
17323:
17318:
17313:
17308:
17302:
17298:
17295:
17262:
17252:
17251:
17232:
17227:
17222:
17217:
17212:
17206:
17200:
17197:
17192:
17189:
17187:
17185:
17182:
17179:
17174:
17170:
17166:
17163:
17160:
17159:
17152:
17147:
17142:
17137:
17132:
17126:
17120:
17117:
17112:
17109:
17107:
17105:
17102:
17099:
17094:
17090:
17086:
17083:
17080:
17079:
17052:
17049:
17046:
17043:
17028:
17027:
17012:
17009:
17006:
17003:
16998:
16993:
16988:
16983:
16978:
16973:
16970:
16968:
16966:
16963:
16960:
16955:
16951:
16947:
16944:
16941:
16938:
16935:
16934:
16931:
16928:
16925:
16922:
16917:
16912:
16907:
16902:
16897:
16892:
16889:
16887:
16885:
16882:
16879:
16874:
16870:
16866:
16863:
16860:
16857:
16854:
16853:
16836:
16817:
16814:
16813:
16812:
16804:
16794:
16793:
16790:
16787:
16784:
16780:
16779:
16776:
16773:
16770:
16769:Middle-income
16766:
16765:
16762:
16759:
16756:
16752:
16751:
16748:
16745:
16742:
16723:utility theory
16713:, which wants
16707:
16706:
16703:
16702:
16653:
16651:
16644:
16636:
16633:
16632:
16631:
16614:
16610:
16606:
16602:
16599:
16598:
16595:
16590:
16585:
16580:
16576:
16572:
16569:
16564:
16561:
16557:
16553:
16549:
16546:
16545:
16541:
16536:
16531:
16526:
16522:
16518:
16515:
16512:
16509:
16506:
16504:
16500:
16497:
16496:
16488:
16485:
16480:
16475:
16470:
16466:
16462:
16459:
16456:
16453:
16450:
16447:
16445:
16441:
16438:
16437:
16428:
16419:
16416:
16413:
16410:
16407:
16404:
16399:
16394:
16389:
16385:
16381:
16378:
16375:
16373:
16369:
16366:
16365:
16357:
16349:
16346:
16343:
16340:
16337:
16334:
16329:
16324:
16319:
16316:
16311:
16306:
16301:
16296:
16291:
16286:
16283:
16280:
16277:
16275:
16271:
16268:
16267:
16263:
16260:
16257:
16254:
16249:
16245:
16241:
16236:
16232:
16228:
16225:
16220:
16215:
16210:
16207:
16202:
16197:
16192:
16187:
16182:
16177:
16174:
16171:
16168:
16166:
16162:
16159:
16158:
16153:
16149:
16146:
16143:
16138:
16134:
16130:
16125:
16121:
16117:
16114:
16111:
16106:
16101:
16096:
16091:
16086:
16081:
16076:
16071:
16066:
16061:
16056:
16051:
16047:
16043:
16040:
16038:
16034:
16031:
16030:
16025:
16021:
16018:
16014:
16008:
16004:
16000:
15995:
15990:
15985:
15980:
15975:
15969:
15965:
15960:
15956:
15952:
15947:
15942:
15937:
15932:
15927:
15921:
15917:
15914:
15912:
15908:
15905:
15904:
15899:
15893:
15888:
15883:
15880:
15877:
15872:
15869:
15864:
15860:
15856:
15851:
15848:
15843:
15839:
15834:
15830:
15827:
15825:
15821:
15818:
15817:
15812:
15806:
15801:
15796:
15791:
15788:
15783:
15779:
15775:
15770:
15767:
15762:
15758:
15753:
15749:
15746:
15744:
15740:
15737:
15732:
15727:
15722:
15719:
15716:
15711:
15707:
15703:
15700:
15697:
15696:
15673:
15670:
15667:
15664:
15661:
15658:
15655:
15652:
15649:
15644:
15640:
15636:
15631:
15627:
15623:
15620:
15605:
15604:
15591:
15587:
15583:
15578:
15574:
15570:
15567:
15557:
15544:
15539:
15534:
15529:
15524:
15519:
15515:
15480:utility theory
15466:
15465:
15452:
15442:
15440:
15437:
15436:
15433:
15428:
15425:
15420:
15416:
15412:
15407:
15404:
15399:
15395:
15386:
15384:
15381:
15380:
15378:
15373:
15368:
15364:
15349:
15348:
15333:
15330:
15326:
15322:
15318:
15312:
15309:
15305:
15301:
15298:
15295:
15292:
15287:
15283:
15279:
15276:
15273:
15270:
15267:
15264:
15259:
15255:
15251:
15248:
15229:
15223:
15222:
15207:
15204:
15201:
15198:
15195:
15192:
15187:
15183:
15179:
15176:
15174:
15170:
15166:
15162:
15161:
15158:
15155:
15152:
15149:
15146:
15143:
15138:
15134:
15130:
15127:
15125:
15121:
15117:
15113:
15112:
15098:
15097:
15079:
15075:
15071:
15066:
15061:
15056:
15051:
15046:
15041:
15038:
15036:
15032:
15029:
15024:
15020:
15016:
15015:
15009:
15005:
15001:
14996:
14991:
14986:
14981:
14976:
14971:
14968:
14966:
14962:
14959:
14954:
14950:
14946:
14945:
14930:
14927:
14902:
14901:
14881:
14876:
14872:
14868:
14865:
14863:
14861:
14857:
14852:
14847:
14842:
14838:
14834:
14831:
14826:
14823:
14819:
14815:
14812:
14810:
14808:
14800:
14797:
14792:
14787:
14782:
14778:
14774:
14769:
14765:
14761:
14758:
14755:
14752:
14750:
14748:
14745:
14740:
14735:
14730:
14726:
14722:
14719:
14714:
14710:
14706:
14703:
14700:
14697:
14695:
14693:
14690:
14687:
14684:
14679:
14675:
14671:
14666:
14661:
14656:
14652:
14648:
14645:
14642:
14639:
14637:
14635:
14632:
14627:
14622:
14617:
14614:
14611:
14606:
14601:
14597:
14593:
14590:
14587:
14584:
14582:
14580:
14575:
14570:
14565:
14562:
14559:
14554:
14550:
14546:
14543:
14540:
14539:
14525:
14524:
14513:
14510:
14507:
14504:
14499:
14496:
14492:
14488:
14485:
14482:
14479:
14474:
14470:
14466:
14463:
14449:logit function
14416:
14403:
14386:
14371:
14370:
14357:
14347:
14345:
14342:
14341:
14338:
14333:
14328:
14323:
14319:
14315:
14309:
14305:
14301:
14289:
14286:
14281:
14276:
14272:
14263:
14261:
14258:
14257:
14255:
14250:
14245:
14241:
14223:
14210:error variable
14206:
14205:
14193:
14190:
14187:
14184:
14181:
14178:
14175:
14172:
14167:
14163:
14148:
14147:
14133:
14129:
14125:
14120:
14115:
14110:
14106:
14102:
14097:
14092:
14088:
14066:
14037:
14034:
14033:
14032:
14014:
14009:
14004:
14000:
13995:
13991:
13988:
13982:
13979:
13974:
13969:
13964:
13960:
13955:
13949:
13944:
13941:
13938:
13933:
13922:
13917:
13912:
13908:
13903:
13899:
13896:
13888:
13883:
13878:
13874:
13869:
13863:
13860:
13856:
13849:
13844:
13834:
13829:
13824:
13820:
13815:
13811:
13808:
13800:
13795:
13790:
13786:
13781:
13775:
13770:
13765:
13762:
13759:
13755:
13749:
13745:
13741:
13738:
13735:
13730:
13723:
13719:
13713:
13710:
13705:
13700:
13695:
13692:
13689:
13684:
13680:
13676:
13673:
13651:
13650:
13632:
13627:
13622:
13618:
13614:
13610:
13606:
13603:
13599:
13594:
13591:
13586:
13581:
13576:
13572:
13568:
13565:
13560:
13557:
13553:
13549:
13544:
13540:
13536:
13533:
13528:
13523:
13518:
13513:
13509:
13505:
13502:
13497:
13463:
13459:
13436:
13399:regularization
13388:
13370:
13367:
13364:
13361:
13358:
13355:
13352:
13329:
13328:
13315:
13310:
13305:
13301:
13297:
13293:
13285:
13281:
13277:
13274:
13268:
13264:
13258:
13254:
13251:
13248:
13245:
13240:
13236:
13232:
13229:
13226:
13223:
13220:
13217:
13212:
13207:
13202:
13197:
13193:
13189:
13186:
13181:
13176:
13173:
13170:
13156:
13155:
13142:
13139:
13136:
13132:
13126:
13122:
13118:
13115:
13112:
13107:
13104:
13101:
13097:
13091:
13087:
13083:
13078:
13074:
13070:
13066:
13058:
13054:
13050:
13047:
13041:
13037:
13031:
13027:
13024:
13021:
13018:
13013:
13009:
13005:
13002:
12999:
12996:
12993:
12990:
12985:
12982:
12979:
12975:
12971:
12968:
12965:
12960:
12957:
12954:
12950:
12946:
12941:
12937:
12933:
12930:
12925:
12920:
12917:
12914:
12887:
12884:
12879:
12876:
12853:
12850:
12846:
12823:
12818:
12790:
12787:
12783:
12771:
12770:
12757:
12754:
12750:
12746:
12741:
12736:
12731:
12726:
12722:
12716:
12711:
12708:
12705:
12701:
12697:
12692:
12689:
12685:
12681:
12676:
12672:
12668:
12665:
12662:
12659:
12654:
12649:
12646:
12643:
12639:
12635:
12632:
12629:
12621:
12618:
12614:
12610:
12605:
12602:
12579:
12568:
12560:
12542:
12537:
12533:
12529:
12526:
12523:
12520:
12509:
12508:
12497:
12494:
12489:
12484:
12479:
12474:
12470:
12466:
12463:
12460:
12456:
12451:
12447:
12443:
12440:
12437:
12434:
12429:
12424:
12421:
12418:
12414:
12408:
12403:
12400:
12397:
12393:
12389:
12386:
12361:
12357:
12334:
12329:
12292:
12288:
12284:
12279:
12275:
12271:
12268:
12265:
12262:
12258:
12254:
12249:
12245:
12224:
12220:
12216:
12211:
12207:
12203:
12200:
12196:
12192:
12189:
12169:
12166:
12163:
12152:
12151:
12139:
12135:
12130:
12125:
12120:
12116:
12110:
12106:
12102:
12097:
12093:
12087:
12083:
12079:
12074:
12070:
12063:
12059:
12056:
12053:
12048:
12044:
12027:
12009:
12005:
12001:
11996:
11992:
11967:
11963:
11959:
11954:
11950:
11927:
11922:
11900:
11896:
11892:
11887:
11883:
11871:
11870:
11853:
11849:
11844:
11839:
11833:
11827:
11822:
11819:
11816:
11812:
11808:
11805:
11801:
11796:
11793:
11789:
11785:
11780:
11776:
11770:
11765:
11762:
11759:
11755:
11751:
11748:
11745:
11742:
11738:
11734:
11729:
11725:
11714:
11703:
11700:
11697:
11694:
11691:
11688:
11685:
11682:
11679:
11653:
11649:
11644:
11639:
11633:
11627:
11622:
11619:
11616:
11612:
11608:
11605:
11598:
11594:
11589:
11584:
11578:
11572:
11569:
11565:
11561:
11556:
11552:
11506:
11503:
11500:
11476:
11473:
11470:
11455:
11439:
11436:
11433:
11408:
11404:
11400:
11397:
11394:
11391:
11371:
11367:
11363:
11360:
11344:Main article:
11341:
11338:
11337:
11336:
11324:
11321:
11318:
11292:
11288:
11265:
11261:
11240:
11235:
11231:
11210:
11207:
11204:
11182:
11178:
11157:
11135:
11131:
11110:
11107:
11102:
11098:
11087:
11075:
11072:
11069:
11043:
11039:
11018:
11015:
11012:
10990:
10986:
10965:
10943:
10939:
10918:
10915:
10910:
10906:
10895:
10884:
10880:
10876:
10873:
10870:
10867:
10864:
10861:
10858:
10854:
10850:
10830:
10827:
10822:
10818:
10814:
10809:
10805:
10784:
10781:
10778:
10756:
10753:
10749:
10728:
10725:
10722:
10702:
10699:
10694:
10690:
10686:
10681:
10677:
10656:
10653:
10648:
10644:
10640:
10635:
10631:
10610:
10607:
10604:
10577:
10574:
10571:
10566:
10562:
10538:
10535:
10532:
10517:
10516:
10499:
10494:
10490:
10484:
10480:
10476:
10471:
10467:
10461:
10457:
10453:
10448:
10444:
10440:
10437:
10433:
10429:
10426:
10422:
10417:
10407:
10403:
10397:
10393:
10389:
10384:
10380:
10374:
10370:
10366:
10361:
10357:
10352:
10348:
10345:
10337:
10333:
10327:
10323:
10319:
10314:
10310:
10304:
10300:
10296:
10291:
10287:
10282:
10276:
10268:
10265:
10261:
10256:
10252:
10249:
10242:
10238:
10234:
10229:
10223:
10220:
10210:
10197:
10193:
10189:
10186:
10181:
10177:
10173:
10170:
10167:
10164:
10158:
10155:
10152:
10148:
10143:
10138:
10134:
10130:
10127:
10104:
10101:
10096:
10092:
10071:
10068:
10063:
10059:
10038:
10035:
10032:
10027:
10023:
10002:
9999:
9996:
9976:
9973:
9970:
9950:
9943:
9938:
9937:
9924:
9921:
9917:
9913:
9908:
9903:
9898:
9895:
9890:
9885:
9882:
9879:
9875:
9871:
9866:
9863:
9859:
9853:
9849:
9843:
9838:
9835:
9832:
9828:
9824:
9821:
9818:
9810:
9806:
9802:
9797:
9794:
9769:
9768:
9757:
9754:
9748:
9744:
9739:
9736:
9733:
9730:
9727:
9724:
9719:
9715:
9711:
9706:
9702:
9698:
9695:
9692:
9687:
9682:
9679:
9676:
9672:
9668:
9665:
9662:
9656:
9652:
9647:
9644:
9641:
9638:
9633:
9629:
9623:
9619:
9613:
9608:
9605:
9602:
9598:
9594:
9591:
9568:
9565:
9562:
9540:
9536:
9509:
9504:
9478:
9475:
9472:
9452:
9448:
9444:
9441:
9420:
9399:
9396:
9393:
9373:
9370:
9367:
9345:
9341:
9320:
9294:
9290:
9278:
9277:
9265:
9262:
9259:
9254:
9250:
9246:
9237:
9233:
9229:
9225:
9221:
9217:
9214:
9210:
9205:
9196:
9192:
9188:
9183:
9179:
9176:
9169:
9165:
9161:
9156:
9150:
9147:
9143:
9139:
9136:
9113:
9110:
9107:
9092:
9091:
9080:
9077:
9073:
9069:
9064:
9060:
9054:
9050:
9044:
9039:
9036:
9033:
9029:
9025:
9022:
9006:
9001:
9000:
8989:
8984:
8980:
8976:
8973:
8970:
8965:
8961:
8957:
8952:
8948:
8944:
8939:
8935:
8931:
8928:
8924:
8913:
8902:
8897:
8893:
8889:
8886:
8883:
8878:
8874:
8870:
8865:
8861:
8857:
8852:
8848:
8844:
8841:
8837:
8811:
8808:
8805:
8802:
8799:
8764:
8731:
8727:
8711:
8710:
8697:
8693:
8687:
8683:
8679:
8676:
8673:
8668:
8664:
8658:
8654:
8650:
8645:
8641:
8635:
8631:
8627:
8622:
8618:
8614:
8608:
8605:
8602:
8598:
8593:
8588:
8584:
8580:
8577:
8550:
8547:
8544:
8519:
8512:
8505:
8484:
8481:
8478:
8475:
8472:
8469:
8466:
8463:
8460:
8446:
8442:
8436:
8433:
8416:
8415:
8404:
8399:
8394:
8389:
8385:
8381:
8378:
8375:
8372:
8369:
8355:
8354:
8344:
8335:
8325:
8315:
8309:
8305:
8291:
8281:
8265:
8258:
8251:
8223:
8219:
8215:
8212:
8209:
8204:
8200:
8188:
8187:
8176:
8171:
8168:
8165:
8161:
8155:
8151:
8147:
8144:
8141:
8136:
8133:
8130:
8126:
8120:
8116:
8112:
8107:
8103:
8099:
8096:
8093:
8090:
8087:
8060:
8057:
8054:
8051:
8025:
8015:
8014:
8010:
8009:
8003:
7994:
7985:
7978:
7971:
7965:
7956:
7947:
7938:
7932:expected value
7928:
7922:
7909:
7896:
7882:
7881:
7880:
7879:
7862:
7859:
7856:
7853:
7850:
7846:
7840:
7836:
7832:
7829:
7826:
7821:
7816:
7812:
7808:
7805:
7803:
7801:
7796:
7793:
7790:
7786:
7782:
7779:
7776:
7771:
7768:
7765:
7761:
7757:
7754:
7751:
7746:
7742:
7738:
7735:
7732:
7731:
7726:
7721:
7718:
7715:
7707:
7703:
7699:
7695:
7692:
7689:
7688:
7685:
7682:
7679:
7671:
7667:
7663:
7659:
7658:
7656:
7651:
7648:
7646:
7644:
7639:
7636:
7633:
7629:
7625:
7622:
7619:
7614:
7611:
7608:
7604:
7600:
7597:
7594:
7589:
7585:
7581:
7578:
7575:
7574:
7569:
7565:
7561:
7558:
7556:
7554:
7549:
7546:
7543:
7539:
7535:
7532:
7529:
7524:
7521:
7518:
7514:
7510:
7505:
7501:
7497:
7494:
7489:
7484:
7483:
7480:
7475:
7471:
7467:
7464:
7461:
7458:
7455:
7453:
7446:
7443:
7440:
7436:
7432:
7429:
7426:
7421:
7418:
7415:
7411:
7407:
7402:
7398:
7394:
7393:
7374:
7361:
7355:
7354:
7312:
7311:
7301:
7291:
7282:
7266:
7249:
7239:
7220:
7217:
7204:
7201:
7198:
7189:where usually
7187:
7186:
7170:
7165:
7161:
7155:
7151:
7147:
7144:
7141:
7136:
7132:
7126:
7122:
7118:
7113:
7109:
7103:
7099:
7095:
7090:
7086:
7082:
7079:
7075:
7071:
7068:
7064:
7059:
7056:
7042:
7041:
7028:
7024:
7018:
7014:
7010:
7007:
7004:
6999:
6995:
6989:
6985:
6981:
6976:
6972:
6966:
6962:
6958:
6953:
6949:
6945:
6939:
6936:
6933:
6929:
6924:
6921:
6895:
6892:
6889:
6886:
6883:
6880:
6877:
6874:
6871:
6868:
6865:
6843:
6839:
6808:
6804:
6798:
6794:
6788:
6783:
6780:
6777:
6773:
6769:
6764:
6760:
6756:
6751:
6747:
6741:
6737:
6733:
6730:
6727:
6722:
6718:
6712:
6708:
6704:
6699:
6695:
6689:
6685:
6681:
6676:
6672:
6651:
6646:
6642:
6638:
6633:
6629:
6616:
6613:
6577:
6574:
6569:
6566:
6536:
6532:
6527:
6504:
6500:
6488:
6487:
6472:
6468:
6463:
6459:
6452:
6447:
6443:
6439:
6434:
6430:
6425:
6419:
6416:
6413:
6410:
6407:
6402:
6398:
6394:
6389:
6385:
6380:
6374:
6368:
6362:
6359:
6356:
6353:
6350:
6347:
6342:
6339:
6336:
6333:
6327:
6322:
6316:
6313:
6310:
6307:
6304:
6301:
6298:
6295:
6290:
6287:
6284:
6281:
6278:
6275:
6269:
6263:
6257:
6254:
6251:
6248:
6245:
6240:
6237:
6234:
6231:
6228:
6225:
6222:
6216:
6212:
6209:
6184:
6183:The odds ratio
6181:
6180:
6179:
6168:
6163:
6158:
6154:
6150:
6145:
6141:
6136:
6132:
6104:
6076:
6064:
6061:
6060:
6059:
6047:
6036:
6024:
6019:
6015:
6004:
5986:
5982:
5971:
5959:
5956:
5953:
5950:
5929:
5926:
5923:
5920:
5900:
5897:
5894:
5891:
5881:
5865:
5855:
5839:
5836:
5833:
5830:
5827:
5824:
5821:
5801:
5786:
5783:
5782:
5781:
5770:
5765:
5760:
5756:
5752:
5747:
5743:
5738:
5734:
5728:
5725:
5722:
5719:
5716:
5713:
5708:
5705:
5702:
5699:
5682:
5681:
5670:
5667:
5662:
5658:
5654:
5649:
5645:
5641:
5637:
5631:
5628:
5625:
5622:
5619:
5616:
5611:
5608:
5605:
5602:
5596:
5592:
5589:
5586:
5583:
5580:
5577:
5574:
5571:
5568:
5565:
5562:
5559:
5556:
5553:
5550:
5547:
5542:
5539:
5535:
5531:
5528:
5525:
5522:
5519:
5516:
5513:
5510:
5485:
5482:
5478:
5474:
5471:
5455:
5452:
5439:
5419:
5394:
5390:
5369:
5366:
5363:
5360:
5357:
5352:
5348:
5344:
5341:
5319:
5315:
5291:
5271:
5268:
5265:
5262:
5251:
5250:
5234:
5231:
5226:
5222:
5218:
5213:
5209:
5205:
5202:
5198:
5194:
5191:
5187:
5182:
5179:
5176:
5173:
5170:
5167:
5164:
5161:
5158:
5155:
5132:
5129:
5126:
5123:
5120:
5117:
5113:
5109:
5106:
5095:
5094:
5083:
5078:
5074:
5070:
5065:
5061:
5057:
5054:
5031:
5007:
4987:
4964:
4946:
4945:
4929:
4926:
4922:
4918:
4915:
4911:
4906:
4900:
4897:
4892:
4888:
4881:
4877:
4871:
4868:
4865:
4862:
4859:
4836:
4833:
4830:
4827:
4824:
4821:
4817:
4813:
4810:
4778:
4754:
4751:
4737:
4717:
4714:
4711:
4708:
4705:
4702:
4699:
4696:
4693:
4690:
4670:
4667:
4664:
4661:
4645:
4642:
4633:
4630:
4613:
4610:
4607:
4579:
4576:
4573:
4556:
4555:
4552:
4549:
4546:
4543:
4539:
4532:
4531:
4528:
4525:
4522:
4519:
4515:
4508:
4507:
4506:-value (Wald)
4501:
4495:
4492:
4489:
4480:
4477:
4474:
4473:
4470:
4467:
4464:
4460:
4459:
4456:
4453:
4450:
4446:
4445:
4442:
4439:
4436:
4432:
4431:
4416:
4413:
4400:
4397:
4394:
4381:
4378:
4375:
4362:
4361:
4358:
4355:
4352:
4348:
4347:
4344:
4343:0.076 ≈ 1:13.1
4341:
4338:
4334:
4333:
4330:
4327:
4323:
4322:
4319:
4306:
4305:
4289:
4286:
4283:
4275:
4272:
4268:
4264:
4261:
4257:
4252:
4249:
4238:
4237:
4226:
4223:
4220:
4217:
4214:
4211:
4208:
4205:
4202:
4197:
4193:
4189:
4186:
4181:
4177:
4173:
4170:
4156:
4155:
4139:
4136:
4133:
4125:
4122:
4118:
4114:
4111:
4107:
4102:
4099:
4088:
4087:
4076:
4073:
4070:
4067:
4064:
4061:
4058:
4055:
4052:
4049:
4044:
4040:
4036:
4033:
4028:
4024:
4020:
4017:
3994:
3991:
3988:
3961:
3957:
3930:
3926:
3911:
3908:
3907:
3906:
3895:
3892:
3887:
3883:
3878:
3874:
3871:
3868:
3858:
3847:
3844:
3839:
3835:
3830:
3824:
3820:
3816:
3813:
3810:
3788:
3787:
3776:
3773:
3768:
3764:
3753:
3742:
3739:
3736:
3731:
3727:
3691:
3687:
3660:
3656:
3644:The values of
3626:
3622:
3595:
3591:
3577:
3576:
3563:
3559:
3555:
3550:
3546:
3542:
3537:
3533:
3529:
3524:
3519:
3516:
3513:
3509:
3505:
3497:
3493:
3489:
3484:
3481:
3475:
3472:
3461:
3460:
3449:
3444:
3440:
3436:
3431:
3427:
3423:
3418:
3413:
3410:
3407:
3403:
3399:
3391:
3387:
3383:
3378:
3375:
3369:
3366:
3339:
3335:
3308:
3304:
3269:
3265:
3238:
3234:
3215:
3212:
3204:
3203:
3192:
3187:
3183:
3179:
3176:
3173:
3168:
3165:
3160:
3156:
3152:
3149:
3145:
3138:
3134:
3128:
3125:
3120:
3116:
3112:
3109:
3105:
3101:
3098:
3080:
3079:
3067:
3063:
3058:
3054:
3050:
3047:
3044:
3041:
3038:
3035:
3030:
3026:
3022:
3019:
3016:
3013:
3010:
3005:
3001:
2997:
2994:
2991:
2986:
2982:
2976:
2970:
2965:
2962:
2959:
2955:
2951:
2948:
2943:
2939:
2935:
2932:
2929:
2926:
2923:
2918:
2915:
2910:
2906:
2902:
2899:
2895:
2891:
2888:
2883:
2879:
2875:
2872:
2869:
2864:
2861:
2856:
2852:
2848:
2845:
2841:
2837:
2834:
2794:
2791:
2765:
2761:
2734:
2730:
2705:
2702:
2675:
2670:
2665:
2661:
2657:
2654:
2651:
2648:
2643:
2639:
2633:
2609:
2604:
2599:
2595:
2591:
2588:
2585:
2582:
2577:
2573:
2567:
2550:
2549:
2538:
2535:
2530:
2526:
2522:
2519:
2516:
2513:
2510:
2507:
2502:
2498:
2494:
2491:
2488:
2485:
2480:
2476:
2472:
2469:
2464:
2460:
2456:
2453:
2448:
2444:
2415:
2412:
2407:
2403:
2399:
2396:
2370:
2366:
2343:
2340:
2335:
2331:
2310:
2307:
2302:
2298:
2277:
2274:
2269:
2265:
2244:
2241:
2236:
2232:
2211:
2208:
2203:
2199:
2178:
2175:
2170:
2166:
2145:
2142:
2137:
2133:
2112:
2109:
2104:
2100:
2071:
2067:
2040:
2036:
2018:
2017:
2004:
1999:
1996:
1991:
1987:
1981: if
1978:
1976:
1971:
1967:
1963:
1960:
1957:
1954:
1951:
1948:
1945:
1944:
1941:
1938:
1935:
1930:
1926:
1920: if
1917:
1913:
1909:
1905:
1902:
1899:
1896:
1895:
1893:
1888:
1883:
1879:
1851:
1847:
1816:
1798:
1794:
1767:
1763:
1732:
1728:
1724:
1721:
1695:
1691:
1664:
1660:
1637:
1632:
1628:
1624:
1621:
1618:
1613:
1609:
1595:
1588:
1585:. For a given
1583:log-likelihood
1566:
1563:
1544:
1540:
1535:
1531:
1528:
1525:
1503:
1499:
1494:
1488:
1484:
1480:
1477:
1474:
1465:. Conversely,
1455:rate parameter
1442:
1438:
1434:
1431:
1426:
1422:
1401:
1396:
1392:
1388:
1383:
1379:
1375:
1372:
1340:
1336:
1332:
1329:
1326:
1321:
1317:
1305:
1304:
1288:
1285:
1280:
1276:
1272:
1267:
1263:
1259:
1256:
1252:
1248:
1245:
1241:
1236:
1233:
1230:
1227:
1224:
1193:
1189:
1185:
1182:
1179:
1176:
1173:
1170:
1151:
1150:
1134:
1130:
1126:
1123:
1120:
1117:
1114:
1111:
1107:
1103:
1100:
1096:
1091:
1088:
1085:
1082:
1079:
1055:
1048:
1038:
1035:
1006:
1003:
1000:
997:
994:
974:
971:
968:
951:
944:
937:
936:
933:
930:
927:
924:
921:
918:
915:
912:
909:
906:
903:
900:
897:
894:
891:
888:
885:
882:
879:
876:
871:
865:
864:
861:
858:
855:
852:
849:
846:
843:
840:
837:
834:
831:
828:
825:
822:
819:
816:
813:
810:
807:
804:
799:
768:
765:
763:
760:
746:
743:
679:
676:
674:
671:
668:
667:
665:
664:
657:
650:
642:
639:
638:
637:
636:
621:
620:
619:
618:
613:
608:
603:
598:
593:
585:
584:
580:
579:
578:
577:
572:
567:
562:
557:
549:
548:
547:
546:
541:
536:
531:
526:
518:
517:
516:
515:
510:
505:
500:
492:
491:
490:
489:
484:
479:
471:
470:
466:
465:
464:
463:
455:
454:
453:
452:
447:
442:
437:
432:
427:
422:
417:
415:Semiparametric
412:
407:
399:
398:
397:
396:
391:
386:
384:Random effects
381:
376:
368:
367:
366:
365:
360:
358:Ordered probit
355:
350:
345:
340:
335:
330:
325:
320:
315:
310:
305:
297:
296:
295:
294:
289:
284:
279:
271:
270:
266:
265:
259:
258:
249:§ History
245:Berkson (1944)
241:Joseph Berkson
136:§ Example
90:, coded by an
45:logistic model
33:§ Example
21:Logit function
15:
9:
6:
4:
3:
2:
34415:
34404:
34401:
34399:
34396:
34394:
34391:
34390:
34388:
34373:
34372:
34363:
34361:
34360:
34351:
34349:
34348:
34343:
34337:
34335:
34334:
34325:
34324:
34321:
34307:
34304:
34302:
34301:Geostatistics
34299:
34297:
34294:
34292:
34289:
34287:
34284:
34283:
34281:
34279:
34275:
34269:
34268:Psychometrics
34266:
34264:
34261:
34259:
34256:
34254:
34251:
34249:
34246:
34244:
34241:
34239:
34236:
34234:
34231:
34229:
34226:
34224:
34221:
34220:
34218:
34216:
34212:
34206:
34203:
34201:
34198:
34196:
34192:
34189:
34187:
34184:
34182:
34179:
34177:
34174:
34173:
34171:
34169:
34165:
34159:
34156:
34154:
34151:
34149:
34145:
34142:
34140:
34137:
34136:
34134:
34132:
34131:Biostatistics
34128:
34124:
34120:
34115:
34111:
34093:
34092:Log-rank test
34090:
34089:
34087:
34083:
34077:
34074:
34073:
34071:
34069:
34065:
34059:
34056:
34054:
34051:
34049:
34046:
34044:
34041:
34040:
34038:
34036:
34032:
34029:
34027:
34023:
34013:
34010:
34008:
34005:
34003:
34000:
33998:
33995:
33993:
33990:
33989:
33987:
33985:
33981:
33975:
33972:
33970:
33967:
33965:
33963:(Box–Jenkins)
33959:
33957:
33954:
33952:
33949:
33945:
33942:
33941:
33940:
33937:
33936:
33934:
33932:
33928:
33922:
33919:
33917:
33916:Durbin–Watson
33914:
33912:
33906:
33904:
33901:
33899:
33898:Dickey–Fuller
33896:
33895:
33893:
33889:
33883:
33880:
33878:
33875:
33873:
33872:Cointegration
33870:
33868:
33865:
33863:
33860:
33858:
33855:
33853:
33850:
33848:
33847:Decomposition
33845:
33844:
33842:
33838:
33835:
33833:
33829:
33819:
33816:
33815:
33814:
33811:
33810:
33809:
33806:
33802:
33799:
33798:
33797:
33794:
33792:
33789:
33787:
33784:
33782:
33779:
33777:
33774:
33772:
33769:
33767:
33764:
33762:
33759:
33758:
33756:
33754:
33750:
33744:
33741:
33739:
33736:
33734:
33731:
33729:
33726:
33724:
33721:
33719:
33718:Cohen's kappa
33716:
33715:
33713:
33711:
33707:
33703:
33699:
33695:
33691:
33687:
33682:
33678:
33664:
33661:
33659:
33656:
33654:
33651:
33649:
33646:
33645:
33643:
33641:
33637:
33631:
33627:
33623:
33617:
33615:
33612:
33611:
33609:
33607:
33603:
33597:
33594:
33592:
33589:
33587:
33584:
33582:
33579:
33577:
33574:
33572:
33571:Nonparametric
33569:
33567:
33564:
33563:
33561:
33557:
33551:
33548:
33546:
33543:
33541:
33538:
33536:
33533:
33532:
33530:
33528:
33524:
33518:
33515:
33513:
33510:
33508:
33505:
33503:
33500:
33498:
33495:
33494:
33492:
33490:
33486:
33480:
33477:
33475:
33472:
33470:
33467:
33465:
33462:
33461:
33459:
33457:
33453:
33449:
33442:
33439:
33437:
33434:
33433:
33429:
33425:
33409:
33406:
33405:
33404:
33401:
33399:
33396:
33394:
33391:
33387:
33384:
33382:
33379:
33378:
33377:
33374:
33373:
33371:
33369:
33365:
33355:
33352:
33348:
33342:
33340:
33334:
33332:
33326:
33325:
33324:
33321:
33320:Nonparametric
33318:
33316:
33310:
33306:
33303:
33302:
33301:
33295:
33291:
33290:Sample median
33288:
33287:
33286:
33283:
33282:
33280:
33278:
33274:
33266:
33263:
33261:
33258:
33256:
33253:
33252:
33251:
33248:
33246:
33243:
33241:
33235:
33233:
33230:
33228:
33225:
33223:
33220:
33218:
33215:
33213:
33211:
33207:
33205:
33202:
33201:
33199:
33197:
33193:
33187:
33185:
33181:
33179:
33177:
33172:
33170:
33165:
33161:
33160:
33157:
33154:
33152:
33148:
33138:
33135:
33133:
33130:
33128:
33125:
33124:
33122:
33120:
33116:
33110:
33107:
33103:
33100:
33099:
33098:
33095:
33091:
33088:
33087:
33086:
33083:
33081:
33078:
33077:
33075:
33073:
33069:
33061:
33058:
33056:
33053:
33052:
33051:
33048:
33046:
33043:
33041:
33038:
33036:
33033:
33031:
33028:
33026:
33023:
33022:
33020:
33018:
33014:
33008:
33005:
33001:
32998:
32994:
32991:
32989:
32986:
32985:
32984:
32981:
32980:
32979:
32976:
32972:
32969:
32967:
32964:
32962:
32959:
32957:
32954:
32953:
32952:
32949:
32948:
32946:
32944:
32940:
32937:
32935:
32931:
32925:
32922:
32920:
32917:
32913:
32910:
32909:
32908:
32905:
32903:
32900:
32896:
32895:loss function
32893:
32892:
32891:
32888:
32884:
32881:
32879:
32876:
32874:
32871:
32870:
32869:
32866:
32864:
32861:
32859:
32856:
32852:
32849:
32847:
32844:
32842:
32836:
32833:
32832:
32831:
32828:
32824:
32821:
32819:
32816:
32814:
32811:
32810:
32809:
32806:
32802:
32799:
32797:
32794:
32793:
32792:
32789:
32785:
32782:
32781:
32780:
32777:
32773:
32770:
32769:
32768:
32765:
32763:
32760:
32758:
32755:
32753:
32750:
32749:
32747:
32745:
32741:
32737:
32733:
32728:
32724:
32710:
32707:
32705:
32702:
32700:
32697:
32695:
32692:
32691:
32689:
32687:
32683:
32677:
32674:
32672:
32669:
32667:
32664:
32663:
32661:
32657:
32651:
32648:
32646:
32643:
32641:
32638:
32636:
32633:
32631:
32628:
32626:
32623:
32621:
32618:
32617:
32615:
32613:
32609:
32603:
32600:
32598:
32597:Questionnaire
32595:
32593:
32590:
32586:
32583:
32581:
32578:
32577:
32576:
32573:
32572:
32570:
32568:
32564:
32558:
32555:
32553:
32550:
32548:
32545:
32543:
32540:
32538:
32535:
32533:
32530:
32528:
32525:
32523:
32520:
32519:
32517:
32515:
32511:
32507:
32503:
32498:
32494:
32480:
32477:
32475:
32472:
32470:
32467:
32465:
32462:
32460:
32457:
32455:
32452:
32450:
32447:
32445:
32442:
32440:
32437:
32435:
32432:
32430:
32427:
32425:
32424:Control chart
32422:
32420:
32417:
32415:
32412:
32410:
32407:
32406:
32404:
32402:
32398:
32392:
32389:
32385:
32382:
32380:
32377:
32376:
32375:
32372:
32370:
32367:
32365:
32362:
32361:
32359:
32357:
32353:
32347:
32344:
32342:
32339:
32337:
32334:
32333:
32331:
32327:
32321:
32318:
32317:
32315:
32313:
32309:
32297:
32294:
32292:
32289:
32287:
32284:
32283:
32282:
32279:
32277:
32274:
32273:
32271:
32269:
32265:
32259:
32256:
32254:
32251:
32249:
32246:
32244:
32241:
32239:
32236:
32234:
32231:
32229:
32226:
32225:
32223:
32221:
32217:
32211:
32208:
32206:
32203:
32199:
32196:
32194:
32191:
32189:
32186:
32184:
32181:
32179:
32176:
32174:
32171:
32169:
32166:
32164:
32161:
32159:
32156:
32154:
32151:
32150:
32149:
32146:
32145:
32143:
32141:
32137:
32134:
32132:
32128:
32124:
32120:
32115:
32111:
32105:
32102:
32100:
32097:
32096:
32093:
32089:
32082:
32077:
32075:
32070:
32068:
32063:
32062:
32059:
32052:
32048:
32045:
32043:
32040:
32038:
32034:
32030:
32025:
32022:
32017:
32013:
32012:
32001:
31996:
31992:
31988:
31983:
31978:
31974:
31970:
31969:
31964:
31959:
31955:
31949:
31945:
31940:
31936:
31930:
31926:
31921:
31917:
31911:
31907:
31903:
31899:
31895:
31889:
31885:
31880:
31876:
31870:
31866:
31862:
31858:
31854:
31850:
31844:
31840:
31835:
31831:
31825:
31821:
31817:
31812:
31808:
31802:
31798:
31793:
31792:
31787:
31783:
31778:
31773:
31768:
31763:
31759:
31755:
31751:
31747:
31746:
31741:
31737:
31736:Worcester, J.
31733:
31729:
31725:
31721:
31716:
31711:
31706:
31701:
31697:
31693:
31689:
31685:
31681:
31676:
31672:
31668:
31664:
31660:
31657:(3): 251–59.
31656:
31652:
31647:
31643:
31639:
31635:
31630:
31626:
31622:
31618:
31614:
31610:
31606:
31601:
31595:
31591:
31587:
31583:
31577:
31576:
31573:
31569:
31562:
31561:
31555:
31551:
31547:
31546:Cox, David R.
31543:
31539:
31535:
31531:
31527:
31523:
31519:
31515:
31514:Cox, David R.
31511:
31508:
31504:
31500:
31496:
31492:
31488:
31484:
31480:
31476:
31475:
31469:
31465:
31461:
31457:
31453:
31449:
31445:
31441:
31437:
31432:
31428:
31424:
31420:
31416:
31412:
31408:
31403:
31402:
31383:on 2018-11-27
31379:
31375:
31368:
31364:
31358:
31352:, p. 13.
31351:
31346:
31344:
31337:, p. 11.
31336:
31331:
31324:
31319:
31312:
31307:
31300:
31295:
31288:
31283:
31276:
31271:
31264:
31259:
31252:
31247:
31240:
31235:
31221:
31217:
31213:
31206:
31199:
31194:
31178:
31174:
31167:
31160:
31153:
31148:
31140:
31136:
31132:
31125:
31117:
31116:
31108:
31100:
31099:
31091:
31083:
31076:
31069:
31051:
31044:
31042:
31034:
31029:
31027:
31018:
31012:
31008:
31001:
30993:
30989:
30985:
30981:
30977:
30973:
30966:
30955:
30948:
30940:
30934:
30930:
30926:
30919:
30917:
30915:
30913:
30911:
30909:
30907:
30905:
30903:
30901:
30892:
30886:
30882:
30875:
30867:
30863:
30858:
30853:
30848:
30843:
30839:
30835:
30831:
30824:
30816:
30812:
30807:
30802:
30798:
30794:
30790:
30783:
30775:
30771:
30766:
30761:
30757:
30753:
30752:
30747:
30740:
30732:
30728:
30723:
30718:
30713:
30708:
30704:
30700:
30696:
30689:
30681:
30675:
30671:
30664:
30656:
30652:
30648:
30644:
30637:
30630:
30622:
30618:
30614:
30610:
30603:
30595:
30589:
30585:
30578:
30576:
30574:
30572:
30570:
30568:
30566:
30556:
30551:
30547:
30543:
30536:
30516:
30508:
30505:
30502:
30496:
30493:
30490:
30484:
30481:
30478:
30462:
30454:
30448:
30443:
30442:
30433:
30425:
30419:
30412:
30408:
30403:
30398:
30394:
30390:
30386:
30382:
30381:
30373:
30369:
30365:
30359:
30345:
30341:
30335:
30327:
30323:
30319:
30315:
30311:
30307:
30303:
30296:
30288:
30284:
30280:
30276:
30272:
30268:
30264:
30260:
30256:
30252:
30248:
30244:
30243:Risk Analysis
30240:
30233:
30225:
30221:
30217:
30213:
30209:
30205:
30201:
30194:
30186:
30179:
30171:
30167:
30163:
30159:
30152:
30144:
30140:
30136:
30132:
30125:
30117:
30111:
30107:
30103:
30099:
30092:
30090:
30088:
30079:
30075:
30071:
30067:
30064:(7): 511–24.
30063:
30059:
30052:
30044:
30040:
30036:
30030:
30028:
30019:
30015:
30011:
30007:
30003:
29999:
29992:
29984:
29980:
29976:
29972:
29968:
29964:
29957:
29949:
29945:
29941:
29937:
29934:(6): 635–42.
29933:
29929:
29922:
29914:
29910:
29906:
29902:
29895:
29887:
29883:
29878:
29873:
29869:
29865:
29861:
29854:
29847:
29842:
29834:
29830:
29826:
29822:
29818:
29814:
29807:
29805:
29797:
29792:
29790:
29781:
29775:
29771:
29764:
29762:
29760:
29758:
29756:
29754:
29752:
29750:
29748:
29746:
29744:
29735:
29731:
29727:
29723:
29719:
29715:
29711:
29707:
29703:
29699:
29692:
29688:
29679:
29676:
29674:
29671:
29668:
29665:- contains a
29664:
29661:
29659:
29656:
29654:
29651:
29649:
29648:Ordered logit
29646:
29644:
29641:
29639:
29636:
29634:
29631:
29629:
29626:
29624:
29621:
29620:
29616:
29610:
29605:
29595:
29591:
29587:
29583:
29580:
29577:
29573:
29570:
29567:
29564:
29560:
29559:ordered logit
29556:
29553:
29550:
29546:
29542:
29538:
29535:
29534:
29533:
29525:
29523:
29519:
29515:
29511:
29507:
29503:
29498:
29496:
29492:
29487:
29484:
29480:
29476:
29472:
29468:
29464:
29459:
29457:
29453:
29449:
29448:Fisher (1935)
29445:
29441:
29437:
29436:Gaddum (1933)
29433:
29429:
29425:
29421:
29416:
29413:
29409:
29405:
29401:
29397:
29396:Raymond Pearl
29392:
29389:
29385:
29381:
29380:autocatalysis
29376:
29373:
29369:
29365:
29361:
29357:
29356:Cramer (2002)
29347:
29345:
29340:
29338:
29333:
29331:
29327:
29323:
29319:
29315:
29311:
29307:
29306:link function
29303:
29299:
29289:
29287:
29283:
29279:
29275:
29259:
29256:
29253:
29245:
29241:
29231:
29229:
29207:
29198:
29179:
29176:
29173:
29167:
29137:
29134:
29131:
29125:
29122:
29114:
29110:
29106:
29103:
29091:
29087:
29085:
29078:
29070:
29063:
29060:
29057:
29054:
29051:
29048:
29045:
29036:
29033:
29030:
29021:
29018:
29015:
29012:
29009:
29006:
29003:
29000:
28997:
28983:
28980:
28977:
28974:
28971:
28968:
28965:
28953:
28950:
28947:
28943:
28936:
28933:
28930:
28927:
28924:
28921:
28918:
28900:
28897:
28893:
28880:
28877:
28873:
28870:
28863:
28853:
28850:
28847:
28844:
28841:
28838:
28835:
28832:
28829:
28820:
28817:
28811:
28808:
28805:
28802:
28799:
28796:
28793:
28775:
28772:
28768:
28755:
28752:
28748:
28744:
28738:
28735:
28730:
28726:
28722:
28717:
28713:
28703:
28700:
28695:
28690:
28687:
28684:
28680:
28674:
28671:
28667:
28658:
28652:
28645:
28633:
28632:
28631:
28629:
28610:
28607:
28604:
28594:Assuming the
28592:
28590:
28568:
28565:
28560:
28556:
28552:
28547:
28543:
28533:
28530:
28525:
28520:
28517:
28514:
28510:
28504:
28501:
28497:
28493:
28487:
28484:
28481:
28478:
28475:
28469:
28466:
28463:
28458:
28455:
28451:
28443:
28442:
28441:
28413:
28409:
28405:
28402:
28386:
28382:
28373:
28369:
28365:
28362:
28352:
28348:
28337:
28333:
28324:
28320:
28314:
28310:
28306:
28304:
28293:
28290:
28285:
28281:
28277:
28272:
28268:
28256:
28252:
28248:
28246:
28235:
28232:
28229:
28226:
28223:
28214:
28212:
28204:
28201:
28198:
28195:
28192:
28186:
28175:
28174:
28173:
28171:
28155:
28147:
28144:
28141:
28127:
28119:
28115:
28111:
28108:
28100:
28092:
28084:
28080:
28076:
28070:
28067:
28064:
28061:
28058:
28029:
28026:
28023:
28020:
28017:
27988:
27985:
27982:
27976:
27973:
27947:
27939:
27935:
27931:
27928:
27925:
27919:
27916:
27913:
27910:
27907:
27904:
27901:
27888:
27887:
27886:
27866:
27863:
27860:
27857:
27854:
27851:
27848:
27839:
27831:
27826:
27822:
27818:
27814:
27810:
27807:
27803:
27798:
27792:
27784:
27780:
27772:
27771:
27770:
27756:
27748:
27743:
27729:
27721:
27716:
27714:
27713:cross-entropy
27704:
27688:
27678:
27673:
27663:
27658:
27629:
27600:
27571:
27556:
27538:
27509:
27506:
27503:
27493:
27475:
27472:
27468:
27445:
27413:
27410:
27407:
27380:
27343:
27333:
27328:
27317:
27311:
27306:
27303:
27300:
27296:
27287:
27277:
27272:
27261:
27255:
27250:
27247:
27243:
27235:
27234:
27233:
27231:
27205:
27201:
27197:
27194:
27190:
27186:
27181:
27177:
27169:
27168:
27167:
27148:
27144:
27139:
27131:
27121:
27116:
27105:
27101:
27096:
27093:
27089:
27081:
27080:
27079:
27063:
27060:
27056:
27047:
27043:
27022:
27012:
27007:
26997:
26992:
26989:
26985:
26979:
26976:
26972:
26966:
26961:
26958:
26955:
26951:
26943:
26942:
26941:
26921:
26918:
26913:
26909:
26900:
26897:
26893:
26889:
26883:
26879:
26876:
26871:
26862:
26857:
26854:
26851:
26847:
26843:
26840:
26837:
26828:
26825:
26820:
26817:
26812:
26805:
26802:
26799:
26796:
26793:
26790:
26781:
26778:
26773:
26770:
26765:
26739:
26738:
26737:
26718:
26715:
26712:
26709:
26697:
26692:
26689:
26686:
26674:
26669:
26666:
26663:
26651:
26637:
26636:
26635:
26633:
26610:
26604:
26601:
26597:
26591:
26586:
26583:
26580:
26576:
26572:
26569:
26565:
26559:
26555:
26549:
26544:
26541:
26538:
26534:
26530:
26525:
26522:
26519:
26516:
26500:
26499:
26498:
26481:
26478:
26473:
26470:
26466:
26460:
26455:
26452:
26449:
26445:
26437:
26436:
26435:
26433:
26429:
26402:
26399:
26395:
26386:
26382:
26378:
26375:
26366:
26361:
26358:
26354:
26348:
26345:
26341:
26332:
26327:
26324:
26321:
26317:
26311:
26308:
26304:
26298:
26293:
26290:
26287:
26283:
26277:
26272:
26269:
26266:
26262:
26258:
26253:
26250:
26247:
26231:
26230:
26229:
26227:
26223:
26219:
26211:
26184:
26181:
26177:
26168:
26164:
26160:
26157:
26148:
26143:
26140:
26136:
26130:
26127:
26123:
26114:
26109:
26106:
26103:
26099:
26095:
26087:
26084:
26080:
26071:
26058:
26057:
26056:
26034:
26031:
26027:
26020:
26017:
26009:
26005:
26001:
25998:
25987:
25982:
25979:
25976:
25972:
25966:
25961:
25958:
25955:
25951:
25947:
25944:
25937:
25936:
25935:
25913:
25910:
25906:
25899:
25896:
25891:
25888:
25884:
25878:
25873:
25870:
25867:
25863:
25857:
25852:
25849:
25846:
25842:
25838:
25835:
25830:
25827:
25824:
25808:
25807:
25806:
25804:
25799:
25796:
25788:
25786:
25782:
25761:
25746:
25742:
25738:
25733:
25730:
25726:
25705:
25702:
25699:
25691:
25687:
25685:
25674:
25672:
25654:
25651:
25648:
25621:
25618:
25614:
25610:
25607:
25604:
25599:
25596:
25592:
25588:
25583:
25580:
25576:
25569:
25564:
25532:
25529:
25526:
25513:
25490:
25486:
25461:
25458:
25454:
25430:
25427:
25424:
25421:
25418:
25415:
25412:
25406:
25403:
25395:
25379:
25376:
25371:
25367:
25342:
25338:
25313:
25310:
25307:
25297:
25292:
25290:
25280:
25279:for details.
25278:
25274:
25270:
25266:
25262:
25258:
25254:
25244:
25242:
25229:
25215:
25202:
25197:
25193:
25189:
25184:
25180:
25176:
25170:
25164:
25154:
25151:
25143:
25142:link function
25138:
25125:
25121:
25118:
25115:
25112:
25109:
25097:
25094:
25091:
25087:
25082:
25079:
25076:
25073:
25070:
25067:
25054:
25049:
25045:
25041:
25037:
25033:
25023:
25003:
24980:
24948:
24942:
24939:
24929:
24921:
24918:
24915:
24909:
24906:
24903:
24899:
24894:
24891:
24888:
24883:
24876:
24873:
24866:
24861:
24856:
24849:
24846:
24835:
24834:
24833:
24817:
24813:
24790:
24786:
24763:
24759:
24748:
24739:
24737:
24736:Type-II error
24712:
24705:
24701:
24696:
24692:
24686:
24681:
24677:
24671:
24666:
24662:
24654:
24653:
24652:
24650:
24646:
24636:
24633:
24623:
24621:
24617:
24606:
24603:
24587:
24583:
24573:
24554:
24545:
24536:
24527:
24518:
24517:
24516:
24513:
24504:
24494:
24469:
24456:
24453:
24450:
24447:
24444:
24442:
24431:
24416:
24411:
24396:
24390:
24387:
24384:
24381:
24378:
24376:
24367:
24353:
24350:
24347:
24334:
24331:
24327:
24323:
24320:
24317:
24315:
24304:
24300:
24291:
24279:
24278:
24277:
24256:
24243:
24240:
24237:
24234:
24231:
24229:
24218:
24200:
24197:
24194:
24191:
24188:
24186:
24175:
24163:
24162:
24161:
24158:
24156:
24140:
24135:
24130:
24127:
24124:
24120:
24109:
24105:
24103:
24075:
24062:
24059:
24056:
24053:
24050:
24047:
24040:
24039:
24038:
24036:
24031:
24027:
24017:
24015:
24011:
24002:
23985:
23982:
23979:
23976:
23973:
23965:
23961:
23957:
23950:
23942:
23940:
23936:
23931:
23918:
23915:
23912:
23904:
23894:
23887:
23878:
23869:
23866:
23863:
23843:
23840:
23837:
23834:
23825:
23802:
23799:
23796:
23793:
23788:
23778:
23765:
23763:
23756:
23731:
23721:
23714:
23705:
23696:
23693:
23689:
23682:
23677:
23667:
23658:
23648:
23639:
23635:
23632:
23629:
23626:
23619:
23618:
23617:
23615:
23594:
23584:
23577:
23568:
23558:
23557:
23556:
23554:
23547:
23523:
23497:
23486:
23481:
23478:
23470:
23464:
23460:
23457:
23454:
23449:
23445:
23437:
23436:
23435:
23419:
23415:
23380:
23375:
23372:
23366:
23363:
23352:
23347:
23344:
23338:
23327:
23319:
23316:
23308:
23299:
23296:
23291:
23281:
23270:
23269:
23268:
23266:
23245:
23240:
23235:
23231:
23206:
23197:
23193:
23189:
23186:
23180:
23177:
23169:
23165:
23161:
23158:
23152:
23144:
23140:
23133:
23130:
23125:
23121:
23116:
23110:
23105:
23102:
23099:
23095:
23091:
23086:
23082:
23074:
23073:
23072:
23053:
23049:
23045:
23040:
23036:
23028:
23027:
23026:
23002:
22998:
22994:
22990:
22986:
22983:
22979:
22974:
22968:
22960:
22956:
22948:
22947:
22946:
22945:is given by:
22932:
22929:
22926:
22902:
22890:
22886:
22879:
22876:
22873:
22867:
22864:
22856:
22852:
22848:
22845:
22839:
22828:
22824:
22817:
22811:
22808:
22803:
22799:
22794:
22788:
22783:
22780:
22777:
22773:
22769:
22766:
22759:
22758:
22757:
22740:
22735:
22731:
22727:
22722:
22718:
22714:
22711:
22704:
22703:
22702:
22688:
22685:
22682:
22657:
22651:
22621:
22618:
22614:
22610:
22607:
22603:
22598:
22592:
22586:
22579:
22578:
22577:
22575:
22570:
22550:
22525:
22521:
22512:
22508:
22504:
22499:
22497:
22493:
22486:
22479:
22463:
22460:
22457:
22454:
22451:
22443:
22439:
22432:
22425:
22417:
22416:data points.
22415:
22391:
22381:
22377:
22373:
22365:
22355:
22350:
22347:
22344:
22340:
22336:
22331:
22326:
22316:
22305:
22304:
22303:
22287:
22282:
22278:
22269:
22245:
22218:
22213:
22208:
22204:
22183:
22161:
22156:
22152:
22129:
22125:
22113:
22094:
22089:
22079:
22075:
22071:
22066:
22062:
22053:
22048:
22045:
22042:
22038:
22034:
22029:
22025:
22017:
22016:
22015:
22010:
22007: =
22006:
21998:
21994:
21989:
21974:
21964:
21937:
21932:
21922:
21918:
21914:
21909:
21905:
21899:
21895:
21891:
21886:
21882:
21873:
21868:
21865:
21862:
21858:
21854:
21849:
21845:
21837:
21836:
21835:
21833:
21817:
21812:
21808:
21804:
21799:
21795:
21791:
21788:
21780:
21773:
21767:data points (
21766:
21761:
21759:
21755:
21751:
21746:
21744:
21728:
21724:
21709:
21705:
21701:
21698:
21678:
21670:
21654:
21646:
21642:
21636:
21629:"Rule of ten"
21626:
21624:
21620:
21615:
21611:
21607:
21603:
21599:
21595:
21591:
21587:
21583:
21579:
21575:
21567:
21566:heavier tails
21548:
21542:
21539:
21507:
21501:
21494:), comparing
21493:
21489:
21485:
21481:
21476:
21467:
21451:
21443:
21440:
21434:
21428:
21425:
21419:
21413:
21407:
21401:
21368:
21362:
21357:
21352:
21345:
21337:
21329:
21325:
21316:
21308:
21304:
21298:
21291:
21283:
21275:
21271:
21262:
21254:
21250:
21244:
21238:
21233:
21221:
21220:
21219:
21202:
21199:
21193:
21187:
21184:
21178:
21172:
21166:
21132:
21126:
21123:
21120:
21111:
21105:
21099:
21096:
21093:
21064:
21058:
21048:
21040:
21035:
21018:
21007:
21001:
20989:
20986:
20981:
20970:
20958:
20947:
20942:
20937:
20934:
20931:
20917:
20916:
20915:
20871:
20858:
20848:
20844:
20840:
20837:
20833:
20828:
20822:
20816:
20794:
20786:
20783:
20777:
20769:
20765:
20761:
20755:
20747:
20743:
20739:
20736:
20730:
20724:
20693:
20690:
20685:
20681:
20677:
20672:
20668:
20664:
20659:
20655:
20648:
20643:
20628:
20624:
20620:
20616:
20600:
20597:
20594:
20574:
20571:
20568:
20550:
20546:
20542:
20538:
20535:
20531:
20530:
20529:
20527:
20523:
20519:
20513:
20511:
20507:
20495:Model fitting
20492:
20475:
20469:
20466:
20461:
20457:
20451:
20440:
20430:
20422:
20418:
20414:
20411:
20407:
20402:
20399:
20395:
20388:
20383:
20373:
20363:
20355:
20351:
20347:
20344:
20340:
20335:
20322:
20317:
20313:
20301:
20296:
20293:
20288:
20284:
20273:
20269:
20265:
20262:
20254:
20249:
20245:
20233:
20228:
20224:
20212:
20204:
20194:
20191:
20188:
20183:
20179:
20165:
20164:
20163:
20146:
20140:
20130:
20122:
20118:
20110:
20106:
20102:
20099:
20093:
20089:
20083:
20079:
20076:
20073:
20065:
20061:
20054:
20051:
20048:
20044:
20039:
20033:
20021:
20012:
20008:
20002:
19998:
19987:
19983:
19972:
19968:
19965:
19958:
19957:
19956:
19939:
19934:
19928:
19916:
19907:
19903:
19897:
19893:
19882:
19878:
19868:
19863:
19859:
19851:
19850:
19849:
19847:
19842:
19841:are planted.
19839:
19835:
19830:
19826:
19807:
19804:
19801:
19798:
19795:
19792:
19789:
19781:
19773:
19769:
19765:
19760:
19756:
19749:
19746:
19743:
19737:
19733:
19725:
19724:
19723:
19721:
19716:
19712:
19708:
19704:
19700:
19696:
19671:
19665:
19655:
19641:
19638:
19635:
19629:
19626:
19620:
19610:
19595:
19594:
19593:
19591:
19587:
19583:
19579:
19575:
19548:
19542:
19539:
19535:
19531:
19528:
19524:
19519:
19516:
19509:
19508:
19507:
19504:
19500:
19493:
19489:
19485:
19478:
19477:step function
19474:
19470:
19450:
19437:
19434:
19431:
19427:
19421:
19417:
19413:
19410:
19407:
19402:
19399:
19396:
19392:
19386:
19382:
19378:
19373:
19369:
19362:
19358:
19354:
19351:
19347:
19342:
19337:
19333:
19325:
19324:
19323:
19315:
19313:
19309:
19305:
19301:
19297:
19293:
19289:
19284:
19279:
19263:
19253:
19248:
19238:
19208:
19204:
19200:
19190:
19180:
19175:
19165:
19161:
19157:
19154:
19150:
19145:
19135:
19125:
19120:
19109:
19105:
19102:
19094:
19084:
19079:
19068:
19062:
19056:
19053:
19048:
19044:
19030:
19029:
19028:
19011:
19008:
19001:
18991:
18982:
18978:
18971:
18961:
18956:
18945:
18937:
18936:
18935:
18921:
18913:
18908:
18874:
18864:
18854:
18849:
18838:
18834:
18827:
18817:
18812:
18801:
18792:
18782:
18777:
18766:
18760:
18758:
18740:
18730:
18725:
18714:
18710:
18703:
18693:
18688:
18677:
18666:
18656:
18647:
18637:
18627:
18622:
18611:
18603:
18593:
18584:
18577:
18575:
18560:
18550:
18541:
18533:
18523:
18518:
18507:
18503:
18496:
18486:
18477:
18469:
18459:
18454:
18443:
18433:
18423:
18414:
18406:
18396:
18391:
18380:
18373:
18371:
18356:
18346:
18335:
18330:
18316:
18312:
18305:
18295:
18284:
18279:
18265:
18256:
18246:
18235:
18230:
18216:
18210:
18208:
18200:
18197:
18192:
18188:
18170:
18169:
18168:
18163:
18162:
18154:
18153:
18148:
18132:
18129:
18123:
18120:
18115:
18111:
18101:
18095:
18092:
18087:
18083:
18053:
18050:
18045:
18041:
18011:
18008:
18003:
17999:
17969:
17963:
17960:
17955:
17945:
17940:
17930:
17925:
17915:
17910:
17900:
17897:
17891:
17888:
17885:
17879:
17876:
17871:
17867:
17853:
17852:
17851:
17849:
17845:
17819:
17809:
17804:
17793:
17787:
17783:
17774:
17764:
17759:
17748:
17742:
17736:
17733:
17728:
17724:
17710:
17709:
17708:
17687:
17677:
17667:
17662:
17651:
17647:
17640:
17630:
17625:
17614:
17605:
17595:
17590:
17579:
17573:
17571:
17563:
17560:
17555:
17551:
17530:
17520:
17515:
17504:
17500:
17493:
17483:
17478:
17467:
17458:
17448:
17443:
17432:
17426:
17424:
17416:
17413:
17408:
17404:
17386:
17385:
17384:
17363:
17353:
17348:
17337:
17333:
17326:
17316:
17311:
17300:
17296:
17293:
17286:
17285:
17284:
17283:". That is:
17282:
17278:
17274:
17270:
17267:is in fact a
17265:
17261:
17257:
17230:
17220:
17215:
17204:
17198:
17195:
17190:
17188:
17180:
17177:
17172:
17168:
17150:
17140:
17135:
17124:
17118:
17115:
17110:
17108:
17100:
17097:
17092:
17088:
17070:
17069:
17068:
17066:
17050:
17047:
17044:
17041:
17033:
17010:
17007:
17004:
17001:
16996:
16986:
16981:
16971:
16969:
16961:
16958:
16953:
16949:
16939:
16936:
16929:
16926:
16923:
16920:
16915:
16905:
16900:
16890:
16888:
16880:
16877:
16872:
16868:
16858:
16855:
16844:
16843:
16842:
16839:
16835:
16831:
16826:
16824:
16810:
16805:
16801:
16800:
16799:
16791:
16788:
16785:
16782:
16781:
16777:
16774:
16771:
16768:
16767:
16763:
16760:
16757:
16754:
16753:
16750:Secessionist
16749:
16746:
16743:
16741:
16740:
16734:
16731:
16728:
16724:
16720:
16716:
16712:
16699:
16696:
16688:
16678:
16674:
16670:
16666:
16660:
16659:
16654:This example
16652:
16643:
16642:
16639:
16638:
16612:
16608:
16600:
16588:
16578:
16567:
16562:
16559:
16555:
16547:
16534:
16524:
16516:
16513:
16505:
16498:
16478:
16468:
16460:
16457:
16454:
16446:
16439:
16411:
16408:
16405:
16402:
16397:
16387:
16374:
16367:
16355:
16341:
16338:
16335:
16332:
16327:
16317:
16309:
16299:
16294:
16276:
16269:
16258:
16255:
16247:
16243:
16239:
16234:
16230:
16223:
16218:
16208:
16200:
16190:
16185:
16167:
16160:
16151:
16147:
16144:
16136:
16132:
16128:
16123:
16119:
16112:
16104:
16094:
16089:
16079:
16074:
16064:
16059:
16045:
16039:
16032:
16023:
16019:
16016:
16012:
16006:
16002:
15998:
15993:
15983:
15978:
15967:
15963:
15958:
15954:
15950:
15945:
15935:
15930:
15919:
15913:
15906:
15897:
15891:
15881:
15878:
15875:
15870:
15867:
15862:
15858:
15854:
15849:
15846:
15841:
15837:
15832:
15826:
15819:
15810:
15804:
15794:
15789:
15786:
15781:
15777:
15773:
15768:
15765:
15760:
15756:
15751:
15745:
15738:
15730:
15720:
15717:
15714:
15709:
15705:
15687:
15686:
15685:
15671:
15665:
15662:
15659:
15653:
15650:
15647:
15642:
15638:
15634:
15629:
15625:
15621:
15618:
15610:
15589:
15585:
15581:
15576:
15572:
15568:
15565:
15558:
15542:
15532:
15527:
15517:
15505:
15504:
15503:
15499:
15497:
15493:
15488:
15485:
15481:
15477:
15472:
15438:
15431:
15426:
15423:
15418:
15414:
15410:
15405:
15402:
15397:
15393:
15382:
15376:
15371:
15366:
15362:
15354:
15353:
15352:
15331:
15328:
15324:
15320:
15316:
15310:
15307:
15303:
15299:
15293:
15290:
15285:
15281:
15271:
15265:
15262:
15257:
15253:
15239:
15238:
15237:
15235:
15228:
15202:
15199:
15196:
15190:
15185:
15181:
15177:
15175:
15168:
15164:
15153:
15150:
15147:
15141:
15136:
15132:
15128:
15126:
15119:
15115:
15103:
15102:
15101:
15077:
15073:
15069:
15064:
15054:
15049:
15039:
15037:
15030:
15027:
15022:
15018:
15007:
15003:
14999:
14994:
14984:
14979:
14969:
14967:
14960:
14957:
14952:
14948:
14936:
14935:
14934:
14926:
14924:
14920:
14919:heavier tails
14915:
14911:
14907:
14874:
14870:
14866:
14864:
14850:
14840:
14829:
14824:
14821:
14817:
14813:
14811:
14790:
14780:
14772:
14767:
14763:
14753:
14751:
14738:
14728:
14720:
14717:
14712:
14708:
14698:
14696:
14685:
14682:
14677:
14673:
14669:
14664:
14654:
14640:
14638:
14625:
14615:
14612:
14609:
14604:
14599:
14595:
14585:
14583:
14573:
14563:
14560:
14557:
14552:
14548:
14530:
14529:
14528:
14508:
14502:
14497:
14494:
14490:
14486:
14480:
14477:
14472:
14468:
14454:
14453:
14452:
14450:
14446:
14442:
14438:
14434:
14430:
14425:
14422:
14419:
14415:
14411:
14406:
14402:
14398:
14394:
14389:
14385:
14381:
14377:
14343:
14336:
14331:
14321:
14313:
14307:
14303:
14299:
14287:
14284:
14279:
14274:
14270:
14259:
14253:
14248:
14243:
14239:
14231:
14230:
14229:
14226:
14222:
14217:
14215:
14211:
14188:
14185:
14182:
14176:
14173:
14170:
14165:
14161:
14153:
14152:
14151:
14131:
14127:
14123:
14118:
14108:
14100:
14095:
14090:
14086:
14078:
14077:
14076:
14074:
14069:
14065:
14062:
14058:
14053:
14051:
14047:
14043:
14012:
14002:
13993:
13989:
13986:
13980:
13977:
13972:
13962:
13953:
13947:
13942:
13939:
13936:
13931:
13920:
13910:
13901:
13897:
13894:
13886:
13876:
13867:
13861:
13858:
13854:
13847:
13842:
13832:
13822:
13813:
13809:
13806:
13798:
13788:
13779:
13773:
13768:
13763:
13760:
13757:
13747:
13743:
13739:
13736:
13728:
13721:
13717:
13711:
13703:
13693:
13690:
13687:
13682:
13678:
13664:
13663:
13662:
13660:
13656:
13630:
13620:
13612:
13608:
13604:
13601:
13597:
13592:
13584:
13574:
13563:
13558:
13555:
13551:
13547:
13542:
13538:
13534:
13526:
13516:
13511:
13507:
13500:
13486:
13485:
13484:
13482:
13477:
13461:
13457:
13448:
13444:
13439:
13435:
13430:
13428:
13427:L-BFGS method
13424:
13420:
13415:
13412:
13408:
13404:
13400:
13396:
13391:
13387:
13382:
13362:
13359:
13353:
13340:
13338:
13334:
13313:
13303:
13295:
13291:
13283:
13279:
13275:
13272:
13266:
13262:
13256:
13252:
13249:
13246:
13238:
13234:
13227:
13224:
13221:
13210:
13200:
13195:
13191:
13184:
13171:
13168:
13161:
13160:
13159:
13140:
13137:
13134:
13130:
13124:
13120:
13116:
13113:
13110:
13105:
13102:
13099:
13095:
13089:
13085:
13081:
13076:
13072:
13068:
13064:
13056:
13052:
13048:
13045:
13039:
13035:
13029:
13025:
13022:
13019:
13011:
13007:
13000:
12997:
12994:
12983:
12980:
12977:
12973:
12969:
12966:
12963:
12958:
12955:
12952:
12948:
12944:
12939:
12935:
12928:
12915:
12912:
12905:
12904:
12903:
12901:
12900:binary-valued
12897:
12893:
12883:
12875:
12873:
12869:
12851:
12848:
12844:
12821:
12806:
12788:
12785:
12781:
12755:
12752:
12748:
12739:
12724:
12720:
12714:
12709:
12706:
12703:
12699:
12695:
12690:
12687:
12683:
12674:
12670:
12666:
12663:
12652:
12647:
12644:
12641:
12637:
12633:
12630:
12627:
12619:
12616:
12612:
12603:
12590:
12589:
12588:
12586:
12582:
12575:
12571:
12564:
12556:
12535:
12531:
12527:
12524:
12487:
12472:
12468:
12461:
12458:
12449:
12445:
12441:
12438:
12427:
12422:
12419:
12416:
12412:
12406:
12401:
12398:
12395:
12391:
12387:
12384:
12377:
12376:
12375:
12359:
12355:
12332:
12317:
12313:
12309:
12304:
12277:
12273:
12269:
12266:
12263:
12247:
12243:
12209:
12205:
12201:
12187:
12167:
12164:
12161:
12133:
12128:
12118:
12114:
12095:
12091:
12072:
12068:
12061:
12057:
12054:
12051:
12046:
12042:
12034:
12033:
12032:
12030:
12023:
11994:
11990:
11981:
11952:
11948:
11925:
11885:
11881:
11847:
11842:
11831:
11825:
11820:
11817:
11814:
11810:
11806:
11803:
11799:
11794:
11778:
11774:
11768:
11763:
11760:
11757:
11753:
11749:
11746:
11743:
11727:
11723:
11715:
11701:
11698:
11695:
11692:
11689:
11686:
11683:
11680:
11677:
11647:
11642:
11631:
11625:
11620:
11617:
11614:
11610:
11606:
11603:
11592:
11587:
11576:
11570:
11554:
11550:
11542:
11541:
11540:
11538:
11534:
11530:
11526:
11522:
11504:
11501:
11498:
11474:
11471:
11468:
11458:
11437:
11434:
11431:
11420:
11395:
11392:
11389:
11358:
11347:
11322:
11319:
11316:
11308:
11290:
11286:
11263:
11259:
11238:
11233:
11229:
11208:
11205:
11202:
11180:
11176:
11155:
11133:
11129:
11108:
11105:
11100:
11096:
11088:
11073:
11070:
11067:
11059:
11041:
11037:
11016:
11013:
11010:
10988:
10984:
10963:
10941:
10937:
10916:
10913:
10908:
10904:
10896:
10882:
10878:
10874:
10871:
10865:
10862:
10859:
10852:
10848:
10828:
10825:
10820:
10816:
10812:
10807:
10803:
10782:
10779:
10776:
10754:
10751:
10747:
10726:
10723:
10720:
10700:
10697:
10692:
10688:
10684:
10679:
10675:
10654:
10651:
10646:
10642:
10638:
10633:
10629:
10608:
10605:
10602:
10594:
10592:
10575:
10572:
10569:
10564:
10560:
10552:
10551:
10550:
10536:
10533:
10530:
10522:
10492:
10488:
10482:
10478:
10474:
10469:
10465:
10459:
10455:
10451:
10446:
10442:
10435:
10431:
10427:
10424:
10420:
10415:
10405:
10401:
10395:
10391:
10387:
10382:
10378:
10372:
10368:
10364:
10359:
10355:
10350:
10346:
10343:
10335:
10331:
10325:
10321:
10317:
10312:
10308:
10302:
10298:
10294:
10289:
10285:
10280:
10274:
10266:
10263:
10254:
10250:
10247:
10236:
10227:
10221:
10218:
10211:
10195:
10191:
10187:
10184:
10179:
10175:
10171:
10168:
10165:
10162:
10156:
10153:
10150:
10146:
10141:
10136:
10132:
10128:
10125:
10118:
10117:
10116:
10102:
10099:
10094:
10090:
10069:
10066:
10061:
10057:
10036:
10033:
10030:
10025:
10021:
10000:
9997:
9994:
9974:
9971:
9968:
9959:
9958:measurement.
9957:
9953:
9946:
9922:
9919:
9915:
9906:
9893:
9888:
9883:
9880:
9877:
9873:
9869:
9864:
9861:
9857:
9851:
9847:
9841:
9836:
9833:
9830:
9826:
9822:
9819:
9816:
9808:
9804:
9795:
9782:
9781:
9780:
9778:
9774:
9734:
9731:
9728:
9722:
9717:
9713:
9704:
9700:
9696:
9693:
9685:
9680:
9677:
9674:
9670:
9666:
9642:
9636:
9631:
9627:
9621:
9617:
9611:
9606:
9603:
9600:
9596:
9592:
9589:
9582:
9581:
9580:
9566:
9563:
9560:
9538:
9534:
9525:
9507:
9492:
9476:
9473:
9470:
9439:
9397:
9394:
9391:
9371:
9368:
9365:
9343:
9339:
9318:
9310:
9292:
9288:
9260:
9252:
9248:
9244:
9231:
9223:
9219:
9215:
9212:
9208:
9203:
9190:
9181:
9177:
9174:
9163:
9154:
9148:
9134:
9127:
9126:
9125:
9111:
9108:
9105:
9097:
9078:
9075:
9067:
9062:
9058:
9052:
9048:
9042:
9037:
9034:
9031:
9027:
9023:
9020:
9013:
9012:
9011:
9009:
8982:
8978:
8974:
8971:
8968:
8963:
8959:
8955:
8950:
8946:
8942:
8937:
8933:
8926:
8914:
8895:
8891:
8887:
8884:
8881:
8876:
8872:
8868:
8863:
8859:
8855:
8850:
8846:
8839:
8827:
8826:
8825:
8806:
8803:
8800:
8787:
8782:
8780:
8779:
8762:
8754:
8751:
8747:
8729:
8725:
8716:
8695:
8691:
8685:
8681:
8677:
8674:
8671:
8666:
8662:
8656:
8652:
8648:
8643:
8639:
8633:
8629:
8625:
8620:
8616:
8612:
8606:
8603:
8600:
8596:
8591:
8586:
8582:
8578:
8575:
8568:
8567:
8566:
8564:
8548:
8545:
8542:
8534:
8530:
8526:
8522:
8515:
8508:
8501:
8496:
8482:
8479:
8476:
8473:
8470:
8467:
8464:
8461:
8458:
8450:
8427:
8423:
8421:
8402:
8397:
8387:
8379:
8373:
8367:
8360:
8359:
8358:
8352:
8348:
8347:
8338:
8334:
8329:
8324:
8319:
8314:
8310:
8304:
8300:
8295:
8290:
8286:
8282:
8279:
8275:
8274:
8268:
8264:
8257:
8250:
8246:
8245:
8244:
8241:
8239:
8221:
8217:
8213:
8210:
8207:
8202:
8198:
8174:
8169:
8166:
8163:
8159:
8153:
8149:
8145:
8142:
8139:
8134:
8131:
8128:
8124:
8118:
8114:
8110:
8105:
8101:
8097:
8091:
8085:
8078:
8077:
8076:
8074:
8055:
8049:
8041:
8037:
8033:
8028:
8024:
8020:
8012:
8011:
8006:
8002:
7997:
7993:
7988:
7984:
7979:
7976:
7972:
7968:
7964:
7959:
7955:
7950:
7946:
7941:
7937:
7933:
7929:
7925:
7921:
7917:
7912:
7908:
7904:
7899:
7895:
7891:
7887:
7886:
7885:
7857:
7854:
7851:
7838:
7834:
7830:
7827:
7819:
7814:
7810:
7806:
7804:
7794:
7791:
7788:
7784:
7780:
7777:
7774:
7769:
7766:
7763:
7759:
7755:
7752:
7749:
7744:
7740:
7719:
7716:
7713:
7701:
7697:
7693:
7690:
7683:
7680:
7677:
7665:
7661:
7654:
7649:
7647:
7637:
7634:
7631:
7627:
7623:
7620:
7617:
7612:
7609:
7606:
7602:
7598:
7595:
7592:
7587:
7583:
7567:
7563:
7559:
7557:
7547:
7544:
7541:
7537:
7533:
7530:
7527:
7522:
7519:
7516:
7512:
7508:
7503:
7499:
7492:
7473:
7469:
7462:
7459:
7456:
7454:
7444:
7441:
7438:
7434:
7430:
7427:
7424:
7419:
7416:
7413:
7409:
7405:
7400:
7396:
7384:
7383:
7382:
7381:
7380:
7377:
7373:
7369:
7364:
7360:
7352:
7351:
7350:
7348:
7344:
7339:
7337:
7333:
7329:
7325:
7321:
7317:
7309:
7308:
7307:
7304:
7300:
7295:
7290:
7285:
7281:
7276:
7274:
7269:
7265:
7261:
7257:
7254:(also called
7252:
7248:
7243:
7238:
7234:
7230:
7226:
7216:
7202:
7199:
7196:
7163:
7159:
7153:
7149:
7145:
7142:
7139:
7134:
7130:
7124:
7120:
7116:
7111:
7107:
7101:
7097:
7093:
7088:
7084:
7077:
7073:
7069:
7066:
7062:
7057:
7054:
7047:
7046:
7045:
7026:
7022:
7016:
7012:
7008:
7005:
7002:
6997:
6993:
6987:
6983:
6979:
6974:
6970:
6964:
6960:
6956:
6951:
6947:
6943:
6937:
6934:
6931:
6927:
6922:
6919:
6912:
6911:
6910:
6907:
6893:
6890:
6887:
6884:
6881:
6878:
6875:
6872:
6869:
6866:
6863:
6841:
6837:
6828:
6824:
6806:
6802:
6796:
6792:
6786:
6781:
6778:
6775:
6771:
6767:
6762:
6758:
6754:
6749:
6745:
6739:
6735:
6731:
6728:
6725:
6720:
6716:
6710:
6706:
6702:
6697:
6693:
6687:
6683:
6679:
6674:
6670:
6649:
6644:
6640:
6636:
6631:
6627:
6612:
6610:
6606:
6602:
6598:
6594:
6575:
6572:
6567:
6564:
6552:
6534:
6530:
6525:
6502:
6498:
6470:
6466:
6461:
6457:
6450:
6445:
6441:
6437:
6432:
6428:
6423:
6414:
6411:
6408:
6400:
6396:
6392:
6387:
6383:
6378:
6372:
6366:
6357:
6351:
6348:
6345:
6337:
6331:
6325:
6320:
6311:
6308:
6305:
6299:
6296:
6293:
6285:
6282:
6279:
6273:
6267:
6261:
6252:
6246:
6243:
6235:
6232:
6229:
6223:
6220:
6214:
6194:
6190:
6189:
6188:
6166:
6161:
6156:
6152:
6148:
6143:
6139:
6134:
6130:
6118:
6117:
6116:
6102:
6093:
6090:
6074:
6045:
6037:
6022:
6017:
6013:
6005:
6002:
5984:
5980:
5972:
5954:
5948:
5924:
5918:
5895:
5889:
5882:
5879:
5863:
5856:
5853:
5831:
5825:
5819:
5799:
5792:
5791:
5790:
5768:
5763:
5758:
5754:
5750:
5745:
5741:
5736:
5732:
5723:
5717:
5714:
5711:
5703:
5697:
5687:
5686:
5685:
5668:
5665:
5660:
5656:
5652:
5647:
5643:
5639:
5635:
5626:
5620:
5617:
5614:
5606:
5600:
5594:
5590:
5587:
5584:
5578:
5572:
5569:
5566:
5563:
5554:
5548:
5540:
5537:
5533:
5529:
5520:
5514:
5508:
5501:
5500:
5499:
5483:
5480:
5476:
5472:
5469:
5461:
5451:
5437:
5417:
5410:
5409:design matrix
5392:
5388:
5364:
5361:
5358:
5355:
5350:
5346:
5339:
5317:
5313:
5305:
5289:
5266:
5260:
5229:
5224:
5220:
5216:
5211:
5207:
5200:
5196:
5192:
5189:
5185:
5180:
5174:
5168:
5165:
5159:
5153:
5146:
5145:
5144:
5127:
5124:
5121:
5107:
5104:
5081:
5076:
5072:
5068:
5063:
5059:
5055:
5052:
5045:
5044:
5043:
5029:
5021:
5005:
4985:
4978:
4962:
4953:
4951:
4927:
4924:
4920:
4916:
4913:
4909:
4904:
4898:
4895:
4890:
4886:
4879:
4875:
4869:
4863:
4857:
4850:
4849:
4848:
4831:
4828:
4825:
4811:
4808:
4800:
4796:
4792:
4776:
4768:
4764:
4760:
4735:
4712:
4709:
4706:
4700:
4694:
4688:
4665:
4659:
4650:
4641:
4639:
4629:
4627:
4611:
4608:
4605:
4597:
4593:
4577:
4574:
4571:
4563:
4553:
4550:
4547:
4544:
4538:
4534:
4533:
4529:
4526:
4523:
4520:
4514:
4510:
4509:
4505:
4502:
4499:
4496:
4493:
4490:
4488:
4487:
4484:
4471:
4468:
4465:
4462:
4461:
4457:
4454:
4451:
4448:
4447:
4443:
4440:
4437:
4434:
4433:
4414:
4411:
4401:
4398:
4395:
4379:
4376:
4373:
4364:
4363:
4359:
4357:0.34 ≈ 1:2.91
4356:
4353:
4350:
4349:
4345:
4342:
4339:
4336:
4335:
4331:
4328:
4325:
4324:
4321:Passing exam
4312:
4309:
4287:
4284:
4281:
4273:
4270:
4266:
4262:
4259:
4255:
4250:
4247:
4240:
4239:
4224:
4221:
4218:
4215:
4212:
4209:
4206:
4203:
4200:
4195:
4191:
4187:
4184:
4179:
4175:
4171:
4168:
4161:
4160:
4159:
4137:
4134:
4131:
4123:
4120:
4116:
4112:
4109:
4105:
4100:
4097:
4090:
4089:
4074:
4071:
4068:
4065:
4062:
4059:
4056:
4053:
4050:
4047:
4042:
4038:
4034:
4031:
4026:
4022:
4018:
4015:
4008:
4007:
4006:
3992:
3989:
3986:
3977:
3959:
3955:
3928:
3924:
3893:
3890:
3885:
3881:
3876:
3872:
3869:
3866:
3859:
3845:
3842:
3837:
3833:
3828:
3822:
3818:
3814:
3811:
3808:
3801:
3800:
3799:
3797:
3793:
3774:
3771:
3766:
3762:
3754:
3740:
3737:
3734:
3729:
3725:
3717:
3716:
3715:
3713:
3709:
3689:
3685:
3658:
3654:
3642:
3624:
3620:
3593:
3589:
3561:
3557:
3548:
3544:
3540:
3535:
3531:
3522:
3517:
3514:
3511:
3507:
3503:
3495:
3491:
3482:
3473:
3470:
3463:
3462:
3442:
3438:
3434:
3429:
3425:
3416:
3411:
3408:
3405:
3401:
3397:
3389:
3385:
3376:
3367:
3364:
3357:
3356:
3355:
3337:
3333:
3306:
3302:
3291:
3287:
3267:
3263:
3236:
3232:
3221:
3211:
3209:
3185:
3181:
3177:
3174:
3166:
3163:
3158:
3154:
3150:
3147:
3143:
3136:
3132:
3126:
3123:
3118:
3114:
3110:
3107:
3103:
3099:
3096:
3089:
3088:
3087:
3085:
3065:
3056:
3052:
3048:
3045:
3039:
3036:
3028:
3024:
3020:
3017:
3011:
3003:
2999:
2992:
2989:
2984:
2980:
2974:
2968:
2963:
2960:
2957:
2953:
2949:
2941:
2937:
2933:
2930:
2924:
2921:
2916:
2913:
2908:
2904:
2900:
2897:
2893:
2889:
2881:
2877:
2870:
2867:
2862:
2859:
2854:
2850:
2846:
2843:
2839:
2835:
2832:
2825:
2824:
2823:
2821:
2817:
2812:
2810:
2792:
2789:
2763:
2759:
2732:
2728:
2703:
2700:
2689:
2663:
2659:
2655:
2652:
2646:
2641:
2637:
2597:
2593:
2589:
2586:
2580:
2575:
2571:
2555:
2554:cross-entropy
2536:
2528:
2524:
2520:
2517:
2511:
2508:
2500:
2496:
2492:
2489:
2483:
2478:
2474:
2470:
2467:
2462:
2458:
2454:
2451:
2446:
2442:
2434:
2433:
2432:
2429:
2413:
2410:
2405:
2401:
2397:
2394:
2368:
2364:
2341:
2333:
2329:
2308:
2305:
2300:
2296:
2275:
2267:
2263:
2242:
2239:
2234:
2230:
2209:
2206:
2201:
2197:
2176:
2173:
2168:
2164:
2143:
2140:
2135:
2131:
2110:
2107:
2102:
2098:
2089:
2069:
2065:
2038:
2034:
2023:
1997:
1994:
1989:
1985:
1969:
1965:
1961:
1958:
1952:
1949:
1946:
1939:
1936:
1933:
1928:
1924:
1911:
1907:
1903:
1900:
1897:
1891:
1886:
1881:
1877:
1869:
1868:
1867:
1849:
1845:
1834:
1829:
1827:
1823:
1819:
1796:
1792:
1765:
1761:
1750:
1730:
1726:
1722:
1719:
1693:
1689:
1662:
1658:
1630:
1626:
1619:
1616:
1611:
1607:
1598:
1591:
1584:
1580:
1576:
1575:logistic loss
1572:
1562:
1558:
1542:
1538:
1533:
1529:
1526:
1523:
1501:
1497:
1492:
1486:
1482:
1478:
1475:
1472:
1464:
1460:
1456:
1440:
1436:
1432:
1429:
1424:
1420:
1399:
1394:
1390:
1386:
1381:
1377:
1373:
1370:
1362:
1359:intercept or
1358:
1354:
1338:
1334:
1330:
1327:
1324:
1319:
1315:
1283:
1278:
1274:
1270:
1265:
1261:
1254:
1250:
1246:
1243:
1239:
1234:
1228:
1222:
1215:
1214:
1213:
1211:
1207:
1191:
1187:
1183:
1180:
1174:
1168:
1160:
1156:
1132:
1128:
1121:
1118:
1115:
1109:
1105:
1101:
1098:
1094:
1089:
1083:
1077:
1070:
1069:
1068:
1066:
1058:
1051:
1043:
1034:
1032:
1028:
1024:
1020:
1004:
1001:
998:
995:
992:
972:
969:
966:
958:
954:
947:
934:
931:
928:
925:
922:
919:
916:
913:
910:
907:
904:
901:
898:
895:
892:
889:
886:
883:
880:
877:
874:
867:
866:
862:
859:
856:
853:
850:
847:
844:
841:
838:
835:
832:
829:
826:
823:
820:
817:
814:
811:
808:
805:
802:
795:
794:
791:
788:
786:
782:
776:
772:
759:
756:
752:
742:
740:
736:
732:
728:
724:
720:
716:
712:
708:
704:
700:
696:
691:
685:
663:
658:
656:
651:
649:
644:
643:
641:
640:
635:
630:
625:
624:
623:
622:
617:
614:
612:
609:
607:
604:
602:
599:
597:
594:
592:
589:
588:
587:
586:
582:
581:
576:
573:
571:
568:
566:
563:
561:
558:
556:
553:
552:
551:
550:
545:
542:
540:
537:
535:
532:
530:
527:
525:
522:
521:
520:
519:
514:
511:
509:
506:
504:
501:
499:
496:
495:
494:
493:
488:
485:
483:
480:
478:
477:Least squares
475:
474:
473:
472:
468:
467:
462:
459:
458:
457:
456:
451:
448:
446:
443:
441:
438:
436:
433:
431:
428:
426:
423:
421:
418:
416:
413:
411:
410:Nonparametric
408:
406:
403:
402:
401:
400:
395:
392:
390:
387:
385:
382:
380:
379:Fixed effects
377:
375:
372:
371:
370:
369:
364:
361:
359:
356:
354:
353:Ordered logit
351:
349:
346:
344:
341:
339:
336:
334:
331:
329:
326:
324:
321:
319:
316:
314:
311:
309:
306:
304:
301:
300:
299:
298:
293:
290:
288:
285:
283:
280:
278:
275:
274:
273:
272:
268:
267:
264:
261:
260:
256:
255:
252:
250:
246:
242:
238:
234:
230:
226:
222:
218:
213:
211:
207:
203:
199:
195:
191:
187:
183:
178:
176:
172:
168:
164:
160:
156:
152:
148:
144:
139:
137:
133:
129:
125:
124:
120:
115:
114:
109:
105:
101:
97:
93:
89:
86:
82:
78:
74:
70:
66:
62:
58:
54:
50:
46:
42:
34:
28:
22:
34369:
34357:
34338:
34331:
34243:Econometrics
34193: /
34176:Chemometrics
34153:Epidemiology
34146: /
34119:Applications
33961:ARIMA model
33908:Q-statistic
33857:Stationarity
33753:Multivariate
33696: /
33692: /
33690:Multivariate
33688: /
33628: /
33624: /
33618:
33398:Bayes factor
33297:Signed rank
33209:
33183:
33175:
33163:
32858:Completeness
32694:Cohort study
32592:Opinion poll
32527:Missing data
32514:Study design
32469:Scatter plot
32391:Scatter plot
32384:Spearman's ρ
32346:Grouped data
31999:
31972:
31966:
31943:
31924:
31905:
31883:
31864:
31838:
31819:
31796:
31752:(2): 79–85.
31749:
31743:
31732:Wilson, E.B.
31687:
31683:
31654:
31650:
31633:
31625:the original
31612:
31608:
31585:
31581:
31559:
31549:
31521:
31517:
31506:
31478:
31472:
31439:
31435:
31410:
31406:
31385:. Retrieved
31378:the original
31373:
31357:
31330:
31318:
31313:, p. 9.
31306:
31294:
31289:, p. 5.
31282:
31270:
31265:, p. 6.
31258:
31253:, p. 7.
31246:
31241:, p. 4.
31234:
31223:. Retrieved
31219:
31215:
31205:
31193:
31181:. Retrieved
31176:
31172:
31159:
31147:
31138:
31134:
31124:
31114:
31107:
31097:
31090:
31081:
31068:
31056:. Retrieved
31006:
31000:
30975:
30971:
30965:
30947:
30928:
30880:
30874:
30837:
30833:
30823:
30796:
30792:
30782:
30755:
30749:
30739:
30702:
30698:
30688:
30669:
30663:
30646:
30642:
30629:
30615:(1): 83–97.
30612:
30608:
30602:
30583:
30545:
30535:
30461:
30440:
30432:
30418:
30384:
30378:
30358:
30347:. Retrieved
30343:
30334:
30309:
30305:
30295:
30246:
30242:
30232:
30210:(1): 11–20.
30207:
30203:
30193:
30184:
30178:
30161:
30157:
30151:
30134:
30130:
30124:
30097:
30061:
30057:
30051:
30038:
30001:
29997:
29991:
29966:
29962:
29956:
29931:
29927:
29921:
29904:
29900:
29894:
29867:
29863:
29853:
29848:, p. 8.
29841:
29816:
29812:
29769:
29704:(5): 533–4.
29701:
29697:
29691:
29558:
29548:
29540:
29531:
29506:Theil (1969)
29499:
29488:
29460:
29428:Bliss (1934)
29420:probit model
29417:
29393:
29377:
29353:
29341:
29334:
29298:probit model
29295:
29292:Alternatives
29285:
29237:
29159:
28627:
28593:
28586:
28439:
28045:is given by
27965:
27884:
27744:
27717:
27710:
27524:independent
27494:rather than
27491:
27365:
27226:
27224:
27165:
27045:
27041:
27039:
26939:
26735:
26628:
26626:
26496:
26431:
26424:
26422:
26225:
26221:
26214:
26206:
26204:
26054:
25933:
25800:
25791:
25789:
25784:
25780:
25689:
25683:
25677:
25675:
25670:
25508:
25393:
25293:
25286:
25250:
25230:
25216:
25139:
25029:
24972:
24749:
24745:
24732:
24648:
24642:
24629:
24619:
24612:
24602:distribution
24569:
24514:
24506:
24488:
24275:
24159:
24110:
24106:
24090:
24023:
24008:
24001:confidence.
23959:
23952:
23945:
23943:
23932:
23766:
23758:
23754:
23751:
23611:
23549:
23542:
23513:
23407:The optimum
23406:
23264:
23222:
23070:
23024:
22918:
22755:
22641:
22573:
22571:
22510:
22506:
22502:
22500:
22495:
22488:
22481:
22476:. Using the
22434:
22427:
22420:
22418:
22410:
22408:
22264:
22111:
22109:
22008:
22004:
21996:
21990:
21952:
21831:
21775:
21768:
21764:
21762:
21747:
21739:
21725:
21644:
21638:
21571:
21385:
21080:
20614:
20560:
20514:
20503:
20490:
20161:
19954:
19844:In terms of
19843:
19837:
19833:
19828:
19824:
19822:
19714:
19710:
19702:
19698:
19694:
19692:
19585:
19577:
19573:
19571:
19502:
19498:
19491:
19487:
19480:
19466:
19321:
19288:econometrics
19280:
19225:
19026:
18893:
18160:
18159:
18151:
18150:
17984:
17841:
17706:
17382:
17276:
17272:
17263:
17259:
17255:
17253:
17029:
16837:
16833:
16827:
16819:
16797:
16755:High-income
16744:Center-right
16732:
16708:
16691:
16682:
16655:
15606:
15500:
15489:
15467:
15350:
15226:
15224:
15099:
14932:
14910:probit model
14903:
14526:
14426:
14423:
14417:
14413:
14409:
14404:
14400:
14396:
14392:
14387:
14383:
14379:
14375:
14372:
14224:
14220:
14218:
14207:
14149:
14067:
14063:
14056:
14054:
14050:probit model
14039:
13652:
13478:
13446:
13437:
13433:
13431:
13425:such as the
13389:
13385:
13383:
13341:
13330:
13157:
12889:
12881:
12871:
12867:
12804:
12772:
12584:
12577:
12573:
12566:
12558:
12510:
12315:
12311:
12307:
12305:
12153:
12025:
12021:
11979:
11872:
11536:
11532:
11528:
11524:
11520:
11453:
11421:
11349:
11306:
11057:
10590:
10520:
10518:
9960:
9955:
9948:
9941:
9939:
9776:
9772:
9770:
9579:case above:
9523:
9490:
9279:
9095:
9093:
9004:
9002:
8785:
8783:
8776:
8752:
8750:Euler number
8745:
8714:
8712:
8562:
8524:
8517:
8510:
8503:
8499:
8497:
8440:
8438:
8417:
8356:
8350:
8342:
8341:
8336:
8332:
8327:
8322:
8317:
8312:
8302:
8301:coefficient
8293:
8288:
8284:
8277:
8272:
8271:
8266:
8262:
8255:
8248:
8242:
8189:
8072:
8026:
8022:
8016:
8004:
8000:
7995:
7991:
7986:
7982:
7966:
7962:
7957:
7953:
7948:
7944:
7939:
7935:
7923:
7919:
7915:
7910:
7906:
7897:
7893:
7883:
7375:
7371:
7362:
7358:
7356:
7340:
7313:
7302:
7298:
7293:
7288:
7283:
7279:
7277:
7267:
7263:
7250:
7246:
7241:
7236:
7232:
7228:
7224:
7222:
7188:
7043:
6908:
6826:
6618:
6604:
6600:
6596:
6592:
6553:
6489:
6186:
6094:
6066:
5876:denotes the
5788:
5683:
5457:
5252:
5096:
5042:as follows:
5019:
4954:
4949:
4947:
4798:
4756:
4635:
4591:
4559:
4536:
4512:
4503:
4497:
4482:
4326:Log-odds (t)
4307:
4157:
3978:
3913:
3795:
3791:
3789:
3711:
3707:
3643:
3578:
3354:to be zero:
3289:
3285:
3219:
3217:
3205:
3081:
2819:
2815:
2813:
2808:
2690:
2551:
2430:
2019:
1832:
1830:
1825:
1814:
1593:
1586:
1568:
1559:
1462:
1458:
1360:
1356:
1306:
1205:
1154:
1152:
1062:
1053:
1046:
1026:
1018:
956:
949:
942:
940:
869:
797:
789:
778:
774:
770:
748:
687:
681:
673:Applications
534:Non-negative
327:
214:
193:
186:probit model
179:
140:
122:
118:
117:
111:
76:
72:
48:
44:
38:
34371:WikiProject
34286:Cartography
34248:Jurimetrics
34200:Reliability
33931:Time domain
33910:(Ljung–Box)
33832:Time-series
33710:Categorical
33694:Time-series
33686:Categorical
33621:(Bernoulli)
33456:Correlation
33436:Correlation
33232:Jarque–Bera
33204:Chi-squared
32966:M-estimator
32919:Asymptotics
32863:Sufficiency
32630:Interaction
32542:Replication
32522:Effect size
32479:Violin plot
32459:Radar chart
32439:Forest plot
32429:Correlogram
32379:Kendall's τ
31615:: 164–165.
31350:Cramer 2002
31335:Cramer 2002
31323:Cramer 2002
31311:Cramer 2002
31299:Cramer 2002
31287:Cramer 2002
31275:Cramer 2002
31263:Cramer 2002
31251:Cramer 2002
31239:Cramer 2002
31198:Cramer 2002
31152:Cramer 2002
30312:: 418–426.
29846:Cramer 2002
29796:Cramer 2002
29658:Brier score
29569:Mixed logit
29545:categorical
29432:John Gaddum
29400:Lowell Reed
29286:probability
27885:Therefore,
27745:Consider a
24537:Nagelkerke
21754:chi-squared
21743:overfitting
20534:Regularized
19298:models and
16783:Low-income
16747:Center-left
14884:(see above)
12836:vector and
11307:probability
11058:probability
8420:dot product
7328:categorical
7320:real-valued
4795:probability
4511:Intercept (
4491:Coefficient
3910:Predictions
1355:(it is the
1025:", and the
711:engineering
707:blood tests
544:Regularized
508:Generalized
440:Least angle
338:Mixed logit
49:logit model
34387:Categories
34238:Demography
33956:ARMA model
33761:Regression
33338:(Friedman)
33299:(Wilcoxon)
33237:Normality
33227:Lilliefors
33174:Student's
33050:Resampling
32924:Robustness
32912:divergence
32902:Efficiency
32840:(monotone)
32835:Likelihood
32752:Population
32585:Stratified
32537:Population
32356:Dependence
32312:Count data
32243:Percentile
32220:Dispersion
32153:Arithmetic
32088:Statistics
32037:Mark Thoma
31436:Biometrics
31387:2019-04-20
31225:2013-02-18
31183:3 December
30705:(1): 163.
30649:: 97–120.
30364:Neyman, J.
30349:2024-03-16
29813:Biometrika
29726:6823603312
29684:References
29590:stratified
29561:) handles
29528:Extensions
29502:Cox (1966)
29495:Cox (1958)
27966:and since
26627:where the
26423:where the
25026:Discussion
24616:definition
23933:Using the
22176:where the
21993:null model
21486:(i.e. the
20526:separation
20522:sparseness
19469:perceptron
17281:normalized
16772:moderate +
16669:improve it
15445:otherwise.
14350:otherwise.
12314:, let the
10593:-intercept
9311:with base
7219:Definition
4644:Background
4494:Std. Error
2816:minimizing
2780:for which
1835:-th point
583:Background
487:Non-linear
469:Estimation
198:odds ratio
41:statistics
33619:Logistic
33386:posterior
33312:Rank sum
33060:Jackknife
33055:Bootstrap
32873:Bootstrap
32808:Parameter
32757:Statistic
32552:Statistic
32464:Run chart
32449:Pie chart
32444:Histogram
32434:Fan chart
32409:Bar chart
32291:L-moments
32178:Geometric
31642:808240121
31456:0006-341X
31179:: 113–121
30506:−
30497:−
30473:Δ
30326:0925-7535
30271:0272-4332
30224:1527-6988
30164:: 88–96.
29718:0098-7484
29491:David Cox
29430:, and by
29412:Udny Yule
29316:(inverse
29257:∣
29177:∣
29135:∣
29123:−
29115:θ
29107:∥
29088:−
29055:∣
29037:
29022:θ
29007:∣
28975:∣
28954:
28948:−
28901:∈
28894:∑
28881:∈
28874:∑
28854:θ
28839:∣
28821:
28776:∈
28769:∑
28756:∈
28749:∑
28739:θ
28723:∣
28704:
28681:∑
28672:−
28662:∞
28656:→
28569:θ
28553:∣
28534:
28511:∑
28502:−
28479:∣
28476:θ
28467:
28456:−
28406:−
28374:θ
28366:−
28325:θ
28311:∏
28294:θ
28278:∣
28253:∏
28236:θ
28227:∣
28196:∣
28193:θ
28145:−
28120:θ
28112:−
28085:θ
28071:θ
28062:∣
28030:θ
28021:∣
27977:∈
27940:θ
27932:−
27920:θ
27911:∣
27867:θ
27858:∣
27823:θ
27819:−
27785:θ
27757:θ
27684:λ
27679:−
27669:λ
27654:β
27625:λ
27596:λ
27567:λ
27553:. In the
27534:λ
27441:λ
27376:λ
27334:⋅
27324:λ
27297:∑
27278:⋅
27268:λ
27202:α
27122:⋅
27112:λ
27013:⋅
27003:λ
26973:λ
26952:∑
26914:α
26910:−
26872:λ
26848:∑
26838:−
26806:
26800:−
26762:∂
26750:∂
26577:∑
26573:−
26556:α
26535:∑
26446:∑
26370:Δ
26367:−
26318:∑
26305:λ
26284:∑
26263:∑
26152:Δ
26149:−
26100:∑
26081:β
26077:∂
26072:ℓ
26069:∂
26021:
25993:Δ
25973:∑
25952:∑
25945:ℓ
25900:
25864:∑
25843:∑
25839:−
25718:. Define
25608:…
25425:…
25194:β
25181:β
25165:
25155:
25106:for
25095:−
25083:
25071:
25048:logarithm
25007:~
25004:π
24981:π
24952:~
24949:π
24943:−
24933:~
24930:π
24922:
24916:−
24910:π
24907:−
24900:π
24895:
24877:^
24874:β
24862:∗
24850:^
24847:β
24814:β
24787:β
24760:β
24702:β
24678:β
24584:χ
24546:McFadden
24457:
24448:−
24391:
24382:−
24354:
24348:−
24335:
24321:−
24301:−
24244:
24235:−
24201:
24192:−
24128:−
24121:χ
24063:
24054:−
23989:%
23983:≈
23977:−
23919:…
23905:φ
23898:^
23895:ℓ
23888:−
23882:^
23879:ℓ
23844:…
23838:−
23829:^
23826:ℓ
23803:…
23797:−
23789:φ
23782:^
23779:ℓ
23732:φ
23725:^
23722:ℓ
23715:−
23709:^
23706:ℓ
23678:φ
23671:^
23652:^
23636:
23595:φ
23588:^
23585:ℓ
23578:≥
23572:^
23569:ℓ
23527:¯
23490:¯
23482:−
23474:¯
23461:
23446:β
23416:β
23384:¯
23376:−
23367:
23356:¯
23348:−
23331:¯
23320:
23312:¯
23292:φ
23285:^
23282:ℓ
23249:¯
23236:φ
23198:φ
23190:−
23181:
23162:−
23145:φ
23134:
23096:∑
23087:φ
23083:ℓ
23050:β
23041:φ
23003:φ
22995:−
22961:φ
22877:−
22868:
22849:−
22812:
22774:∑
22767:ℓ
22732:β
22719:β
22619:−
22554:^
22551:ℓ
22522:ε
22455:−
22374:−
22369:¯
22341:∑
22327:φ
22320:^
22317:ε
22283:φ
22279:ε
22249:¯
22222:¯
22184:φ
22157:φ
22153:ε
22126:ε
22072:−
22039:∑
22026:ε
21968:^
21965:ε
21915:−
21859:∑
21846:ε
21614:Turing.jl
21576:context,
21540:π
21531:Φ
21502:σ
21444:…
21363:⋮
21358:⋮
21353:⋮
21346:…
21292:…
21203:…
21188:μ
21173:μ
21163:μ
21127:μ
21124:−
21106:μ
21100:
21054:μ
21049:−
20987:−
20849:−
20817:μ
20787:…
20694:…
20682:β
20669:β
20656:β
20467:−
20431:⋅
20427:β
20423:−
20403:−
20364:⋅
20360:β
20356:−
20294:−
20266:−
20195:∣
20131:⋅
20127:β
20103:−
20080:
20055:
19984:
19969:
19879:
19802:…
19750:
19744:∼
19639:−
19540:−
19418:β
19411:⋯
19383:β
19370:β
19363:−
19259:β
19254:−
19244:β
19235:β
19181:⋅
19171:β
19166:−
19126:⋅
19116:β
19085:⋅
19075:β
18992:⋅
18962:⋅
18952:β
18904:β
18855:⋅
18845:β
18818:⋅
18808:β
18783:⋅
18773:β
18731:⋅
18721:β
18694:⋅
18684:β
18657:⋅
18628:⋅
18618:β
18594:⋅
18551:⋅
18524:⋅
18514:β
18487:⋅
18460:⋅
18450:β
18424:⋅
18397:⋅
18387:β
18347:⋅
18326:β
18296:⋅
18275:β
18247:⋅
18226:β
17964:…
17946:⋅
17936:β
17916:⋅
17906:β
17892:
17810:⋅
17800:β
17784:∑
17765:⋅
17755:β
17668:⋅
17658:β
17631:⋅
17621:β
17596:⋅
17586:β
17521:⋅
17511:β
17484:⋅
17474:β
17449:⋅
17439:β
17354:⋅
17344:β
17317:⋅
17307:β
17221:⋅
17211:β
17141:⋅
17131:β
17048:
17042:−
17032:logarithm
17008:
17002:−
16987:⋅
16977:β
16940:
16927:
16921:−
16906:⋅
16896:β
16859:
16764:strong −
16673:verifying
16667:. Please
16579:⋅
16575:β
16568:
16560:−
16525:⋅
16521:β
16514:ε
16469:⋅
16465:β
16461:−
16455:ε
16427:β
16406:ε
16388:⋅
16384:β
16356:ε
16336:ε
16318:⋅
16305:β
16300:−
16290:β
16244:ε
16240:−
16231:ε
16209:⋅
16196:β
16191:−
16181:β
16133:ε
16129:−
16120:ε
16095:⋅
16085:β
16080:−
16065:⋅
16055:β
16003:ε
15984:⋅
15974:β
15964:−
15955:ε
15936:⋅
15926:β
15882:∣
15871:∗
15855:−
15850:∗
15795:∣
15790:∗
15769:∗
15721:∣
15654:
15648:∼
15639:ε
15635:−
15626:ε
15619:ε
15586:ε
15582:−
15573:ε
15566:ε
15538:β
15533:−
15523:β
15514:β
15427:∗
15406:∗
15329:−
15321:−
15308:−
15282:ε
15254:ε
15191:
15178:∼
15165:ε
15142:
15129:∼
15116:ε
15074:ε
15055:⋅
15045:β
15031:∗
15004:ε
14985:⋅
14975:β
14961:∗
14841:⋅
14837:β
14830:
14822:−
14781:⋅
14777:β
14764:ε
14729:⋅
14725:β
14721:−
14709:ε
14674:ε
14655:⋅
14651:β
14616:∣
14605:∗
14564:∣
14503:
14495:−
14469:ε
14322:⋅
14318:β
14304:ε
14300:−
14280:∗
14177:
14171:∼
14162:ε
14128:ε
14109:⋅
14105:β
14096:∗
14003:⋅
13999:β
13978:⋅
13963:⋅
13959:β
13940:−
13911:⋅
13907:β
13877:⋅
13873:β
13862:−
13823:⋅
13819:β
13789:⋅
13785:β
13761:−
13740:−
13694:∣
13621:⋅
13617:β
13613:−
13575:⋅
13571:β
13564:
13556:−
13517:∣
13501:
13462:β
13366:∞
13357:∞
13354:−
13304:⋅
13300:β
13276:−
13253:
13228:
13201:∣
13185:
13172:
13121:β
13114:⋯
13086:β
13073:β
13049:−
13026:
13001:
12967:…
12945:∣
12929:
12916:
12898:used for
12817:β
12782:β
12700:∑
12696:−
12658:Δ
12638:∑
12613:β
12609:∂
12604:ℓ
12601:∂
12519:Δ
12462:
12433:Δ
12413:∑
12392:∑
12385:ℓ
12270:−
12134:⋅
12124:β
12058:
11921:β
11848:⋅
11838:β
11811:∑
11754:∑
11750:−
11696:…
11648:⋅
11638:β
11611:∑
11593:⋅
11583:β
11393:−
11097:β
10905:β
10752:−
10573:−
10561:β
10479:β
10456:β
10443:β
10436:−
10392:β
10369:β
10356:β
10322:β
10299:β
10286:β
10264:⋅
10260:β
10237:⋅
10233:β
10166:−
10154:−
10142:
10091:β
10058:β
10034:−
10022:β
9874:∑
9870:−
9827:∑
9805:β
9801:∂
9796:ℓ
9793:∂
9732:−
9723:
9697:−
9671:∑
9637:
9597:∑
9590:ℓ
9340:β
9232:⋅
9228:β
9224:−
9191:⋅
9187:β
9164:⋅
9160:β
9076:⋅
9072:β
9049:β
9028:∑
8979:β
8972:…
8960:β
8947:β
8934:β
8923:β
8885:…
8726:β
8682:β
8675:⋯
8653:β
8630:β
8617:β
8604:−
8592:
8483:…
8388:⋅
8384:β
8299:intercept
8218:β
8211:…
8199:β
8150:β
8143:⋯
8115:β
8102:β
8034:, i.e. a
7855:−
7831:−
7778:…
7756:∣
7694:−
7621:…
7599:∣
7531:…
7509:∣
7493:
7463:
7460:Bernoulli
7457:∼
7428:…
7406:∣
7150:β
7143:⋯
7121:β
7098:β
7085:β
7078:−
7013:β
7006:⋯
6984:β
6961:β
6948:β
6935:−
6923:
6888:…
6838:β
6793:β
6772:∑
6759:β
6736:β
6729:⋯
6707:β
6684:β
6671:β
6641:β
6628:β
6531:β
6499:β
6467:β
6442:β
6429:β
6397:β
6384:β
6349:−
6297:−
6247:
6224:
6153:β
6140:β
6014:β
6001:intercept
5981:β
5755:β
5742:β
5715:−
5657:β
5644:β
5618:−
5591:
5570:
5538:−
5534:σ
5481:−
5477:σ
5438:β
5362:∣
5221:β
5208:β
5201:−
5169:σ
5116:→
5073:β
5060:β
4925:−
4858:σ
4820:→
4809:σ
4701:∈
4689:σ
4660:σ
4609:≈
4562:Wald test
4377:≈
4374:μ
4282:≈
4271:−
4216:⋅
4204:−
4201:≈
4192:β
4176:β
4132:≈
4121:−
4072:−
4063:⋅
4051:−
4048:≈
4039:β
4023:β
3956:β
3925:β
3891:≈
3882:β
3843:≈
3834:β
3819:β
3815:−
3809:μ
3772:≈
3763:β
3738:−
3735:≈
3726:β
3686:β
3655:β
3621:β
3590:β
3541:−
3508:∑
3492:β
3488:∂
3483:ℓ
3480:∂
3435:−
3402:∑
3386:β
3382:∂
3377:ℓ
3374:∂
3334:β
3303:β
3264:β
3233:β
3178:−
3144:∏
3104:∏
3049:−
3040:
3021:−
2993:
2954:∑
2934:−
2925:
2894:∑
2871:
2840:∑
2833:ℓ
2809:minimized
2793:ℓ
2790:−
2760:β
2729:β
2704:ℓ
2701:−
2656:−
2590:−
2521:−
2512:
2493:−
2484:−
2471:
2455:−
2443:ℓ
2339:→
2273:→
2022:surprisal
1962:−
1953:
1947:−
1904:
1898:−
1878:ℓ
1846:ℓ
1826:minimized
1793:β
1762:β
1723:−
1539:β
1498:β
1483:β
1479:−
1473:μ
1421:β
1391:β
1378:β
1353:intercept
1331:μ
1328:−
1316:β
1275:β
1262:β
1255:−
1175:μ
1122:μ
1119:−
1110:−
719:economics
715:marketing
450:Segmented
81:estimates
34333:Category
34026:Survival
33903:Johansen
33626:Binomial
33581:Isotonic
33168:(normal)
32813:location
32620:Blocking
32575:Sampling
32454:Q–Q plot
32419:Box plot
32401:Graphics
32296:Skewness
32286:Kurtosis
32258:Variance
32188:Heronian
32183:Harmonic
32002:. Wiley.
31904:(2009).
31859:(2000).
31786:16588606
31738:(1943).
31724:16576496
31503:17813446
31365:(1973).
31084:: 16–19.
31035:slide 16
30972:Stat Med
30927:(2002).
30866:25532820
30815:17182981
30731:27881078
30370:(1933),
30287:45282555
30279:23078036
30037:(2009).
29948:11129812
29913:11268952
29734:27483067
29601:See also
29584:handles
29493:, as in
29452:bioassay
29388:catalyst
27078:yields:
26922:′
26901:′
26880:′
26829:′
26821:′
26782:′
26774:′
24030:deviance
23614:deviance
21750:deviance
21598:OpenBUGS
21470:Bayesian
19955:so that
19580:) is an
19294:, where
16789:strong +
16761:strong −
16758:strong +
16685:May 2022
15651:Logistic
15389:if
14421:choice.
14266:if
14174:Logistic
13483:, i.e.:
13411:Gaussian
12576:= 1 and
11168:. So if
10976:. So if
9124:yields:
8430:female).
8349:of size
8276:of size
8030:using a
7934:of each
7892:of each
7710:if
7674:if
6856:for all
4799:standard
4791:log-odds
4728:for all
4628:below).
4329:Odds (e)
4316:of study
2820:maximize
1599:, write
1579:log loss
1357:vertical
723:mortgage
695:diabetes
565:Bayesian
503:Weighted
498:Ordinary
430:Isotonic
425:Quantile
204:for the
194:constant
121:istic un
57:log-odds
34359:Commons
34306:Kriging
34191:Process
34148:studies
34007:Wavelet
33840:General
33007:Plug-in
32801:L space
32580:Cluster
32281:Moments
32099:Outline
32033:YouTube
31991:8970487
31777:1078563
31754:Bibcode
31715:1084522
31692:Bibcode
31671:2525642
31538:2983890
31483:Bibcode
31474:Science
31464:3001655
31427:2280041
31398:Sources
31058:Feb 23,
30992:9160492
30857:4289553
30840:: 137.
30774:8970487
30722:5122171
30389:Bibcode
30251:Bibcode
30078:6028270
30018:8254858
29983:7587228
29886:3106646
29833:2333860
29586:matched
29563:ordinal
29350:History
29226:is the
29195:is the
27522:
27496:
27429:
27397:
27166:where:
25803:entropy
25667:
25641:
25548:
25516:
25505:
25478:
25357:
25330:
25326:
25300:
23916:11.6661
23841:8.02988
23800:13.8629
22673:
22644:
21588:of the
21490:of the
20552:logit).
20516:cases,
19497:, ...,
19306:, e.g.
19027:and so
17889:softmax
16727:utility
16635:Example
15476:utility
15236:: i.e.
14451:, i.e.
14443:is the
12866:is the
12803:is the
11517:
11491:
11487:
11461:
11450:
11424:
10588:is the
9307:is the
8822:
8790:
8331:, ...,
8261:, ...,
5999:is the
4612:0.00064
4560:By the
4535:Hours (
4430:= 0.50
4392:
4366:
3974:
3947:
3943:
3916:
3704:
3677:
3673:
3646:
3639:
3612:
3608:
3581:
3352:
3325:
3321:
3294:
3282:
3255:
3251:
3224:
2805:
2782:
2778:
2751:
2747:
2720:
2716:
2693:
2426:
2387:
2383:
2356:
2084:
2057:
2053:
2026:
1864:
1837:
1820:), the
1811:
1784:
1780:
1753:
1745:
1712:
1708:
1681:
1677:
1650:
1412:), and
796:Hours (
767:Problem
762:Example
678:General
524:Partial
363:Poisson
159:ordered
116:, from
51:) is a
34228:Census
33818:Normal
33766:Manova
33586:Robust
33336:2-way
33328:1-way
33166:-test
32837:
32414:Biplot
32205:Median
32198:Lehmer
32140:Center
31989:
31950:
31931:
31912:
31890:
31871:
31845:
31826:
31803:
31784:
31774:
31722:
31712:
31669:
31640:
31536:
31501:
31462:
31454:
31425:
31013:
30990:
30935:
30887:
30864:
30854:
30813:
30772:
30729:
30719:
30676:
30590:
30449:
30409:
30324:
30285:
30277:
30269:
30222:
30112:
30076:
30016:
29981:
29946:
29911:
29884:
29831:
29776:
29732:
29724:
29716:
29663:mlpack
29370:; see
29160:where
26425:λ
25507:. The
24973:where
24309:fitted
24223:fitted
23514:where
22642:where
22236:where
21081:where
19572:where
18934:Then,
17850:as in
16775:weak +
16719:Canada
16715:Quebec
15225:where
15100:where
14923:robust
14527:Then:
14291:
14150:where
12773:where
12553:is an
12511:where
11459:) and
11056:. The
10519:where
10082:, and
9940:where
9280:where
8713:where
8190:where
7450:
7324:binary
7260:binary
6591:where
4797:. The
4769:input
4554:0.017
4530:0.021
4500:-value
3218:Since
1648:. The
1307:where
1204:) and
1153:where
1017:. The
868:Pass (
689:et al.
482:Linear
420:Robust
343:Probit
269:Models
233:scalar
223:; see
188:; see
85:binary
43:, the
33852:Trend
33381:prior
33323:anova
33212:-test
33186:-test
33178:-test
33085:Power
33030:Pivot
32823:shape
32818:scale
32268:Shape
32248:Range
32193:Heinz
32168:Cubic
32104:Index
32047:mlelr
31667:JSTOR
31564:(PDF)
31534:JSTOR
31460:JSTOR
31423:JSTOR
31381:(PDF)
31370:(PDF)
31169:(PDF)
31078:(PDF)
31053:(PDF)
30957:(PDF)
30639:(PDF)
30411:91247
30407:JSTOR
30375:(PDF)
30283:S2CID
29829:JSTOR
29272:is a
25283:Proof
25237:logit
25233:logit
25152:logit
25068:logit
25058:logit
25053:logit
24555:Tjur
23986:99.94
23966:with
21645:event
21572:In a
20621:of a
20052:logit
19966:logit
16830:logit
16792:none
16778:none
16556:logit
15351:Then
14818:logit
14491:logit
14219:Then
13552:logit
13225:logit
13169:logit
12998:logit
12913:logit
12583:when
12572:when
10883:1001.
10795:when
9463:that
9098:that
8533:logit
6825:with
6089:logit
6038:base
5852:logit
5567:logit
5460:logit
5018:is a
4624:(see
4578:0.017
4472:0.97
4458:0.87
4444:0.61
4360:0.26
4354:−1.07
4346:0.07
4340:−2.57
4314:Hours
2156:, or
1208:is a
1157:is a
1037:Model
863:5.50
684:TRISS
529:Total
445:Local
113:logit
67:. In
34085:Test
33285:Sign
33137:Wald
32210:Mode
32148:Mean
31987:PMID
31948:ISBN
31929:ISBN
31910:ISBN
31888:ISBN
31869:ISBN
31843:ISBN
31824:ISBN
31801:ISBN
31782:PMID
31720:PMID
31638:OCLC
31499:PMID
31452:ISSN
31185:2014
31141:(1).
31060:2022
31011:ISBN
30988:PMID
30933:ISBN
30885:ISBN
30862:PMID
30811:PMID
30770:PMID
30727:PMID
30674:ISBN
30588:ISBN
30447:ISBN
30322:ISSN
30275:PMID
30267:ISSN
30220:ISSN
30110:ISBN
30074:PMID
30014:PMID
29998:JAMA
29979:PMID
29944:PMID
29909:PMID
29882:PMID
29774:ISBN
29730:PMID
29722:OCLC
29714:ISSN
29698:JAMA
29557:(or
29539:(or
29504:and
29398:and
29199:and
27366:The
27044:and
26224:+1)(
25676:Let
25476:and
25231:The
25119:<
25113:<
25044:odds
24630:The
24570:The
24296:null
24180:null
24160:Let
23434:is:
22302:is:
21621:and
21610:Stan
21606:PyMC
21602:JAGS
21523:vs.
21097:diag
19310:and
19290:and
18158:and
18027:and
16786:none
16517:<
16458:>
16409:>
16339:>
16256:>
16145:>
16017:>
15876:>
15774:>
15411:>
14773:<
14718:>
14683:>
14610:>
14478:<
14314:<
14285:>
13443:odds
12235:and
11670:for
10860:1000
9956:k-th
8516:...
8449:,...
8236:are
7345:(or
7334:and
7316:type
7297:...
7245:...
7044:and
6603:and
6244:odds
6221:odds
6127:odds
4767:real
4548:0.9
4527:−2.3
4521:−4.1
4469:31.4
4466:3.45
4455:6.96
4452:1.94
4441:1.55
4438:0.44
4318:(x)
4285:0.87
4135:0.25
3945:and
3914:The
3894:0.67
3798:of:
3794:and
3710:and
3675:and
3610:and
3323:and
3253:and
2749:and
2411:<
2398:<
2321:and
2255:and
2189:and
2123:and
1866:is:
1782:and
1592:and
1577:(or
1516:and
1063:The
860:5.00
857:4.75
854:4.50
851:4.25
848:4.00
845:3.50
842:3.25
839:3.00
836:2.75
833:2.50
830:2.25
827:2.00
824:1.75
821:1.75
818:1.50
815:1.25
812:1.00
809:0.75
806:0.50
130:and
75:(or
47:(or
33265:BIC
33260:AIC
32035:by
32031:on
31977:doi
31772:PMC
31762:doi
31710:PMC
31700:doi
31659:doi
31617:doi
31590:doi
31568:doi
31526:doi
31491:doi
31444:doi
31415:doi
30980:doi
30852:PMC
30842:doi
30801:doi
30797:165
30760:doi
30717:PMC
30707:doi
30651:doi
30647:108
30617:doi
30550:doi
30397:doi
30385:231
30314:doi
30259:doi
30212:doi
30166:doi
30139:doi
30102:doi
30066:doi
30006:doi
30002:270
29971:doi
29936:doi
29932:191
29872:doi
29821:doi
29706:doi
29702:316
29667:C++
29588:or
29469:in
29446:in
29442:by
29434:in
29034:log
28951:log
28818:log
28701:log
28649:lim
28531:log
28464:log
24919:log
24892:log
24551:McF
21612:or
21488:CDF
20587:or
19747:Bin
19584:in
16671:by
13661:):
12578:1-y
12563:= n
11529:y=n
11309:of
11060:of
10133:log
9714:log
9628:log
8583:log
8445:, x
8337:m,i
7303:m,i
7251:m,i
6920:log
4551:2.4
4545:1.5
4524:1.8
4380:2.7
4225:1.9
4219:1.5
4207:4.1
4075:1.1
4066:1.5
4054:4.1
3846:2.7
3775:1.5
3741:4.1
2807:is
2288:or
1565:Fit
985:to
119:log
39:In
34389::
31985:.
31973:49
31971:.
31965:.
31863:.
31818:.
31780:.
31770:.
31760:.
31750:29
31748:.
31742:.
31734:;
31718:.
31708:.
31698:.
31686:.
31682:.
31665:.
31655:10
31653:.
31613:22
31611:.
31607:.
31586:35
31584:.
31532:.
31522:20
31520:.
31505:.
31497:.
31489:.
31479:79
31477:.
31458:.
31450:.
31438:.
31421:.
31411:39
31409:.
31342:^
31220:18
31218:.
31177:10
31175:.
31171:.
31137:.
31133:.
31080:.
31040:^
31025:^
30986:.
30976:16
30974:.
30899:^
30860:.
30850:.
30838:14
30836:.
30832:.
30809:.
30795:.
30791:.
30768:.
30756:49
30754:.
30748:.
30725:.
30715:.
30703:16
30701:.
30697:.
30645:.
30641:.
30613:17
30611:.
30564:^
30544:.
30405:,
30395:,
30383:,
30377:,
30366:;
30342:.
30320:.
30310:62
30308:.
30304:.
30281:.
30273:.
30265:.
30257:.
30247:33
30245:.
30241:.
30218:.
30208:14
30206:.
30202:.
30162:47
30160:.
30135:46
30133:.
30108:.
30086:^
30072:.
30062:20
30060:.
30041:.
30026:^
30012:.
30000:.
29977:.
29967:23
29965:.
29942:.
29930:.
29905:48
29903:.
29880:.
29868:27
29866:.
29862:.
29827:.
29817:54
29815:.
29803:^
29788:^
29742:^
29728:.
29720:.
29712:.
29700:.
29497:.
29346:.
29212:KL
29096:KL
29040:Pr
28992:Pr
28960:Pr
28913:Pr
28824:Pr
28788:Pr
28707:Pr
28630:,
28591:.
28537:Pr
28262:Pr
28218:Pr
28053:Pr
28012:Pr
27896:Pr
27843:Pr
27769:,
27703:.
26803:ln
26427:nm
26217:nk
26209:nk
26018:ln
25897:ln
25805::
25794:nk
25787:.
25511:mk
25255:,
25080:ln
24533:CS
24454:ln
24388:ln
24351:ln
24332:ln
24241:ln
24198:ln
24060:ln
24037::
23633:ln
23458:ln
23364:ln
23317:ln
23178:ln
23131:ln
22865:ln
22809:ln
22569:.
21774:,
21699:10
21625:.
21608:,
21604:,
21600:,
20528:.
20520:,
20173:Pr
20077:ln
19722::
19314:.
19038:Pr
18182:Pr
18105:Pr
18077:Pr
18035:Pr
17993:Pr
17861:Pr
17718:Pr
17545:Pr
17398:Pr
17162:Pr
17082:Pr
17045:ln
17005:ln
16943:Pr
16937:ln
16924:ln
16862:Pr
16856:ln
16825:.
16508:Pr
16449:Pr
16377:Pr
16279:Pr
16170:Pr
16042:Pr
15916:Pr
15829:Pr
15748:Pr
15699:Pr
15498:.
15275:Pr
15247:Pr
15227:EV
15182:EV
15133:EV
14757:Pr
14702:Pr
14644:Pr
14589:Pr
14542:Pr
14462:Pr
14216:.
14052:.
13672:Pr
13429:.
13250:ln
13023:ln
12459:ln
12303:.
12055:ln
11230:10
11038:10
10748:10
10137:10
10049:,
10001:10
9944:mk
8509:,
8495:.
8326:1,
8321:,
8316:0,
8292:0,
8254:,
7734:Pr
7577:Pr
7338:.
7326:,
7322:,
7318::
7306:.
7292:1,
7240:1,
7215:.
6611:.
6599:,
6595:,
5864:ln
5588:ln
5450:.
4681:;
4542:)
4518:)
3210:.
3037:ln
2990:ln
2922:ln
2868:ln
2811:.
2509:ln
2468:ln
2428:.
1998:0.
1950:ln
1901:ln
1828:.
1557:.
1005:20
935:1
875:)
803:)
741:.
725:.
697:;
251:.
212:.
177:.
123:it
79:)
71:,
33210:G
33184:F
33176:t
33164:Z
32883:V
32878:U
32080:e
32073:t
32066:v
32051:C
31993:.
31979::
31956:.
31937:.
31918:.
31896:.
31877:.
31851:.
31832:.
31809:.
31788:.
31764::
31756::
31726:.
31702::
31694::
31688:6
31673:.
31661::
31644:.
31619::
31596:.
31592::
31574:.
31570::
31540:.
31528::
31493::
31485::
31466:.
31446::
31440:7
31429:.
31417::
31390:.
31228:.
31187:.
31139:1
31062:.
31019:.
30994:.
30982::
30941:.
30893:.
30868:.
30844::
30817:.
30803::
30776:.
30762::
30733:.
30709::
30682:.
30657:.
30653::
30623:.
30619::
30596:.
30558:.
30552::
30517:2
30513:)
30509:n
30503:y
30500:(
30494:1
30491:=
30488:)
30485:y
30482:,
30479:n
30476:(
30455:.
30399::
30391::
30352:.
30328:.
30316::
30289:.
30261::
30253::
30226:.
30214::
30172:.
30168::
30145:.
30141::
30118:.
30104::
30080:.
30068::
30020:.
30008::
29985:.
29973::
29950:.
29938::
29915:.
29888:.
29874::
29835:.
29823::
29782:.
29736:.
29708::
29596:.
29578:.
29551:.
29382:(
29260:x
29254:y
29208:D
29183:)
29180:X
29174:Y
29171:(
29168:H
29141:)
29138:X
29132:Y
29129:(
29126:H
29120:)
29111:Y
29104:Y
29101:(
29092:D
29079:=
29071:)
29067:)
29064:x
29061:=
29058:X
29052:y
29049:=
29046:Y
29043:(
29031:+
29025:)
29019:;
29016:x
29013:=
29010:X
29004:y
29001:=
28998:Y
28995:(
28987:)
28984:x
28981:=
28978:X
28972:y
28969:=
28966:Y
28963:(
28944:(
28940:)
28937:y
28934:=
28931:Y
28928:,
28925:x
28922:=
28919:X
28916:(
28906:Y
28898:y
28886:X
28878:x
28864:=
28857:)
28851:;
28848:x
28845:=
28842:X
28836:y
28833:=
28830:Y
28827:(
28815:)
28812:y
28809:=
28806:Y
28803:,
28800:x
28797:=
28794:X
28791:(
28781:Y
28773:y
28761:X
28753:x
28745:=
28742:)
28736:;
28731:i
28727:x
28718:i
28714:y
28710:(
28696:N
28691:1
28688:=
28685:i
28675:1
28668:N
28659:+
28653:N
28628:N
28614:)
28611:y
28608:,
28605:x
28602:(
28572:)
28566:;
28561:i
28557:x
28548:i
28544:y
28540:(
28526:N
28521:1
28518:=
28515:i
28505:1
28498:N
28494:=
28491:)
28488:x
28485:;
28482:y
28473:(
28470:L
28459:1
28452:N
28419:)
28414:i
28410:y
28403:1
28400:(
28396:)
28392:)
28387:i
28383:x
28379:(
28370:h
28363:1
28360:(
28353:i
28349:y
28344:)
28338:i
28334:x
28330:(
28321:h
28315:i
28307:=
28297:)
28291:;
28286:i
28282:x
28273:i
28269:y
28265:(
28257:i
28249:=
28239:)
28233:;
28230:X
28224:Y
28221:(
28215:=
28208:)
28205:x
28202:;
28199:y
28190:(
28187:L
28156:.
28151:)
28148:y
28142:1
28139:(
28135:)
28131:)
28128:X
28125:(
28116:h
28109:1
28106:(
28101:y
28097:)
28093:X
28090:(
28081:h
28077:=
28074:)
28068:;
28065:X
28059:y
28056:(
28033:)
28027:;
28024:X
28018:y
28015:(
27992:}
27989:1
27986:,
27983:0
27980:{
27974:Y
27951:)
27948:X
27945:(
27936:h
27929:1
27926:=
27923:)
27917:;
27914:X
27908:0
27905:=
27902:Y
27899:(
27870:)
27864:;
27861:X
27855:1
27852:=
27849:Y
27846:(
27840:=
27832:X
27827:T
27815:e
27811:+
27808:1
27804:1
27799:=
27796:)
27793:X
27790:(
27781:h
27730:Y
27689:0
27674:n
27664:=
27659:n
27630:0
27601:n
27572:0
27539:n
27510:1
27507:+
27504:N
27492:N
27476:k
27473:n
27469:p
27446:n
27417:)
27414:1
27411:+
27408:M
27405:(
27381:n
27344:k
27339:x
27329:u
27318:e
27312:N
27307:0
27304:=
27301:u
27288:k
27283:x
27273:n
27262:e
27256:=
27251:k
27248:n
27244:p
27229:k
27227:Z
27206:k
27198:+
27195:1
27191:e
27187:=
27182:k
27178:Z
27149:k
27145:Z
27140:/
27132:k
27127:x
27117:n
27106:e
27102:=
27097:k
27094:n
27090:p
27064:k
27061:n
27057:p
27046:k
27042:n
27023:k
27018:x
27008:n
26998:=
26993:k
26990:m
26986:x
26980:m
26977:n
26967:M
26962:0
26959:=
26956:m
26919:k
26907:)
26898:k
26894:m
26890:x
26884:m
26877:n
26868:(
26863:M
26858:0
26855:=
26852:m
26844:+
26841:1
26835:)
26826:k
26818:n
26813:p
26809:(
26797:=
26794:0
26791:=
26779:k
26771:n
26766:p
26755:L
26719:m
26716:r
26713:o
26710:n
26704:L
26698:+
26693:t
26690:i
26687:f
26681:L
26675:+
26670:t
26667:n
26664:e
26658:L
26652:=
26647:L
26631:k
26629:α
26611:)
26605:k
26602:n
26598:p
26592:N
26587:1
26584:=
26581:n
26570:1
26566:(
26560:k
26550:K
26545:1
26542:=
26539:k
26531:=
26526:m
26523:r
26520:o
26517:n
26511:L
26482:1
26479:=
26474:k
26471:n
26467:p
26461:N
26456:0
26453:=
26450:n
26432:K
26408:)
26403:k
26400:m
26396:x
26392:)
26387:k
26383:y
26379:,
26376:n
26373:(
26362:k
26359:m
26355:x
26349:k
26346:n
26342:p
26338:(
26333:K
26328:1
26325:=
26322:k
26312:m
26309:n
26299:M
26294:0
26291:=
26288:m
26278:N
26273:0
26270:=
26267:n
26259:=
26254:t
26251:i
26248:f
26242:L
26226:N
26222:M
26215:p
26207:p
26190:)
26185:k
26182:m
26178:x
26174:)
26169:k
26165:y
26161:,
26158:n
26155:(
26144:k
26141:m
26137:x
26131:k
26128:n
26124:p
26120:(
26115:K
26110:1
26107:=
26104:k
26096:=
26088:m
26085:n
26040:)
26035:k
26032:n
26028:p
26024:(
26015:)
26010:k
26006:y
26002:,
25999:n
25996:(
25988:N
25983:0
25980:=
25977:n
25967:K
25962:1
25959:=
25956:k
25948:=
25919:)
25914:k
25911:n
25907:p
25903:(
25892:k
25889:n
25885:p
25879:N
25874:0
25871:=
25868:n
25858:K
25853:1
25850:=
25847:k
25836:=
25831:t
25828:n
25825:e
25819:L
25792:p
25785:n
25781:k
25767:)
25762:k
25757:x
25752:(
25747:n
25743:p
25739:=
25734:k
25731:n
25727:p
25706:n
25703:=
25700:y
25690:x
25686:)
25684:x
25682:(
25680:n
25678:p
25671:y
25655:1
25652:+
25649:N
25627:}
25622:k
25619:M
25615:x
25611:,
25605:,
25600:k
25597:1
25593:x
25589:,
25584:k
25581:0
25577:x
25573:{
25570:=
25565:k
25560:x
25536:)
25533:1
25530:+
25527:M
25524:(
25509:x
25491:k
25487:y
25462:k
25459:m
25455:x
25434:}
25431:K
25428:,
25422:,
25419:2
25416:,
25413:1
25410:{
25407:=
25404:k
25394:K
25380:1
25377:=
25372:0
25368:x
25343:m
25339:x
25314:1
25311:+
25308:M
25226:β
25222:x
25218:Y
25203:x
25198:1
25190:+
25185:0
25177:=
25174:)
25171:Y
25168:(
25160:E
25126:.
25122:1
25116:p
25110:0
25098:p
25092:1
25088:p
25077:=
25074:p
24940:1
24904:1
24889:+
24884:0
24867:=
24857:0
24818:0
24791:0
24764:j
24713:2
24706:j
24697:E
24693:S
24687:2
24682:j
24672:=
24667:j
24663:W
24649:t
24620:t
24588:2
24560:T
24557:R
24548:R
24542:N
24539:R
24530:R
24524:L
24521:R
24509:R
24491:F
24470:.
24451:2
24445:=
24432:)
24417:(
24412:)
24397:(
24385:2
24379:=
24368:)
24328:(
24324:2
24318:=
24305:D
24292:D
24257:.
24238:2
24232:=
24219:D
24195:2
24189:=
24176:D
24141:,
24136:2
24131:p
24125:s
24098:D
24093:D
24076:.
24057:2
24051:=
24048:D
24014:R
23980:D
23974:1
23960:x
23955:k
23953:y
23948:k
23946:y
23913:=
23910:)
23873:(
23870:2
23867:=
23864:D
23835:=
23794:=
23761:k
23759:x
23755:K
23753:(
23737:)
23700:(
23697:2
23694:=
23690:)
23683:2
23668:L
23659:2
23649:L
23640:(
23630:=
23627:D
23552:k
23550:y
23545:k
23543:y
23524:y
23498:)
23487:y
23479:1
23471:y
23465:(
23455:=
23450:0
23420:0
23392:)
23389:)
23381:y
23373:1
23370:(
23361:)
23353:y
23345:1
23342:(
23339:+
23336:)
23328:y
23323:(
23309:y
23303:(
23300:K
23297:=
23265:L
23246:y
23241:=
23232:p
23207:)
23203:)
23194:p
23187:1
23184:(
23175:)
23170:k
23166:y
23159:1
23156:(
23153:+
23150:)
23141:p
23137:(
23126:k
23122:y
23117:(
23111:K
23106:1
23103:=
23100:k
23092:=
23054:0
23046:=
23037:t
22999:t
22991:e
22987:+
22984:1
22980:1
22975:=
22972:)
22969:x
22966:(
22957:p
22933:1
22930:=
22927:y
22903:)
22899:)
22896:)
22891:k
22887:x
22883:(
22880:p
22874:1
22871:(
22862:)
22857:k
22853:y
22846:1
22843:(
22840:+
22837:)
22834:)
22829:k
22825:x
22821:(
22818:p
22815:(
22804:k
22800:y
22795:(
22789:K
22784:1
22781:=
22778:k
22770:=
22741:x
22736:1
22728:+
22723:0
22715:=
22712:t
22689:1
22686:=
22683:y
22661:)
22658:x
22655:(
22652:p
22622:t
22615:e
22611:+
22608:1
22604:1
22599:=
22596:)
22593:x
22590:(
22587:p
22574:K
22526:2
22511:L
22507:ℓ
22503:L
22496:x
22491:k
22489:y
22484:k
22482:y
22464:1
22461:=
22458:1
22452:2
22437:k
22435:y
22430:k
22428:x
22423:k
22421:y
22413:k
22411:y
22392:2
22388:)
22382:k
22378:y
22366:y
22361:(
22356:K
22351:1
22348:=
22345:k
22337:=
22332:2
22288:2
22267:k
22265:y
22246:y
22219:y
22214:=
22209:0
22205:b
22162:2
22130:2
22115:0
22112:b
22095:.
22090:2
22086:)
22080:k
22076:y
22067:0
22063:b
22059:(
22054:K
22049:1
22046:=
22043:k
22035:=
22030:2
22012:0
22009:b
22005:y
22001:k
21997:x
21975:2
21938:.
21933:2
21929:)
21923:k
21919:y
21910:k
21906:x
21900:1
21896:b
21892:+
21887:0
21883:b
21879:(
21874:K
21869:1
21866:=
21863:k
21855:=
21850:2
21832:b
21818:x
21813:1
21809:b
21805:+
21800:0
21796:b
21792:=
21789:y
21778:k
21776:y
21771:k
21769:x
21765:K
21710:p
21706:/
21702:k
21679:p
21655:k
21552:)
21549:x
21543:8
21534:(
21511:)
21508:x
21505:(
21452:T
21448:]
21441:,
21438:)
21435:2
21432:(
21429:y
21426:,
21423:)
21420:1
21417:(
21414:y
21411:[
21408:=
21405:)
21402:i
21399:(
21395:y
21369:]
21341:)
21338:2
21335:(
21330:2
21326:x
21320:)
21317:2
21314:(
21309:1
21305:x
21299:1
21287:)
21284:1
21281:(
21276:2
21272:x
21266:)
21263:1
21260:(
21255:1
21251:x
21245:1
21239:[
21234:=
21230:X
21206:]
21200:,
21197:)
21194:2
21191:(
21185:,
21182:)
21179:1
21176:(
21170:[
21167:=
21142:)
21139:)
21136:)
21133:i
21130:(
21121:1
21118:(
21115:)
21112:i
21109:(
21103:(
21094:=
21090:S
21065:)
21059:k
21045:y
21041:+
21036:k
21031:w
21025:X
21019:k
21014:S
21008:(
21002:T
20997:X
20990:1
20982:)
20977:X
20971:k
20966:S
20959:T
20954:X
20948:(
20943:=
20938:1
20935:+
20932:k
20927:w
20901:w
20875:)
20872:i
20869:(
20865:x
20859:T
20854:w
20845:e
20841:+
20838:1
20834:1
20829:=
20826:)
20823:i
20820:(
20795:T
20791:]
20784:,
20781:)
20778:i
20775:(
20770:2
20766:x
20762:,
20759:)
20756:i
20753:(
20748:1
20744:x
20740:,
20737:1
20734:[
20731:=
20728:)
20725:i
20722:(
20718:x
20697:]
20691:,
20686:2
20678:,
20673:1
20665:,
20660:0
20652:[
20649:=
20644:T
20639:w
20601:1
20598:=
20595:y
20575:0
20572:=
20569:y
20476:.
20470:y
20462:i
20458:n
20452:)
20441:i
20436:X
20419:e
20415:+
20412:1
20408:1
20400:1
20396:(
20389:y
20384:)
20374:i
20369:X
20352:e
20348:+
20345:1
20341:1
20336:(
20328:)
20323:y
20318:i
20314:n
20308:(
20302:=
20297:y
20289:i
20285:n
20280:)
20274:i
20270:p
20263:1
20260:(
20255:y
20250:i
20246:p
20239:)
20234:y
20229:i
20225:n
20219:(
20213:=
20210:)
20205:i
20200:X
20192:y
20189:=
20184:i
20180:Y
20176:(
20147:,
20141:i
20136:X
20123:=
20119:)
20111:i
20107:p
20100:1
20094:i
20090:p
20084:(
20074:=
20071:)
20066:i
20062:p
20058:(
20049:=
20045:)
20040:]
20034:i
20029:X
20022:|
20013:i
20009:n
20003:i
19999:Y
19988:[
19979:E
19973:(
19940:,
19935:]
19929:i
19924:X
19917:|
19908:i
19904:n
19898:i
19894:Y
19883:[
19874:E
19869:=
19864:i
19860:p
19838:i
19834:n
19829:i
19825:p
19808:n
19805:,
19799:,
19796:1
19793:=
19790:i
19782:,
19779:)
19774:i
19770:p
19766:,
19761:i
19757:n
19753:(
19738:i
19734:Y
19715:i
19711:Y
19703:i
19699:n
19695:i
19672:.
19666:X
19662:d
19656:f
19652:d
19645:)
19642:y
19636:1
19633:(
19630:y
19627:=
19621:X
19617:d
19611:y
19607:d
19586:X
19578:X
19576:(
19574:f
19552:)
19549:X
19546:(
19543:f
19536:e
19532:+
19529:1
19525:1
19520:=
19517:y
19503:k
19499:x
19495:1
19492:x
19488:X
19483:i
19481:p
19451:.
19443:)
19438:i
19435:,
19432:k
19428:x
19422:k
19414:+
19408:+
19403:i
19400:,
19397:1
19393:x
19387:1
19379:+
19374:0
19366:(
19359:e
19355:+
19352:1
19348:1
19343:=
19338:i
19334:p
19264:0
19249:1
19239:=
19209:i
19205:p
19201:=
19191:i
19186:X
19176:1
19162:e
19158:+
19155:1
19151:1
19146:=
19136:i
19131:X
19121:1
19110:e
19106:+
19103:1
19095:i
19090:X
19080:1
19069:e
19063:=
19060:)
19057:1
19054:=
19049:i
19045:Y
19041:(
19012:1
19009:=
19002:i
18997:X
18988:0
18983:e
18979:=
18972:i
18967:X
18957:0
18946:e
18922:.
18918:0
18914:=
18909:0
18875:.
18865:i
18860:X
18850:1
18839:e
18835:+
18828:i
18823:X
18813:0
18802:e
18793:i
18788:X
18778:1
18767:e
18761:=
18748:)
18741:i
18736:X
18726:1
18715:e
18711:+
18704:i
18699:X
18689:0
18678:e
18674:(
18667:i
18662:X
18653:C
18648:e
18638:i
18633:X
18623:1
18612:e
18604:i
18599:X
18590:C
18585:e
18578:=
18561:i
18556:X
18547:C
18542:e
18534:i
18529:X
18519:1
18508:e
18504:+
18497:i
18492:X
18483:C
18478:e
18470:i
18465:X
18455:0
18444:e
18434:i
18429:X
18420:C
18415:e
18407:i
18402:X
18392:1
18381:e
18374:=
18357:i
18352:X
18344:)
18340:C
18336:+
18331:1
18321:(
18317:e
18313:+
18306:i
18301:X
18293:)
18289:C
18285:+
18280:0
18270:(
18266:e
18257:i
18252:X
18244:)
18240:C
18236:+
18231:1
18221:(
18217:e
18211:=
18204:)
18201:1
18198:=
18193:i
18189:Y
18185:(
18165:1
18161:β
18156:0
18152:β
18133:1
18130:=
18127:)
18124:1
18121:=
18116:i
18112:Y
18108:(
18102:+
18099:)
18096:0
18093:=
18088:i
18084:Y
18080:(
18057:)
18054:1
18051:=
18046:i
18042:Y
18038:(
18015:)
18012:0
18009:=
18004:i
18000:Y
17996:(
17970:.
17967:)
17961:,
17956:i
17951:X
17941:1
17931:,
17926:i
17921:X
17911:0
17901:,
17898:c
17895:(
17886:=
17883:)
17880:c
17877:=
17872:i
17868:Y
17864:(
17820:i
17815:X
17805:h
17794:e
17788:h
17775:i
17770:X
17760:c
17749:e
17743:=
17740:)
17737:c
17734:=
17729:i
17725:Y
17721:(
17688:.
17678:i
17673:X
17663:1
17652:e
17648:+
17641:i
17636:X
17626:0
17615:e
17606:i
17601:X
17591:1
17580:e
17574:=
17567:)
17564:1
17561:=
17556:i
17552:Y
17548:(
17531:i
17526:X
17516:1
17505:e
17501:+
17494:i
17489:X
17479:0
17468:e
17459:i
17454:X
17444:0
17433:e
17427:=
17420:)
17417:0
17414:=
17409:i
17405:Y
17401:(
17364:i
17359:X
17349:1
17338:e
17334:+
17327:i
17322:X
17312:0
17301:e
17297:=
17294:Z
17277:Z
17273:Z
17264:i
17260:Y
17256:Z
17231:i
17226:X
17216:1
17205:e
17199:Z
17196:1
17191:=
17184:)
17181:1
17178:=
17173:i
17169:Y
17165:(
17151:i
17146:X
17136:0
17125:e
17119:Z
17116:1
17111:=
17104:)
17101:0
17098:=
17093:i
17089:Y
17085:(
17051:Z
17011:Z
16997:i
16992:X
16982:1
16972:=
16965:)
16962:1
16959:=
16954:i
16950:Y
16946:(
16930:Z
16916:i
16911:X
16901:0
16891:=
16884:)
16881:0
16878:=
16873:i
16869:Y
16865:(
16838:i
16834:p
16698:)
16692:(
16687:)
16683:(
16661:.
16613:i
16609:p
16601:=
16594:)
16589:i
16584:X
16571:(
16563:1
16548:=
16540:)
16535:i
16530:X
16511:(
16499:=
16484:)
16479:i
16474:X
16452:(
16440:=
16415:)
16412:0
16403:+
16398:i
16393:X
16380:(
16368:=
16345:)
16342:0
16333:+
16328:i
16323:X
16315:)
16310:0
16295:1
16285:(
16282:(
16270:=
16262:)
16259:0
16253:)
16248:0
16235:1
16227:(
16224:+
16219:i
16214:X
16206:)
16201:0
16186:1
16176:(
16173:(
16161:=
16152:)
16148:0
16142:)
16137:0
16124:1
16116:(
16113:+
16110:)
16105:i
16100:X
16090:0
16075:i
16070:X
16060:1
16050:(
16046:(
16033:=
16024:)
16020:0
16013:)
16007:0
15999:+
15994:i
15989:X
15979:0
15968:(
15959:1
15951:+
15946:i
15941:X
15931:1
15920:(
15907:=
15898:)
15892:i
15887:X
15879:0
15868:0
15863:i
15859:Y
15847:1
15842:i
15838:Y
15833:(
15820:=
15811:)
15805:i
15800:X
15787:0
15782:i
15778:Y
15766:1
15761:i
15757:Y
15752:(
15739:=
15736:)
15731:i
15726:X
15718:1
15715:=
15710:i
15706:Y
15702:(
15672:.
15669:)
15666:1
15663:,
15660:0
15657:(
15643:0
15630:1
15622:=
15590:0
15577:1
15569:=
15543:0
15528:1
15518:=
15439:0
15432:,
15424:0
15419:i
15415:Y
15403:1
15398:i
15394:Y
15383:1
15377:{
15372:=
15367:i
15363:Y
15332:x
15325:e
15317:e
15311:x
15304:e
15300:=
15297:)
15294:x
15291:=
15286:1
15278:(
15272:=
15269:)
15266:x
15263:=
15258:0
15250:(
15230:1
15206:)
15203:1
15200:,
15197:0
15194:(
15186:1
15169:1
15157:)
15154:1
15151:,
15148:0
15145:(
15137:1
15120:0
15078:1
15070:+
15065:i
15060:X
15050:1
15040:=
15028:1
15023:i
15019:Y
15008:0
15000:+
14995:i
14990:X
14980:0
14970:=
14958:0
14953:i
14949:Y
14875:i
14871:p
14867:=
14856:)
14851:i
14846:X
14833:(
14825:1
14814:=
14796:)
14791:i
14786:X
14768:i
14760:(
14754:=
14744:)
14739:i
14734:X
14713:i
14705:(
14699:=
14689:)
14686:0
14678:i
14670:+
14665:i
14660:X
14647:(
14641:=
14631:)
14626:i
14621:X
14613:0
14600:i
14596:Y
14592:(
14586:=
14579:)
14574:i
14569:X
14561:1
14558:=
14553:i
14549:Y
14545:(
14512:)
14509:x
14506:(
14498:1
14487:=
14484:)
14481:x
14473:i
14465:(
14418:i
14414:Y
14410:s
14405:i
14401:Y
14397:s
14393:s
14388:i
14384:Y
14380:μ
14376:μ
14344:0
14337:,
14332:i
14327:X
14308:i
14288:0
14275:i
14271:Y
14260:1
14254:{
14249:=
14244:i
14240:Y
14225:i
14221:Y
14192:)
14189:1
14186:,
14183:0
14180:(
14166:i
14132:i
14124:+
14119:i
14114:X
14101:=
14091:i
14087:Y
14068:i
14064:Y
14057:i
14013:i
14008:X
13994:e
13990:+
13987:1
13981:y
13973:i
13968:X
13954:e
13948:=
13943:y
13937:1
13932:)
13921:i
13916:X
13902:e
13898:+
13895:1
13887:i
13882:X
13868:e
13859:1
13855:(
13848:y
13843:)
13833:i
13828:X
13814:e
13810:+
13807:1
13799:i
13794:X
13780:e
13774:(
13769:=
13764:y
13758:1
13754:)
13748:i
13744:p
13737:1
13734:(
13729:y
13722:i
13718:p
13712:=
13709:)
13704:i
13699:X
13691:y
13688:=
13683:i
13679:Y
13675:(
13631:i
13626:X
13609:e
13605:+
13602:1
13598:1
13593:=
13590:)
13585:i
13580:X
13567:(
13559:1
13548:=
13543:i
13539:p
13535:=
13532:]
13527:i
13522:X
13512:i
13508:Y
13504:[
13496:E
13458:e
13447:j
13438:j
13434:β
13390:i
13386:p
13369:)
13363:+
13360:,
13351:(
13314:i
13309:X
13296:=
13292:)
13284:i
13280:p
13273:1
13267:i
13263:p
13257:(
13247:=
13244:)
13239:i
13235:p
13231:(
13222:=
13219:)
13216:]
13211:i
13206:X
13196:i
13192:Y
13188:[
13180:E
13175:(
13141:i
13138:,
13135:m
13131:x
13125:m
13117:+
13111:+
13106:i
13103:,
13100:1
13096:x
13090:1
13082:+
13077:0
13069:=
13065:)
13057:i
13053:p
13046:1
13040:i
13036:p
13030:(
13020:=
13017:)
13012:i
13008:p
13004:(
12995:=
12992:)
12989:]
12984:i
12981:,
12978:m
12974:x
12970:,
12964:,
12959:i
12956:,
12953:1
12949:x
12940:i
12936:Y
12932:[
12924:E
12919:(
12872:k
12868:m
12852:k
12849:m
12845:x
12822:n
12805:m
12789:m
12786:n
12756:k
12753:m
12749:x
12745:)
12740:k
12735:x
12730:(
12725:n
12721:p
12715:K
12710:1
12707:=
12704:k
12691:k
12688:m
12684:x
12680:)
12675:k
12671:y
12667:,
12664:n
12661:(
12653:K
12648:1
12645:=
12642:k
12634:=
12631:0
12628:=
12620:m
12617:n
12585:n
12580:k
12574:n
12569:k
12567:y
12561:k
12559:y
12541:)
12536:k
12532:y
12528:,
12525:n
12522:(
12496:)
12493:)
12488:k
12483:x
12478:(
12473:n
12469:p
12465:(
12455:)
12450:k
12446:y
12442:,
12439:n
12436:(
12428:N
12423:0
12420:=
12417:n
12407:K
12402:1
12399:=
12396:k
12388:=
12360:k
12356:y
12333:k
12328:x
12316:k
12312:k
12308:K
12291:)
12287:x
12283:(
12278:1
12274:p
12267:1
12264:=
12261:)
12257:x
12253:(
12248:0
12244:p
12223:)
12219:x
12215:(
12210:1
12206:p
12202:=
12199:)
12195:x
12191:(
12188:p
12168:1
12165:=
12162:N
12138:x
12129:n
12119:=
12115:)
12109:)
12105:x
12101:(
12096:0
12092:p
12086:)
12082:x
12078:(
12073:n
12069:p
12062:(
12052:=
12047:n
12043:t
12028:n
12026:t
12022:n
12008:)
12004:x
12000:(
11995:0
11991:p
11980:n
11966:)
11962:x
11958:(
11953:n
11949:p
11926:n
11899:)
11895:x
11891:(
11886:0
11882:p
11852:x
11843:u
11832:e
11826:N
11821:1
11818:=
11815:u
11807:+
11804:1
11800:1
11795:=
11792:)
11788:x
11784:(
11779:n
11775:p
11769:N
11764:1
11761:=
11758:n
11747:1
11744:=
11741:)
11737:x
11733:(
11728:0
11724:p
11702:N
11699:,
11693:,
11690:2
11687:,
11684:1
11681:=
11678:n
11652:x
11643:u
11632:e
11626:N
11621:1
11618:=
11615:u
11607:+
11604:1
11597:x
11588:n
11577:e
11571:=
11568:)
11564:x
11560:(
11555:n
11551:p
11537:e
11533:x
11525:y
11521:n
11505:1
11502:+
11499:N
11475:1
11472:+
11469:N
11456:0
11454:x
11438:1
11435:+
11432:M
11407:)
11403:x
11399:(
11396:p
11390:1
11370:)
11366:x
11362:(
11359:p
11323:1
11320:=
11317:y
11291:1
11287:x
11264:2
11260:x
11239:.
11234:2
11209:1
11206:=
11203:y
11181:2
11177:x
11156:2
11134:2
11130:x
11109:2
11106:=
11101:2
11074:1
11071:=
11068:y
11042:1
11017:1
11014:=
11011:y
10989:1
10985:x
10964:1
10942:1
10938:x
10917:1
10914:=
10909:1
10879:/
10875:1
10872:=
10869:)
10866:1
10863:+
10857:(
10853:/
10849:1
10829:0
10826:=
10821:2
10817:x
10813:=
10808:1
10804:x
10783:1
10780:=
10777:y
10755:3
10727:1
10724:=
10721:y
10701:0
10698:=
10693:2
10689:x
10685:=
10680:1
10676:x
10655:0
10652:=
10647:2
10643:x
10639:=
10634:1
10630:x
10609:1
10606:=
10603:y
10591:y
10576:3
10570:=
10565:0
10537:1
10534:=
10531:y
10521:p
10515:,
10498:)
10493:2
10489:x
10483:2
10475:+
10470:1
10466:x
10460:1
10452:+
10447:0
10439:(
10432:b
10428:+
10425:1
10421:1
10416:=
10406:2
10402:x
10396:2
10388:+
10383:1
10379:x
10373:1
10365:+
10360:0
10351:b
10347:+
10344:1
10336:2
10332:x
10326:2
10318:+
10313:1
10309:x
10303:1
10295:+
10290:0
10281:b
10275:=
10267:x
10255:b
10251:+
10248:1
10241:x
10228:b
10222:=
10219:p
10196:2
10192:x
10188:2
10185:+
10180:1
10176:x
10172:+
10169:3
10163:=
10157:p
10151:1
10147:p
10129:=
10126:t
10103:2
10100:=
10095:2
10070:1
10067:=
10062:1
10037:3
10031:=
10026:0
9998:=
9995:b
9975:2
9972:=
9969:M
9951:m
9949:x
9942:x
9923:k
9920:m
9916:x
9912:)
9907:k
9902:x
9897:(
9894:p
9889:K
9884:1
9881:=
9878:k
9865:k
9862:m
9858:x
9852:k
9848:y
9842:K
9837:1
9834:=
9831:k
9823:=
9820:0
9817:=
9809:m
9777:β
9773:β
9756:)
9753:)
9747:k
9743:x
9738:(
9735:p
9729:1
9726:(
9718:b
9710:)
9705:k
9701:y
9694:1
9691:(
9686:K
9681:1
9678:=
9675:k
9667:+
9664:)
9661:)
9655:k
9651:x
9646:(
9643:p
9640:(
9632:b
9622:k
9618:y
9612:K
9607:1
9604:=
9601:k
9593:=
9567:1
9564:=
9561:M
9539:k
9535:y
9524:k
9508:k
9503:x
9491:K
9477:1
9474:=
9471:y
9451:)
9447:x
9443:(
9440:p
9419:x
9398:1
9395:=
9392:y
9372:1
9369:=
9366:y
9344:m
9319:b
9293:b
9289:S
9276:,
9264:)
9261:t
9258:(
9253:b
9249:S
9245:=
9236:x
9220:b
9216:+
9213:1
9209:1
9204:=
9195:x
9182:b
9178:+
9175:1
9168:x
9155:b
9149:=
9146:)
9142:x
9138:(
9135:p
9112:1
9109:=
9106:y
9096:p
9079:x
9068:=
9063:m
9059:x
9053:m
9043:M
9038:0
9035:=
9032:m
9024:=
9021:t
9007:0
9005:x
8988:}
8983:M
8975:,
8969:,
8964:2
8956:,
8951:1
8943:,
8938:0
8930:{
8927:=
8901:}
8896:M
8892:x
8888:,
8882:,
8877:2
8873:x
8869:,
8864:1
8860:x
8856:,
8851:0
8847:x
8843:{
8840:=
8836:x
8810:)
8807:1
8804:+
8801:M
8798:(
8786:β
8778:e
8763:b
8753:e
8746:b
8730:i
8715:t
8696:M
8692:x
8686:M
8678:+
8672:+
8667:2
8663:x
8657:2
8649:+
8644:1
8640:x
8634:1
8626:+
8621:0
8613:=
8607:p
8601:1
8597:p
8587:b
8579:=
8576:t
8563:M
8549:1
8546:=
8543:y
8525:y
8520:M
8518:x
8513:2
8511:x
8506:1
8504:x
8500:M
8480:,
8477:2
8474:,
8471:1
8468:,
8465:0
8462:=
8459:y
8447:2
8443:1
8441:x
8403:,
8398:i
8393:X
8380:=
8377:)
8374:i
8371:(
8368:f
8351:m
8345:i
8343:X
8333:x
8328:i
8323:x
8318:i
8313:x
8308:.
8306:0
8303:β
8294:i
8289:x
8285:i
8278:m
8273:β
8267:m
8263:β
8259:1
8256:β
8252:0
8249:β
8222:m
8214:,
8208:,
8203:0
8175:,
8170:i
8167:,
8164:m
8160:x
8154:m
8146:+
8140:+
8135:i
8132:,
8129:1
8125:x
8119:1
8111:+
8106:0
8098:=
8095:)
8092:i
8089:(
8086:f
8073:i
8059:)
8056:i
8053:(
8050:f
8027:i
8023:p
8005:i
8001:p
7996:i
7992:p
7987:i
7983:Y
7967:i
7963:p
7958:i
7954:p
7949:i
7945:p
7940:i
7936:Y
7924:i
7920:p
7916:i
7911:i
7907:p
7898:i
7894:Y
7861:)
7858:y
7852:1
7849:(
7845:)
7839:i
7835:p
7828:1
7825:(
7820:y
7815:i
7811:p
7807:=
7800:)
7795:i
7792:,
7789:m
7785:x
7781:,
7775:,
7770:i
7767:,
7764:1
7760:x
7753:y
7750:=
7745:i
7741:Y
7737:(
7720:0
7717:=
7714:y
7702:i
7698:p
7691:1
7684:1
7681:=
7678:y
7666:i
7662:p
7655:{
7650:=
7643:)
7638:i
7635:,
7632:m
7628:x
7624:,
7618:,
7613:i
7610:,
7607:1
7603:x
7596:y
7593:=
7588:i
7584:Y
7580:(
7568:i
7564:p
7560:=
7553:]
7548:i
7545:,
7542:m
7538:x
7534:,
7528:,
7523:i
7520:,
7517:1
7513:x
7504:i
7500:Y
7496:[
7488:E
7479:)
7474:i
7470:p
7466:(
7445:i
7442:,
7439:m
7435:x
7431:,
7425:,
7420:i
7417:,
7414:1
7410:x
7401:i
7397:Y
7376:i
7372:p
7363:i
7359:Y
7299:x
7294:i
7289:x
7284:i
7280:Y
7268:i
7264:Y
7247:x
7242:i
7237:x
7233:m
7229:i
7225:N
7203:e
7200:=
7197:b
7169:)
7164:m
7160:x
7154:m
7146:+
7140:+
7135:2
7131:x
7125:2
7117:+
7112:1
7108:x
7102:1
7094:+
7089:0
7081:(
7074:b
7070:+
7067:1
7063:1
7058:=
7055:p
7027:m
7023:x
7017:m
7009:+
7003:+
6998:2
6994:x
6988:2
6980:+
6975:1
6971:x
6965:1
6957:+
6952:0
6944:=
6938:p
6932:1
6928:p
6894:m
6891:,
6885:,
6882:2
6879:,
6876:1
6873:,
6870:0
6867:=
6864:i
6842:i
6827:m
6807:i
6803:x
6797:i
6787:m
6782:1
6779:=
6776:i
6768:+
6763:0
6755:=
6750:m
6746:x
6740:m
6732:+
6726:+
6721:2
6717:x
6711:2
6703:+
6698:1
6694:x
6688:1
6680:+
6675:0
6650:x
6645:1
6637:+
6632:0
6605:d
6601:c
6597:b
6593:a
6576:c
6573:b
6568:d
6565:a
6535:1
6526:e
6503:1
6471:1
6462:e
6458:=
6451:x
6446:1
6438:+
6433:0
6424:e
6418:)
6415:1
6412:+
6409:x
6406:(
6401:1
6393:+
6388:0
6379:e
6373:=
6367:)
6361:)
6358:x
6355:(
6352:p
6346:1
6341:)
6338:x
6335:(
6332:p
6326:(
6321:)
6315:)
6312:1
6309:+
6306:x
6303:(
6300:p
6294:1
6289:)
6286:1
6283:+
6280:x
6277:(
6274:p
6268:(
6262:=
6256:)
6253:x
6250:(
6239:)
6236:1
6233:+
6230:x
6227:(
6215:=
6211:R
6208:O
6167:.
6162:x
6157:1
6149:+
6144:0
6135:e
6131:=
6103:x
6075:x
6046:e
6023:x
6018:1
5985:0
5958:)
5955:x
5952:(
5949:p
5928:)
5925:x
5922:(
5919:p
5899:)
5896:x
5893:(
5890:p
5880:.
5838:)
5835:)
5832:x
5829:(
5826:p
5823:(
5820:g
5800:g
5769:.
5764:x
5759:1
5751:+
5746:0
5737:e
5733:=
5727:)
5724:x
5721:(
5718:p
5712:1
5707:)
5704:x
5701:(
5698:p
5669:,
5666:x
5661:1
5653:+
5648:0
5640:=
5636:)
5630:)
5627:x
5624:(
5621:p
5615:1
5610:)
5607:x
5604:(
5601:p
5595:(
5585:=
5582:)
5579:x
5576:(
5573:p
5564:=
5561:)
5558:)
5555:x
5552:(
5549:p
5546:(
5541:1
5530:=
5527:)
5524:)
5521:x
5518:(
5515:p
5512:(
5509:g
5484:1
5473:=
5470:g
5418:X
5393:i
5389:X
5368:)
5365:X
5359:1
5356:=
5351:i
5347:Y
5343:(
5340:P
5318:i
5314:Y
5290:Y
5270:)
5267:x
5264:(
5261:p
5233:)
5230:x
5225:1
5217:+
5212:0
5204:(
5197:e
5193:+
5190:1
5186:1
5181:=
5178:)
5175:t
5172:(
5166:=
5163:)
5160:x
5157:(
5154:p
5131:)
5128:1
5125:,
5122:0
5119:(
5112:R
5108::
5105:p
5082:x
5077:1
5069:+
5064:0
5056:=
5053:t
5030:t
5006:t
4986:x
4963:t
4950:t
4928:t
4921:e
4917:+
4914:1
4910:1
4905:=
4899:1
4896:+
4891:t
4887:e
4880:t
4876:e
4870:=
4867:)
4864:t
4861:(
4835:)
4832:1
4829:,
4826:0
4823:(
4816:R
4812::
4777:t
4748:.
4736:t
4716:)
4713:1
4710:,
4707:0
4704:(
4698:)
4695:t
4692:(
4669:)
4666:t
4663:(
4606:p
4592:p
4575:=
4572:p
4540:1
4537:β
4516:0
4513:β
4504:p
4498:z
4463:5
4449:4
4435:3
4415:2
4412:1
4399:1
4396:0
4351:2
4337:1
4288:=
4274:t
4267:e
4263:+
4260:1
4256:1
4251:=
4248:p
4222:=
4213:4
4210:+
4196:1
4188:4
4185:+
4180:0
4172:=
4169:t
4138:=
4124:t
4117:e
4113:+
4110:1
4106:1
4101:=
4098:p
4069:=
4060:2
4057:+
4043:1
4035:2
4032:+
4027:0
4019:=
4016:t
3993:2
3990:=
3987:x
3960:1
3929:0
3886:1
3877:/
3873:1
3870:=
3867:s
3838:1
3829:/
3823:0
3812:=
3796:s
3792:μ
3767:1
3730:0
3712:L
3708:ℓ
3690:1
3659:0
3625:1
3594:0
3562:k
3558:x
3554:)
3549:k
3545:p
3536:k
3532:y
3528:(
3523:K
3518:1
3515:=
3512:k
3504:=
3496:1
3474:=
3471:0
3448:)
3443:k
3439:p
3430:k
3426:y
3422:(
3417:K
3412:1
3409:=
3406:k
3398:=
3390:0
3368:=
3365:0
3338:1
3307:0
3290:ℓ
3286:ℓ
3268:1
3237:0
3220:ℓ
3191:)
3186:k
3182:p
3175:1
3172:(
3167:0
3164:=
3159:k
3155:y
3151::
3148:k
3137:k
3133:p
3127:1
3124:=
3119:k
3115:y
3111::
3108:k
3100:=
3097:L
3066:)
3062:)
3057:k
3053:p
3046:1
3043:(
3034:)
3029:k
3025:y
3018:1
3015:(
3012:+
3009:)
3004:k
3000:p
2996:(
2985:k
2981:y
2975:(
2969:K
2964:1
2961:=
2958:k
2950:=
2947:)
2942:k
2938:p
2931:1
2928:(
2917:0
2914:=
2909:k
2905:y
2901::
2898:k
2890:+
2887:)
2882:k
2878:p
2874:(
2863:1
2860:=
2855:k
2851:y
2847::
2844:k
2836:=
2764:1
2733:0
2674:)
2669:)
2664:k
2660:y
2653:1
2650:(
2647:,
2642:k
2638:y
2632:(
2608:)
2603:)
2598:k
2594:p
2587:1
2584:(
2581:,
2576:k
2572:p
2566:(
2537:.
2534:)
2529:k
2525:p
2518:1
2515:(
2506:)
2501:k
2497:y
2490:1
2487:(
2479:k
2475:p
2463:k
2459:y
2452:=
2447:k
2414:1
2406:k
2402:p
2395:0
2369:k
2365:y
2342:1
2334:k
2330:p
2309:0
2306:=
2301:k
2297:y
2276:0
2268:k
2264:p
2243:1
2240:=
2235:k
2231:y
2210:0
2207:=
2202:k
2198:y
2177:0
2174:=
2169:k
2165:p
2144:1
2141:=
2136:k
2132:y
2111:1
2108:=
2103:k
2099:p
2070:k
2066:p
2039:k
2035:y
1995:=
1990:k
1986:y
1975:)
1970:k
1966:p
1959:1
1956:(
1940:,
1937:1
1934:=
1929:k
1925:y
1912:k
1908:p
1892:{
1887:=
1882:k
1850:k
1833:k
1817:k
1815:y
1797:1
1766:0
1731:k
1727:p
1720:1
1694:k
1690:y
1663:k
1659:p
1636:)
1631:k
1627:x
1623:(
1620:p
1617:=
1612:k
1608:p
1596:k
1594:y
1589:k
1587:x
1543:1
1534:/
1530:1
1527:=
1524:s
1502:1
1493:/
1487:0
1476:=
1463:x
1459:y
1441:s
1437:/
1433:1
1430:=
1425:1
1400:x
1395:1
1387:+
1382:0
1374:=
1371:y
1361:y
1339:s
1335:/
1325:=
1320:0
1287:)
1284:x
1279:1
1271:+
1266:0
1258:(
1251:e
1247:+
1244:1
1240:1
1235:=
1232:)
1229:x
1226:(
1223:p
1206:s
1192:2
1188:/
1184:1
1181:=
1178:)
1172:(
1169:p
1155:μ
1133:s
1129:/
1125:)
1116:x
1113:(
1106:e
1102:+
1099:1
1095:1
1090:=
1087:)
1084:x
1081:(
1078:p
1056:m
1054:y
1052:,
1049:m
1047:x
1027:y
1019:x
1002:=
999:K
996:=
993:k
973:1
970:=
967:k
957:k
952:k
950:y
945:k
943:x
932:1
929:1
926:1
923:1
920:1
917:0
914:1
911:0
908:1
905:0
902:1
899:0
896:1
893:0
890:0
887:0
884:0
881:0
878:0
872:k
870:y
800:k
798:x
661:e
654:t
647:v
23:.
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.