Knowledge

Multilevel model

Source đź“ť

1879:
research with sufficient power, large sample sizes are required in multilevel models. However, the number of individual observations in groups is not as important as the number of groups in a study. In order to detect cross-level interactions, given that the group sizes are not too small, recommendations have been made that at least 20 groups are needed, although many fewer can be used if one is only interested in inference on the fixed effects and the random effects are control, or "nuisance", variables. The issue of statistical power in multilevel models is complicated by the fact that power varies as a function of effect size and intraclass correlations, it differs for fixed effects versus random effects, and it changes depending on the number of groups and the number of individual observations per group.
4740: 2024:
example, assign class variables to the individual level). The problem with this approach is that it would violate the assumption of independence, and thus could bias our results. This is known as atomistic fallacy. Another way to analyze the data using traditional statistical approaches is to aggregate individual level variables to higher-order variables and then to conduct an analysis on this higher level. The problem with this approach is that it discards all within-group information (because it takes the average of the individual level variables). As much as 80–90% of the variance could be wasted, and the relationship between aggregated variables is inflated, and thus distorted. This is known as
1956:. However, it would also predict, for example, that a white person might have an average income $ 7,000 above a black person, and a 65-year-old might have an income $ 3,000 below a 45-year-old, in both cases regardless of location. A multilevel model, however, would allow for different regression coefficients for each predictor in each location. Essentially, it would assume that people in a given location have correlated incomes generated by a single set of regression coefficients, whereas people in another location have incomes generated by a different set of coefficients. Meanwhile, the coefficients themselves are assumed to be correlated and generated from a single set of 2032:
population of group intercepts and slopes. This allows for an analysis in which one can assume that slopes are fixed but intercepts are allowed to vary. However this presents a problem, as individual components are independent but group components are independent between groups, but dependent within groups. This also allows for an analysis in which the slopes are random; however, the correlations of the error terms (disturbances) are dependent on the values of the individual-level variables. Thus, the problem with using a random-coefficients model in order to analyze hierarchical data is that it is still not possible to incorporate higher order variables.
4097: 4735:{\displaystyle {\begin{aligned}=&~\left.{\pi (\{y_{ij}\}_{i=1,j=1}^{N,M_{i}}|\{\theta _{li}\}_{i=1,l=1}^{N,K},\sigma ^{2})}\right\}{\text{Stage 1: Individual-Level Model}}\\{\phantom {spacer}}\\\times &~\left.{\pi (\{\theta _{li}\}_{i=1,l=1}^{N,K}|\{\alpha _{l}\}_{l=1}^{K},\{\beta _{lb}\}_{l=1,b=1}^{K,P},\{\omega _{l}\}_{l=1}^{K})}\right\}{\text{Stage 2: Population Model}}\\{\phantom {spacer}}\\\times &~\left.{p(\sigma ^{2},\{\alpha _{l}\}_{l=1}^{K},\{\beta _{lb}\}_{l=1,b=1}^{K,P},\{\omega _{l}\}_{l=1}^{K})}\right\}{\text{Stage 3: Prior}}\end{aligned}}} 407: 3127: 2050: 1867: 2698: 1736: 2392: 3122:{\displaystyle {\begin{aligned}&\sigma ^{2}\sim \pi (\sigma ^{2}),\\{\phantom {spacer}}\\&\alpha _{l}\sim \pi (\alpha _{l}),\\{\phantom {spacer}}\\&(\beta _{l1},\ldots ,\beta _{lb},\ldots ,\beta _{lP})\sim \pi (\beta _{l1},\ldots ,\beta _{lb},\ldots ,\beta _{lP}),\\{\phantom {spacer}}\\&\omega _{l}^{2}\sim \pi (\omega _{l}^{2}),\\{\phantom {spacer}}\\&l=1,\ldots ,K.\end{aligned}}} 2685: 509:). The units of analysis are usually individuals (at a lower level) who are nested within contextual/aggregate units (at a higher level). While the lowest level of data in multilevel models is usually an individual, repeated measurements of individuals may also be examined. As such, multilevel models provide an alternative type of analysis for univariate or 2070: 4091: 3796: 1695:, which assesses the difference between models. The likelihood-ratio test can be employed for model building in general, for examining what happens when effects in a model are allowed to vary, and when testing a dummy-coded categorical variable as a single effect. However, the test can only be used when models are 2405: 4745:
review, defining a problem and specifying the research question and hypothesis. Bayesian-specific workflow comprises three sub-steps: (b)–(i) formalizing prior distributions based on background knowledge and prior elicitation; (b)–(ii) determining the likelihood function based on a nonlinear function
1656:
A random intercepts model is a model in which intercepts are allowed to vary, and therefore, the scores on the dependent variable for each individual observation are predicted by the intercept that varies across groups. This model assumes that slopes are fixed (the same across different contexts). In
2003:
Cross-level interactions may also be of substantive interest; for example, when a slope is allowed to vary randomly, a level-2 predictor may be included in the slope formula for the level-1 covariate. For example, one may estimate the interaction of race and neighborhood to obtain an estimate of the
1776:
Independence is an assumption of general linear models, which states that cases are random samples from the population and that scores on the dependent variable are independent of each other. One of the main purposes of multilevel models is to deal with cases where the assumption of independence is
2057:
Multilevel modeling is frequently used in diverse applications and it can be formulated by the Bayesian framework. Particularly, Bayesian nonlinear mixed-effects models have recently received significant attention. A basic version of the Bayesian nonlinear mixed-effects models is represented as the
1987:
Multilevel models have been used in education research or geographical research, to estimate separately the variance between pupils within the same school, and the variance between schools. In psychological applications, the multiple levels are items in an instrument, individuals, and families. In
1936:
As a simple example, consider a basic linear regression model that predicts income as a function of age, class, gender and race. It might then be observed that income levels also vary depending on the city and state of residence. A simple way to incorporate this into the regression model would be
1755:
The assumption of normality states that the error terms at every level of the model are normally distributed. However, most statistical software allows one to specify different distributions for the variance terms, such as a Poisson, binomial, logistic. The multilevel modelling approach can be used
2031:
Another way to analyze hierarchical data would be through a random-coefficients model. This model assumes that each group has a different regression model—with its own intercept and slope. Because groups are sampled, the model assumes that the intercepts and slopes are also randomly sampled from a
1878:
Statistical power for multilevel models differs depending on whether it is level 1 or level 2 effects that are being examined. Power for level 1 effects is dependent upon the number of individual observations, whereas the power for level 2 effects is dependent upon the number of groups. To conduct
1742:
The assumption of linearity states that there is a rectilinear (straight-line, as opposed to non-linear or U-shaped) relationship between variables. However, the model can be extended to nonlinear relationships. Particularly, when the mean part of the level 1 regression equation is replaced with a
1687:
In order to conduct a multilevel model analysis, one would start with fixed coefficients (slopes and intercepts). One aspect would be allowed to vary at a time (that is, would be changed), and compared with the previous model in order to assess better model fit. There are three different questions
1862:
can also be computed. When computing a t-test, it is important to keep in mind the degrees of freedom, which will depend on the level of the predictor (e.g., level 1 predictor or level 2 predictor). For a level 1 predictor, the degrees of freedom are based on the number of level 1 predictors, the
1647:
Before conducting a multilevel model analysis, a researcher must decide on several aspects, including which predictors are to be included in the analysis, if any. Second, the researcher must decide whether parameter values (i.e., the elements that will be estimated) will be fixed or random. Fixed
4744:
The panel on the right displays Bayesian research cycle using Bayesian nonlinear mixed-effects model. A research cycle using the Bayesian nonlinear mixed-effects model comprises two steps: (a) standard research cycle and (b) Bayesian-specific workflow. Standard research cycle involves literature
893:
At Level 1, both the intercepts and slopes in the groups can be either fixed (meaning that all groups have the same values, although in the real world this would be a rare occurrence), non-randomly varying (meaning that the intercepts and/or slopes are predictable from an independent variable at
1944:
to account for the location (i.e. a set of additional binary predictors and associated regression coefficients, one per location). This would have the effect of shifting the mean income up or down—but it would still assume, for example, that the effect of race and gender on income is the same
2040:
Multilevel models have two error terms, which are also known as disturbances. The individual components are all independent, but there are also group components, which are independent between groups but correlated within groups. However, variance components can differ, as some groups are more
2023:
There are several alternative ways of analyzing hierarchical data, although most of them have some problems. First, traditional statistical techniques can be used. One could disaggregate higher-order variables to the individual level, and thus conduct an analysis on this individual level (for
1669:
A random slopes model is a model in which slopes are allowed to vary according to a correlation matrix, and therefore, the slopes are different across grouping variable such as time or individuals. This model assumes that intercepts are fixed (the same across different contexts).
525:, where scores on the dependent variable are adjusted for covariates (e.g. individual differences) before testing treatment differences. Multilevel models are able to analyze these experiments without the assumptions of homogeneity-of-regression slopes that is required by ANCOVA. 2387:{\displaystyle {\begin{aligned}&{y}_{ij}=f(t_{ij};\theta _{1i},\theta _{2i},\ldots ,\theta _{li},\ldots ,\theta _{Ki})+\epsilon _{ij},\\{\phantom {spacer}}\\&\epsilon _{ij}\sim N(0,\sigma ^{2}),\\{\phantom {spacer}}\\&i=1,\ldots ,N,\,j=1,\ldots ,M_{i}.\end{aligned}}} 1678:
A model that includes both random intercepts and random slopes is likely the most realistic type of model, although it is also the most complex. In this model, both intercepts and slopes are allowed to vary across groups, meaning that they are different in different contexts.
1648:
parameters are composed of a constant over all the groups, whereas a random parameter has a different value for each of the groups. Additionally, the researcher must decide whether to employ a maximum likelihood estimation or a restricted maximum likelihood estimation type.
1815:. This assumption is testable but often ignored, rendering the estimator inconsistent. If this assumption is violated, the random-effect must be modeled explicitly in the fixed part of the model, either by using dummy variables or including cluster means of all 3802: 1853:
The type of statistical tests that are employed in multilevel models depend on whether one is examining fixed effects or variance components. When examining fixed effects, the tests are compared with the standard error of the fixed effect, which results in a
1945:
everywhere. In reality, this is unlikely to be the case—different local laws, different retirement policies, differences in level of racial prejudice, etc. are likely to cause all of the predictors to have different sorts of effects in different locales.
493:
that vary at more than one level. An example could be a model of student performance that contains measures for individual students as well as measures for classrooms within which the students are grouped. These models can be seen as generalizations of
3508: 2680:{\displaystyle {\begin{aligned}&\theta _{li}=\alpha _{l}+\sum _{b=1}^{P}\beta _{lb}x_{ib}+\eta _{li},\\{\phantom {spacer}}\\&\eta _{li}\sim N(0,\omega _{l}^{2}),\\{\phantom {spacer}}\\&i=1,\ldots ,N,\,l=1,\ldots ,K.\end{aligned}}} 1768:, also known as homogeneity of variance, assumes equality of population variances. However, different variance-correlation matrix can be specified to account for this, and the heterogeneity of variance can itself be modeled. 528:
Multilevel models can be used on data with many levels, although 2-level models are the most common and the rest of this article deals only with these. The dependent variable must be examined at the lowest level of analysis.
1960:. Additional levels are possible: For example, people might be grouped by cities, and the city-level regression coefficients grouped by state, and the state-level coefficients generated from a single hyper-hyperparameter. 1429: 1346: 1999:
Different covariables may be relevant on different levels. They can be used for longitudinal studies, as with growth studies, to separate changes within one individual and differences between individuals.
1263: 1195: 889: 628: 1928:
the level at which it was measured. In this example "test score" might be measured at pupil level, "teacher experience" at class level, "school funding" at school level, and "urban" at district level.
3340: 1777:
violated; multilevel models do, however, assume that 1) the level 1 and level 2 residuals are uncorrelated and 2) The errors (as measured by the residuals) at the highest level are uncorrelated.
4102: 2703: 2410: 2075: 3412: 4086:{\displaystyle \propto \pi (\{y_{ij}\}_{i=1,j=1}^{N,M_{i}},\{\theta _{li}\}_{i=1,l=1}^{N,K},\sigma ^{2},\{\alpha _{l}\}_{l=1}^{K},\{\beta _{lb}\}_{l=1,b=1}^{K,P},\{\omega _{l}\}_{l=1}^{K})} 1688:
that a researcher would ask in assessing a model. First, is it a good model? Second, is a more complex model better? Third, what contribution do individual predictors make to the model?
3791:{\displaystyle \pi (\{\theta _{li}\}_{i=1,l=1}^{N,K},\sigma ^{2},\{\alpha _{l}\}_{l=1}^{K},\{\beta _{lb}\}_{l=1,b=1}^{K,P},\{\omega _{l}\}_{l=1}^{K}|\{y_{ij}\}_{i=1,j=1}^{N,M_{i}})} 1863:
number of groups and the number of individual observations. For a level 2 predictor, the degrees of freedom are based on the number of level 2 predictors and the number of groups.
1845:
regressors. This assumption is probably the most important assumption the estimator makes, but one that is misunderstood by most applied researchers using these types of models.
1114: 3462: 1992:
research, data from individuals must often be nested within teams or other functional units. They are often used in ecological research as well under the more general term
1572: 1545: 1487: 1458: 757: 725: 3492: 894:
Level 2), or randomly varying (meaning that the intercepts and/or slopes are different in the different groups, and that each have their own overall mean and variance).
960:
can not be described by the linear relationship, then one can find some non linear functional relationship between the response and predictor, and extend the model to
3239: 3209: 3159: 1843: 1813: 1699:(meaning that a more complex model includes all of the effects of a simpler model). When testing non-nested models, comparisons between models can be made using the 1636: 1604: 1042: 992: 958: 928: 819: 789: 693: 661: 1516: 4763: 3432: 3360: 3279: 3259: 3179: 1062: 1012: 663:
refers to the score on the dependent variable for an individual observation at Level 1 (subscript i refers to individual case, subscript j refers to the group).
502:), although they can also extend to non-linear models. These models became much more popular after sufficient computing power and software became available. 5589:
Bryk, Anthony S.; Raudenbush, Stephen W. (1 January 1988). "Heterogeneity of variance in experimental studies: A challenge to conventional interpretations".
1460:
refers to the overall intercept. This is the grand mean of the scores on the dependent variable across all the groups when all the predictors are equal to 0.
5386:"Fixed effects models versus mixed effects models for clustered data: Reviewing the approaches, disentangling the differences, and making recommendations" 5138:
Lee, Se Yoon; Mallick, Bani (2021). "Bayesian Hierarchical Modeling: Application Towards Production Results in the Eagle Ford Shale of South Texas".
5304: 5544: 5254: 4915: 4853: 505:
Multilevel models are particularly appropriate for research designs where data for participants are organized at more than one level (i.e.,
17: 5488: 1352: 1269: 897:
When there are multiple level 1 independent variables, the model can be expanded by substituting vectors and matrices in the equation.
2053:
Bayesian research cycle using Bayesian nonlinear mixed effects model: (a) standard research cycle and (b) Bayesian-specific workflow.
2013: 1989: 437: 1201: 1133: 827: 542: 347: 5823:"Should I use fixed effects or random effects when I have fewer than five levels of a grouping factor in a mixed-effects model?" 4969:"Should I use fixed effects or random effects when I have fewer than five levels of a grouping factor in a mixed-effects model?" 5797: 5744: 5720: 5701: 5680: 5570: 5528: 5498: 5238: 5182: 5050: 4938: 4899: 4837: 1128:
The dependent variables are the intercepts and the slopes for the independent variables at Level 1 in the groups of Level 2.
4765:; and (b)–(iii) making a posterior inference. The resulting posterior inference can be used to start a new research cycle. 337: 3284: 1971:
and arbitrary relationships among the different variables. Multilevel analysis has been extended to include multilevel
3365: 759:
refers to the slope for the relationship in group j (Level 2) between the Level 1 predictor and the dependent variable.
3503:
A central task in the application of the Bayesian nonlinear mixed-effect models is to evaluate the posterior density:
1988:
sociological applications, multilevel models are used to examine individuals embedded within regions or countries. In
5775: 4779: 1948:
In other words, a simple linear regression model might, for example, predict that a given randomly sampled person in
1638:
refers to the deviation in group j from the average slope between the dependent variable and the Level 1 predictor.
301: 1691:
In order to assess models, different model fit statistics would be examined. One such statistic is the chi-square
352: 290: 110: 85: 1704: 212: 2028:, and statistically, this type of analysis results in decreased power in addition to the loss of information. 5887: 4794: 1744: 961: 171: 1907:
However, if one were studying multiple schools and multiple school districts, a 4-level model could include
1972: 1727:), but some of the assumptions are modified for the hierarchical nature of the design (i.e., nested data). 1700: 430: 5442:"Bridging Methodological Divides Between Macro- and Microresearch: Endogeneity and Methods for Panel Data" 5311: 1964: 373: 5621:"Bayesian Nonlinear Models for Repeated Measurement Data: An Overview, Implementation, and Applications" 5268:
Goldstein, Harvey (1991). "Nonlinear Multilevel Models, with an Application to Discrete Response Data".
791:
refers to the random errors of prediction for the Level 1 equation (it is also sometimes referred to as
5892: 5335:"On Ignoring the Random Effects Assumption in Multilevel Models: Review, Critique, and Recommendations" 342: 311: 238: 1067: 518: 332: 321: 285: 192: 3437: 4799: 1696: 393: 264: 187: 80: 59: 1550: 1523: 1465: 1436: 5334: 1724: 732: 700: 423: 316: 3467: 1925: 1658: 280: 275: 217: 5670: 5204: 3434:
is a `nonlinear' function and describes the temporal trajectory of individuals. In the model,
1661:, which are helpful in determining whether multilevel models are required in the first place. 1692: 1574:
refer to the effect of the Level 2 predictor on the Level 1 intercept and slope respectively.
510: 368: 64: 3494:
describe within-individual variability and between-individual variability, respectively. If
3214: 3184: 3134: 1818: 1788: 1611: 1579: 1017: 967: 933: 903: 794: 764: 668: 636: 5441: 5089: 4789: 1941: 1938: 1893: 1494: 388: 378: 259: 227: 182: 161: 69: 8: 4784: 3500:
is not considered, then the model reduces to a frequentist nonlinear mixed-effect model.
306: 207: 202: 156: 105: 95: 40: 5093: 5068:"Estimation of COVID-19 spread curves integrating global data and borrowing information" 1719:
Multilevel models have the same assumptions as other major general linear models (e.g.,
5849: 5822: 5632: 5538: 5469: 5422: 5385: 5365: 5285: 5248: 5192: 5155: 5112: 5079: 5067: 4995: 4968: 4909: 4847: 4748: 3417: 3345: 3264: 3244: 3164: 2025: 1976: 1047: 997: 411: 140: 125: 5440:
Bliese, Paul D.; Schepker, Donald J.; Essman, Spenser M.; Ployhart, Robert E. (2020).
5854: 5793: 5785: 5771: 5740: 5716: 5697: 5676: 5566: 5524: 5494: 5473: 5461: 5414: 5406: 5369: 5357: 5234: 5227: 5178: 5159: 5117: 5046: 5000: 4934: 4895: 4833: 1489:
refers to the average slope between the dependent variable and the Level 1 predictor.
1117: 514: 499: 486: 406: 197: 100: 54: 5788:; Tavlas, George S. (2001). "Random Coefficient Models". In Baltagi, Badi H. (ed.). 5426: 5844: 5834: 5642: 5598: 5453: 5398: 5349: 5277: 5147: 5107: 5097: 4990: 4980: 1765: 222: 5765: 5734: 5691: 5102: 1968: 1953: 1743:
non-linear parametric function, then such a model framework is widely called the
1708: 521:
may be examined. Furthermore, multilevel models can be used as an alternative to
383: 90: 5602: 5814: 5151: 4774: 1957: 135: 5767:
Multilevel Analysis: an Introduction to Basic and Advanced Multilevel Modeling
5881: 5666: 5465: 5457: 5410: 5361: 5353: 5281: 1952:
would have an average yearly income $ 10,000 higher than a similar person in
254: 130: 5229:
Using SPSS for Windows and Macintosh : analyzing and understanding data
5043:
Applied multiple regression/correlation analysis for the behavioral sciences
3281:-th subject. Parameters involved in the model are written in Greek letters. 2004:
interaction between an individual's characteristics and the social context.
5858: 5418: 5121: 5004: 495: 120: 2049: 1866: 537:
When there is a single level 1 independent variable, the level 1 model is
5647: 5620: 1993: 506: 166: 115: 5839: 4985: 4830:
Hierarchical linear models : applications and data analysis methods
2007: 727:
refers to the intercept of the dependent variable for individual case i.
5402: 5289: 1772:
Independence of observations (No Autocorrelation of Model's Residuals)
490: 5490:
Econometric Analysis of Cross Section and Panel Data, second edition
5730: 5637: 5558: 5084: 1735: 5756:
Hierarchical Linear Models: Applications and Data Analysis Methods
2018: 5672:
Data Analysis Using Regression and Multilevel/Hierarchical Models
5614: 5612: 1949: 1424:{\displaystyle \beta _{1j}=\gamma _{10}+\gamma _{11}w_{j}+u_{1j}} 1341:{\displaystyle \beta _{0j}=\gamma _{00}+\gamma _{01}w_{j}+u_{0j}} 27:
Statistical models of parameters that vary at more than one level
5872: 1859: 1855: 522: 5609: 1606:
refers to the deviation in group j from the overall intercept.
1892:
The concept of level is the keystone of this approach. In an
1720: 5439: 5302: 4832:(2. ed.,  ed.). Thousand Oaks, CA : Sage Publications. 1258:{\displaystyle u_{1j}\sim {\mathcal {N}}(0,\sigma _{3}^{2})} 1190:{\displaystyle u_{0j}\sim {\mathcal {N}}(0,\sigma _{2}^{2})} 884:{\displaystyle e_{ij}\sim {\mathcal {N}}(0,\sigma _{1}^{2})} 5333:
Antonakis, John; Bastardoz, Nicolas; Rönkkö, Mikko (2021).
5233:(4th ed.). Upper Saddle River, NJ: Pearson Education. 4564: 4312: 4114: 1785:
The regressors must not correlate with the random effects,
623:{\displaystyle Y_{ij}=\beta _{0j}+\beta _{1j}X_{ij}+e_{ij}} 5675:. New York: Cambridge University Press. pp. 235–299. 5588: 5175:
Hierarchical linear modeling : guide and applications
2044: 5332: 5563:
Multilevel analysis : techniques and applications
4894:(5th ed.). Boston; Montreal: Pearson/A & B. 4751: 4100: 3805: 3511: 3470: 3440: 3420: 3368: 3348: 3287: 3267: 3247: 3217: 3187: 3167: 3137: 2701: 2408: 2073: 2008:
Applications to longitudinal (repeated measures) data
1821: 1791: 1673: 1614: 1582: 1553: 1526: 1497: 1468: 1439: 1355: 1272: 1204: 1136: 1070: 1050: 1020: 1000: 970: 936: 906: 830: 797: 767: 735: 703: 671: 639: 545: 3335:{\displaystyle f(t;\theta _{1},\ldots ,\theta _{K})} 1967:, which are general models with multiple levels of 5806: 5384: 5226: 4757: 4734: 4085: 3790: 3486: 3456: 3426: 3406: 3354: 3334: 3273: 3253: 3233: 3203: 3173: 3153: 3121: 2679: 2386: 1837: 1807: 1630: 1598: 1566: 1539: 1510: 1481: 1452: 1423: 1340: 1257: 1189: 1108: 1056: 1036: 1006: 986: 952: 922: 883: 813: 783: 751: 719: 687: 655: 622: 5523:(Repr. ed.). London: Sage Publications Ltd. 5305:"Introduction to Multilevel Modeling Using HLM 6" 3407:{\displaystyle (\theta _{1},\ldots ,\theta _{K})} 1896:example, the levels for a 2-level model might be 5879: 5736:Multilevel Analysis: Techniques and Applications 5066:Lee, Se Yoon; Lei, Bowen; Mallick, Bani (2020). 4890:Fidell, Barbara G. Tabachnick, Linda S. (2007). 4828:Bryk, Stephen W. Raudenbush, Anthony S. (2002). 1682: 1657:addition, this model provides information about 2019:Alternative ways of analyzing hierarchical data 5763: 5753: 5710: 4933:(3. repr. ed.). Thousand Oaks, CA: Sage. 1123: 994:is the cumulative infection trajectory of the 532: 5382: 5065: 1781:Orthogonality of regressors to random effects 1116:for each country may show a shape similar to 431: 5543:: CS1 maint: multiple names: authors list ( 5253:: CS1 maint: multiple names: authors list ( 5177:. Thousand Oaks, Calif.: Sage Publications. 5133: 5131: 4914:: CS1 maint: multiple names: authors list ( 4852:: CS1 maint: multiple names: authors list ( 4695: 4681: 4640: 4623: 4600: 4586: 4490: 4476: 4435: 4418: 4395: 4381: 4338: 4321: 4207: 4190: 4140: 4123: 4060: 4046: 4005: 3988: 3965: 3951: 3897: 3880: 3832: 3815: 3740: 3723: 3698: 3684: 3643: 3626: 3603: 3589: 3535: 3518: 1756:for all forms of Generalized Linear models. 5784: 5565:(Reprint. ed.). Mahwah, NJ : Erlbaum. 4962: 4960: 4958: 4956: 4954: 4952: 4950: 900:When the relationship between the response 5764:Snijders, T. A. B.; Bosker, R. J. (2011). 5665: 5486: 5225:Salkind, Samuel B. Green, Neil J. (2004). 5137: 1651: 438: 424: 5848: 5838: 5809:Linear Mixed Models for Longitudinal Data 5689: 5646: 5636: 5487:Wooldridge, Jeffrey M. (1 October 2010). 5267: 5173:editor, G. David Garson (10 April 2012). 5128: 5111: 5101: 5083: 4994: 4984: 4885: 4883: 3342:is a known function parameterized by the 2648: 2348: 2014:Multilevel Modeling for Repeated Measures 5758:(2nd ed.). Thousand Oaks, CA: Sage. 5036: 5034: 4947: 4881: 4879: 4877: 4875: 4873: 4871: 4869: 4867: 4865: 4863: 2048: 1865: 1734: 5792:. Oxford: Blackwell. pp. 410–429. 5790:A Companion to Theoretical Econometrics 5754:Raudenbush, S. W.; Bryk, A. S. (2002). 5584: 5582: 5224: 5220: 5218: 5216: 5214: 5032: 5030: 5028: 5026: 5024: 5022: 5020: 5018: 5016: 5014: 3161:denotes the continuous response of the 1924:The researcher must establish for each 1064:-th time points, then the ordered pair 14: 5880: 5514: 5512: 5510: 5172: 4889: 4823: 4821: 4819: 4817: 4815: 2045:Bayesian nonlinear mixed-effects model 1664: 5821:Gomes, Dylan G.E. (20 January 2022). 5820: 5807:Verbeke, G.; Molenberghs, G. (2013). 5739:(2nd ed.). New York: Routledge. 5518: 5383:McNeish, Daniel; Kelley, Ken (2019). 5328: 5326: 5324: 5166: 5045:(3. ed.). Mahwah, NJ : Erlbaum. 5040: 4967:Gomes, Dylan G.E. (20 January 2022). 4966: 4860: 5711:Hedeker, D.; Gibbons, R. D. (2012). 5579: 5211: 5011: 4928: 4827: 1963:Multilevel models are a subclass of 1873: 1848: 5729: 5618: 5557: 5551: 5507: 4922: 4812: 24: 5659: 5321: 5303:ATS Statistical Consulting Group. 1707:(BIC), among others. See further 1674:Random intercepts and slopes model 1642: 1223: 1155: 849: 25: 5904: 5866: 5715:(2nd ed.). New York: Wiley. 5519:Leeuw, Ita Kreft, Jan de (1998). 4780:Mixed-design analysis of variance 1979:, and other more general models. 964:. For example, when the response 1518:refers to the Level 2 predictor. 695:refers to the Level 1 predictor. 405: 5873:Centre for Multilevel Modelling 5760:This concentrates on education. 5696:(4th ed.). London: Wiley. 5521:Introducing multilevel modeling 5480: 5433: 5376: 5342:Organizational Research Methods 5296: 5261: 5041:Cohen, Jacob (3 October 2003). 4268:Stage 1: Individual-Level Model 2063:Stage 1: Individual-Level Model 1882: 1109:{\displaystyle (X_{ij},Y_{ij})} 353:Least-squares spectral analysis 291:Generalized estimating equation 111:Multinomial logistic regression 86:Vector generalized linear model 5770:(2nd ed.). London: Sage. 5059: 4715: 4570: 4510: 4377: 4318: 4258: 4186: 4120: 4080: 3812: 3785: 3719: 3515: 3457:{\displaystyle \epsilon _{ij}} 3401: 3369: 3329: 3291: 3181:-th subject at the time point 3052: 3034: 2974: 2914: 2905: 2845: 2804: 2791: 2736: 2723: 2585: 2561: 2285: 2266: 2192: 2100: 2035: 1714: 1705:Bayesian information criterion 1252: 1228: 1184: 1160: 1103: 1071: 878: 854: 13: 1: 5693:Multilevel Statistical Models 4892:Using multivariate statistics 4805: 4795:Nonlinear mixed-effects model 1745:nonlinear mixed-effects model 1683:Developing a multilevel model 962:nonlinear mixed-effects model 172:Nonlinear mixed-effects model 5103:10.1371/journal.pone.0236860 1973:structural equation modeling 1965:hierarchical Bayesian models 1701:Akaike information criterion 1567:{\displaystyle \gamma _{11}} 1540:{\displaystyle \gamma _{01}} 1482:{\displaystyle \gamma _{10}} 1453:{\displaystyle \gamma _{00}} 517:. Individual differences in 18:Hierarchical linear modeling 7: 5603:10.1037/0033-2909.104.3.396 4768: 1124:Level 2 regression equation 752:{\displaystyle \beta _{1j}} 720:{\displaystyle \beta _{0j}} 533:Level 1 regression equation 374:Mean and predicted response 10: 5909: 5713:Longitudinal Data Analysis 5152:10.1007/s13571-020-00245-8 3487:{\displaystyle \eta _{li}} 2011: 1931: 455:hierarchical linear models 167:Linear mixed-effects model 4929:Luke, Douglas A. (2004). 4520:Stage 2: Population Model 2398:Stage 2: Population Model 2041:homogeneous than others. 1990:organizational psychology 459:linear mixed-effect model 333:Least absolute deviations 5458:10.1177/0149206319868016 5354:10.1177/1094428119877457 4800:Restricted randomization 1887: 81:Generalized linear model 2058:following three-stage: 1982: 1659:intraclass correlations 1652:Random intercepts model 479:random parameter models 5690:Goldstein, H. (2011). 5591:Psychological Bulletin 5282:10.1093/biomet/78.1.45 4759: 4736: 4087: 3792: 3488: 3458: 3428: 3408: 3356: 3336: 3275: 3255: 3235: 3234:{\displaystyle x_{ib}} 3205: 3204:{\displaystyle t_{ij}} 3175: 3155: 3154:{\displaystyle y_{ij}} 3123: 2681: 2463: 2388: 2054: 1870: 1839: 1838:{\displaystyle X_{ij}} 1809: 1808:{\displaystyle u_{0j}} 1739: 1632: 1631:{\displaystyle u_{1j}} 1600: 1599:{\displaystyle u_{0j}} 1568: 1541: 1512: 1483: 1454: 1425: 1342: 1259: 1191: 1110: 1058: 1038: 1037:{\displaystyle X_{ij}} 1008: 988: 987:{\displaystyle Y_{ij}} 954: 953:{\displaystyle X_{ij}} 924: 923:{\displaystyle Y_{ij}} 885: 815: 814:{\displaystyle r_{ij}} 785: 784:{\displaystyle e_{ij}} 753: 721: 689: 688:{\displaystyle X_{ij}} 657: 656:{\displaystyle Y_{ij}} 624: 412:Mathematics portal 338:Iteratively reweighted 5619:Lee, Se Yoon (2022). 5446:Journal of Management 5391:Psychological Methods 4760: 4737: 4088: 3793: 3489: 3459: 3429: 3409: 3357: 3337: 3276: 3261:-th covariate of the 3256: 3236: 3206: 3176: 3156: 3124: 2682: 2443: 2389: 2052: 2012:Further information: 1977:latent class modeling 1937:to add an additional 1869: 1840: 1810: 1738: 1693:likelihood-ratio test 1633: 1601: 1569: 1542: 1513: 1511:{\displaystyle w_{j}} 1484: 1455: 1426: 1343: 1260: 1192: 1111: 1059: 1039: 1009: 989: 955: 925: 886: 816: 786: 754: 722: 690: 658: 625: 511:multivariate analysis 475:random-effects models 369:Regression validation 348:Bayesian multivariate 65:Polynomial regression 5888:Analysis of variance 5648:10.3390/math10060898 5317:on 31 December 2010. 4790:Random effects model 4749: 4548: 4296: 4098: 3803: 3509: 3468: 3438: 3418: 3366: 3362:-dimensional vector 3346: 3285: 3265: 3245: 3215: 3185: 3165: 3135: 3083: 3005: 2835: 2767: 2699: 2616: 2534: 2406: 2316: 2239: 2071: 1942:categorical variable 1894:educational research 1819: 1789: 1612: 1580: 1551: 1524: 1495: 1466: 1437: 1353: 1270: 1202: 1134: 1068: 1048: 1018: 998: 968: 934: 904: 828: 795: 765: 733: 701: 669: 637: 543: 394:Gauss–Markov theorem 389:Studentized residual 379:Errors and residuals 213:Principal components 183:Nonlinear regression 70:General linear model 5840:10.7717/peerj.12794 5669:; Hill, J. (2007). 5094:2020PLoSO..1536860L 4986:10.7717/peerj.12794 4931:Multilevel modeling 4785:Multiscale modeling 4714: 4677: 4619: 4529: 4509: 4472: 4414: 4375: 4277: 4244: 4184: 4079: 4042: 3984: 3934: 3876: 3784: 3717: 3680: 3622: 3572: 3064: 3051: 3027: 2986: 2816: 2748: 2597: 2584: 2515: 2297: 2220: 1665:Random slopes model 1251: 1183: 877: 239:Errors-in-variables 106:Logistic regression 96:Binomial regression 41:Regression analysis 35:Part of a series on 5786:Swamy, P. A. V. B. 5403:10.1037/met0000182 5203:has generic name ( 4755: 4732: 4730: 4694: 4639: 4599: 4489: 4434: 4394: 4337: 4206: 4139: 4083: 4059: 4004: 3964: 3896: 3831: 3788: 3739: 3697: 3642: 3602: 3534: 3484: 3454: 3424: 3404: 3352: 3332: 3271: 3251: 3231: 3201: 3171: 3151: 3119: 3117: 3037: 3013: 2677: 2675: 2570: 2384: 2382: 2055: 2026:ecological fallacy 1871: 1835: 1805: 1764:The assumption of 1740: 1628: 1596: 1564: 1537: 1508: 1479: 1450: 1421: 1338: 1255: 1237: 1187: 1169: 1106: 1054: 1034: 1004: 984: 950: 920: 881: 863: 811: 781: 749: 717: 685: 653: 620: 487:statistical models 483:split-plot designs 471:random coefficient 467:nested data models 126:Multinomial probit 5893:Regression models 5799:978-0-631-21254-6 5746:978-1-84872-845-5 5722:978-0-470-88918-3 5703:978-0-470-74865-7 5682:978-0-521-68689-1 5572:978-0-8058-3219-8 5530:978-0-7619-5141-4 5500:978-0-262-29679-3 5240:978-0-13-146597-8 5184:978-1-4129-9885-7 5052:978-0-8058-2223-6 4940:978-0-7619-2879-9 4901:978-0-205-45938-4 4839:978-0-7619-1904-9 4758:{\displaystyle f} 4726: 4562: 4521: 4310: 4269: 4112: 3427:{\displaystyle f} 3355:{\displaystyle K} 3274:{\displaystyle i} 3254:{\displaystyle b} 3174:{\displaystyle i} 1874:Statistical power 1849:Statistical tests 1118:logistic function 1057:{\displaystyle j} 1014:-th country, and 1007:{\displaystyle i} 515:repeated measures 500:linear regression 451:Multilevel models 448: 447: 101:Binary regression 60:Simple regression 55:Linear regression 16:(Redirected from 5900: 5862: 5852: 5842: 5812: 5803: 5781: 5759: 5750: 5726: 5707: 5686: 5653: 5652: 5650: 5640: 5616: 5607: 5606: 5586: 5577: 5576: 5555: 5549: 5548: 5542: 5534: 5516: 5505: 5504: 5484: 5478: 5477: 5437: 5431: 5430: 5388: 5380: 5374: 5373: 5339: 5330: 5319: 5318: 5316: 5310:. Archived from 5309: 5300: 5294: 5293: 5265: 5259: 5258: 5252: 5244: 5232: 5222: 5209: 5208: 5202: 5198: 5196: 5188: 5170: 5164: 5163: 5135: 5126: 5125: 5115: 5105: 5087: 5063: 5057: 5056: 5038: 5009: 5008: 4998: 4988: 4964: 4945: 4944: 4926: 4920: 4919: 4913: 4905: 4887: 4858: 4857: 4851: 4843: 4825: 4764: 4762: 4761: 4756: 4741: 4739: 4738: 4733: 4731: 4727: 4724: 4722: 4718: 4713: 4708: 4693: 4692: 4676: 4665: 4638: 4637: 4618: 4613: 4598: 4597: 4582: 4581: 4560: 4550: 4549: 4522: 4519: 4517: 4513: 4508: 4503: 4488: 4487: 4471: 4460: 4433: 4432: 4413: 4408: 4393: 4392: 4380: 4374: 4363: 4336: 4335: 4308: 4298: 4297: 4270: 4267: 4265: 4261: 4257: 4256: 4243: 4232: 4205: 4204: 4189: 4183: 4182: 4181: 4165: 4138: 4137: 4110: 4092: 4090: 4089: 4084: 4078: 4073: 4058: 4057: 4041: 4030: 4003: 4002: 3983: 3978: 3963: 3962: 3947: 3946: 3933: 3922: 3895: 3894: 3875: 3874: 3873: 3857: 3830: 3829: 3797: 3795: 3794: 3789: 3783: 3782: 3781: 3765: 3738: 3737: 3722: 3716: 3711: 3696: 3695: 3679: 3668: 3641: 3640: 3621: 3616: 3601: 3600: 3585: 3584: 3571: 3560: 3533: 3532: 3493: 3491: 3490: 3485: 3483: 3482: 3463: 3461: 3460: 3455: 3453: 3452: 3433: 3431: 3430: 3425: 3413: 3411: 3410: 3405: 3400: 3399: 3381: 3380: 3361: 3359: 3358: 3353: 3341: 3339: 3338: 3333: 3328: 3327: 3309: 3308: 3280: 3278: 3277: 3272: 3260: 3258: 3257: 3252: 3240: 3238: 3237: 3232: 3230: 3229: 3210: 3208: 3207: 3202: 3200: 3199: 3180: 3178: 3177: 3172: 3160: 3158: 3157: 3152: 3150: 3149: 3128: 3126: 3125: 3120: 3118: 3089: 3085: 3084: 3050: 3045: 3026: 3021: 3011: 3007: 3006: 2973: 2972: 2951: 2950: 2929: 2928: 2904: 2903: 2882: 2881: 2860: 2859: 2841: 2837: 2836: 2803: 2802: 2784: 2783: 2773: 2769: 2768: 2735: 2734: 2716: 2715: 2705: 2686: 2684: 2683: 2678: 2676: 2622: 2618: 2617: 2583: 2578: 2554: 2553: 2540: 2536: 2535: 2505: 2504: 2489: 2488: 2476: 2475: 2462: 2457: 2439: 2438: 2426: 2425: 2412: 2393: 2391: 2390: 2385: 2383: 2376: 2375: 2322: 2318: 2317: 2284: 2283: 2259: 2258: 2245: 2241: 2240: 2210: 2209: 2191: 2190: 2169: 2168: 2147: 2146: 2131: 2130: 2115: 2114: 2093: 2092: 2084: 2077: 1969:random variables 1844: 1842: 1841: 1836: 1834: 1833: 1814: 1812: 1811: 1806: 1804: 1803: 1766:homoscedasticity 1760:Homoscedasticity 1637: 1635: 1634: 1629: 1627: 1626: 1605: 1603: 1602: 1597: 1595: 1594: 1573: 1571: 1570: 1565: 1563: 1562: 1546: 1544: 1543: 1538: 1536: 1535: 1517: 1515: 1514: 1509: 1507: 1506: 1488: 1486: 1485: 1480: 1478: 1477: 1459: 1457: 1456: 1451: 1449: 1448: 1430: 1428: 1427: 1422: 1420: 1419: 1404: 1403: 1394: 1393: 1381: 1380: 1368: 1367: 1347: 1345: 1344: 1339: 1337: 1336: 1321: 1320: 1311: 1310: 1298: 1297: 1285: 1284: 1264: 1262: 1261: 1256: 1250: 1245: 1227: 1226: 1217: 1216: 1196: 1194: 1193: 1188: 1182: 1177: 1159: 1158: 1149: 1148: 1115: 1113: 1112: 1107: 1102: 1101: 1086: 1085: 1063: 1061: 1060: 1055: 1043: 1041: 1040: 1035: 1033: 1032: 1013: 1011: 1010: 1005: 993: 991: 990: 985: 983: 982: 959: 957: 956: 951: 949: 948: 929: 927: 926: 921: 919: 918: 890: 888: 887: 882: 876: 871: 853: 852: 843: 842: 820: 818: 817: 812: 810: 809: 790: 788: 787: 782: 780: 779: 758: 756: 755: 750: 748: 747: 726: 724: 723: 718: 716: 715: 694: 692: 691: 686: 684: 683: 662: 660: 659: 654: 652: 651: 629: 627: 626: 621: 619: 618: 603: 602: 590: 589: 574: 573: 558: 557: 498:(in particular, 440: 433: 426: 410: 409: 317:Ridge regression 152:Multilevel model 32: 31: 21: 5908: 5907: 5903: 5902: 5901: 5899: 5898: 5897: 5878: 5877: 5869: 5800: 5778: 5747: 5723: 5704: 5683: 5662: 5660:Further reading 5657: 5656: 5617: 5610: 5587: 5580: 5573: 5556: 5552: 5536: 5535: 5531: 5517: 5508: 5501: 5485: 5481: 5438: 5434: 5381: 5377: 5337: 5331: 5322: 5314: 5307: 5301: 5297: 5266: 5262: 5246: 5245: 5241: 5223: 5212: 5200: 5199: 5190: 5189: 5185: 5171: 5167: 5136: 5129: 5078:(7): e0236860. 5064: 5060: 5053: 5039: 5012: 4965: 4948: 4941: 4927: 4923: 4907: 4906: 4902: 4888: 4861: 4845: 4844: 4840: 4826: 4813: 4808: 4771: 4750: 4747: 4746: 4729: 4728: 4723: 4709: 4698: 4688: 4684: 4666: 4643: 4630: 4626: 4614: 4603: 4593: 4589: 4577: 4573: 4566: 4563: 4558: 4552: 4551: 4528: 4527: 4524: 4523: 4518: 4504: 4493: 4483: 4479: 4461: 4438: 4425: 4421: 4409: 4398: 4388: 4384: 4376: 4364: 4341: 4328: 4324: 4314: 4311: 4306: 4300: 4299: 4276: 4275: 4272: 4271: 4266: 4252: 4248: 4233: 4210: 4197: 4193: 4185: 4177: 4173: 4166: 4143: 4130: 4126: 4116: 4113: 4108: 4101: 4099: 4096: 4095: 4074: 4063: 4053: 4049: 4031: 4008: 3995: 3991: 3979: 3968: 3958: 3954: 3942: 3938: 3923: 3900: 3887: 3883: 3869: 3865: 3858: 3835: 3822: 3818: 3804: 3801: 3800: 3777: 3773: 3766: 3743: 3730: 3726: 3718: 3712: 3701: 3691: 3687: 3669: 3646: 3633: 3629: 3617: 3606: 3596: 3592: 3580: 3576: 3561: 3538: 3525: 3521: 3510: 3507: 3506: 3475: 3471: 3469: 3466: 3465: 3445: 3441: 3439: 3436: 3435: 3419: 3416: 3415: 3395: 3391: 3376: 3372: 3367: 3364: 3363: 3347: 3344: 3343: 3323: 3319: 3304: 3300: 3286: 3283: 3282: 3266: 3263: 3262: 3246: 3243: 3242: 3222: 3218: 3216: 3213: 3212: 3192: 3188: 3186: 3183: 3182: 3166: 3163: 3162: 3142: 3138: 3136: 3133: 3132: 3116: 3115: 3087: 3086: 3063: 3062: 3059: 3058: 3046: 3041: 3022: 3017: 3009: 3008: 2985: 2984: 2981: 2980: 2965: 2961: 2943: 2939: 2921: 2917: 2896: 2892: 2874: 2870: 2852: 2848: 2839: 2838: 2815: 2814: 2811: 2810: 2798: 2794: 2779: 2775: 2771: 2770: 2747: 2746: 2743: 2742: 2730: 2726: 2711: 2707: 2702: 2700: 2697: 2696: 2674: 2673: 2620: 2619: 2596: 2595: 2592: 2591: 2579: 2574: 2546: 2542: 2538: 2537: 2514: 2513: 2510: 2509: 2497: 2493: 2481: 2477: 2468: 2464: 2458: 2447: 2434: 2430: 2418: 2414: 2409: 2407: 2404: 2403: 2381: 2380: 2371: 2367: 2320: 2319: 2296: 2295: 2292: 2291: 2279: 2275: 2251: 2247: 2243: 2242: 2219: 2218: 2215: 2214: 2202: 2198: 2183: 2179: 2161: 2157: 2139: 2135: 2123: 2119: 2107: 2103: 2085: 2080: 2079: 2074: 2072: 2069: 2068: 2047: 2038: 2021: 2016: 2010: 1985: 1958:hyperparameters 1954:Mobile, Alabama 1934: 1890: 1885: 1876: 1851: 1826: 1822: 1820: 1817: 1816: 1796: 1792: 1790: 1787: 1786: 1717: 1709:Model selection 1685: 1676: 1667: 1654: 1645: 1643:Types of models 1619: 1615: 1613: 1610: 1609: 1587: 1583: 1581: 1578: 1577: 1558: 1554: 1552: 1549: 1548: 1531: 1527: 1525: 1522: 1521: 1502: 1498: 1496: 1493: 1492: 1473: 1469: 1467: 1464: 1463: 1444: 1440: 1438: 1435: 1434: 1412: 1408: 1399: 1395: 1389: 1385: 1376: 1372: 1360: 1356: 1354: 1351: 1350: 1329: 1325: 1316: 1312: 1306: 1302: 1293: 1289: 1277: 1273: 1271: 1268: 1267: 1246: 1241: 1222: 1221: 1209: 1205: 1203: 1200: 1199: 1178: 1173: 1154: 1153: 1141: 1137: 1135: 1132: 1131: 1126: 1094: 1090: 1078: 1074: 1069: 1066: 1065: 1049: 1046: 1045: 1044:represents the 1025: 1021: 1019: 1016: 1015: 999: 996: 995: 975: 971: 969: 966: 965: 941: 937: 935: 932: 931: 911: 907: 905: 902: 901: 872: 867: 848: 847: 835: 831: 829: 826: 825: 802: 798: 796: 793: 792: 772: 768: 766: 763: 762: 740: 736: 734: 731: 730: 708: 704: 702: 699: 698: 676: 672: 670: 667: 666: 644: 640: 638: 635: 634: 611: 607: 595: 591: 582: 578: 566: 562: 550: 546: 544: 541: 540: 535: 453:(also known as 444: 404: 384:Goodness of fit 91:Discrete choice 28: 23: 22: 15: 12: 11: 5: 5906: 5896: 5895: 5890: 5876: 5875: 5868: 5867:External links 5865: 5864: 5863: 5818: 5804: 5798: 5782: 5776: 5761: 5751: 5745: 5727: 5721: 5708: 5702: 5687: 5681: 5661: 5658: 5655: 5654: 5608: 5597:(3): 396–404. 5578: 5571: 5550: 5529: 5506: 5499: 5479: 5432: 5375: 5348:(2): 443–483. 5320: 5295: 5260: 5239: 5210: 5183: 5165: 5127: 5058: 5051: 5010: 4946: 4939: 4921: 4900: 4859: 4838: 4810: 4809: 4807: 4804: 4803: 4802: 4797: 4792: 4787: 4782: 4777: 4775:Hyperparameter 4770: 4767: 4754: 4725:Stage 3: Prior 4721: 4717: 4712: 4707: 4704: 4701: 4697: 4691: 4687: 4683: 4680: 4675: 4672: 4669: 4664: 4661: 4658: 4655: 4652: 4649: 4646: 4642: 4636: 4633: 4629: 4625: 4622: 4617: 4612: 4609: 4606: 4602: 4596: 4592: 4588: 4585: 4580: 4576: 4572: 4569: 4565: 4559: 4557: 4554: 4553: 4547: 4544: 4541: 4538: 4535: 4532: 4526: 4525: 4516: 4512: 4507: 4502: 4499: 4496: 4492: 4486: 4482: 4478: 4475: 4470: 4467: 4464: 4459: 4456: 4453: 4450: 4447: 4444: 4441: 4437: 4431: 4428: 4424: 4420: 4417: 4412: 4407: 4404: 4401: 4397: 4391: 4387: 4383: 4379: 4373: 4370: 4367: 4362: 4359: 4356: 4353: 4350: 4347: 4344: 4340: 4334: 4331: 4327: 4323: 4320: 4317: 4313: 4307: 4305: 4302: 4301: 4295: 4292: 4289: 4286: 4283: 4280: 4274: 4273: 4264: 4260: 4255: 4251: 4247: 4242: 4239: 4236: 4231: 4228: 4225: 4222: 4219: 4216: 4213: 4209: 4203: 4200: 4196: 4192: 4188: 4180: 4176: 4172: 4169: 4164: 4161: 4158: 4155: 4152: 4149: 4146: 4142: 4136: 4133: 4129: 4125: 4122: 4119: 4115: 4109: 4107: 4104: 4103: 4082: 4077: 4072: 4069: 4066: 4062: 4056: 4052: 4048: 4045: 4040: 4037: 4034: 4029: 4026: 4023: 4020: 4017: 4014: 4011: 4007: 4001: 3998: 3994: 3990: 3987: 3982: 3977: 3974: 3971: 3967: 3961: 3957: 3953: 3950: 3945: 3941: 3937: 3932: 3929: 3926: 3921: 3918: 3915: 3912: 3909: 3906: 3903: 3899: 3893: 3890: 3886: 3882: 3879: 3872: 3868: 3864: 3861: 3856: 3853: 3850: 3847: 3844: 3841: 3838: 3834: 3828: 3825: 3821: 3817: 3814: 3811: 3808: 3787: 3780: 3776: 3772: 3769: 3764: 3761: 3758: 3755: 3752: 3749: 3746: 3742: 3736: 3733: 3729: 3725: 3721: 3715: 3710: 3707: 3704: 3700: 3694: 3690: 3686: 3683: 3678: 3675: 3672: 3667: 3664: 3661: 3658: 3655: 3652: 3649: 3645: 3639: 3636: 3632: 3628: 3625: 3620: 3615: 3612: 3609: 3605: 3599: 3595: 3591: 3588: 3583: 3579: 3575: 3570: 3567: 3564: 3559: 3556: 3553: 3550: 3547: 3544: 3541: 3537: 3531: 3528: 3524: 3520: 3517: 3514: 3497:Stage 3: Prior 3481: 3478: 3474: 3451: 3448: 3444: 3423: 3403: 3398: 3394: 3390: 3387: 3384: 3379: 3375: 3371: 3351: 3331: 3326: 3322: 3318: 3315: 3312: 3307: 3303: 3299: 3296: 3293: 3290: 3270: 3250: 3228: 3225: 3221: 3198: 3195: 3191: 3170: 3148: 3145: 3141: 3114: 3111: 3108: 3105: 3102: 3099: 3096: 3093: 3090: 3088: 3082: 3079: 3076: 3073: 3070: 3067: 3061: 3060: 3057: 3054: 3049: 3044: 3040: 3036: 3033: 3030: 3025: 3020: 3016: 3012: 3010: 3004: 3001: 2998: 2995: 2992: 2989: 2983: 2982: 2979: 2976: 2971: 2968: 2964: 2960: 2957: 2954: 2949: 2946: 2942: 2938: 2935: 2932: 2927: 2924: 2920: 2916: 2913: 2910: 2907: 2902: 2899: 2895: 2891: 2888: 2885: 2880: 2877: 2873: 2869: 2866: 2863: 2858: 2855: 2851: 2847: 2844: 2842: 2840: 2834: 2831: 2828: 2825: 2822: 2819: 2813: 2812: 2809: 2806: 2801: 2797: 2793: 2790: 2787: 2782: 2778: 2774: 2772: 2766: 2763: 2760: 2757: 2754: 2751: 2745: 2744: 2741: 2738: 2733: 2729: 2725: 2722: 2719: 2714: 2710: 2706: 2704: 2691:Stage 3: Prior 2672: 2669: 2666: 2663: 2660: 2657: 2654: 2651: 2647: 2644: 2641: 2638: 2635: 2632: 2629: 2626: 2623: 2621: 2615: 2612: 2609: 2606: 2603: 2600: 2594: 2593: 2590: 2587: 2582: 2577: 2573: 2569: 2566: 2563: 2560: 2557: 2552: 2549: 2545: 2541: 2539: 2533: 2530: 2527: 2524: 2521: 2518: 2512: 2511: 2508: 2503: 2500: 2496: 2492: 2487: 2484: 2480: 2474: 2471: 2467: 2461: 2456: 2453: 2450: 2446: 2442: 2437: 2433: 2429: 2424: 2421: 2417: 2413: 2411: 2379: 2374: 2370: 2366: 2363: 2360: 2357: 2354: 2351: 2347: 2344: 2341: 2338: 2335: 2332: 2329: 2326: 2323: 2321: 2315: 2312: 2309: 2306: 2303: 2300: 2294: 2293: 2290: 2287: 2282: 2278: 2274: 2271: 2268: 2265: 2262: 2257: 2254: 2250: 2246: 2244: 2238: 2235: 2232: 2229: 2226: 2223: 2217: 2216: 2213: 2208: 2205: 2201: 2197: 2194: 2189: 2186: 2182: 2178: 2175: 2172: 2167: 2164: 2160: 2156: 2153: 2150: 2145: 2142: 2138: 2134: 2129: 2126: 2122: 2118: 2113: 2110: 2106: 2102: 2099: 2096: 2091: 2088: 2083: 2078: 2076: 2046: 2043: 2037: 2034: 2020: 2017: 2009: 2006: 1984: 1981: 1933: 1930: 1922: 1921: 1918: 1915: 1912: 1905: 1904: 1901: 1889: 1886: 1884: 1881: 1875: 1872: 1850: 1847: 1832: 1829: 1825: 1802: 1799: 1795: 1783: 1782: 1774: 1773: 1762: 1761: 1753: 1752: 1733: 1732: 1716: 1713: 1684: 1681: 1675: 1672: 1666: 1663: 1653: 1650: 1644: 1641: 1640: 1639: 1625: 1622: 1618: 1607: 1593: 1590: 1586: 1575: 1561: 1557: 1534: 1530: 1519: 1505: 1501: 1490: 1476: 1472: 1461: 1447: 1443: 1418: 1415: 1411: 1407: 1402: 1398: 1392: 1388: 1384: 1379: 1375: 1371: 1366: 1363: 1359: 1335: 1332: 1328: 1324: 1319: 1315: 1309: 1305: 1301: 1296: 1292: 1288: 1283: 1280: 1276: 1254: 1249: 1244: 1240: 1236: 1233: 1230: 1225: 1220: 1215: 1212: 1208: 1186: 1181: 1176: 1172: 1168: 1165: 1162: 1157: 1152: 1147: 1144: 1140: 1125: 1122: 1105: 1100: 1097: 1093: 1089: 1084: 1081: 1077: 1073: 1053: 1031: 1028: 1024: 1003: 981: 978: 974: 947: 944: 940: 930:and predictor 917: 914: 910: 880: 875: 870: 866: 862: 859: 856: 851: 846: 841: 838: 834: 823: 822: 808: 805: 801: 778: 775: 771: 760: 746: 743: 739: 728: 714: 711: 707: 696: 682: 679: 675: 664: 650: 647: 643: 617: 614: 610: 606: 601: 598: 594: 588: 585: 581: 577: 572: 569: 565: 561: 556: 553: 549: 534: 531: 446: 445: 443: 442: 435: 428: 420: 417: 416: 415: 414: 399: 398: 397: 396: 391: 386: 381: 376: 371: 363: 362: 358: 357: 356: 355: 350: 345: 340: 335: 327: 326: 325: 324: 319: 314: 309: 304: 296: 295: 294: 293: 288: 283: 278: 270: 269: 268: 267: 262: 257: 249: 248: 244: 243: 242: 241: 233: 232: 231: 230: 225: 220: 215: 210: 205: 200: 195: 193:Semiparametric 190: 185: 177: 176: 175: 174: 169: 164: 162:Random effects 159: 154: 146: 145: 144: 143: 138: 136:Ordered probit 133: 128: 123: 118: 113: 108: 103: 98: 93: 88: 83: 75: 74: 73: 72: 67: 62: 57: 49: 48: 44: 43: 37: 36: 26: 9: 6: 4: 3: 2: 5905: 5894: 5891: 5889: 5886: 5885: 5883: 5874: 5871: 5870: 5860: 5856: 5851: 5846: 5841: 5836: 5832: 5828: 5824: 5819: 5816: 5810: 5805: 5801: 5795: 5791: 5787: 5783: 5779: 5777:9781446254332 5773: 5769: 5768: 5762: 5757: 5752: 5748: 5742: 5738: 5737: 5732: 5728: 5724: 5718: 5714: 5709: 5705: 5699: 5695: 5694: 5688: 5684: 5678: 5674: 5673: 5668: 5664: 5663: 5649: 5644: 5639: 5634: 5630: 5626: 5622: 5615: 5613: 5604: 5600: 5596: 5592: 5585: 5583: 5574: 5568: 5564: 5560: 5554: 5546: 5540: 5532: 5526: 5522: 5515: 5513: 5511: 5502: 5496: 5493:. MIT Press. 5492: 5491: 5483: 5475: 5471: 5467: 5463: 5459: 5455: 5451: 5447: 5443: 5436: 5428: 5424: 5420: 5416: 5412: 5408: 5404: 5400: 5396: 5392: 5387: 5379: 5371: 5367: 5363: 5359: 5355: 5351: 5347: 5343: 5336: 5329: 5327: 5325: 5313: 5306: 5299: 5291: 5287: 5283: 5279: 5275: 5271: 5264: 5256: 5250: 5242: 5236: 5231: 5230: 5221: 5219: 5217: 5215: 5206: 5194: 5186: 5180: 5176: 5169: 5161: 5157: 5153: 5149: 5145: 5141: 5134: 5132: 5123: 5119: 5114: 5109: 5104: 5099: 5095: 5091: 5086: 5081: 5077: 5073: 5069: 5062: 5054: 5048: 5044: 5037: 5035: 5033: 5031: 5029: 5027: 5025: 5023: 5021: 5019: 5017: 5015: 5006: 5002: 4997: 4992: 4987: 4982: 4978: 4974: 4970: 4963: 4961: 4959: 4957: 4955: 4953: 4951: 4942: 4936: 4932: 4925: 4917: 4911: 4903: 4897: 4893: 4886: 4884: 4882: 4880: 4878: 4876: 4874: 4872: 4870: 4868: 4866: 4864: 4855: 4849: 4841: 4835: 4831: 4824: 4822: 4820: 4818: 4816: 4811: 4801: 4798: 4796: 4793: 4791: 4788: 4786: 4783: 4781: 4778: 4776: 4773: 4772: 4766: 4752: 4742: 4719: 4710: 4705: 4702: 4699: 4689: 4685: 4678: 4673: 4670: 4667: 4662: 4659: 4656: 4653: 4650: 4647: 4644: 4634: 4631: 4627: 4620: 4615: 4610: 4607: 4604: 4594: 4590: 4583: 4578: 4574: 4567: 4555: 4545: 4542: 4539: 4536: 4533: 4530: 4514: 4505: 4500: 4497: 4494: 4484: 4480: 4473: 4468: 4465: 4462: 4457: 4454: 4451: 4448: 4445: 4442: 4439: 4429: 4426: 4422: 4415: 4410: 4405: 4402: 4399: 4389: 4385: 4371: 4368: 4365: 4360: 4357: 4354: 4351: 4348: 4345: 4342: 4332: 4329: 4325: 4315: 4303: 4293: 4290: 4287: 4284: 4281: 4278: 4262: 4253: 4249: 4245: 4240: 4237: 4234: 4229: 4226: 4223: 4220: 4217: 4214: 4211: 4201: 4198: 4194: 4178: 4174: 4170: 4167: 4162: 4159: 4156: 4153: 4150: 4147: 4144: 4134: 4131: 4127: 4117: 4105: 4093: 4075: 4070: 4067: 4064: 4054: 4050: 4043: 4038: 4035: 4032: 4027: 4024: 4021: 4018: 4015: 4012: 4009: 3999: 3996: 3992: 3985: 3980: 3975: 3972: 3969: 3959: 3955: 3948: 3943: 3939: 3935: 3930: 3927: 3924: 3919: 3916: 3913: 3910: 3907: 3904: 3901: 3891: 3888: 3884: 3877: 3870: 3866: 3862: 3859: 3854: 3851: 3848: 3845: 3842: 3839: 3836: 3826: 3823: 3819: 3809: 3806: 3798: 3778: 3774: 3770: 3767: 3762: 3759: 3756: 3753: 3750: 3747: 3744: 3734: 3731: 3727: 3713: 3708: 3705: 3702: 3692: 3688: 3681: 3676: 3673: 3670: 3665: 3662: 3659: 3656: 3653: 3650: 3647: 3637: 3634: 3630: 3623: 3618: 3613: 3610: 3607: 3597: 3593: 3586: 3581: 3577: 3573: 3568: 3565: 3562: 3557: 3554: 3551: 3548: 3545: 3542: 3539: 3529: 3526: 3522: 3512: 3504: 3501: 3499: 3498: 3479: 3476: 3472: 3449: 3446: 3442: 3421: 3414:. Typically, 3396: 3392: 3388: 3385: 3382: 3377: 3373: 3349: 3324: 3320: 3316: 3313: 3310: 3305: 3301: 3297: 3294: 3288: 3268: 3248: 3226: 3223: 3219: 3196: 3193: 3189: 3168: 3146: 3143: 3139: 3129: 3112: 3109: 3106: 3103: 3100: 3097: 3094: 3091: 3080: 3077: 3074: 3071: 3068: 3065: 3055: 3047: 3042: 3038: 3031: 3028: 3023: 3018: 3014: 3002: 2999: 2996: 2993: 2990: 2987: 2977: 2969: 2966: 2962: 2958: 2955: 2952: 2947: 2944: 2940: 2936: 2933: 2930: 2925: 2922: 2918: 2911: 2908: 2900: 2897: 2893: 2889: 2886: 2883: 2878: 2875: 2871: 2867: 2864: 2861: 2856: 2853: 2849: 2843: 2832: 2829: 2826: 2823: 2820: 2817: 2807: 2799: 2795: 2788: 2785: 2780: 2776: 2764: 2761: 2758: 2755: 2752: 2749: 2739: 2731: 2727: 2720: 2717: 2712: 2708: 2694: 2693: 2692: 2687: 2670: 2667: 2664: 2661: 2658: 2655: 2652: 2649: 2645: 2642: 2639: 2636: 2633: 2630: 2627: 2624: 2613: 2610: 2607: 2604: 2601: 2598: 2588: 2580: 2575: 2571: 2567: 2564: 2558: 2555: 2550: 2547: 2543: 2531: 2528: 2525: 2522: 2519: 2516: 2506: 2501: 2498: 2494: 2490: 2485: 2482: 2478: 2472: 2469: 2465: 2459: 2454: 2451: 2448: 2444: 2440: 2435: 2431: 2427: 2422: 2419: 2415: 2401: 2400: 2399: 2394: 2377: 2372: 2368: 2364: 2361: 2358: 2355: 2352: 2349: 2345: 2342: 2339: 2336: 2333: 2330: 2327: 2324: 2313: 2310: 2307: 2304: 2301: 2298: 2288: 2280: 2276: 2272: 2269: 2263: 2260: 2255: 2252: 2248: 2236: 2233: 2230: 2227: 2224: 2221: 2211: 2206: 2203: 2199: 2195: 2187: 2184: 2180: 2176: 2173: 2170: 2165: 2162: 2158: 2154: 2151: 2148: 2143: 2140: 2136: 2132: 2127: 2124: 2120: 2116: 2111: 2108: 2104: 2097: 2094: 2089: 2086: 2081: 2066: 2065: 2064: 2059: 2051: 2042: 2033: 2029: 2027: 2015: 2005: 2001: 1997: 1995: 1991: 1980: 1978: 1975:, multilevel 1974: 1970: 1966: 1961: 1959: 1955: 1951: 1946: 1943: 1940: 1929: 1927: 1919: 1916: 1913: 1910: 1909: 1908: 1902: 1899: 1898: 1897: 1895: 1880: 1868: 1864: 1861: 1857: 1846: 1830: 1827: 1823: 1800: 1797: 1793: 1780: 1779: 1778: 1771: 1770: 1769: 1767: 1759: 1758: 1757: 1750: 1749: 1748: 1746: 1737: 1730: 1729: 1728: 1726: 1722: 1712: 1710: 1706: 1703:(AIC) or the 1702: 1698: 1694: 1689: 1680: 1671: 1662: 1660: 1649: 1623: 1620: 1616: 1608: 1591: 1588: 1584: 1576: 1559: 1555: 1532: 1528: 1520: 1503: 1499: 1491: 1474: 1470: 1462: 1445: 1441: 1433: 1432: 1431: 1416: 1413: 1409: 1405: 1400: 1396: 1390: 1386: 1382: 1377: 1373: 1369: 1364: 1361: 1357: 1348: 1333: 1330: 1326: 1322: 1317: 1313: 1307: 1303: 1299: 1294: 1290: 1286: 1281: 1278: 1274: 1265: 1247: 1242: 1238: 1234: 1231: 1218: 1213: 1210: 1206: 1197: 1179: 1174: 1170: 1166: 1163: 1150: 1145: 1142: 1138: 1129: 1121: 1119: 1098: 1095: 1091: 1087: 1082: 1079: 1075: 1051: 1029: 1026: 1022: 1001: 979: 976: 972: 963: 945: 942: 938: 915: 912: 908: 898: 895: 891: 873: 868: 864: 860: 857: 844: 839: 836: 832: 806: 803: 799: 776: 773: 769: 761: 744: 741: 737: 729: 712: 709: 705: 697: 680: 677: 673: 665: 648: 645: 641: 633: 632: 631: 615: 612: 608: 604: 599: 596: 592: 586: 583: 579: 575: 570: 567: 563: 559: 554: 551: 547: 538: 530: 526: 524: 520: 519:growth curves 516: 512: 508: 503: 501: 497: 496:linear models 492: 488: 484: 480: 476: 472: 468: 464: 460: 456: 452: 441: 436: 434: 429: 427: 422: 421: 419: 418: 413: 408: 403: 402: 401: 400: 395: 392: 390: 387: 385: 382: 380: 377: 375: 372: 370: 367: 366: 365: 364: 360: 359: 354: 351: 349: 346: 344: 341: 339: 336: 334: 331: 330: 329: 328: 323: 320: 318: 315: 313: 310: 308: 305: 303: 300: 299: 298: 297: 292: 289: 287: 284: 282: 279: 277: 274: 273: 272: 271: 266: 263: 261: 258: 256: 255:Least squares 253: 252: 251: 250: 246: 245: 240: 237: 236: 235: 234: 229: 226: 224: 221: 219: 216: 214: 211: 209: 206: 204: 201: 199: 196: 194: 191: 189: 188:Nonparametric 186: 184: 181: 180: 179: 178: 173: 170: 168: 165: 163: 160: 158: 157:Fixed effects 155: 153: 150: 149: 148: 147: 142: 139: 137: 134: 132: 131:Ordered logit 129: 127: 124: 122: 119: 117: 114: 112: 109: 107: 104: 102: 99: 97: 94: 92: 89: 87: 84: 82: 79: 78: 77: 76: 71: 68: 66: 63: 61: 58: 56: 53: 52: 51: 50: 46: 45: 42: 39: 38: 34: 33: 30: 19: 5830: 5826: 5808: 5789: 5766: 5755: 5735: 5712: 5692: 5671: 5628: 5624: 5594: 5590: 5562: 5553: 5520: 5489: 5482: 5452:(1): 70–99. 5449: 5445: 5435: 5397:(1): 20–35. 5394: 5390: 5378: 5345: 5341: 5312:the original 5298: 5276:(1): 45–51. 5273: 5269: 5263: 5228: 5174: 5168: 5143: 5139: 5075: 5071: 5061: 5042: 4976: 4972: 4930: 4924: 4891: 4829: 4743: 4094: 3799: 3505: 3502: 3496: 3495: 3130: 2695: 2690: 2689: 2688: 2402: 2397: 2396: 2395: 2067: 2062: 2061: 2060: 2056: 2039: 2030: 2022: 2002: 1998: 1994:mixed models 1986: 1962: 1947: 1935: 1923: 1906: 1891: 1883:Applications 1877: 1852: 1784: 1775: 1763: 1754: 1741: 1718: 1690: 1686: 1677: 1668: 1655: 1646: 1349: 1266: 1198: 1130: 1127: 899: 896: 892: 824: 539: 536: 527: 504: 482: 478: 474: 470: 466: 463:mixed models 462: 458: 454: 450: 449: 312:Non-negative 151: 29: 5811:. Springer. 5625:Mathematics 5201:|last= 2036:Error terms 1939:independent 1715:Assumptions 507:nested data 322:Regularized 286:Generalized 218:Least angle 116:Mixed logit 5882:Categories 5833:: e12794. 5731:Hox, J. J. 5667:Gelman, A. 5638:2201.12430 5631:(6): 898. 5270:Biometrika 5085:2005.00662 4979:: e12794. 4806:References 1725:regression 491:parameters 361:Background 265:Non-linear 247:Estimation 5813:Includes 5559:Hox, Joop 5539:cite book 5474:202288849 5466:0149-2063 5411:1939-1463 5370:210355362 5362:1094-4281 5249:cite book 5193:cite book 5160:234027590 5140:Sankhya B 4910:cite book 4848:cite book 4686:ω 4628:β 4591:α 4575:σ 4556:× 4481:ω 4423:β 4386:α 4326:θ 4316:π 4304:× 4250:σ 4195:θ 4118:π 4051:ω 3993:β 3956:α 3940:σ 3885:θ 3810:π 3807:∝ 3689:ω 3631:β 3594:α 3578:σ 3523:θ 3513:π 3473:η 3443:ϵ 3393:θ 3386:… 3374:θ 3321:θ 3314:… 3302:θ 3104:… 3039:ω 3032:π 3029:∼ 3015:ω 2963:β 2956:… 2941:β 2934:… 2919:β 2912:π 2909:∼ 2894:β 2887:… 2872:β 2865:… 2850:β 2796:α 2789:π 2786:∼ 2777:α 2728:σ 2721:π 2718:∼ 2709:σ 2662:… 2637:… 2572:ω 2556:∼ 2544:η 2495:η 2466:β 2445:∑ 2432:α 2416:θ 2362:… 2337:… 2277:σ 2261:∼ 2249:ϵ 2200:ϵ 2181:θ 2174:… 2159:θ 2152:… 2137:θ 2121:θ 1751:Normality 1731:Linearity 1556:γ 1529:γ 1471:γ 1442:γ 1387:γ 1374:γ 1358:β 1304:γ 1291:γ 1275:β 1239:σ 1219:∼ 1171:σ 1151:∼ 865:σ 845:∼ 738:β 706:β 580:β 564:β 228:Segmented 5859:35116198 5733:(2010). 5561:(2002). 5427:44145669 5419:29863377 5146:: 1–43. 5122:32726361 5072:PLOS ONE 5005:35116198 4769:See also 1926:variable 1920:district 343:Bayesian 281:Weighted 276:Ordinary 208:Isotonic 203:Quantile 5850:8784019 5290:2336894 5113:7390340 5090:Bibcode 4996:8784019 3241:is the 1950:Seattle 1932:Example 302:Partial 141:Poisson 5857:  5847:  5796:  5774:  5743:  5719:  5700:  5679:  5569:  5527:  5497:  5472:  5464:  5425:  5417:  5409:  5368:  5360:  5288:  5237:  5181:  5158:  5120:  5110:  5049:  5003:  4993:  4937:  4898:  4836:  4561:  4309:  4111:  3211:, and 3131:Here, 1917:school 1860:t-test 1856:Z-test 1697:nested 523:ANCOVA 485:) are 260:Linear 198:Robust 121:Probit 47:Models 5827:PeerJ 5633:arXiv 5470:S2CID 5423:S2CID 5366:S2CID 5338:(PDF) 5315:(PDF) 5308:(PDF) 5286:JSTOR 5156:S2CID 5080:arXiv 4973:PeerJ 3464:and 1914:class 1911:pupil 1903:class 1900:pupil 1888:Level 1721:ANOVA 481:, or 307:Total 223:Local 5855:PMID 5817:code 5794:ISBN 5772:ISBN 5741:ISBN 5717:ISBN 5698:ISBN 5677:ISBN 5567:ISBN 5545:link 5525:ISBN 5495:ISBN 5462:ISSN 5415:PMID 5407:ISSN 5358:ISSN 5255:link 5235:ISBN 5205:help 5179:ISBN 5118:PMID 5047:ISBN 5001:PMID 4935:ISBN 4916:link 4896:ISBN 4854:link 4834:ISBN 1983:Uses 1858:. A 1547:and 5845:PMC 5835:doi 5815:SAS 5643:doi 5599:doi 5595:104 5454:doi 5399:doi 5350:doi 5278:doi 5148:doi 5108:PMC 5098:doi 4991:PMC 4981:doi 513:of 489:of 5884:: 5853:. 5843:. 5831:10 5829:. 5825:. 5641:. 5629:10 5627:. 5623:. 5611:^ 5593:. 5581:^ 5541:}} 5537:{{ 5509:^ 5468:. 5460:. 5450:46 5448:. 5444:. 5421:. 5413:. 5405:. 5395:24 5393:. 5389:. 5364:. 5356:. 5346:24 5344:. 5340:. 5323:^ 5284:. 5274:78 5272:. 5251:}} 5247:{{ 5213:^ 5197:: 5195:}} 5191:{{ 5154:. 5144:84 5142:. 5130:^ 5116:. 5106:. 5096:. 5088:. 5076:15 5074:. 5070:. 5013:^ 4999:. 4989:. 4977:10 4975:. 4971:. 4949:^ 4912:}} 4908:{{ 4862:^ 4850:}} 4846:{{ 4814:^ 1996:. 1747:. 1723:, 1711:. 1560:11 1533:01 1475:10 1446:00 1391:11 1378:10 1308:01 1295:00 1120:. 821:). 630:. 477:, 473:, 469:, 465:, 461:, 457:, 5861:. 5837:: 5802:. 5780:. 5749:. 5725:. 5706:. 5685:. 5651:. 5645:: 5635:: 5605:. 5601:: 5575:. 5547:) 5533:. 5503:. 5476:. 5456:: 5429:. 5401:: 5372:. 5352:: 5292:. 5280:: 5257:) 5243:. 5207:) 5187:. 5162:. 5150:: 5124:. 5100:: 5092:: 5082:: 5055:. 5007:. 4983:: 4943:. 4918:) 4904:. 4856:) 4842:. 4753:f 4720:} 4716:) 4711:K 4706:1 4703:= 4700:l 4696:} 4690:l 4682:{ 4679:, 4674:P 4671:, 4668:K 4663:1 4660:= 4657:b 4654:, 4651:1 4648:= 4645:l 4641:} 4635:b 4632:l 4624:{ 4621:, 4616:K 4611:1 4608:= 4605:l 4601:} 4595:l 4587:{ 4584:, 4579:2 4571:( 4568:p 4546:r 4543:e 4540:c 4537:a 4534:p 4531:s 4515:} 4511:) 4506:K 4501:1 4498:= 4495:l 4491:} 4485:l 4477:{ 4474:, 4469:P 4466:, 4463:K 4458:1 4455:= 4452:b 4449:, 4446:1 4443:= 4440:l 4436:} 4430:b 4427:l 4419:{ 4416:, 4411:K 4406:1 4403:= 4400:l 4396:} 4390:l 4382:{ 4378:| 4372:K 4369:, 4366:N 4361:1 4358:= 4355:l 4352:, 4349:1 4346:= 4343:i 4339:} 4333:i 4330:l 4322:{ 4319:( 4294:r 4291:e 4288:c 4285:a 4282:p 4279:s 4263:} 4259:) 4254:2 4246:, 4241:K 4238:, 4235:N 4230:1 4227:= 4224:l 4221:, 4218:1 4215:= 4212:i 4208:} 4202:i 4199:l 4191:{ 4187:| 4179:i 4175:M 4171:, 4168:N 4163:1 4160:= 4157:j 4154:, 4151:1 4148:= 4145:i 4141:} 4135:j 4132:i 4128:y 4124:{ 4121:( 4106:= 4081:) 4076:K 4071:1 4068:= 4065:l 4061:} 4055:l 4047:{ 4044:, 4039:P 4036:, 4033:K 4028:1 4025:= 4022:b 4019:, 4016:1 4013:= 4010:l 4006:} 4000:b 3997:l 3989:{ 3986:, 3981:K 3976:1 3973:= 3970:l 3966:} 3960:l 3952:{ 3949:, 3944:2 3936:, 3931:K 3928:, 3925:N 3920:1 3917:= 3914:l 3911:, 3908:1 3905:= 3902:i 3898:} 3892:i 3889:l 3881:{ 3878:, 3871:i 3867:M 3863:, 3860:N 3855:1 3852:= 3849:j 3846:, 3843:1 3840:= 3837:i 3833:} 3827:j 3824:i 3820:y 3816:{ 3813:( 3786:) 3779:i 3775:M 3771:, 3768:N 3763:1 3760:= 3757:j 3754:, 3751:1 3748:= 3745:i 3741:} 3735:j 3732:i 3728:y 3724:{ 3720:| 3714:K 3709:1 3706:= 3703:l 3699:} 3693:l 3685:{ 3682:, 3677:P 3674:, 3671:K 3666:1 3663:= 3660:b 3657:, 3654:1 3651:= 3648:l 3644:} 3638:b 3635:l 3627:{ 3624:, 3619:K 3614:1 3611:= 3608:l 3604:} 3598:l 3590:{ 3587:, 3582:2 3574:, 3569:K 3566:, 3563:N 3558:1 3555:= 3552:l 3549:, 3546:1 3543:= 3540:i 3536:} 3530:i 3527:l 3519:{ 3516:( 3480:i 3477:l 3450:j 3447:i 3422:f 3402:) 3397:K 3389:, 3383:, 3378:1 3370:( 3350:K 3330:) 3325:K 3317:, 3311:, 3306:1 3298:; 3295:t 3292:( 3289:f 3269:i 3249:b 3227:b 3224:i 3220:x 3197:j 3194:i 3190:t 3169:i 3147:j 3144:i 3140:y 3113:. 3110:K 3107:, 3101:, 3098:1 3095:= 3092:l 3081:r 3078:e 3075:c 3072:a 3069:p 3066:s 3056:, 3053:) 3048:2 3043:l 3035:( 3024:2 3019:l 3003:r 3000:e 2997:c 2994:a 2991:p 2988:s 2978:, 2975:) 2970:P 2967:l 2959:, 2953:, 2948:b 2945:l 2937:, 2931:, 2926:1 2923:l 2915:( 2906:) 2901:P 2898:l 2890:, 2884:, 2879:b 2876:l 2868:, 2862:, 2857:1 2854:l 2846:( 2833:r 2830:e 2827:c 2824:a 2821:p 2818:s 2808:, 2805:) 2800:l 2792:( 2781:l 2765:r 2762:e 2759:c 2756:a 2753:p 2750:s 2740:, 2737:) 2732:2 2724:( 2713:2 2671:. 2668:K 2665:, 2659:, 2656:1 2653:= 2650:l 2646:, 2643:N 2640:, 2634:, 2631:1 2628:= 2625:i 2614:r 2611:e 2608:c 2605:a 2602:p 2599:s 2589:, 2586:) 2581:2 2576:l 2568:, 2565:0 2562:( 2559:N 2551:i 2548:l 2532:r 2529:e 2526:c 2523:a 2520:p 2517:s 2507:, 2502:i 2499:l 2491:+ 2486:b 2483:i 2479:x 2473:b 2470:l 2460:P 2455:1 2452:= 2449:b 2441:+ 2436:l 2428:= 2423:i 2420:l 2378:. 2373:i 2369:M 2365:, 2359:, 2356:1 2353:= 2350:j 2346:, 2343:N 2340:, 2334:, 2331:1 2328:= 2325:i 2314:r 2311:e 2308:c 2305:a 2302:p 2299:s 2289:, 2286:) 2281:2 2273:, 2270:0 2267:( 2264:N 2256:j 2253:i 2237:r 2234:e 2231:c 2228:a 2225:p 2222:s 2212:, 2207:j 2204:i 2196:+ 2193:) 2188:i 2185:K 2177:, 2171:, 2166:i 2163:l 2155:, 2149:, 2144:i 2141:2 2133:, 2128:i 2125:1 2117:; 2112:j 2109:i 2105:t 2101:( 2098:f 2095:= 2090:j 2087:i 2082:y 1831:j 1828:i 1824:X 1801:j 1798:0 1794:u 1624:j 1621:1 1617:u 1592:j 1589:0 1585:u 1504:j 1500:w 1417:j 1414:1 1410:u 1406:+ 1401:j 1397:w 1383:+ 1370:= 1365:j 1362:1 1334:j 1331:0 1327:u 1323:+ 1318:j 1314:w 1300:+ 1287:= 1282:j 1279:0 1253:) 1248:2 1243:3 1235:, 1232:0 1229:( 1224:N 1214:j 1211:1 1207:u 1185:) 1180:2 1175:2 1167:, 1164:0 1161:( 1156:N 1146:j 1143:0 1139:u 1104:) 1099:j 1096:i 1092:Y 1088:, 1083:j 1080:i 1076:X 1072:( 1052:j 1030:j 1027:i 1023:X 1002:i 980:j 977:i 973:Y 946:j 943:i 939:X 916:j 913:i 909:Y 879:) 874:2 869:1 861:, 858:0 855:( 850:N 840:j 837:i 833:e 807:j 804:i 800:r 777:j 774:i 770:e 745:j 742:1 713:j 710:0 681:j 678:i 674:X 649:j 646:i 642:Y 616:j 613:i 609:e 605:+ 600:j 597:i 593:X 587:j 584:1 576:+ 571:j 568:0 560:= 555:j 552:i 548:Y 439:e 432:t 425:v 20:)

Index

Hierarchical linear modeling
Regression analysis
Linear regression
Simple regression
Polynomial regression
General linear model
Generalized linear model
Vector generalized linear model
Discrete choice
Binomial regression
Binary regression
Logistic regression
Multinomial logistic regression
Mixed logit
Probit
Multinomial probit
Ordered logit
Ordered probit
Poisson
Multilevel model
Fixed effects
Random effects
Linear mixed-effects model
Nonlinear mixed-effects model
Nonlinear regression
Nonparametric
Semiparametric
Robust
Quantile
Isotonic

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑