4712:
2976:, i.e. the actual categorical distribution that generated the data. For example, if 3 categories in the ratio 40:5:55 are in the observed data, then ignoring the effect of the prior distribution, the true parameter – i.e. the true, underlying distribution that generated our observed data – would be expected to have the average value of (0.40,0.05,0.55), which is indeed what the posterior reveals. However, the true distribution might actually be (0.35,0.07,0.58) or (0.42,0.04,0.54) or various other nearby possibilities. The amount of uncertainty involved here is specified by the
2495:
2795:
3894:
4352:
4310:
2238:
2506:
7262:
3607:
4707:{\displaystyle {\begin{aligned}p({\tilde {x}}=i\mid \mathbb {X} ,{\boldsymbol {\alpha }})&=\int _{\mathbf {p} }p({\tilde {x}}=i\mid \mathbf {p} )\,p(\mathbf {p} \mid \mathbb {X} ,{\boldsymbol {\alpha }})\,{\textrm {d}}\mathbf {p} \\&=\,\operatorname {E} _{\mathbf {p} \mid \mathbb {X} ,{\boldsymbol {\alpha }}}\left\\&=\,\operatorname {E} _{\mathbf {p} \mid \mathbb {X} ,{\boldsymbol {\alpha }}}\left\\&=\,\operatorname {E} .\end{aligned}}}
3996:
2490:{\displaystyle {\begin{array}{lclcl}{\boldsymbol {\alpha }}&=&(\alpha _{1},\ldots ,\alpha _{K})&=&{\text{concentration hyperparameter}}\\\mathbf {p} \mid {\boldsymbol {\alpha }}&=&(p_{1},\ldots ,p_{K})&\sim &\operatorname {Dir} (K,{\boldsymbol {\alpha }})\\\mathbb {X} \mid \mathbf {p} &=&(x_{1},\ldots ,x_{N})&\sim &\operatorname {Cat} (K,\mathbf {p} )\end{array}}}
7272:
1439:
2790:{\displaystyle {\begin{array}{lclcl}\mathbf {c} &=&(c_{1},\ldots ,c_{K})&=&{\text{number of occurrences of category }}i,{\text{ so that }}c_{i}=\sum _{j=1}^{N}\\\mathbf {p} \mid \mathbb {X} ,{\boldsymbol {\alpha }}&\sim &\operatorname {Dir} (K,\mathbf {c} +{\boldsymbol {\alpha }})&=&\operatorname {Dir} (K,c_{1}+\alpha _{1},\ldots ,c_{K}+\alpha _{K})\end{array}}}
3889:{\displaystyle {\begin{aligned}p(\mathbb {X} \mid {\boldsymbol {\alpha }})&=\int _{\mathbf {p} }p(\mathbb {X} \mid \mathbf {p} )p(\mathbf {p} \mid {\boldsymbol {\alpha }}){\textrm {d}}\mathbf {p} \\&={\frac {\Gamma \left(\sum _{k}\alpha _{k}\right)}{\Gamma \left(N+\sum _{k}\alpha _{k}\right)}}\prod _{k=1}^{K}{\frac {\Gamma (c_{k}+\alpha _{k})}{\Gamma (\alpha _{k})}}\end{aligned}}}
4305:{\displaystyle {\begin{aligned}p({\tilde {x}}=i\mid \mathbb {X} ,{\boldsymbol {\alpha }})&=\int _{\mathbf {p} }p({\tilde {x}}=i\mid \mathbf {p} )\,p(\mathbf {p} \mid \mathbb {X} ,{\boldsymbol {\alpha }})\,{\textrm {d}}\mathbf {p} \\&=\,{\frac {c_{i}+\alpha _{i}}{N+\sum _{k}\alpha _{k}}}\\&=\,\mathbb {E} \\&\propto \,c_{i}+\alpha _{i}.\\\end{aligned}}}
5082:
3482:
751:, that is a constant equal to 1 in the categorical-style PMF. Confusing the two can easily lead to incorrect results in settings where this extra factor is not constant with respect to the distributions of interest. The factor is frequently constant in the complete conditionals used in Gibbs sampling and the optimal distributions in
5226:
function draw_categorical(n) // where n is the number of samples to draw from the categorical distribution r = 1 s = 0 for i from 1 to k // where k is the number of categories v = draw from a binomial(n, p / r) distribution // where p is the probability of category i for j from 1 to v
4319:
The posterior predictive probability of seeing a particular category is the same as the relative proportion of previous observations in that category (including the pseudo-observations of the prior). This makes logical sense — intuitively, we would expect to see a particular category according to the
2228:
of the parameter, after incorporating the knowledge gained from the observed data, is also a
Dirichlet. Intuitively, in such a case, starting from what is known about the parameter prior to observing the data point, knowledge can then be updated based on the data point, yielding a new distribution of
734:
of the same variables with the same
Dirichlet-multinomial distribution has two different forms depending on whether it is characterized as a distribution whose domain is over individual categorical nodes or over multinomial-style counts of nodes in each particular category (similar to the distinction
2964:
among the various discrete distributions generated by the posterior distribution is simply equal to the proportion of occurrences of that category actually seen in the data, including the pseudocounts in the prior distribution. This makes a great deal of intuitive sense: if, for example, there are
4732:
Observe data points one by one and each time consider their predictive probability before observing the data point and updating the posterior. For any given data point, the probability of that point assuming a given category depends on the number of data points already in that category. In this
4717:
The crucial line above is the third. The second follows directly from the definition of expected value. The third line is particular to the categorical distribution, and follows from the fact that, in the categorical distribution specifically, the expected value of seeing a particular value
4877:
3290:
702:
are conflated, and it is common to speak of a "multinomial distribution" when a "categorical distribution" would be more precise. This imprecise usage stems from the fact that it is sometimes convenient to express the outcome of a categorical distribution as a
5222:
If it is necessary to draw many values from the same categorical distribution, the following approach is more efficient. It draws n samples in O(n) time (assuming an O(1) approximation is used to draw values from the binomial distribution).
628:
possible categories, with the probability of each category separately specified. There is no innate underlying ordering of these outcomes, but numerical labels are often attached for convenience in describing the distribution, (e.g. 1 to
2955:
5584:
1343:
5077:{\displaystyle {\begin{aligned}p(x_{n}=i\mid \mathbb {X} ^{(-n)},{\boldsymbol {\alpha }})&=\,{\frac {c_{i}^{(-n)}+\alpha _{i}}{N-1+\sum _{i}\alpha _{i}}}&\propto \,c_{i}^{(-n)}+\alpha _{i}\end{aligned}}}
1126:
593:
3207:. Logically, a flat distribution of this sort represents total ignorance, corresponding to no observations of any sort. However, the mathematical updating of the posterior works fine if we ignore the
1813:
3477:{\displaystyle \operatorname {arg\,max} \limits _{\mathbf {p} }p(\mathbf {p} \mid \mathbb {X} )={\frac {\alpha _{i}+c_{i}-1}{\sum _{i}(\alpha _{i}+c_{i}-1)}},\qquad \forall i\;\alpha _{i}+c_{i}>1}
467:
4733:
scenario, if a category has a high frequency of occurrence, then new data points are more likely to join that category — further enriching the same category. This type of scenario is often termed a
936:
1761:
3193:
4729:. The fourth line is simply a rewriting of the third in a different notation, using the notation farther up for an expectation taken with respect to the posterior distribution of the parameters.
4882:
4357:
4001:
3612:
1025:
873:
3538:
4737:(or "rich get richer") model. This models many real-world processes, and in such cases the choices made by the first few data points have an outsize influence on the rest of the data points.
5367:
2965:
three possible categories, and category 1 is seen in the observed data 40% of the time, one would expect on average to see category 1 40% of the time in the posterior distribution as well.
2229:
the same form as the old one. As such, knowledge of a parameter can be successively updated by incorporating new observations one at a time, without running into mathematical difficulties.
166:
1900:
1637:
1424:
366:
4330:
As a result, this formula can be expressed as simply "the posterior predictive probability of seeing a category is proportional to the total observed count of that category", or as "the
5682:
648:
is a special case. The parameters specifying the probabilities of each possible outcome are constrained only by the fact that each must be in the range 0 to 1, and all must sum to 1.
1247:
having the property that exactly one element has the value 1 and the others have the value 0. The particular element having the value 1 indicates which category has been chosen. The
5490:
5436:
5283:
4869:
3147:
1525:
218:
104:
2037:
3101:
1988:
1837:
5312:
3571:
3041:
2148:
5126:
4804:
3980:
3958:
1582:
276:
3262:
3008:
3231:
2824:, i.e. as representing the number of observations in each category that we have already seen. Then we simply add in the counts for all the new observations (the vector
683:
Occasionally, the categorical distribution is termed the "discrete distribution". However, this properly refers not to one particular family of distributions but to a
5390:
2182:
53:
5614:
4831:
1671:
1373:
963:
1942:
1466:
1187:
3061:
2012:
1962:
2111:
1161:
503:
2845:
5502:
1235:
of the multinomial distribution (the number of sampled items) is fixed at 1. In this formulation, the sample space can be considered to be the set of 1-of-
1260:
5920:
2980:
of the posterior, which is controlled by the total number of observations – the more data observed, the less uncertainty about the true parameter.)
5227:
z = i // where z is an array in which the results are stored n = n - v r = r - p shuffle (randomly re-order) the elements in z return z
4334:
of a category is the same as the total observed count of the category", where "observed count" is taken to include the pseudo-observations of the prior.
4346:
article, the formula for the posterior predictive probability has the form of an expected value taken with respect to the posterior distribution:
2066:
independent observations is the set of counts (or, equivalently, proportion) of observations in each category, where the total number of trials (=
747:(PMFs), which both make reference to multinomial-style counts of nodes in a category. However, the multinomial-style PMF has an extra factor, a
5442:, which can then be sampled using the techniques described above. There is however a more direct sampling method that uses samples from the
5145:
1040:
522:
6049:
7275:
6532:
4777:). One of the reasons for doing this is that in such a case, the distribution of one categorical node given the others is exactly the
707:" vector (a vector with one element containing a 1 and all other elements containing a 0) rather than as an integer in the range 1 to
7306:
6440:
4769:) of the network, which introduces dependencies among the various categorical nodes dependent on a given prior (specifically, their
7227:
2972:. The posterior distribution in general describes the parameter in question, and in this case the parameter itself is a discrete
7093:
6305:
6064:
5913:
4338:
The reason for the equivalence between posterior predictive probability and the expected value of the posterior distribution of
6988:
6752:
1774:
373:
6426:
5874:
881:
711:; in this form, a categorical distribution is equivalent to a multinomial distribution for a single observation (see below).
5207:
Locate the greatest number in the CDF whose value is less than or equal to the number just chosen. This can be done in time
1682:
6747:
6691:
6589:
6351:
5989:
3152:
790:
is taken to be a finite sequence of integers. The exact integers used as labels are unimportant; they might be {0, 1, ...,
7033:
6767:
6620:
6295:
6039:
2208:). This means that in a model consisting of a data point having a categorical distribution with unknown parameter vector
1198:
972:
819:
6497:
7265:
6937:
6913:
6492:
5906:
5720:
4774:
3987:
3916:
3598:
3490:
715:
7134:
7011:
6972:
6944:
6918:
6836:
6762:
6185:
5933:
5857:
5836:
5617:
5201:
4778:
4343:
3928:
3196:
621:
5320:
109:
7122:
7088:
6954:
6949:
6794:
6602:
6300:
6054:
5185:
1852:
1595:
1382:
718:, which arises commonly in natural language processing models (although not usually with this name) as a result of
1231:
by treating the categorical distribution as a special case of the multinomial distribution in which the parameter
1030:
Another formulation that appears more complex but facilitates mathematical manipulations is as follows, using the
282:
7301:
6872:
6785:
6757:
6666:
6615:
6487:
6270:
6235:
3273:
5888:
3237:
vector as directly representing a set of pseudocounts. Furthermore, doing this avoids the issue of interpreting
6886:
6803:
6640:
6564:
6387:
6265:
6240:
6104:
6099:
6094:
5768:
5792:
5623:
675:, in that it gives the probabilities of potential outcomes of a single drawing rather than multiple drawings.
7202:
7068:
6776:
6625:
6557:
6542:
6435:
6409:
6341:
6180:
6074:
6069:
6011:
5996:
7038:
7028:
6719:
6645:
6346:
6205:
7098:
4757:
where each variable is conditioned on all the others. In networks that include categorical variables with
1765:
The distribution is a special case of a "multivariate
Bernoulli distribution" in which exactly one of the
714:
However, conflating the categorical and multinomial distributions can lead to problems. For example, in a
663:
random variable, i.e. for a discrete variable with more than two possible outcomes, such as the roll of a
7083:
7078:
7023:
6959:
6903:
6724:
6711:
6502:
6447:
6399:
6190:
6119:
5984:
5449:
5395:
5242:
4836:
3900:
3106:
1471:
727:
695:
179:
63:
2017:
7217:
6993:
6812:
6594:
6547:
6416:
6392:
6372:
6215:
6089:
5969:
5149:
3066:
1971:
1820:
5288:
3543:
7296:
7222:
7006:
6967:
6841:
6678:
6522:
6467:
6365:
6329:
6200:
6165:
1248:
807:
744:
224:
3281:
6908:
6696:
6462:
6421:
6336:
6290:
6230:
6195:
6084:
5929:
5710:
5188:(CDF) by replacing each value with the sum of all of the previous values. This can be done in time
4750:
3013:
2973:
2205:
2120:
1991:
1228:
1205:
764:
719:
699:
672:
5090:
4787:
3963:
3934:
2968:(This intuition is ignoring the effect of the prior distribution. Furthermore, the posterior is a
1679:= 2 this reduces to the possible probabilities of the Bernoulli distribution being the 1-simplex,
1539:
233:
7207:
7149:
6820:
6607:
6517:
6472:
6457:
6275:
6225:
6220:
6021:
6001:
4765:
and models including mixture components), the
Dirichlet distributions are often "collapsed out" (
4734:
3240:
2986:
2511:
2243:
748:
6377:
7073:
7061:
7050:
6932:
6828:
6635:
6079:
6059:
5964:
5715:
5705:
4758:
3210:
2836:
2225:
2221:
2197:
2151:
2048:
1220:
1212:
803:
776:
736:
723:
656:
7197:
7154:
6998:
6673:
6527:
6507:
6404:
5974:
5375:
4766:
3904:
3594:
740:
684:
172:
25:
2157:
32:
7247:
7242:
7237:
7232:
7169:
7139:
7018:
6661:
6552:
6155:
6114:
6109:
6006:
5693:
5592:
5167:
5156:
4809:
2059:
1644:
1351:
941:
780:
660:
6452:
1918:
has a distribution which is a special case of the multinomial distribution with parameter
798:} or any other arbitrary set of values. In the following descriptions, we use {1, 2, ...,
8:
7181:
6706:
6686:
6656:
6630:
6584:
6512:
6324:
6260:
5443:
3586:
2801:
2193:
1921:
1445:
1194:
1166:
752:
7212:
6701:
6482:
6477:
6382:
6319:
6314:
6170:
6160:
6044:
5824:
4770:
3590:
3046:
2217:
1997:
1947:
1911:
1427:
731:
601:
5155:
Assume a distribution is expressed as "proportional to" some expression, with unknown
2950:{\displaystyle \operatorname {E} ={\frac {c_{i}+\alpha _{i}}{N+\sum _{k}\alpha _{k}}}}
2084:
1532:
The distribution is completely given by the probabilities associated with each number
1134:
476:
7110:
6537:
6280:
6210:
6175:
6124:
5870:
5869:
Agresti, A., An
Introduction to Categorical Data Analysis, Wiley-Interscience, 2007,
5853:
5832:
5764:
5579:{\displaystyle c=\operatorname {arg\,max} \limits _{i}\left(\gamma _{i}+g_{i}\right)}
3912:
515:
1338:{\displaystyle f(\mathbf {x} \mid {\boldsymbol {p}})=\prod _{i=1}^{k}p_{i}^{x_{i}},}
6285:
5959:
5898:
5439:
5236:
5148:, but the most common way to sample from a categorical distribution uses a type of
3931:
of a new observation in the above model is the distribution that a new observation
3149:
posterior observations. This reflects the fact that a
Dirichlet distribution with
1640:
691:
5173:
Impose some sort of order on the categories (e.g. by an index that runs from 1 to
1227:
Yet another formulation makes explicit the connection between the categorical and
5809:
5805:
2213:
2201:
2114:
2078:
2044:
1216:
1031:
624:
that describes the possible results of a random variable that can take on one of
506:
6358:
4746:
4324:
3908:
2832:
652:
637:-dimensional categorical distribution is the most general distribution over a
7290:
6981:
6729:
6016:
5212:
4762:
4754:
3487:
In many practical applications, the only way to guarantee the condition that
1839:
be the realisation from a categorical distribution. Define the random vector
787:
768:
668:
645:
5739:
However, Bishop does not explicitly use the term categorical distribution.
4315:
There are various relationships among this formula and the previous ones:
2212:, and (in standard Bayesian style) we choose to treat this parameter as a
4342:
is evident with re-examination of the above formula. As explained in the
2821:
730:, it is very important to distinguish categorical from multinomial. The
2813:
1189:, 0 otherwise. There are various advantages of this formulation, e.g.:
802:} for convenience, although this disagrees with the convention for the
605:
5163:
Compute the unnormalized value of the distribution for each category.
3915:, Dirichlet prior distributions are often marginalized out. See the
1121:{\displaystyle f(x\mid {\boldsymbol {p}})=\prod _{i=1}^{k}p_{i}^{},}
588:{\displaystyle i{\text{ such that }}p_{i}=\max(p_{1},\ldots ,p_{k})}
2977:
5159:. Before taking any samples, one prepares some values as follows:
3200:
1442:
The possible probabilities for the categorical distribution with
56:
5589:
will be a sample from the desired categorical distribution. (If
5496:
independent draws from the standard Gumbel distribution, then
775:
individually identified items. It is the generalization of the
1964:
independent and identically distributed such random variables
1639:. The possible sets of probabilities are exactly those in the
2960:
This says that the expected probability of seeing a category
2052:
1438:
5793:
Bayesian inference, entropy and the multinomial distribution
4327:
of the posterior distribution. This is explained more below.
5239:
it is typical to parametrize the categorical distribution,
5166:
Sum them up and divide each value by this sum, in order to
2204:
distribution of the categorical distribution (and also the
2187:
1968:
constructed from a categorical distribution with parameter
664:
2232:
Formally, this can be expressed as follows. Given a model
1204:
It connects the categorical distribution with the related
1808:{\displaystyle \operatorname {E} \left={\boldsymbol {p}}}
4323:
The posterior predictive probability is the same as the
2073:
The indicator function of an observation having a value
641:-way event; any other discrete distribution over a size-
462:{\displaystyle p(x)=\cdot p_{1}\,+\cdots +\,\cdot p_{k}}
5230:
5192:. The resulting value for the first category will be 0.
931:{\displaystyle {\boldsymbol {p}}=(p_{1},\ldots ,p_{k})}
667:. On the other hand, the categorical distribution is a
2835:
of the posterior distribution (see the article on the
1756:{\displaystyle p_{1}+p_{2}=1,0\leq p_{1},p_{2}\leq 1.}
1599:
1386:
976:
5626:
5595:
5505:
5452:
5398:
5378:
5323:
5291:
5245:
5093:
4880:
4839:
4812:
4790:
4740:
4355:
3999:
3966:
3937:
3610:
3546:
3493:
3293:
3243:
3213:
3188:{\displaystyle {\boldsymbol {\alpha }}=(1,1,\ldots )}
3155:
3109:
3069:
3049:
3016:
2989:
2848:
2509:
2241:
2160:
2123:
2087:
2020:
2000:
1974:
1950:
1924:
1855:
1823:
1777:
1685:
1647:
1598:
1542:
1474:
1448:
1385:
1354:
1263:
1169:
1137:
1043:
975:
944:
884:
822:
525:
479:
376:
285:
236:
182:
112:
66:
35:
5928:
5684:
is a sample from the standard Gumbel distribution.)
5218:
Return the category corresponding to this CDF value.
3922:
2808:
of a categorical distribution given a collection of
5196:Then, each time it is necessary to sample a value:
1020:{\displaystyle \textstyle {\sum _{i=1}^{k}p_{i}=1}}
868:{\displaystyle f(x=i\mid {\boldsymbol {p}})=p_{i},}
5676:
5608:
5578:
5484:
5430:
5384:
5361:
5306:
5277:
5120:
5076:
4863:
4825:
4798:
4722:is directly specified by the associated parameter
4706:
4304:
3974:
3952:
3888:
3565:
3532:
3476:
3256:
3225:
3187:
3141:
3095:
3055:
3035:
3002:
2949:
2789:
2489:
2176:
2142:
2105:
2031:
2006:
1982:
1956:
1936:
1894:
1831:
1807:
1755:
1665:
1631:
1576:
1519:
1460:
1418:
1367:
1337:
1181:
1155:
1120:
1019:
957:
930:
867:
587:
497:
461:
360:
270:
212:
160:
98:
47:
5889:"The Gumbel–Max Trick for Discrete Distributions"
5848:Johnson, N.L., Kotz, S., Balakrishnan, N. (1997)
5392:is any real constant. Given this representation,
3533:{\displaystyle \forall i\;\alpha _{i}+c_{i}>1}
2828:) in order to derive the posterior distribution.
7288:
2047:distribution of a categorical distribution is a
1219:of the categorical distribution, and allows the
547:
5808:function, similar to but less general than the
5362:{\displaystyle \gamma _{i}=\log p_{i}+\alpha }
3593:of the observations, with the prior parameter
161:{\displaystyle (p_{i}\geq 0,\,\Sigma p_{i}=1)}
5914:
5761:Machine learning: a probabilistic perspective
3899:This distribution plays an important role in
3195:has a completely flat shape — essentially, a
1895:{\displaystyle Y_{i}=I({\boldsymbol {X}}=i),}
1632:{\displaystyle \textstyle {\sum _{i}p_{i}=1}}
1419:{\displaystyle \textstyle {\sum _{i}p_{i}=1}}
1375:represents the probability of seeing element
965:represents the probability of seeing element
4320:frequency already observed of that category.
3282:mode of the posterior Dirichlet distribution
786:In one formulation of the distribution, the
758:
743:node). Both forms have very similar-looking
361:{\displaystyle p(x)=p_{1}^{}\cdots p_{k}^{}}
207:
189:
5921:
5907:
5804:Minka, T. (2003), op. cit. Minka uses the
3986:categorical observations. As shown in the
3500:
3444:
5523:
5294:
5032:
4949:
4912:
4842:
4792:
4682:
4658:
4613:
4598:
4526:
4511:
4488:
4473:
4457:
4388:
4271:
4246:
4225:
4223:
4155:
4132:
4117:
4101:
4032:
3968:
3663:
3622:
3342:
3305:
2872:
2653:
2399:
763:A categorical distribution is a discrete
430:
420:
135:
5829:Pattern Recognition and Machine Learning
5820:
5818:
5699:
5677:{\displaystyle g_{i}=-\log(-\log u_{i})}
4806:, if the node in question is denoted as
3063:. Then, the updated posterior parameter
3010:should actually be seen as representing
2188:Bayesian inference using conjugate prior
1437:
5787:
5785:
5783:
5781:
5779:
5777:
5285:via an unconstrained representation in
5128:is the number of nodes having category
4935:
4690:
4621:
4534:
4481:
4396:
4254:
4125:
4040:
3907:over such models using methods such as
3693:
3630:
3157:
2880:
2696:
2661:
2570:number of occurrences of category
2387:
2318:
2247:
2022:
1976:
1876:
1825:
1801:
1279:
1057:
886:
842:
806:, which uses {0, 1}. In this case, the
7289:
5795:. Technical report Microsoft Research.
3580:
2812:samples. Intuitively, we can view the
5902:
5815:
2804:to estimate the underlying parameter
1426:. This is the formulation adopted by
7271:
5774:
5231:Sampling via the Gumbel distribution
3990:article, it has a very simple form:
651:The categorical distribution is the
5850:Discrete Multivariate Distributions
5485:{\displaystyle g_{1},\ldots ,g_{k}}
5431:{\displaystyle p_{1},\ldots ,p_{k}}
5278:{\displaystyle p_{1},\ldots ,p_{k}}
4864:{\displaystyle \mathbb {X} ^{(-n)}}
4749:, one typically needs to draw from
3142:{\displaystyle c_{i}+\alpha _{i}-1}
1520:{\displaystyle p_{1}+p_{2}+p_{3}=1}
1223:of the parameters to be calculated.
1199:independent identically distributed
213:{\displaystyle x\in \{1,\dots ,k\}}
99:{\displaystyle p_{1},\ldots ,p_{k}}
13:
5721:Dirichlet-multinomial distribution
5530:
5527:
5524:
5520:
5517:
5514:
4775:Dirichlet-multinomial distribution
4741:Posterior conditional distribution
4659:
4600:
4513:
3988:Dirichlet-multinomial distribution
3860:
3826:
3760:
3725:
3599:Dirichlet-multinomial distribution
3494:
3438:
3312:
3309:
3306:
3302:
3299:
3296:
2983:(Technically, the prior parameter
2849:
2032:{\displaystyle {\boldsymbol {p}}.}
1778:
1769:0-1 variables takes the value one.
716:Dirichlet-multinomial distribution
614:generalized Bernoulli distribution
136:
14:
7318:
5886:
5314:, whose components are given by:
4779:posterior predictive distribution
4344:posterior predictive distribution
3929:posterior predictive distribution
3923:Posterior predictive distribution
3280:in the above model is simply the
3267:
3096:{\displaystyle c_{i}+\alpha _{i}}
2831:Further intuition comes from the
1983:{\displaystyle {\boldsymbol {p}}}
1832:{\displaystyle {\boldsymbol {X}}}
622:discrete probability distribution
16:Discrete probability distribution
7307:Exponential family distributions
7270:
7261:
7260:
5307:{\displaystyle \mathbb {R} ^{k}}
5186:cumulative distribution function
5132:among the nodes other than node
4605:
4576:
4518:
4497:
4465:
4450:
4416:
4141:
4109:
4094:
4060:
3708:
3685:
3671:
3650:
3566:{\displaystyle \alpha _{i}>1}
3334:
3318:
2688:
2645:
2515:
2476:
2407:
2310:
1789:
1271:
3437:
3043:prior observations of category
2970:distribution over distributions
794: − 1} or {1, 2, ...,
5880:
5863:
5842:
5798:
5753:
5733:
5671:
5649:
5616:is a sample from the standard
5113:
5104:
5052:
5043:
4972:
4963:
4939:
4926:
4917:
4888:
4856:
4847:
4694:
4665:
4580:
4560:
4551:
4485:
4461:
4454:
4434:
4425:
4400:
4372:
4363:
4258:
4229:
4129:
4105:
4098:
4078:
4069:
4044:
4016:
4007:
3944:
3876:
3863:
3855:
3829:
3697:
3681:
3675:
3659:
3634:
3618:
3589:of the observations (i.e. the
3428:
3396:
3346:
3330:
3182:
3164:
2884:
2855:
2780:
2716:
2700:
2678:
2637:
2618:
2558:
2526:
2480:
2466:
2450:
2418:
2391:
2377:
2361:
2329:
2290:
2258:
2100:
2088:
1886:
1872:
1660:
1648:
1571:
1559:
1283:
1267:
1193:It is easier to write out the
1150:
1138:
1110:
1098:
1061:
1047:
925:
893:
846:
826:
685:general class of distributions
678:
582:
550:
492:
480:
443:
431:
404:
392:
386:
380:
353:
341:
323:
311:
295:
289:
252:
240:
155:
113:
1:
5746:
5181:is the number of categories).
3274:maximum-a-posteriori estimate
3233:term and simply think of the
3036:{\displaystyle \alpha _{i}-1}
2800:This relationship is used in
2143:{\displaystyle \delta _{xi},}
1433:
5121:{\displaystyle c_{i}^{(-n)}}
4799:{\displaystyle \mathbb {X} }
4784:That is, for a set of nodes
3975:{\displaystyle \mathbb {X} }
3953:{\displaystyle {\tilde {x}}}
3917:article on this distribution
3901:hierarchical Bayesian models
2302:concentration hyperparameter
1843:as composed of the elements:
1577:{\displaystyle p_{i}=P(X=i)}
271:{\displaystyle p(x=i)=p_{i}}
7:
5687:
5438:can be recovered using the
5139:
3257:{\displaystyle \alpha _{i}}
3003:{\displaystyle \alpha _{i}}
728:hierarchical Bayesian model
696:natural language processing
10:
7323:
7094:Wrapped asymmetric Laplace
6065:Extended negative binomial
5150:inverse transform sampling
2500:then the following holds:
745:probability mass functions
7256:
7190:
7148:
7049:
6885:
6863:
6854:
6753:Generalized extreme value
6738:
6573:
6533:Relativistic Breit–Wigner
6249:
6146:
6137:
6030:
5950:
5941:
5930:Probability distributions
5763:, p. 35. MIT press.
4751:conditional distributions
3960:would take given the set
3226:{\displaystyle \cdots -1}
1992:multinomially distributed
1249:probability mass function
1229:multinomial distributions
808:probability mass function
759:Formulating distributions
700:multinomial distributions
519:
514:
228:
223:
176:
171:
29:
24:
5726:
5711:Multinomial distribution
5184:Convert the values to a
4781:of the remaining nodes.
3585:In the above model, the
2974:probability distribution
2206:multinomial distribution
1254:in this formulation is:
1206:multinomial distribution
765:probability distribution
720:collapsed Gibbs sampling
690:In some fields, such as
673:multinomial distribution
618:multinoulli distribution
610:categorical distribution
6748:Generalized chi-squared
6692:Normal-inverse Gaussian
5385:{\displaystyle \alpha }
5204:number between 0 and 1.
4735:preferential attachment
1239:encoded random vectors
749:multinomial coefficient
726:are collapsed out of a
724:Dirichlet distributions
7302:Discrete distributions
7060:Univariate (circular)
6621:Generalized hyperbolic
6050:Conway–Maxwell–Poisson
6040:Beta negative binomial
5759:Murphy, K. P. (2012).
5716:Bernoulli distribution
5706:Dirichlet distribution
5678:
5610:
5580:
5486:
5432:
5386:
5363:
5308:
5279:
5144:There are a number of
5122:
5078:
4865:
4827:
4800:
4708:
4306:
3976:
3954:
3890:
3822:
3567:
3534:
3478:
3258:
3227:
3203:of possible values of
3189:
3143:
3097:
3057:
3037:
3004:
2951:
2837:Dirichlet distribution
2791:
2617:
2491:
2226:posterior distribution
2222:Dirichlet distribution
2198:Dirichlet distribution
2178:
2177:{\displaystyle p_{i}.}
2144:
2107:
2049:Dirichlet distribution
2033:
2008:
1984:
1958:
1938:
1896:
1833:
1809:
1757:
1667:
1633:
1578:
1528:
1527:, embedded in 3-space.
1521:
1462:
1420:
1369:
1339:
1309:
1221:posterior distribution
1213:Dirichlet distribution
1201:categorical variables.
1183:
1157:
1122:
1087:
1021:
998:
959:
932:
869:
804:Bernoulli distribution
777:Bernoulli distribution
698:, the categorical and
657:Bernoulli distribution
589:
499:
463:
362:
272:
214:
162:
100:
55:number of categories (
49:
48:{\displaystyle k>0}
7105:Bivariate (spherical)
6603:Kaniadakis κ-Gaussian
5700:Related distributions
5679:
5611:
5609:{\displaystyle u_{i}}
5581:
5487:
5433:
5387:
5364:
5309:
5280:
5202:uniformly distributed
5123:
5079:
4866:
4833:and the remainder as
4828:
4826:{\displaystyle x_{n}}
4801:
4709:
4307:
3977:
3955:
3903:, because when doing
3891:
3802:
3568:
3535:
3479:
3264:values less than 1.)
3259:
3228:
3190:
3144:
3098:
3058:
3038:
3005:
2952:
2792:
2597:
2492:
2179:
2152:Bernoulli distributed
2145:
2108:
2034:
2009:
1985:
1959:
1939:
1897:
1834:
1810:
1758:
1668:
1666:{\displaystyle (k-1)}
1634:
1579:
1522:
1463:
1441:
1421:
1370:
1368:{\displaystyle p_{i}}
1340:
1289:
1184:
1158:
1123:
1067:
1022:
978:
960:
958:{\displaystyle p_{i}}
933:
870:
737:Bernoulli-distributed
590:
531: such that
500:
464:
363:
273:
215:
163:
101:
50:
7170:Dirac delta function
7117:Bivariate (toroidal)
7074:Univariate von Mises
6945:Multivariate Laplace
6837:Shifted log-logistic
6186:Continuous Bernoulli
5694:Categorical variable
5624:
5618:uniform distribution
5593:
5503:
5450:
5396:
5376:
5321:
5289:
5243:
5157:normalizing constant
5091:
4878:
4837:
4810:
4788:
4353:
3997:
3964:
3935:
3608:
3544:
3491:
3291:
3241:
3211:
3197:uniform distribution
3153:
3107:
3067:
3047:
3014:
2987:
2846:
2507:
2239:
2158:
2121:
2085:
2077:, equivalent to the
2060:sufficient statistic
2055:for more discussion.
2018:
1998:
1972:
1948:
1922:
1853:
1821:
1775:
1683:
1673:-dimensional simplex
1645:
1596:
1540:
1472:
1446:
1383:
1352:
1261:
1167:
1135:
1041:
973:
942:
882:
820:
741:binomial-distributed
523:
477:
374:
283:
234:
180:
110:
106:event probabilities
64:
33:
7218:Natural exponential
7123:Bivariate von Mises
7089:Wrapped exponential
6955:Multivariate stable
6950:Multivariate normal
6271:Benktander 2nd kind
6266:Benktander 1st kind
6055:Discrete phase-type
5444:Gumbel distribution
5117:
5056:
4976:
3587:marginal likelihood
3581:Marginal likelihood
2802:Bayesian statistics
2581: so that
2194:Bayesian statistics
1937:{\displaystyle n=1}
1461:{\displaystyle k=3}
1331:
1195:likelihood function
1182:{\displaystyle x=i}
1114:
753:variational methods
739:nodes and a single
357:
327:
21:
6873:Rectified Gaussian
6758:Generalized Pareto
6616:Generalized normal
6488:Matrix-exponential
5674:
5606:
5576:
5482:
5428:
5382:
5359:
5304:
5275:
5118:
5094:
5074:
5072:
5033:
5013:
4953:
4861:
4823:
4796:
4771:joint distribution
4753:in multi-variable
4704:
4702:
4302:
4300:
4199:
3972:
3950:
3919:for more details.
3886:
3884:
3783:
3742:
3591:joint distribution
3563:
3530:
3474:
3395:
3254:
3223:
3185:
3139:
3093:
3053:
3033:
3000:
2947:
2933:
2787:
2785:
2487:
2485:
2218:prior distribution
2174:
2140:
2103:
2029:
2004:
1980:
1954:
1934:
1912:indicator function
1892:
1829:
1805:
1753:
1663:
1629:
1628:
1610:
1574:
1529:
1517:
1468:are the 2-simplex
1458:
1416:
1415:
1397:
1365:
1335:
1310:
1179:
1163:evaluates to 1 if
1153:
1118:
1088:
1017:
1016:
955:
928:
865:
732:joint distribution
602:probability theory
585:
495:
459:
358:
331:
301:
268:
210:
158:
96:
45:
19:
7284:
7283:
6881:
6880:
6850:
6849:
6741:whose type varies
6687:Normal (Gaussian)
6641:Hyperbolic secant
6590:Exponential power
6493:Maxwell–Boltzmann
6241:Wigner semicircle
6133:
6132:
6105:Parabolic fractal
6095:Negative binomial
5875:978-0-471-22618-5
5791:Minka, T. (2003)
5025:
5004:
4563:
4493:
4437:
4375:
4211:
4190:
4137:
4081:
4019:
3947:
3913:variational Bayes
3880:
3800:
3774:
3733:
3704:
3432:
3386:
3276:of the parameter
3056:{\displaystyle i}
2945:
2924:
2582:
2571:
2303:
2007:{\displaystyle n}
1957:{\displaystyle n}
1601:
1388:
1211:It shows why the
783:random variable.
735:between a set of
598:
597:
532:
7314:
7297:Categorical data
7274:
7273:
7264:
7263:
7203:Compound Poisson
7178:
7166:
7135:von Mises–Fisher
7131:
7119:
7107:
7069:Circular uniform
7065:
6985:
6929:
6900:
6861:
6860:
6763:Marchenko–Pastur
6626:Geometric stable
6543:Truncated normal
6436:Inverse Gaussian
6342:Hyperexponential
6181:Beta rectangular
6149:bounded interval
6144:
6143:
6012:Discrete uniform
5997:Poisson binomial
5948:
5947:
5923:
5916:
5909:
5900:
5899:
5893:
5892:
5884:
5878:
5867:
5861:
5846:
5840:
5822:
5813:
5802:
5796:
5789:
5772:
5757:
5740:
5737:
5683:
5681:
5680:
5675:
5670:
5669:
5636:
5635:
5615:
5613:
5612:
5607:
5605:
5604:
5585:
5583:
5582:
5577:
5575:
5571:
5570:
5569:
5557:
5556:
5539:
5538:
5533:
5491:
5489:
5488:
5483:
5481:
5480:
5462:
5461:
5440:softmax function
5437:
5435:
5434:
5429:
5427:
5426:
5408:
5407:
5391:
5389:
5388:
5383:
5368:
5366:
5365:
5360:
5352:
5351:
5333:
5332:
5313:
5311:
5310:
5305:
5303:
5302:
5297:
5284:
5282:
5281:
5276:
5274:
5273:
5255:
5254:
5237:machine learning
5127:
5125:
5124:
5119:
5116:
5102:
5083:
5081:
5080:
5075:
5073:
5069:
5068:
5055:
5041:
5026:
5024:
5023:
5022:
5012:
4990:
4989:
4988:
4975:
4961:
4951:
4938:
4930:
4929:
4915:
4900:
4899:
4870:
4868:
4867:
4862:
4860:
4859:
4845:
4832:
4830:
4829:
4824:
4822:
4821:
4805:
4803:
4802:
4797:
4795:
4767:marginalized out
4713:
4711:
4710:
4705:
4703:
4693:
4685:
4677:
4676:
4651:
4647:
4643:
4642:
4626:
4625:
4624:
4616:
4608:
4591:
4587:
4583:
4579:
4565:
4564:
4556:
4539:
4538:
4537:
4529:
4521:
4504:
4500:
4495:
4494:
4491:
4484:
4476:
4468:
4453:
4439:
4438:
4430:
4421:
4420:
4419:
4399:
4391:
4377:
4376:
4368:
4311:
4309:
4308:
4303:
4301:
4294:
4293:
4281:
4280:
4264:
4257:
4249:
4241:
4240:
4228:
4216:
4212:
4210:
4209:
4208:
4198:
4182:
4181:
4180:
4168:
4167:
4157:
4148:
4144:
4139:
4138:
4135:
4128:
4120:
4112:
4097:
4083:
4082:
4074:
4065:
4064:
4063:
4043:
4035:
4021:
4020:
4012:
3981:
3979:
3978:
3973:
3971:
3959:
3957:
3956:
3951:
3949:
3948:
3940:
3895:
3893:
3892:
3887:
3885:
3881:
3879:
3875:
3874:
3858:
3854:
3853:
3841:
3840:
3824:
3821:
3816:
3801:
3799:
3798:
3794:
3793:
3792:
3782:
3758:
3757:
3753:
3752:
3751:
3741:
3723:
3715:
3711:
3706:
3705:
3702:
3696:
3688:
3674:
3666:
3655:
3654:
3653:
3633:
3625:
3595:marginalized out
3572:
3570:
3569:
3564:
3556:
3555:
3539:
3537:
3536:
3531:
3523:
3522:
3510:
3509:
3483:
3481:
3480:
3475:
3467:
3466:
3454:
3453:
3433:
3431:
3421:
3420:
3408:
3407:
3394:
3384:
3377:
3376:
3364:
3363:
3353:
3345:
3337:
3323:
3322:
3321:
3315:
3263:
3261:
3260:
3255:
3253:
3252:
3232:
3230:
3229:
3224:
3194:
3192:
3191:
3186:
3160:
3148:
3146:
3145:
3140:
3132:
3131:
3119:
3118:
3102:
3100:
3099:
3094:
3092:
3091:
3079:
3078:
3062:
3060:
3059:
3054:
3042:
3040:
3039:
3034:
3026:
3025:
3009:
3007:
3006:
3001:
2999:
2998:
2956:
2954:
2953:
2948:
2946:
2944:
2943:
2942:
2932:
2916:
2915:
2914:
2902:
2901:
2891:
2883:
2875:
2867:
2866:
2796:
2794:
2793:
2788:
2786:
2779:
2778:
2766:
2765:
2747:
2746:
2734:
2733:
2699:
2691:
2664:
2656:
2648:
2630:
2629:
2616:
2611:
2593:
2592:
2583:
2580:
2572:
2569:
2557:
2556:
2538:
2537:
2518:
2496:
2494:
2493:
2488:
2486:
2479:
2449:
2448:
2430:
2429:
2410:
2402:
2390:
2360:
2359:
2341:
2340:
2321:
2313:
2304:
2301:
2289:
2288:
2270:
2269:
2250:
2220:defined using a
2183:
2181:
2180:
2175:
2170:
2169:
2149:
2147:
2146:
2141:
2136:
2135:
2112:
2110:
2109:
2106:{\displaystyle }
2104:
2038:
2036:
2035:
2030:
2025:
2013:
2011:
2010:
2005:
1994:with parameters
1989:
1987:
1986:
1981:
1979:
1963:
1961:
1960:
1955:
1943:
1941:
1940:
1935:
1901:
1899:
1898:
1893:
1879:
1865:
1864:
1838:
1836:
1835:
1830:
1828:
1814:
1812:
1811:
1806:
1804:
1796:
1792:
1762:
1760:
1759:
1754:
1746:
1745:
1733:
1732:
1708:
1707:
1695:
1694:
1672:
1670:
1669:
1664:
1638:
1636:
1635:
1630:
1627:
1620:
1619:
1609:
1583:
1581:
1580:
1575:
1552:
1551:
1526:
1524:
1523:
1518:
1510:
1509:
1497:
1496:
1484:
1483:
1467:
1465:
1464:
1459:
1425:
1423:
1422:
1417:
1414:
1407:
1406:
1396:
1374:
1372:
1371:
1366:
1364:
1363:
1344:
1342:
1341:
1336:
1330:
1329:
1328:
1318:
1308:
1303:
1282:
1274:
1188:
1186:
1185:
1180:
1162:
1160:
1159:
1156:{\displaystyle }
1154:
1127:
1125:
1124:
1119:
1113:
1096:
1086:
1081:
1060:
1026:
1024:
1023:
1018:
1015:
1008:
1007:
997:
992:
964:
962:
961:
956:
954:
953:
937:
935:
934:
929:
924:
923:
905:
904:
889:
874:
872:
871:
866:
861:
860:
845:
692:machine learning
594:
592:
591:
586:
581:
580:
562:
561:
543:
542:
533:
530:
504:
502:
501:
498:{\displaystyle }
496:
469:
468:
466:
465:
460:
458:
457:
419:
418:
367:
365:
364:
359:
356:
339:
326:
309:
277:
275:
274:
269:
267:
266:
219:
217:
216:
211:
167:
165:
164:
159:
148:
147:
125:
124:
105:
103:
102:
97:
95:
94:
76:
75:
54:
52:
51:
46:
22:
18:
7322:
7321:
7317:
7316:
7315:
7313:
7312:
7311:
7287:
7286:
7285:
7280:
7252:
7228:Maximum entropy
7186:
7174:
7162:
7152:
7144:
7127:
7115:
7103:
7058:
7045:
6982:Matrix-valued:
6979:
6925:
6896:
6888:
6877:
6865:
6856:
6846:
6740:
6734:
6651:
6577:
6575:
6569:
6498:Maxwell–Jüttner
6347:Hypoexponential
6253:
6251:
6250:supported on a
6245:
6206:Noncentral beta
6166:Balding–Nichols
6148:
6147:supported on a
6139:
6129:
6032:
6026:
6022:Zipf–Mandelbrot
5952:
5943:
5937:
5927:
5897:
5896:
5885:
5881:
5868:
5864:
5847:
5843:
5823:
5816:
5810:Iverson bracket
5806:Kronecker delta
5803:
5799:
5790:
5775:
5758:
5754:
5749:
5744:
5743:
5738:
5734:
5729:
5702:
5690:
5665:
5661:
5631:
5627:
5625:
5622:
5621:
5600:
5596:
5594:
5591:
5590:
5565:
5561:
5552:
5548:
5547:
5543:
5534:
5513:
5512:
5504:
5501:
5500:
5476:
5472:
5457:
5453:
5451:
5448:
5447:
5422:
5418:
5403:
5399:
5397:
5394:
5393:
5377:
5374:
5373:
5347:
5343:
5328:
5324:
5322:
5319:
5318:
5298:
5293:
5292:
5290:
5287:
5286:
5269:
5265:
5250:
5246:
5244:
5241:
5240:
5233:
5228:
5142:
5103:
5098:
5092:
5089:
5088:
5071:
5070:
5064:
5060:
5042:
5037:
5027:
5018:
5014:
5008:
4991:
4984:
4980:
4962:
4957:
4952:
4950:
4942:
4934:
4916:
4911:
4910:
4895:
4891:
4881:
4879:
4876:
4875:
4846:
4841:
4840:
4838:
4835:
4834:
4817:
4813:
4811:
4808:
4807:
4791:
4789:
4786:
4785:
4743:
4727:
4701:
4700:
4689:
4681:
4672:
4668:
4649:
4648:
4638:
4634:
4630:
4620:
4612:
4604:
4603:
4599:
4589:
4588:
4575:
4555:
4554:
4547:
4543:
4533:
4525:
4517:
4516:
4512:
4502:
4501:
4496:
4490:
4489:
4480:
4472:
4464:
4449:
4429:
4428:
4415:
4414:
4410:
4403:
4395:
4387:
4367:
4366:
4356:
4354:
4351:
4350:
4299:
4298:
4289:
4285:
4276:
4272:
4262:
4261:
4253:
4245:
4236:
4232:
4224:
4214:
4213:
4204:
4200:
4194:
4183:
4176:
4172:
4163:
4159:
4158:
4156:
4146:
4145:
4140:
4134:
4133:
4124:
4116:
4108:
4093:
4073:
4072:
4059:
4058:
4054:
4047:
4039:
4031:
4011:
4010:
4000:
3998:
3995:
3994:
3967:
3965:
3962:
3961:
3939:
3938:
3936:
3933:
3932:
3925:
3883:
3882:
3870:
3866:
3859:
3849:
3845:
3836:
3832:
3825:
3823:
3817:
3806:
3788:
3784:
3778:
3767:
3763:
3759:
3747:
3743:
3737:
3732:
3728:
3724:
3722:
3713:
3712:
3707:
3701:
3700:
3692:
3684:
3670:
3662:
3649:
3648:
3644:
3637:
3629:
3621:
3611:
3609:
3606:
3605:
3583:
3551:
3547:
3545:
3542:
3541:
3518:
3514:
3505:
3501:
3492:
3489:
3488:
3462:
3458:
3449:
3445:
3416:
3412:
3403:
3399:
3390:
3385:
3372:
3368:
3359:
3355:
3354:
3352:
3341:
3333:
3317:
3316:
3295:
3294:
3292:
3289:
3288:
3270:
3248:
3244:
3242:
3239:
3238:
3212:
3209:
3208:
3156:
3154:
3151:
3150:
3127:
3123:
3114:
3110:
3108:
3105:
3104:
3087:
3083:
3074:
3070:
3068:
3065:
3064:
3048:
3045:
3044:
3021:
3017:
3015:
3012:
3011:
2994:
2990:
2988:
2985:
2984:
2938:
2934:
2928:
2917:
2910:
2906:
2897:
2893:
2892:
2890:
2879:
2871:
2862:
2858:
2847:
2844:
2843:
2784:
2783:
2774:
2770:
2761:
2757:
2742:
2738:
2729:
2725:
2708:
2703:
2695:
2687:
2670:
2665:
2660:
2652:
2644:
2641:
2640:
2625:
2621:
2612:
2601:
2588:
2584:
2579:
2568:
2566:
2561:
2552:
2548:
2533:
2529:
2524:
2519:
2514:
2510:
2508:
2505:
2504:
2484:
2483:
2475:
2458:
2453:
2444:
2440:
2425:
2421:
2416:
2411:
2406:
2398:
2395:
2394:
2386:
2369:
2364:
2355:
2351:
2336:
2332:
2327:
2322:
2317:
2309:
2306:
2305:
2300:
2298:
2293:
2284:
2280:
2265:
2261:
2256:
2251:
2246:
2242:
2240:
2237:
2236:
2214:random variable
2202:conjugate prior
2190:
2165:
2161:
2159:
2156:
2155:
2154:with parameter
2128:
2124:
2122:
2119:
2118:
2115:Kronecker delta
2086:
2083:
2082:
2079:Iverson bracket
2045:conjugate prior
2021:
2019:
2016:
2015:
1999:
1996:
1995:
1975:
1973:
1970:
1969:
1949:
1946:
1945:
1923:
1920:
1919:
1875:
1860:
1856:
1854:
1851:
1850:
1824:
1822:
1819:
1818:
1800:
1788:
1784:
1776:
1773:
1772:
1741:
1737:
1728:
1724:
1703:
1699:
1690:
1686:
1684:
1681:
1680:
1646:
1643:
1642:
1615:
1611:
1605:
1600:
1597:
1594:
1593:
1547:
1543:
1541:
1538:
1537:
1505:
1501:
1492:
1488:
1479:
1475:
1473:
1470:
1469:
1447:
1444:
1443:
1436:
1402:
1398:
1392:
1387:
1384:
1381:
1380:
1359:
1355:
1353:
1350:
1349:
1324:
1320:
1319:
1314:
1304:
1293:
1278:
1270:
1262:
1259:
1258:
1217:conjugate prior
1168:
1165:
1164:
1136:
1133:
1132:
1097:
1092:
1082:
1071:
1056:
1042:
1039:
1038:
1032:Iverson bracket
1003:
999:
993:
982:
977:
974:
971:
970:
949:
945:
943:
940:
939:
919:
915:
900:
896:
885:
883:
880:
879:
856:
852:
841:
821:
818:
817:
761:
681:
612:(also called a
576:
572:
557:
553:
538:
534:
529:
524:
521:
520:
507:Iverson bracket
478:
475:
474:
453:
449:
414:
410:
375:
372:
371:
369:
368:
340:
335:
310:
305:
284:
281:
280:
278:
262:
258:
235:
232:
231:
181:
178:
177:
143:
139:
120:
116:
111:
108:
107:
90:
86:
71:
67:
65:
62:
61:
60:
34:
31:
30:
17:
12:
11:
5:
7320:
7310:
7309:
7304:
7299:
7282:
7281:
7279:
7278:
7268:
7257:
7254:
7253:
7251:
7250:
7245:
7240:
7235:
7230:
7225:
7223:Location–scale
7220:
7215:
7210:
7205:
7200:
7194:
7192:
7188:
7187:
7185:
7184:
7179:
7172:
7167:
7159:
7157:
7146:
7145:
7143:
7142:
7137:
7132:
7125:
7120:
7113:
7108:
7101:
7096:
7091:
7086:
7084:Wrapped Cauchy
7081:
7079:Wrapped normal
7076:
7071:
7066:
7055:
7053:
7047:
7046:
7044:
7043:
7042:
7041:
7036:
7034:Normal-inverse
7031:
7026:
7016:
7015:
7014:
7004:
6996:
6991:
6986:
6977:
6976:
6975:
6965:
6957:
6952:
6947:
6942:
6941:
6940:
6930:
6923:
6922:
6921:
6916:
6906:
6901:
6893:
6891:
6883:
6882:
6879:
6878:
6876:
6875:
6869:
6867:
6858:
6852:
6851:
6848:
6847:
6845:
6844:
6839:
6834:
6826:
6818:
6810:
6801:
6792:
6783:
6774:
6765:
6760:
6755:
6750:
6744:
6742:
6736:
6735:
6733:
6732:
6727:
6725:Variance-gamma
6722:
6717:
6709:
6704:
6699:
6694:
6689:
6684:
6676:
6671:
6670:
6669:
6659:
6654:
6649:
6643:
6638:
6633:
6628:
6623:
6618:
6613:
6605:
6600:
6592:
6587:
6581:
6579:
6571:
6570:
6568:
6567:
6565:Wilks's lambda
6562:
6561:
6560:
6550:
6545:
6540:
6535:
6530:
6525:
6520:
6515:
6510:
6505:
6503:Mittag-Leffler
6500:
6495:
6490:
6485:
6480:
6475:
6470:
6465:
6460:
6455:
6450:
6445:
6444:
6443:
6433:
6424:
6419:
6414:
6413:
6412:
6402:
6400:gamma/Gompertz
6397:
6396:
6395:
6390:
6380:
6375:
6370:
6369:
6368:
6356:
6355:
6354:
6349:
6344:
6334:
6333:
6332:
6322:
6317:
6312:
6311:
6310:
6309:
6308:
6298:
6288:
6283:
6278:
6273:
6268:
6263:
6257:
6255:
6252:semi-infinite
6247:
6246:
6244:
6243:
6238:
6233:
6228:
6223:
6218:
6213:
6208:
6203:
6198:
6193:
6188:
6183:
6178:
6173:
6168:
6163:
6158:
6152:
6150:
6141:
6135:
6134:
6131:
6130:
6128:
6127:
6122:
6117:
6112:
6107:
6102:
6097:
6092:
6087:
6082:
6077:
6072:
6067:
6062:
6057:
6052:
6047:
6042:
6036:
6034:
6031:with infinite
6028:
6027:
6025:
6024:
6019:
6014:
6009:
6004:
5999:
5994:
5993:
5992:
5985:Hypergeometric
5982:
5977:
5972:
5967:
5962:
5956:
5954:
5945:
5939:
5938:
5926:
5925:
5918:
5911:
5903:
5895:
5894:
5879:
5862:
5841:
5814:
5797:
5773:
5751:
5750:
5748:
5745:
5742:
5741:
5731:
5730:
5728:
5725:
5724:
5723:
5718:
5713:
5708:
5701:
5698:
5697:
5696:
5689:
5686:
5673:
5668:
5664:
5660:
5657:
5654:
5651:
5648:
5645:
5642:
5639:
5634:
5630:
5603:
5599:
5587:
5586:
5574:
5568:
5564:
5560:
5555:
5551:
5546:
5542:
5537:
5532:
5529:
5526:
5522:
5519:
5516:
5511:
5508:
5479:
5475:
5471:
5468:
5465:
5460:
5456:
5425:
5421:
5417:
5414:
5411:
5406:
5402:
5381:
5370:
5369:
5358:
5355:
5350:
5346:
5342:
5339:
5336:
5331:
5327:
5301:
5296:
5272:
5268:
5264:
5261:
5258:
5253:
5249:
5232:
5229:
5225:
5220:
5219:
5216:
5205:
5194:
5193:
5182:
5171:
5164:
5141:
5138:
5115:
5112:
5109:
5106:
5101:
5097:
5085:
5084:
5067:
5063:
5059:
5054:
5051:
5048:
5045:
5040:
5036:
5031:
5028:
5021:
5017:
5011:
5007:
5003:
5000:
4997:
4994:
4987:
4983:
4979:
4974:
4971:
4968:
4965:
4960:
4956:
4948:
4945:
4943:
4941:
4937:
4933:
4928:
4925:
4922:
4919:
4914:
4909:
4906:
4903:
4898:
4894:
4890:
4887:
4884:
4883:
4858:
4855:
4852:
4849:
4844:
4820:
4816:
4794:
4763:mixture models
4755:Bayes networks
4747:Gibbs sampling
4742:
4739:
4725:
4715:
4714:
4699:
4696:
4692:
4688:
4684:
4680:
4675:
4671:
4667:
4664:
4661:
4657:
4654:
4652:
4650:
4646:
4641:
4637:
4633:
4629:
4623:
4619:
4615:
4611:
4607:
4602:
4597:
4594:
4592:
4590:
4586:
4582:
4578:
4574:
4571:
4568:
4562:
4559:
4553:
4550:
4546:
4542:
4536:
4532:
4528:
4524:
4520:
4515:
4510:
4507:
4505:
4503:
4499:
4487:
4483:
4479:
4475:
4471:
4467:
4463:
4460:
4456:
4452:
4448:
4445:
4442:
4436:
4433:
4427:
4424:
4418:
4413:
4409:
4406:
4404:
4402:
4398:
4394:
4390:
4386:
4383:
4380:
4374:
4371:
4365:
4362:
4359:
4358:
4336:
4335:
4332:expected count
4328:
4325:expected value
4321:
4313:
4312:
4297:
4292:
4288:
4284:
4279:
4275:
4270:
4267:
4265:
4263:
4260:
4256:
4252:
4248:
4244:
4239:
4235:
4231:
4227:
4222:
4219:
4217:
4215:
4207:
4203:
4197:
4193:
4189:
4186:
4179:
4175:
4171:
4166:
4162:
4154:
4151:
4149:
4147:
4143:
4131:
4127:
4123:
4119:
4115:
4111:
4107:
4104:
4100:
4096:
4092:
4089:
4086:
4080:
4077:
4071:
4068:
4062:
4057:
4053:
4050:
4048:
4046:
4042:
4038:
4034:
4030:
4027:
4024:
4018:
4015:
4009:
4006:
4003:
4002:
3970:
3946:
3943:
3924:
3921:
3909:Gibbs sampling
3897:
3896:
3878:
3873:
3869:
3865:
3862:
3857:
3852:
3848:
3844:
3839:
3835:
3831:
3828:
3820:
3815:
3812:
3809:
3805:
3797:
3791:
3787:
3781:
3777:
3773:
3770:
3766:
3762:
3756:
3750:
3746:
3740:
3736:
3731:
3727:
3721:
3718:
3716:
3714:
3710:
3699:
3695:
3691:
3687:
3683:
3680:
3677:
3673:
3669:
3665:
3661:
3658:
3652:
3647:
3643:
3640:
3638:
3636:
3632:
3628:
3624:
3620:
3617:
3614:
3613:
3582:
3579:
3562:
3559:
3554:
3550:
3529:
3526:
3521:
3517:
3513:
3508:
3504:
3499:
3496:
3485:
3484:
3473:
3470:
3465:
3461:
3457:
3452:
3448:
3443:
3440:
3436:
3430:
3427:
3424:
3419:
3415:
3411:
3406:
3402:
3398:
3393:
3389:
3383:
3380:
3375:
3371:
3367:
3362:
3358:
3351:
3348:
3344:
3340:
3336:
3332:
3329:
3326:
3320:
3314:
3311:
3308:
3304:
3301:
3298:
3269:
3268:MAP estimation
3266:
3251:
3247:
3222:
3219:
3216:
3184:
3181:
3178:
3175:
3172:
3169:
3166:
3163:
3159:
3138:
3135:
3130:
3126:
3122:
3117:
3113:
3090:
3086:
3082:
3077:
3073:
3052:
3032:
3029:
3024:
3020:
2997:
2993:
2958:
2957:
2941:
2937:
2931:
2927:
2923:
2920:
2913:
2909:
2905:
2900:
2896:
2889:
2886:
2882:
2878:
2874:
2870:
2865:
2861:
2857:
2854:
2851:
2833:expected value
2798:
2797:
2782:
2777:
2773:
2769:
2764:
2760:
2756:
2753:
2750:
2745:
2741:
2737:
2732:
2728:
2724:
2721:
2718:
2715:
2712:
2709:
2707:
2704:
2702:
2698:
2694:
2690:
2686:
2683:
2680:
2677:
2674:
2671:
2669:
2666:
2663:
2659:
2655:
2651:
2647:
2643:
2642:
2639:
2636:
2633:
2628:
2624:
2620:
2615:
2610:
2607:
2604:
2600:
2596:
2591:
2587:
2578:
2575:
2567:
2565:
2562:
2560:
2555:
2551:
2547:
2544:
2541:
2536:
2532:
2528:
2525:
2523:
2520:
2517:
2513:
2512:
2498:
2497:
2482:
2478:
2474:
2471:
2468:
2465:
2462:
2459:
2457:
2454:
2452:
2447:
2443:
2439:
2436:
2433:
2428:
2424:
2420:
2417:
2415:
2412:
2409:
2405:
2401:
2397:
2396:
2393:
2389:
2385:
2382:
2379:
2376:
2373:
2370:
2368:
2365:
2363:
2358:
2354:
2350:
2347:
2344:
2339:
2335:
2331:
2328:
2326:
2323:
2320:
2316:
2312:
2308:
2307:
2299:
2297:
2294:
2292:
2287:
2283:
2279:
2276:
2273:
2268:
2264:
2260:
2257:
2255:
2252:
2249:
2245:
2244:
2216:and give it a
2189:
2186:
2185:
2184:
2173:
2168:
2164:
2139:
2134:
2131:
2127:
2102:
2099:
2096:
2093:
2090:
2071:
2056:
2040:
2039:
2028:
2024:
2003:
1978:
1953:
1933:
1930:
1927:
1904:
1903:
1902:
1891:
1888:
1885:
1882:
1878:
1874:
1871:
1868:
1863:
1859:
1845:
1844:
1827:
1815:
1803:
1799:
1795:
1791:
1787:
1783:
1780:
1770:
1763:
1752:
1749:
1744:
1740:
1736:
1731:
1727:
1723:
1720:
1717:
1714:
1711:
1706:
1702:
1698:
1693:
1689:
1662:
1659:
1656:
1653:
1650:
1626:
1623:
1618:
1614:
1608:
1604:
1573:
1570:
1567:
1564:
1561:
1558:
1555:
1550:
1546:
1516:
1513:
1508:
1504:
1500:
1495:
1491:
1487:
1482:
1478:
1457:
1454:
1451:
1435:
1432:
1413:
1410:
1405:
1401:
1395:
1391:
1362:
1358:
1346:
1345:
1334:
1327:
1323:
1317:
1313:
1307:
1302:
1299:
1296:
1292:
1288:
1285:
1281:
1277:
1273:
1269:
1266:
1225:
1224:
1209:
1202:
1178:
1175:
1172:
1152:
1149:
1146:
1143:
1140:
1129:
1128:
1117:
1112:
1109:
1106:
1103:
1100:
1095:
1091:
1085:
1080:
1077:
1074:
1070:
1066:
1063:
1059:
1055:
1052:
1049:
1046:
1014:
1011:
1006:
1002:
996:
991:
988:
985:
981:
952:
948:
927:
922:
918:
914:
911:
908:
903:
899:
895:
892:
888:
876:
875:
864:
859:
855:
851:
848:
844:
840:
837:
834:
831:
828:
825:
771:is the set of
760:
757:
680:
677:
653:generalization
596:
595:
584:
579:
575:
571:
568:
565:
560:
556:
552:
549:
546:
541:
537:
528:
518:
512:
511:
510:
509:
494:
491:
488:
485:
482:
456:
452:
448:
445:
442:
439:
436:
433:
429:
426:
423:
417:
413:
409:
406:
403:
400:
397:
394:
391:
388:
385:
382:
379:
355:
352:
349:
346:
343:
338:
334:
330:
325:
322:
319:
316:
313:
308:
304:
300:
297:
294:
291:
288:
265:
261:
257:
254:
251:
248:
245:
242:
239:
227:
221:
220:
209:
206:
203:
200:
197:
194:
191:
188:
185:
175:
169:
168:
157:
154:
151:
146:
142:
138:
134:
131:
128:
123:
119:
115:
93:
89:
85:
82:
79:
74:
70:
44:
41:
38:
28:
15:
9:
6:
4:
3:
2:
7319:
7308:
7305:
7303:
7300:
7298:
7295:
7294:
7292:
7277:
7269:
7267:
7259:
7258:
7255:
7249:
7246:
7244:
7241:
7239:
7236:
7234:
7231:
7229:
7226:
7224:
7221:
7219:
7216:
7214:
7211:
7209:
7206:
7204:
7201:
7199:
7196:
7195:
7193:
7189:
7183:
7180:
7177:
7173:
7171:
7168:
7165:
7161:
7160:
7158:
7156:
7151:
7147:
7141:
7138:
7136:
7133:
7130:
7126:
7124:
7121:
7118:
7114:
7112:
7109:
7106:
7102:
7100:
7097:
7095:
7092:
7090:
7087:
7085:
7082:
7080:
7077:
7075:
7072:
7070:
7067:
7064:
7063:
7057:
7056:
7054:
7052:
7048:
7040:
7037:
7035:
7032:
7030:
7027:
7025:
7022:
7021:
7020:
7017:
7013:
7010:
7009:
7008:
7005:
7003:
7002:
6997:
6995:
6994:Matrix normal
6992:
6990:
6987:
6984:
6983:
6978:
6974:
6971:
6970:
6969:
6966:
6964:
6963:
6960:Multivariate
6958:
6956:
6953:
6951:
6948:
6946:
6943:
6939:
6936:
6935:
6934:
6931:
6928:
6924:
6920:
6917:
6915:
6912:
6911:
6910:
6907:
6905:
6902:
6899:
6895:
6894:
6892:
6890:
6887:Multivariate
6884:
6874:
6871:
6870:
6868:
6862:
6859:
6853:
6843:
6840:
6838:
6835:
6833:
6831:
6827:
6825:
6823:
6819:
6817:
6815:
6811:
6809:
6807:
6802:
6800:
6798:
6793:
6791:
6789:
6784:
6782:
6780:
6775:
6773:
6771:
6766:
6764:
6761:
6759:
6756:
6754:
6751:
6749:
6746:
6745:
6743:
6739:with support
6737:
6731:
6728:
6726:
6723:
6721:
6718:
6716:
6715:
6710:
6708:
6705:
6703:
6700:
6698:
6695:
6693:
6690:
6688:
6685:
6683:
6682:
6677:
6675:
6672:
6668:
6665:
6664:
6663:
6660:
6658:
6655:
6653:
6652:
6644:
6642:
6639:
6637:
6634:
6632:
6629:
6627:
6624:
6622:
6619:
6617:
6614:
6612:
6611:
6606:
6604:
6601:
6599:
6598:
6593:
6591:
6588:
6586:
6583:
6582:
6580:
6576:on the whole
6572:
6566:
6563:
6559:
6556:
6555:
6554:
6551:
6549:
6548:type-2 Gumbel
6546:
6544:
6541:
6539:
6536:
6534:
6531:
6529:
6526:
6524:
6521:
6519:
6516:
6514:
6511:
6509:
6506:
6504:
6501:
6499:
6496:
6494:
6491:
6489:
6486:
6484:
6481:
6479:
6476:
6474:
6471:
6469:
6466:
6464:
6461:
6459:
6456:
6454:
6451:
6449:
6446:
6442:
6439:
6438:
6437:
6434:
6432:
6430:
6425:
6423:
6420:
6418:
6417:Half-logistic
6415:
6411:
6408:
6407:
6406:
6403:
6401:
6398:
6394:
6391:
6389:
6386:
6385:
6384:
6381:
6379:
6376:
6374:
6373:Folded normal
6371:
6367:
6364:
6363:
6362:
6361:
6357:
6353:
6350:
6348:
6345:
6343:
6340:
6339:
6338:
6335:
6331:
6328:
6327:
6326:
6323:
6321:
6318:
6316:
6313:
6307:
6304:
6303:
6302:
6299:
6297:
6294:
6293:
6292:
6289:
6287:
6284:
6282:
6279:
6277:
6274:
6272:
6269:
6267:
6264:
6262:
6259:
6258:
6256:
6248:
6242:
6239:
6237:
6234:
6232:
6229:
6227:
6224:
6222:
6219:
6217:
6216:Raised cosine
6214:
6212:
6209:
6207:
6204:
6202:
6199:
6197:
6194:
6192:
6189:
6187:
6184:
6182:
6179:
6177:
6174:
6172:
6169:
6167:
6164:
6162:
6159:
6157:
6154:
6153:
6151:
6145:
6142:
6136:
6126:
6123:
6121:
6118:
6116:
6113:
6111:
6108:
6106:
6103:
6101:
6098:
6096:
6093:
6091:
6090:Mixed Poisson
6088:
6086:
6083:
6081:
6078:
6076:
6073:
6071:
6068:
6066:
6063:
6061:
6058:
6056:
6053:
6051:
6048:
6046:
6043:
6041:
6038:
6037:
6035:
6029:
6023:
6020:
6018:
6015:
6013:
6010:
6008:
6005:
6003:
6000:
5998:
5995:
5991:
5988:
5987:
5986:
5983:
5981:
5978:
5976:
5973:
5971:
5970:Beta-binomial
5968:
5966:
5963:
5961:
5958:
5957:
5955:
5949:
5946:
5940:
5935:
5931:
5924:
5919:
5917:
5912:
5910:
5905:
5904:
5901:
5890:
5887:Adams, Ryan.
5883:
5876:
5872:
5866:
5860:(p. 105)
5859:
5858:0-471-12844-9
5855:
5851:
5845:
5838:
5837:0-387-31073-8
5834:
5830:
5826:
5821:
5819:
5811:
5807:
5801:
5794:
5788:
5786:
5784:
5782:
5780:
5778:
5770:
5766:
5762:
5756:
5752:
5736:
5732:
5722:
5719:
5717:
5714:
5712:
5709:
5707:
5704:
5703:
5695:
5692:
5691:
5685:
5666:
5662:
5658:
5655:
5652:
5646:
5643:
5640:
5637:
5632:
5628:
5619:
5601:
5597:
5572:
5566:
5562:
5558:
5553:
5549:
5544:
5540:
5535:
5509:
5506:
5499:
5498:
5497:
5495:
5477:
5473:
5469:
5466:
5463:
5458:
5454:
5445:
5441:
5423:
5419:
5415:
5412:
5409:
5404:
5400:
5379:
5356:
5353:
5348:
5344:
5340:
5337:
5334:
5329:
5325:
5317:
5316:
5315:
5299:
5270:
5266:
5262:
5259:
5256:
5251:
5247:
5238:
5224:
5217:
5214:
5213:binary search
5210:
5206:
5203:
5199:
5198:
5197:
5191:
5187:
5183:
5180:
5176:
5172:
5169:
5165:
5162:
5161:
5160:
5158:
5153:
5151:
5147:
5137:
5135:
5131:
5110:
5107:
5099:
5095:
5065:
5061:
5057:
5049:
5046:
5038:
5034:
5029:
5019:
5015:
5009:
5005:
5001:
4998:
4995:
4992:
4985:
4981:
4977:
4969:
4966:
4958:
4954:
4946:
4944:
4931:
4923:
4920:
4907:
4904:
4901:
4896:
4892:
4885:
4874:
4873:
4872:
4853:
4850:
4818:
4814:
4782:
4780:
4776:
4772:
4768:
4764:
4761:priors (e.g.
4760:
4756:
4752:
4748:
4738:
4736:
4730:
4728:
4721:
4697:
4686:
4678:
4673:
4669:
4662:
4655:
4653:
4644:
4639:
4635:
4631:
4627:
4617:
4609:
4595:
4593:
4584:
4572:
4569:
4566:
4557:
4548:
4544:
4540:
4530:
4522:
4508:
4506:
4477:
4469:
4458:
4446:
4443:
4440:
4431:
4422:
4411:
4407:
4405:
4392:
4384:
4381:
4378:
4369:
4360:
4349:
4348:
4347:
4345:
4341:
4333:
4329:
4326:
4322:
4318:
4317:
4316:
4295:
4290:
4286:
4282:
4277:
4273:
4268:
4266:
4250:
4242:
4237:
4233:
4220:
4218:
4205:
4201:
4195:
4191:
4187:
4184:
4177:
4173:
4169:
4164:
4160:
4152:
4150:
4121:
4113:
4102:
4090:
4087:
4084:
4075:
4066:
4055:
4051:
4049:
4036:
4028:
4025:
4022:
4013:
4004:
3993:
3992:
3991:
3989:
3985:
3941:
3930:
3920:
3918:
3914:
3910:
3906:
3902:
3871:
3867:
3850:
3846:
3842:
3837:
3833:
3818:
3813:
3810:
3807:
3803:
3795:
3789:
3785:
3779:
3775:
3771:
3768:
3764:
3754:
3748:
3744:
3738:
3734:
3729:
3719:
3717:
3689:
3678:
3667:
3656:
3645:
3641:
3639:
3626:
3615:
3604:
3603:
3602:
3600:
3596:
3592:
3588:
3578:
3576:
3560:
3557:
3552:
3548:
3527:
3524:
3519:
3515:
3511:
3506:
3502:
3497:
3471:
3468:
3463:
3459:
3455:
3450:
3446:
3441:
3434:
3425:
3422:
3417:
3413:
3409:
3404:
3400:
3391:
3387:
3381:
3378:
3373:
3369:
3365:
3360:
3356:
3349:
3338:
3327:
3324:
3287:
3286:
3285:
3283:
3279:
3275:
3265:
3249:
3245:
3236:
3220:
3217:
3214:
3206:
3202:
3198:
3179:
3176:
3173:
3170:
3167:
3161:
3136:
3133:
3128:
3124:
3120:
3115:
3111:
3088:
3084:
3080:
3075:
3071:
3050:
3030:
3027:
3022:
3018:
2995:
2991:
2981:
2979:
2975:
2971:
2966:
2963:
2939:
2935:
2929:
2925:
2921:
2918:
2911:
2907:
2903:
2898:
2894:
2887:
2876:
2868:
2863:
2859:
2852:
2842:
2841:
2840:
2838:
2834:
2829:
2827:
2823:
2819:
2815:
2811:
2807:
2803:
2775:
2771:
2767:
2762:
2758:
2754:
2751:
2748:
2743:
2739:
2735:
2730:
2726:
2722:
2719:
2713:
2710:
2705:
2692:
2684:
2681:
2675:
2672:
2667:
2657:
2649:
2634:
2631:
2626:
2622:
2613:
2608:
2605:
2602:
2598:
2594:
2589:
2585:
2576:
2573:
2563:
2553:
2549:
2545:
2542:
2539:
2534:
2530:
2521:
2503:
2502:
2501:
2472:
2469:
2463:
2460:
2455:
2445:
2441:
2437:
2434:
2431:
2426:
2422:
2413:
2403:
2383:
2380:
2374:
2371:
2366:
2356:
2352:
2348:
2345:
2342:
2337:
2333:
2324:
2314:
2295:
2285:
2281:
2277:
2274:
2271:
2266:
2262:
2253:
2235:
2234:
2233:
2230:
2227:
2223:
2219:
2215:
2211:
2207:
2203:
2199:
2195:
2171:
2166:
2162:
2153:
2137:
2132:
2129:
2125:
2116:
2097:
2094:
2091:
2080:
2076:
2072:
2069:
2065:
2061:
2057:
2054:
2053:section below
2050:
2046:
2042:
2041:
2026:
2001:
1993:
1967:
1951:
1944:. The sum of
1931:
1928:
1925:
1917:
1913:
1909:
1905:
1889:
1883:
1880:
1869:
1866:
1861:
1857:
1849:
1848:
1847:
1846:
1842:
1816:
1797:
1793:
1785:
1781:
1771:
1768:
1764:
1750:
1747:
1742:
1738:
1734:
1729:
1725:
1721:
1718:
1715:
1712:
1709:
1704:
1700:
1696:
1691:
1687:
1678:
1674:
1657:
1654:
1651:
1624:
1621:
1616:
1612:
1606:
1602:
1591:
1587:
1568:
1565:
1562:
1556:
1553:
1548:
1544:
1535:
1531:
1530:
1514:
1511:
1506:
1502:
1498:
1493:
1489:
1485:
1480:
1476:
1455:
1452:
1449:
1440:
1431:
1429:
1411:
1408:
1403:
1399:
1393:
1389:
1378:
1360:
1356:
1332:
1325:
1321:
1315:
1311:
1305:
1300:
1297:
1294:
1290:
1286:
1275:
1264:
1257:
1256:
1255:
1253:
1250:
1246:
1243:of dimension
1242:
1238:
1234:
1230:
1222:
1218:
1214:
1210:
1207:
1203:
1200:
1196:
1192:
1191:
1190:
1176:
1173:
1170:
1147:
1144:
1141:
1115:
1107:
1104:
1101:
1093:
1089:
1083:
1078:
1075:
1072:
1068:
1064:
1053:
1050:
1044:
1037:
1036:
1035:
1033:
1028:
1012:
1009:
1004:
1000:
994:
989:
986:
983:
979:
968:
950:
946:
920:
916:
912:
909:
906:
901:
897:
890:
862:
857:
853:
849:
838:
835:
832:
829:
823:
816:
815:
814:
812:
809:
805:
801:
797:
793:
789:
784:
782:
778:
774:
770:
766:
756:
754:
750:
746:
742:
738:
733:
729:
725:
721:
717:
712:
710:
706:
701:
697:
693:
688:
686:
676:
674:
670:
666:
662:
658:
654:
649:
647:
644:
640:
636:
632:
627:
623:
619:
615:
611:
607:
603:
577:
573:
569:
566:
563:
558:
554:
544:
539:
535:
526:
517:
513:
508:
489:
486:
483:
472:
471:
470:
454:
450:
446:
440:
437:
434:
427:
424:
421:
415:
411:
407:
401:
398:
395:
389:
383:
377:
350:
347:
344:
336:
332:
328:
320:
317:
314:
306:
302:
298:
292:
286:
263:
259:
255:
249:
246:
243:
237:
226:
222:
204:
201:
198:
195:
192:
186:
183:
174:
170:
152:
149:
144:
140:
132:
129:
126:
121:
117:
91:
87:
83:
80:
77:
72:
68:
58:
42:
39:
36:
27:
23:
7175:
7163:
7129:Multivariate
7128:
7116:
7104:
7099:Wrapped Lévy
7059:
7007:Matrix gamma
7000:
6980:
6968:Normal-gamma
6961:
6927:Continuous:
6926:
6897:
6842:Tukey lambda
6829:
6821:
6816:-exponential
6813:
6805:
6796:
6787:
6778:
6772:-exponential
6769:
6713:
6680:
6647:
6609:
6596:
6523:Poly-Weibull
6468:Log-logistic
6428:
6427:Hotelling's
6359:
6201:Logit-normal
6075:Gauss–Kuzmin
6070:Flory–Schulz
5979:
5951:with finite
5882:
5865:
5849:
5844:
5831:, Springer.
5828:
5800:
5760:
5755:
5735:
5588:
5493:
5371:
5234:
5221:
5208:
5195:
5189:
5178:
5174:
5154:
5143:
5133:
5129:
5086:
4783:
4744:
4731:
4723:
4719:
4716:
4339:
4337:
4331:
4314:
3983:
3926:
3898:
3584:
3574:
3486:
3277:
3271:
3234:
3204:
2982:
2969:
2967:
2961:
2959:
2830:
2825:
2822:pseudocounts
2817:
2809:
2805:
2799:
2499:
2231:
2209:
2191:
2074:
2067:
2063:
1965:
1915:
1907:
1840:
1766:
1676:
1589:
1585:
1533:
1376:
1347:
1251:
1244:
1240:
1236:
1232:
1226:
1197:of a set of
1130:
1029:
966:
877:
810:
799:
795:
791:
788:sample space
785:
772:
769:sample space
762:
713:
708:
704:
689:
682:
669:special case
650:
646:sample space
642:
638:
634:
630:
625:
617:
613:
609:
599:
229:
7213:Exponential
7062:directional
7051:Directional
6938:Generalized
6909:Multinomial
6864:continuous-
6804:Kaniadakis
6795:Kaniadakis
6786:Kaniadakis
6777:Kaniadakis
6768:Kaniadakis
6720:Tracy–Widom
6697:Skew normal
6679:Noncentral
6463:Log-Laplace
6441:Generalized
6422:Half-normal
6388:Generalized
6352:Logarithmic
6337:Exponential
6291:Chi-squared
6231:U-quadratic
6196:Kumaraswamy
6138:Continuous
6085:Logarithmic
5980:Categorical
3103:represents
2224:, then the
2070:) is fixed.
781:categorical
679:Terminology
661:categorical
20:Categorical
7291:Categories
7208:Elliptical
7164:Degenerate
7150:Degenerate
6898:Discrete:
6857:univariate
6712:Student's
6667:Asymmetric
6646:Johnson's
6574:supported
6518:Phase-type
6473:Log-normal
6458:Log-Cauchy
6448:Kolmogorov
6366:Noncentral
6296:Noncentral
6276:Beta prime
6226:Triangular
6221:Reciprocal
6191:Irwin–Hall
6140:univariate
6120:Yule–Simon
6002:Rademacher
5944:univariate
5825:Bishop, C.
5769:0262018020
5747:References
3540:is to set
2814:hyperprior
2051:. See the
1434:Properties
606:statistics
26:Parameters
6933:Dirichlet
6914:Dirichlet
6824:-Gaussian
6799:-Logistic
6636:Holtsmark
6608:Gaussian
6595:Fisher's
6578:real line
6080:Geometric
6060:Delaporte
5965:Bernoulli
5942:Discrete
5852:, Wiley.
5659:
5653:−
5647:
5641:−
5550:γ
5541:
5467:…
5413:…
5380:α
5357:α
5341:
5326:γ
5260:…
5209:O(log(k))
5168:normalize
5108:−
5062:α
5047:−
5030:∝
5016:α
5006:∑
4996:−
4982:α
4967:−
4936:α
4921:−
4908:∣
4851:−
4759:Dirichlet
4691:α
4679:∣
4663:
4628:
4622:α
4610:∣
4573:∣
4561:~
4541:
4535:α
4523:∣
4482:α
4470:∣
4447:∣
4435:~
4412:∫
4397:α
4385:∣
4373:~
4287:α
4269:∝
4255:α
4243:∣
4202:α
4192:∑
4174:α
4126:α
4114:∣
4091:∣
4079:~
4056:∫
4041:α
4029:∣
4017:~
3945:~
3905:inference
3868:α
3861:Γ
3847:α
3827:Γ
3804:∏
3786:α
3776:∑
3761:Γ
3745:α
3735:∑
3726:Γ
3694:α
3690:∣
3668:∣
3646:∫
3631:α
3627:∣
3549:α
3503:α
3495:∀
3447:α
3439:∀
3423:−
3401:α
3388:∑
3379:−
3357:α
3339:∣
3325:
3246:α
3218:−
3215:⋯
3199:over the
3180:…
3158:α
3134:−
3125:α
3085:α
3028:−
3019:α
2992:α
2936:α
2926:∑
2908:α
2881:α
2869:∣
2853:
2772:α
2752:…
2740:α
2714:
2697:α
2676:
2668:∼
2662:α
2650:∣
2599:∑
2543:…
2464:
2456:∼
2435:…
2404:∣
2388:α
2375:
2367:∼
2346:…
2319:α
2315:∣
2282:α
2275:…
2263:α
2248:α
2126:δ
2117:function
2081:function
1782:
1748:≤
1722:≤
1655:−
1641:standard
1603:∑
1390:∑
1291:∏
1276:∣
1069:∏
1054:∣
980:∑
910:…
839:∣
567:…
447:⋅
425:⋯
408:⋅
329:⋯
199:…
187:∈
137:Σ
127:≥
81:…
7266:Category
7198:Circular
7191:Families
7176:Singular
7155:singular
6919:Negative
6866:discrete
6832:-Weibull
6790:-Weibull
6674:Logistic
6558:Discrete
6528:Rayleigh
6508:Nakagami
6431:-squared
6405:Gompertz
6254:interval
5990:Negative
5975:Binomial
5877:, pp. 25
5688:See also
5177:, where
5140:Sampling
3573:for all
3284:, i.e.,
2978:variance
1592:, where
1588:= 1,...,
7276:Commons
7248:Wrapped
7243:Tweedie
7238:Pearson
7233:Mixture
7140:Bingham
7039:Complex
7029:Inverse
7019:Wishart
7012:Inverse
6999:Matrix
6973:Inverse
6889:(joint)
6808:-Erlang
6662:Laplace
6553:Weibull
6410:Shifted
6393:Inverse
6378:Fréchet
6301:Inverse
6236:Uniform
6156:Arcsine
6115:Skellam
6110:Poisson
6033:support
6007:Soliton
5960:Benford
5953:support
5827:(2006)
5620:, then
5200:Pick a
5146:methods
4871:, then
3597:) is a
3201:simplex
2816:vector
2200:is the
2113:or the
1914:. Then
1910:is the
1215:is the
671:of the
655:of the
633:). The
620:) is a
505:is the
173:Support
57:integer
7182:Cantor
7024:Normal
6855:Mixed
6781:-Gamma
6707:Stable
6657:Landau
6631:Gumbel
6585:Cauchy
6513:Pareto
6325:Erlang
6306:Scaled
6261:Benini
6100:Panjer
5873:
5856:
5835:
5767:
5446:. Let
5372:where
5087:where
2196:, the
1906:where
1675:; for
1428:Bishop
1348:where
1131:where
878:where
779:for a
767:whose
722:where
703:"1-of-
659:for a
473:where
6904:Ewens
6730:Voigt
6702:Slash
6483:Lomax
6478:Log-t
6383:Gamma
6330:Hyper
6320:Davis
6315:Dagum
6171:Bates
6161:ARGUS
6045:Borel
5727:Notes
5211:, by
5170:them.
4773:is a
2062:from
7153:and
7111:Kent
6538:Rice
6453:Lévy
6281:Burr
6211:PERT
6176:Beta
6125:Zeta
6017:Zipf
5934:list
5871:ISBN
5854:ISBN
5833:ISBN
5765:ISBN
5190:O(k)
3927:The
3558:>
3525:>
3469:>
3272:The
2058:The
2043:The
2014:and
1817:Let
1379:and
969:and
813:is:
694:and
608:, a
604:and
516:Mode
370:(3)
279:(2)
230:(1)
40:>
6989:LKJ
6286:Chi
5656:log
5644:log
5492:be
5338:log
5235:In
4745:In
3982:of
3911:or
2839:):
2820:as
2711:Dir
2673:Dir
2461:Cat
2372:Dir
2192:In
2150:is
1990:is
665:die
600:In
548:max
225:PMF
7293::
5817:^
5776:^
5152::
5136:.
3601::
3577:.
1751:1.
1584:,
1536::
1430:.
1034::
1027:.
938:,
755:.
687:.
616:,
7001:t
6962:t
6830:q
6822:q
6814:q
6806:κ
6797:κ
6788:κ
6779:κ
6770:κ
6714:t
6681:t
6650:U
6648:S
6610:q
6597:z
6429:T
6360:F
5936:)
5932:(
5922:e
5915:t
5908:v
5891:.
5839:.
5812:.
5771:.
5672:)
5667:i
5663:u
5650:(
5638:=
5633:i
5629:g
5602:i
5598:u
5573:)
5567:i
5563:g
5559:+
5554:i
5545:(
5536:i
5531:x
5528:a
5525:m
5521:g
5518:r
5515:a
5510:=
5507:c
5494:k
5478:k
5474:g
5470:,
5464:,
5459:1
5455:g
5424:k
5420:p
5416:,
5410:,
5405:1
5401:p
5354:+
5349:i
5345:p
5335:=
5330:i
5300:k
5295:R
5271:k
5267:p
5263:,
5257:,
5252:1
5248:p
5215:.
5179:k
5175:k
5134:n
5130:i
5114:)
5111:n
5105:(
5100:i
5096:c
5066:i
5058:+
5053:)
5050:n
5044:(
5039:i
5035:c
5020:i
5010:i
5002:+
4999:1
4993:N
4986:i
4978:+
4973:)
4970:n
4964:(
4959:i
4955:c
4947:=
4940:)
4932:,
4927:)
4924:n
4918:(
4913:X
4905:i
4902:=
4897:n
4893:x
4889:(
4886:p
4857:)
4854:n
4848:(
4843:X
4819:n
4815:x
4793:X
4726:i
4724:p
4720:i
4698:.
4695:]
4687:,
4683:X
4674:i
4670:p
4666:[
4660:E
4656:=
4645:]
4640:i
4636:p
4632:[
4618:,
4614:X
4606:p
4601:E
4596:=
4585:]
4581:)
4577:p
4570:i
4567:=
4558:x
4552:(
4549:p
4545:[
4531:,
4527:X
4519:p
4514:E
4509:=
4498:p
4492:d
4486:)
4478:,
4474:X
4466:p
4462:(
4459:p
4455:)
4451:p
4444:i
4441:=
4432:x
4426:(
4423:p
4417:p
4408:=
4401:)
4393:,
4389:X
4382:i
4379:=
4370:x
4364:(
4361:p
4340:p
4296:.
4291:i
4283:+
4278:i
4274:c
4259:]
4251:,
4247:X
4238:i
4234:p
4230:[
4226:E
4221:=
4206:k
4196:k
4188:+
4185:N
4178:i
4170:+
4165:i
4161:c
4153:=
4142:p
4136:d
4130:)
4122:,
4118:X
4110:p
4106:(
4103:p
4099:)
4095:p
4088:i
4085:=
4076:x
4070:(
4067:p
4061:p
4052:=
4045:)
4037:,
4033:X
4026:i
4023:=
4014:x
4008:(
4005:p
3984:N
3969:X
3942:x
3877:)
3872:k
3864:(
3856:)
3851:k
3843:+
3838:k
3834:c
3830:(
3819:K
3814:1
3811:=
3808:k
3796:)
3790:k
3780:k
3772:+
3769:N
3765:(
3755:)
3749:k
3739:k
3730:(
3720:=
3709:p
3703:d
3698:)
3686:p
3682:(
3679:p
3676:)
3672:p
3664:X
3660:(
3657:p
3651:p
3642:=
3635:)
3623:X
3619:(
3616:p
3575:i
3561:1
3553:i
3528:1
3520:i
3516:c
3512:+
3507:i
3498:i
3472:1
3464:i
3460:c
3456:+
3451:i
3442:i
3435:,
3429:)
3426:1
3418:i
3414:c
3410:+
3405:i
3397:(
3392:i
3382:1
3374:i
3370:c
3366:+
3361:i
3350:=
3347:)
3343:X
3335:p
3331:(
3328:p
3319:p
3313:x
3310:a
3307:m
3303:g
3300:r
3297:a
3278:p
3250:i
3235:α
3221:1
3205:p
3183:)
3177:,
3174:1
3171:,
3168:1
3165:(
3162:=
3137:1
3129:i
3121:+
3116:i
3112:c
3089:i
3081:+
3076:i
3072:c
3051:i
3031:1
3023:i
2996:i
2962:i
2940:k
2930:k
2922:+
2919:N
2912:i
2904:+
2899:i
2895:c
2888:=
2885:]
2877:,
2873:X
2864:i
2860:p
2856:[
2850:E
2826:c
2818:α
2810:N
2806:p
2781:)
2776:K
2768:+
2763:K
2759:c
2755:,
2749:,
2744:1
2736:+
2731:1
2727:c
2723:,
2720:K
2717:(
2706:=
2701:)
2693:+
2689:c
2685:,
2682:K
2679:(
2658:,
2654:X
2646:p
2638:]
2635:i
2632:=
2627:j
2623:x
2619:[
2614:N
2609:1
2606:=
2603:j
2595:=
2590:i
2586:c
2577:,
2574:i
2564:=
2559:)
2554:K
2550:c
2546:,
2540:,
2535:1
2531:c
2527:(
2522:=
2516:c
2481:)
2477:p
2473:,
2470:K
2467:(
2451:)
2446:N
2442:x
2438:,
2432:,
2427:1
2423:x
2419:(
2414:=
2408:p
2400:X
2392:)
2384:,
2381:K
2378:(
2362:)
2357:K
2353:p
2349:,
2343:,
2338:1
2334:p
2330:(
2325:=
2311:p
2296:=
2291:)
2286:K
2278:,
2272:,
2267:1
2259:(
2254:=
2210:p
2172:.
2167:i
2163:p
2138:,
2133:i
2130:x
2101:]
2098:i
2095:=
2092:x
2089:[
2075:i
2068:n
2064:n
2027:.
2023:p
2002:n
1977:p
1966:Y
1952:n
1932:1
1929:=
1926:n
1916:Y
1908:I
1890:,
1887:)
1884:i
1881:=
1877:X
1873:(
1870:I
1867:=
1862:i
1858:Y
1841:Y
1826:X
1802:p
1798:=
1794:]
1790:x
1786:[
1779:E
1767:k
1743:2
1739:p
1735:,
1730:1
1726:p
1719:0
1716:,
1713:1
1710:=
1705:2
1701:p
1697:+
1692:1
1688:p
1677:k
1661:)
1658:1
1652:k
1649:(
1625:1
1622:=
1617:i
1613:p
1607:i
1590:k
1586:i
1572:)
1569:i
1566:=
1563:X
1560:(
1557:P
1554:=
1549:i
1545:p
1534:i
1515:1
1512:=
1507:3
1503:p
1499:+
1494:2
1490:p
1486:+
1481:1
1477:p
1456:3
1453:=
1450:k
1412:1
1409:=
1404:i
1400:p
1394:i
1377:i
1361:i
1357:p
1333:,
1326:i
1322:x
1316:i
1312:p
1306:k
1301:1
1298:=
1295:i
1287:=
1284:)
1280:p
1272:x
1268:(
1265:f
1252:f
1245:k
1241:x
1237:K
1233:n
1208:.
1177:i
1174:=
1171:x
1151:]
1148:i
1145:=
1142:x
1139:[
1116:,
1111:]
1108:i
1105:=
1102:x
1099:[
1094:i
1090:p
1084:k
1079:1
1076:=
1073:i
1065:=
1062:)
1058:p
1051:x
1048:(
1045:f
1013:1
1010:=
1005:i
1001:p
995:k
990:1
987:=
984:i
967:i
951:i
947:p
926:)
921:k
917:p
913:,
907:,
902:1
898:p
894:(
891:=
887:p
863:,
858:i
854:p
850:=
847:)
843:p
836:i
833:=
830:x
827:(
824:f
811:f
800:k
796:k
792:k
773:k
709:K
705:K
643:K
639:K
635:K
631:K
626:K
583:)
578:k
574:p
570:,
564:,
559:1
555:p
551:(
545:=
540:i
536:p
527:i
493:]
490:i
487:=
484:x
481:[
455:k
451:p
444:]
441:k
438:=
435:x
432:[
428:+
422:+
416:1
412:p
405:]
402:1
399:=
396:x
393:[
390:=
387:)
384:x
381:(
378:p
354:]
351:k
348:=
345:x
342:[
337:k
333:p
324:]
321:1
318:=
315:x
312:[
307:1
303:p
299:=
296:)
293:x
290:(
287:p
264:i
260:p
256:=
253:)
250:i
247:=
244:x
241:(
238:p
208:}
205:k
202:,
196:,
193:1
190:{
184:x
156:)
153:1
150:=
145:i
141:p
133:,
130:0
122:i
118:p
114:(
92:k
88:p
84:,
78:,
73:1
69:p
59:)
43:0
37:k
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.