Knowledge

Random forest

Source đź“ť

8361:-based models. This interpretability is one of the most desirable qualities of decision trees. It allows developers to confirm that the model has learned realistic information from the data and allows end-users to have trust and confidence in the decisions made by the model. For example, following the path that a decision tree takes to make its decision is quite trivial, but following the paths of tens or hundreds of trees is much harder. To achieve both performance and interpretability, some model compression techniques allow transforming a random forest into a minimal "born-again" decision tree that faithfully reproduces the same decision function. If it is established that the predictive attributes are linearly correlated with the target variable, using random forest may not enhance the accuracy of the base learner. Furthermore, in problems with multiple categorical variables, random forest may not be able to increase the accuracy of the base learner. 3248:
suitably generated synthetic data. The observed data are the original unlabeled data and the synthetic data are drawn from a reference distribution. A random forest dissimilarity can be attractive because it handles mixed variable types very well, is invariant to monotonic transformations of the input variables, and is robust to outlying observations. The random forest dissimilarity easily deals with a large number of semi-continuous variables due to its intrinsic variable selection; for example, the "Addcl 1" random forest dissimilarity weighs the contribution of each variable according to how dependent it is on other variables. The random forest dissimilarity has been used in a variety of applications, e.g. to find clusters of patients based on tissue marker data.
1154: 5997: 6477: 3300:
Forest Kernel and show that it can empirically outperform state-of-art kernel methods. Scornet first defined KeRF estimates and gave the explicit link between KeRF estimates and random forest. He also gave explicit expressions for kernels based on centered random forest and uniform random forest, two simplified models of random forest. He named these two KeRFs Centered KeRF and Uniform KeRF, and proved upper bounds on their rates of consistency.
10080: 5652: 4789:. Random regression forest has two levels of averaging, first over the samples in the target cell of a tree, then over all trees. Thus the contributions of observations that are in cells with a high density of data points are smaller than that of observations which belong to less populated cells. In order to improve the random forest methods and compensate the misestimation, Scornet defined KeRF by 6094: 5052: 4787: 7569: 5534: 1614:, or ExtraTrees. While similar to ordinary random forests in that they are an ensemble of individual trees, there are two main differences: first, each tree is trained using the whole learning sample (rather than a bootstrap sample), and second, the top-down splitting in the tree learner is randomized. Instead of computing the locally 2121: 4792: 3179: 1139:. Random forests are a way of averaging multiple deep decision trees, trained on different parts of the same training set, with the goal of reducing the variance. This comes at the expense of a small increase in the bias and some loss of interpretability, but generally greatly boosts the performance in the final model. 5992:{\displaystyle K_{k}^{cc}(\mathbf {x} ,\mathbf {z} )=\sum _{k_{1},\ldots ,k_{d},\sum _{j=1}^{d}k_{j}=k}{\frac {k!}{k_{1}!\cdots k_{d}!}}\left({\frac {1}{d}}\right)^{k}\prod _{j=1}^{d}\mathbf {1} _{\lceil 2^{k_{j}}x_{j}\rceil =\lceil 2^{k_{j}}z_{j}\rceil },\qquad {\text{ for all }}\mathbf {x} ,\mathbf {z} \in ^{d}.} 4541: 6472:{\displaystyle K_{k}^{uf}(\mathbf {0} ,\mathbf {x} )=\sum _{k_{1},\ldots ,k_{d},\sum _{j=1}^{d}k_{j}=k}{\frac {k!}{k_{1}!\ldots k_{d}!}}\left({\frac {1}{d}}\right)^{k}\prod _{m=1}^{d}\left(1-|x_{m}|\sum _{j=0}^{k_{m}-1}{\frac {\left(-\ln |x_{m}|\right)^{j}}{j!}}\right){\text{ for all }}\mathbf {x} \in ^{d}.} 1043:
monotonically is in sharp contrast to the common belief that the complexity of a classifier can only grow to a certain level of accuracy before being hurt by overfitting. The explanation of the forest method's resistance to overtraining can be found in Kleinberg's theory of stochastic discrimination.
6710: 1630:
cut-point is selected. This value is selected from a uniform distribution within the feature's empirical range (in the tree's training set). Then, of all the randomly generated splits, the split that yields the highest score is chosen to split the node. Similar to ordinary random forests, the number
1037:
The general method of random decision forests was first proposed by Salzberg and Heath in 1993, with a method that used a randomized decision tree algorithm to generate multiple different trees and then combine them using majority voting. This idea was developed further by Ho in 1995. Ho established
7342: 3247:
As part of their construction, random forest predictors naturally lead to a dissimilarity measure among the observations. One can also define a random forest dissimilarity measure between unlabeled data: the idea is to construct a random forest predictor that distinguishes the "observed" data from
1703:
The basic Random Forest procedure may not work well in situations where there are a large number of features but only a small proportion of these features are informative with respect to sample classification. This can be addressed by encouraging the procedure to focus mainly on features and trees
3299:
random vectors in the tree construction are equivalent to a kernel acting on the true margin. Lin and Jeon established the connection between random forests and adaptive nearest neighbor, implying that random forests can be seen as adaptive kernel estimates. Davies and Ghahramani proposed Random
1392:
of the model, without increasing the bias. This means that while the predictions of a single tree are highly sensitive to noise in its training set, the average of many trees is not, as long as the trees are not correlated. Simply training many trees on a single training set would give strongly
1042:
dimensions. A subsequent work along the same lines concluded that other splitting methods behave similarly, as long as they are randomly forced to be insensitive to some feature dimensions. Note that this observation of a more complex classifier (a larger forest) getting more accurate nearly
4222: 5258: 5307: 6884: 1936: 4061: 4486: 3313:
Centered forest is a simplified model for Breiman's original random forest, which uniformly selects an attribute among all attributes and performs splits at the center of the cell along the pre-chosen attribute. The algorithm stops when a fully binary tree of level
982:
at training time. For classification tasks, the output of the random forest is the class selected by most trees. For regression tasks, the mean or average prediction of the individual trees is returned. Random decision forests correct for decision trees' habit of
1932:
This feature importance for random forests is the default implementation in sci-kit learn and R. It is described in the book "Classification and Regression Trees" by Leo Breiman. Variables which decrease the impurity during splits a lot are considered important:
2936: 1046:
The early development of Breiman's notion of random forests was influenced by the work of Amit and Geman who introduced the idea of searching over a random subset of the available decisions when splitting a node, in the context of growing a single
1051:. The idea of random subspace selection from Ho was also influential in the design of random forests. In this method a forest of trees is grown, and variation among the trees is introduced by projecting the training data into a randomly chosen 9121:
Li, H. B., Wang, W., Ding, H. W., & Dong, J. (2010, 10-12 Nov. 2010). Trees weighting random forest method for classifying high-dimensional noisy data. Paper presented at the 2010 IEEE 7th International Conference on E-Business Engineering.
6550: 5047:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {1}{\sum _{j=1}^{M}N_{n}(\mathbf {x} ,\Theta _{j})}}\sum _{j=1}^{M}\sum _{i=1}^{n}Y_{i}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})},} 4782:{\displaystyle m_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {1}{M}}\sum _{j=1}^{M}\left(\sum _{i=1}^{n}{\frac {Y_{i}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})}}{N_{n}(\mathbf {x} ,\Theta _{j})}}\right)} 1566:
The above procedure describes the original bagging algorithm for trees. Random forests also include another type of bagging scheme: they use a modified tree learning algorithm that selects, at each candidate split in the learning process, a
7334: 3370:
Uniform forest is another simplified model for Breiman's original random forest, which uniformly selects a feature among all features and performs splits at a point uniformly drawn on the side of the cell, along the preselected feature.
1516: 3238:
depends in a complex way on the structure of the trees, and thus on the structure of the training set. Lin and Jeon show that the shape of the neighborhood used by a random forest adapts to the local importance of each feature.
1055:
before fitting each tree or each node. Finally, the idea of randomized node optimization, where the decision at each node is selected by a randomized procedure, rather than a deterministic optimization was first introduced by
6715: 6089: 5647: 4066: 8577: 3466: 1728:
Random forests can be used to rank the importance of variables in a regression or classification problem in a natural way. The following technique was described in Breiman's original paper and is implemented in the
8075: 1902:
Features which produce large values for this score are ranked as more important than features which produce small values. The statistical definition of the variable importance measure was given and analyzed by Zhu
7564:{\displaystyle |m_{\infty ,n}(\mathbf {x} )-{\tilde {m}}_{\infty ,n}(\mathbf {x} )|\leq {\frac {b_{n}-a_{n}}{a_{n}}}{\tilde {m}}_{\infty ,n}(\mathbf {x} )+n\varepsilon _{n}\left(\max _{1\leq i\leq n}Y_{i}\right).} 1381: 8339: 5529:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {\sum _{i=1}^{n}Y_{i}K_{M,n}(\mathbf {x} ,\mathbf {x} _{i})}{\sum _{\ell =1}^{n}K_{M,n}(\mathbf {x} ,\mathbf {x} _{\ell })}}} 5126: 3864: 1128:, "because it is invariant under scaling and various other transformations of feature values, is robust to inclusion of irrelevant features, and produces inspectable models. However, they are seldom accurate". 2642: 9085:
Ye, Y., Li, H., Deng, X., and Huang, J. (2008) Feature weighting random forest for detection of hidden web search interfaces. Journal of Computational Linguistics and Chinese Language Processing, 13, 387–404.
1830: 1038:
that forests of trees splitting with oblique hyperplanes can gain accuracy as they grow without suffering from overtraining, as long as the forests are randomly restricted to be sensitive to only selected
7185: 9112:
Winham, Stacey & Freimuth, Robert & Biernacka, Joanna. (2013). A weighted random forests approach to improve predictive performance. Statistical Analysis and Data Mining. 6. 10.1002/sam.11196.
3772: 3922: 4358: 3919:. This random variable can be used to describe the randomness induced by node splitting and the sampling procedure for tree construction. The trees are combined to form the finite forest estimate 2881: 1899:-th feature is computed by averaging the difference in out-of-bag error before and after the permutation over all trees. The score is normalized by the standard deviation of these differences. 1393:
correlated trees (or even the same tree many times, if the training algorithm is deterministic); bootstrap sampling is a way of de-correlating the trees by showing them different training sets.
6987: 3699: 1121:
Decision trees are a popular method for various machine learning tasks. Tree learning "come closest to meeting the requirements for serving as an off-the-shelf procedure for data mining", say
4273: 2798: 2278: 7620: 2116:{\displaystyle {\text{unormalized average importance}}(x)={\frac {1}{n_{T}}}\sum _{i=1}^{n_{T}}\sum _{{\text{node }}j\in T_{i}|{\text{split variable}}(j)=x}p_{T_{i}}(j)\Delta i_{T_{i}}(j),} 1923:
Additionally, the permutation procedure may fail to identify important features when there are collinear features. In this case permuting groups of correlated features together is a remedy.
1711:
Enriched Random Forest (ERF): Use weighted random sampling instead of simple random sampling at each node of each tree, giving greater weight to features that appear to be more informative.
3592: 2508: 8150: 7855: 7055: 3513: 7695: 8109: 7814: 4536: 2344: 4353: 3917: 3886: 3360: 9076:
Dessi, N. & Milia, G. & Pes, B. (2013). Enhancing random forests performance in microarray data classification. Conference paper, 99-103. 10.1007/978-3-642-38326-7_15.
3547: 2691: 1403: 6545: 3174:{\displaystyle {\hat {y}}={\frac {1}{m}}\sum _{j=1}^{m}\sum _{i=1}^{n}W_{j}(x_{i},x')\,y_{i}=\sum _{i=1}^{n}\left({\frac {1}{m}}\sum _{j=1}^{m}W_{j}(x_{i},x')\right)\,y_{i}.} 1075:. In addition, this paper combines several ingredients, some previously known and some novel, which form the basis of the modern practice of random forests, in particular: 9718:
Prinzie, Anita (2007). "Random Multiclass Classification: Generalizing Random Forests to Random MNL and Random NB". In Roland Wagner; Norman Revell; GĂĽnther Pernul (eds.).
1579:
trees, causing them to become correlated. An analysis of how bagging and random subspace projection contribute to accuracy gains under different conditions is given by Ho.
9103:
Ghosh D, Cabrera J. (2022) Enriched random forest for high dimensional genomic data. IEEE/ACM Trans Comput Biol Bioinform. 19(5):2817-2828. doi:10.1109/TCBB.2021.3089417.
7717: 7662: 7640: 5302: 5280: 5101: 4295: 3794: 3634: 4322: 1653: 1602:(rounded down) with a minimum node size of 5 as the default. In practice, the best values for these parameters should be tuned on a case-to-case basis for every problem. 7888: 2537: 3264:. In cases that the relationship between the predictors and the target variable is linear, the base learners may have an equally high accuracy as the ensemble learner. 8176: 1524:, is a free parameter. Typically, a few hundred to several thousand trees are used, depending on the size and nature of the training set. An optimal number of trees 873: 7191: 5079: 3212: 2931: 2730: 2195: 2168: 1396:
Additionally, an estimate of the uncertainty of the prediction can be made as the standard deviation of the predictions from all the individual regression trees on
7756: 1913:
For data including categorical variables with different number of levels, random forests are biased in favor of those attributes with more levels. Methods such as
911: 1571:. This process is sometimes called "feature bagging". The reason for doing this is the correlation of the trees in an ordinary bootstrap sample: if one or a few 7908: 7776: 6912: 5562: 5121: 3814: 3719: 3612: 3332: 3232: 2384: 2364: 2298: 2215: 2141: 1897: 1877: 1857: 1693: 1673: 6705:{\displaystyle a_{n}\leq N_{n}(\mathbf {x} ,\Theta )\leq b_{n}{\text{ and }}a_{n}\leq {\frac {1}{M}}\sum _{m=1}^{M}N_{n}{\mathbf {x} ,\Theta _{m}}\leq b_{n}.} 9239:
Piryonesi S. Madeh; El-Diraby Tamer E. (2020-06-01). "Role of Data Analytics in Infrastructure Asset Management: Overcoming Data Size and Quality Problems".
8353:
present in decision trees. Decision trees are among a fairly small family of machine learning models that are easily interpretable along with linear models,
8679: 8569: 1836:
for each data point is recorded and averaged over the forest (errors on an independent test set can be substituted if bagging is not used during training).
868: 6914:
goes to infinity, then we have infinite random forest and infinite KeRF. Their estimates are close if the number of observations in each cell is bounded:
8439:. Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, QC, 14–16 August 1995. pp. 278–282. Archived from 858: 4217:{\displaystyle m_{n}=\sum _{i=1}^{n}{\frac {Y_{i}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})}}{N_{n}(\mathbf {x} ,\Theta _{j})}}} 1879:-th feature are permuted in the out-of-bag samples and the out-of-bag error is again computed on this perturbed data set. The importance score for the 1299: 6009: 5567: 1157:
Illustration of training a Random Forest model. The training dataset (in this case, of 250 rows and 100 columns) is randomly sampled with replacement
8849: 3183:
This shows that the whole forest is again a weighted neighborhood scheme, with weights that average those of the individual trees. The neighbors of
2416:
it uses training statistics and therefore does not "reflect the ability of feature to be useful to make predictions that generalize to the test set"
2550: 699: 9475: 3382: 906: 8986: 7913: 5253:{\displaystyle K_{M,n}(\mathbf {x} ,\mathbf {z} )={\frac {1}{M}}\sum _{j=1}^{M}\mathbf {1} _{\mathbf {z} \in A_{n}(\mathbf {x} ,\Theta _{j})}} 9440: 8828:"RANDOM FORESTS Trademark of Health Care Productivity, Inc. - Registration Number 3185828 - Serial Number 78642027 :: Justia Trademarks" 1002:, which, in Ho's formulation, is a way to implement the "stochastic discrimination" approach to classification proposed by Eugene Kleinberg. 10061:
The Application of Data Analytics to Asset Management: Deterioration and Climate Change Adaptation in Ontario Roads (Doctoral dissertation)
8181: 3819: 2409:
The sci-kit learn default implementation of Mean Decrease in Impurity Feature Importance is susceptible to misleading feature importances:
863: 714: 8431: 6879:{\displaystyle |m_{M,n}(\mathbf {x} )-{\tilde {m}}_{M,n}(\mathbf {x} )|\leq {\frac {b_{n}-a_{n}}{a_{n}}}{\tilde {m}}_{M,n}(\mathbf {x} ).} 1920:
If the data contain groups of correlated features of similar relevance for the output, then smaller groups are favored over larger groups.
9720:
Database and Expert Systems Applications: 18th International Conference, DEXA 2007, Regensburg, Germany, September 3-7, 2007, Proceedings
445: 946: 749: 2406:
The normalized importance is then obtained by normalizing over all features, so that the sum of normalized feature importances is 1.
1748: 7061: 4056:{\displaystyle m_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})={\frac {1}{M}}\sum _{j=1}^{M}m_{n}(\mathbf {x} ,\Theta _{j})} 4481:{\displaystyle N_{n}(\mathbf {x} ,\Theta _{j})=\sum _{i=1}^{n}\mathbf {1} _{\mathbf {X} _{i}\in A_{n}(\mathbf {x} ,\Theta _{j})}} 825: 8935:"An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization" 3724: 374: 3256:
Instead of decision trees, linear models have been proposed and evaluated as base estimators in random forests, in particular
10133: 9735: 9485: 9458: 9500: 9913:
Davies, Alex; Ghahramani, Zoubin (2014). "The Random Forest Kernel and other kernels for big data from random partitions".
1214: 883: 646: 181: 901: 9318:
Painsky A, Rosset S (2017). "Cross-Validated Variable Selection in Tree-Based Methods Improves Predictive Performance".
6921: 3639: 10229: 9646:"Using Machine Learning to Examine Impact of Type of Performance Indicator on Flexible Pavement Deterioration Modeling" 1068: 734: 709: 658: 1714:
Tree Weighted Random Forest (TWRF): Weight trees so that trees exhibiting better accuracy are assigned higher weights.
10249: 10117: 8549: 4227: 2220: 1575:
are very strong predictors for the response variable (target output), these features will be selected in many of the
782: 777: 430: 7582: 1631:
of randomly selected features to be considered at each node can be specified. Default values for this parameter are
3552: 1552:
in their bootstrap sample. The training and test error tend to level off after some number of trees have been fit.
440: 78: 8720: 8612: 3257: 2821: 2443: 9691:
Prinzie, A.; Van den Poel, D. (2008). "Random Forests for multiclass classification: Random MultiNomial Logit".
8114: 7819: 2743: 9596:"Tumor classification by tissue microarray profiling: random forest clustering applied to renal cell carcinoma" 6995: 939: 835: 599: 420: 10210:
Liaw, Andy & Wiener, Matthew "Classification and Regression by randomForest" R News (2002) Vol. 2/3 p. 18
8349:
While random forests often achieve higher accuracy than a single decision tree, they sacrifice the intrinsic
3471: 1529: 810: 512: 288: 7667: 9187:. Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN). pp. 293–300. 8857: 8358: 6491:
Predictions given by KeRF and random forests are close if the number of points in each cell is controlled:
2426: 2390: 767: 704: 614: 592: 435: 425: 10095: 8088: 7793: 4494: 8892: 8370: 8354: 2303: 918: 830: 815: 276: 98: 9756:"A comparison of random forest regression and multiple linear regression for prediction in neuroscience" 8794: 4327: 3891: 3869: 10234: 8939: 8753: 3337: 1572: 1136: 1039: 971: 878: 805: 555: 450: 238: 171: 131: 9987: 3518: 3272:
In machine learning, kernel random forests (KeRF) establish the connection between random forests and
9042: 8394: 6498: 1389: 1094:
The report also offers the first theoretical result for random forests in the form of a bound on the
932: 538: 306: 176: 9882: 9854: 9645: 9555: 9526: 9295: 9094:
Amaratunga, D., Cabrera, J., Lee, Y.S. (2008) Enriched Random Forest. Bioinformatics, 24, 2010-2014.
1161:
times. Then, a decision tree is trained on each sample. Finally, for prediction, the results of all
1025:" idea and random selection of features, introduced first by Ho and later independently by Amit and 10244: 10203: 9017: 8871: 8699: 8591: 1730: 560: 480: 403: 321: 151: 113: 108: 68: 63: 9947: 7700: 7645: 7625: 5285: 5263: 5084: 4278: 3777: 3617: 10239: 9755: 4300: 2649: 1634: 1595:(rounded down) features are used in each split. For regression problems the inventors recommend 507: 356: 256: 83: 7860: 2513: 2436:-NN) was pointed out by Lin and Jeon in 2002. It turns out that both can be viewed as so-called 10213: 9942: 9877: 9550: 9521: 9290: 8866: 8694: 8586: 8440: 8376: 3261: 2386:. As impurity measure for samples falling in a node e.g. the following statistics can be used: 1568: 1561: 1116: 999: 979: 687: 663: 565: 326: 301: 261: 73: 8473: 7782:. Scornet proved upper bounds on the rates of consistency for centered KeRF and uniform KeRF. 7329:{\displaystyle \operatorname {P} \leq b_{n}\mid {\mathcal {D}}_{n}]\geq 1-\varepsilon _{n}/2,} 9841: 6006:
Uniform KeRF is built in the same way as uniform forest, except that predictions are made by
1170: 1148: 1072: 1022: 641: 463: 415: 271: 186: 58: 8827: 8155: 1131:
In particular, trees that are grown very deep tend to learn highly irregular patterns: they
10156: 10110:"Random Multiclass Classification: Generalizing Random Forests to Random MNL and Random NB" 9661: 8762: 8661: 8638: 8403: 7779: 5057: 3190: 2909: 2708: 2173: 2146: 1095: 1084: 1057: 570: 520: 7722: 16:
This article is about the machine learning technique. For other kinds of random tree, see
8: 9501:
https://scikit-learn.org/stable/auto_examples/inspection/plot_permutation_importance.html
9373:"Classification with correlated features: unreliability of feature ranking and solutions" 1914: 1511:{\displaystyle \sigma ={\sqrt {\frac {\sum _{b=1}^{B}(f_{b}(x')-{\hat {f}})^{2}}{B-1}}}.} 975: 673: 609: 580: 485: 311: 244: 230: 216: 191: 141: 93: 53: 10209: 10160: 10085: 9988:"Explainable decision forest: Transforming a decision forest into an interpretable tree" 8967: 8766: 1388:
This bootstrapping procedure leads to better model performance because it decreases the
10179: 10144: 10059: 10038: 10007: 9966: 9914: 9895: 9809: 9783: 9673: 9576: 9568: 9353: 9327: 9275: 9256: 9158: 9133: 9009: 8884: 8712: 8604: 8535: 8496: 8406: â€“ Algorithm that employs a degree of randomness as part of its logic or procedure 7893: 7761: 6897: 5547: 5106: 3799: 3704: 3597: 3317: 3217: 2400: 2369: 2349: 2283: 2200: 2126: 1882: 1862: 1842: 1678: 1658: 651: 575: 361: 156: 8987:"A Data Complexity Analysis of Comparative Advantages of Decision Forest Constructors" 10184: 10129: 10109: 10011: 9775: 9731: 9677: 9665: 9617: 9481: 9454: 9394: 9345: 9260: 9221: 9182: 9163: 8608: 8545: 8531: 8500: 8388: 8382: 1290:
can be made by averaging the predictions from all the individual regression trees on
1153: 967: 744: 587: 500: 296: 266: 211: 206: 161: 103: 9937:
Breiman L, Ghahramani Z (2004). "Consistency for a simple model of random forests".
9787: 9541:
Shi, T.; Horvath, S. (2006). "Unsupervised Learning with Random Forest Predictors".
9389: 9372: 9216: 9199: 8888: 8539: 1067:. This paper describes a method of building a forest of uncorrelated trees using a 10174: 10164: 10121: 9999: 9899: 9887: 9833: 9771: 9767: 9723: 9700: 9657: 9607: 9560: 9446: 9415: 9384: 9357: 9337: 9300: 9248: 9211: 9153: 9145: 9057: 9013: 9001: 8948: 8876: 8770: 8716: 8704: 8647: 8596: 8488: 8350: 3515:-valued independent random variables distributed as the independent prototype pair 2395: 1833: 1619: 1534: 1080: 772: 525: 475: 385: 369: 339: 201: 196: 146: 136: 34: 10026: 9580: 9149: 10125: 10003: 9727: 8657: 1052: 800: 604: 470: 410: 6084:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})} 5642:{\displaystyle {\tilde {m}}_{M,n}(\mathbf {x} ,\Theta _{1},\ldots ,\Theta _{M})} 10149:
Proceedings of the National Academy of Sciences of the United States of America
9891: 9868:
Lin, Yi; Jeon, Yongho (2006). "Random forests and adaptive nearest neighbors".
9704: 9341: 9304: 8880: 3292: 2543:
by looking at the "neighborhood" of the point, formalized by a weight function
1029:
in order to construct a collection of decision trees with controlled variance.
820: 351: 88: 9939:
Statistical Department, University of California at Berkeley. Technical Report
9612: 9595: 9062: 8953: 8934: 8775: 8748: 8634:"An Overtraining-Resistant Stochastic Modeling Method for Pattern Recognition" 10223: 9669: 8652: 8633: 8527: 3277: 3276:. By slightly modifying their definition, random forests can be rewritten as 3273: 1623: 1122: 1048: 739: 668: 550: 166: 10169: 9564: 6889: 10188: 9779: 9621: 9398: 9349: 9276:"Unbiased split selection for classification trees based on the Gini index" 9252: 9225: 9167: 8845: 3866:
are independent random variables, distributed as a generic random variable
1169:
The training algorithm for random forests applies the general technique of
1026: 1010: 9450: 9005: 9829: 8744: 3461:{\displaystyle {\mathcal {D}}_{n}=\{(\mathbf {X} _{i},Y_{i})\}_{i=1}^{n}} 3288: 1132: 1099: 1064: 1006: 984: 545: 39: 17: 9572: 9520:(Technical report). Technical Report No. 1055. University of Wisconsin. 8070:{\displaystyle \mathbb {E} ^{2}\leq C_{1}n^{-1/(3+d\log 2)}(\log n)^{2}} 5564:
is the same as for centered forest, except that predictions are made by
9722:. Lecture Notes in Computer Science. Vol. 4653. pp. 349–358. 8968:
Gareth James; Daniela Witten; Trevor Hastie; Robert Tibshirani (2013).
8600: 1832:
is to fit a random forest to the data. During the fitting process the
995: 994:
The first algorithm for random decision forests was created in 1995 by
694: 390: 316: 8969: 8708: 8492: 9594:
Shi T, Seligson D, Belldegrun AS, Palotie A, Horvath S (April 2005).
9184:
Bias of importance measures for multi-valued attributes and solutions
8334:{\displaystyle \mathbb {E} ^{2}\leq Cn^{-2/(6+3d\log 2)}(\log n)^{2}} 3859:{\displaystyle \mathbf {\Theta } _{1},\ldots ,\mathbf {\Theta } _{M}} 1385:
or by taking the plurality vote in the case of classification trees.
1014: 853: 634: 9965:
Arlot S, Genuer R (2014). "Analysis of purely random forests bias".
9238: 10043: 9814: 9332: 1909:
This method of determining variable importance has some drawbacks.
1098:
which depends on the strength of the trees in the forest and their
988: 9971: 9919: 8921:
Proceedings of the Second Intl. Workshop on Multistrategy Learning
8817:
U.S. trademark registration number 3185828, registered 2006/12/19.
3291:
was the first person to notice the link between random forest and
1745:
The first step in measuring the variable importance in a data set
1376:{\displaystyle {\hat {f}}={\frac {1}{B}}\sum _{b=1}^{B}f_{b}(x')} 1063:
The proper introduction of random forests was made in a paper by
1018: 629: 9200:"Permutation importance: a corrected feature importance measure" 8680:"On the Algorithmic Implementation of Stochastic Discrimination" 1618:
cut-point for each feature under consideration (based on, e.g.,
6091:, the corresponding kernel function, or connection function is 5649:, the corresponding kernel function, or connection function is 3296: 1927: 1704:
that are informative. Some methods for accomplishing this are:
1071:
like procedure, combined with randomized node optimization and
380: 9320:
IEEE Transactions on Pattern Analysis and Machine Intelligence
8687:
IEEE Transactions on Pattern Analysis and Machine Intelligence
8481:
IEEE Transactions on Pattern Analysis and Machine Intelligence
8474:"The Random Subspace Method for Constructing Decision Forests" 1825:{\displaystyle {\mathcal {D}}_{n}=\{(X_{i},Y_{i})\}_{i=1}^{n}} 9477:
Pattern Recognition Techniques Applied to Biomedical Problems
2637:{\displaystyle {\hat {y}}=\sum _{i=1}^{n}W(x_{i},x')\,y_{i}.} 978:
and other tasks that operates by constructing a multitude of
624: 619: 346: 9808:
Scornet, Erwan (2015). "Random forests and kernel methods".
9593: 7180:{\displaystyle \operatorname {P} \geq 1-\varepsilon _{n}/2,} 3242: 1917:
and growing unbiased trees can be used to solve the problem.
1708:
Prefiltering: Eliminate features that are mostly just noise.
9690: 9197: 5103:
in the forest. If we define the connection function of the
3295:. He pointed out that random forests which are grown using 1698: 8850:"Shape quantization and recognition with randomized trees" 6486: 3767:{\displaystyle m_{n}(\mathbf {x} ,\mathbf {\Theta } _{j})} 9932: 9930: 9180: 6890:
Relation between infinite KeRF and infinite random forest
10212:(Discussion of the use of the random forest package for 9273: 9241:
Journal of Transportation Engineering, Part B: Pavements
8526: 2732:
must sum to one. Weight functions are given as follows:
2413:
the importance measure prefers high cardinality features
1110: 912:
List of datasets in computer vision and image processing
9644:
Piryonesi, S. Madeh; El-Diraby, Tamer E. (2021-02-01).
8399:
Pages displaying short descriptions of redirect targets
9927: 9754:
Smith, Paul F.; Ganesh, Siva; Liu, Ping (2013-10-01).
9198:
Altmann A, ToloĹźi L, Sander O, Lengauer T (May 2010).
3280:, which are more interpretable and easier to analyze. 8184: 8158: 8117: 8091: 7916: 7896: 7863: 7822: 7796: 7764: 7725: 7703: 7670: 7648: 7628: 7585: 7345: 7194: 7064: 6998: 6924: 6900: 6718: 6553: 6501: 6097: 6012: 5655: 5570: 5550: 5310: 5288: 5266: 5129: 5109: 5087: 5060: 4795: 4544: 4497: 4361: 4330: 4303: 4281: 4230: 4069: 3925: 3894: 3872: 3822: 3802: 3780: 3727: 3707: 3642: 3620: 3600: 3555: 3521: 3474: 3385: 3340: 3320: 3220: 3193: 2939: 2912: 2824: 2746: 2711: 2652: 2553: 2516: 2446: 2372: 2352: 2306: 2286: 2223: 2203: 2176: 2149: 2129: 1939: 1885: 1865: 1845: 1751: 1681: 1661: 1637: 1406: 1302: 1217:
of the training set and fits trees to these samples:
1173:, or bagging, to tree learners. Given a training set 9960: 9958: 9803: 9801: 9799: 9797: 2902:
Since a forest averages the predictions of a set of
2420: 1538:: the mean prediction error on each training sample 9040: 10145:"Classification and interaction in random forests" 9643: 9540: 8333: 8170: 8144: 8103: 8069: 7902: 7882: 7849: 7808: 7770: 7750: 7711: 7689: 7656: 7634: 7614: 7563: 7328: 7179: 7049: 6982:{\displaystyle (\varepsilon _{n}),(a_{n}),(b_{n})} 6981: 6906: 6878: 6704: 6539: 6471: 6083: 5991: 5641: 5556: 5528: 5296: 5274: 5252: 5115: 5095: 5073: 5046: 4781: 4530: 4480: 4347: 4316: 4289: 4267: 4216: 4055: 3911: 3880: 3858: 3808: 3788: 3766: 3713: 3694:{\displaystyle m(\mathbf {x} )=\operatorname {E} } 3693: 3628: 3606: 3586: 3541: 3507: 3460: 3354: 3326: 3308: 3226: 3206: 3173: 2925: 2875: 2792: 2724: 2685: 2636: 2531: 2502: 2378: 2358: 2338: 2292: 2272: 2209: 2189: 2162: 2135: 2115: 1891: 1871: 1851: 1824: 1687: 1667: 1647: 1510: 1375: 1090:Measuring variable importance through permutation. 9955: 9936: 9794: 9543:Journal of Computational and Graphical Statistics 8578:Annals of Mathematics and Artificial Intelligence 8385: â€“ Statistics and machine learning technique 1555: 1165:trees are aggregated to produce a final decision. 10221: 9912: 9906: 8839: 8837: 7522: 1610:Adding one further step of randomization yields 9870:Journal of the American Statistical Association 9822: 9138:Journal of the American Statistical Association 8673: 8671: 8627: 8625: 8563: 8561: 8522: 8520: 8518: 8516: 8514: 8512: 8510: 7785: 4268:{\displaystyle A_{n}(\mathbf {x} ,\Theta _{j})} 3701:. A random regression forest is an ensemble of 2273:{\displaystyle p_{T_{i}}(j)={\frac {n_{j}}{n}}} 1286:After training, predictions for unseen samples 1005:An extension of the algorithm was developed by 10142: 9834:"Some infinity theory for predictor ensembles" 9753: 9370: 9131: 8915:Heath, D., Kasif, S. and Salzberg, S. (1993). 8080: 7615:{\displaystyle Y=m(\mathbf {X} )+\varepsilon } 5260:, i.e. the proportion of cells shared between 4491:Thus random forest estimates satisfy, for all 2425:A relationship between random forests and the 1859:-th feature after training, the values of the 907:List of datasets for machine-learning research 10025:Vidal, Thibaut; Schiffer, Maximilian (2020). 10024: 9861: 9836:. Technical Report 579, Statistics Dept. UCB. 9518:Random forests and adaptive nearest neighbors 9473: 9317: 8834: 8739: 8737: 8735: 8733: 7642:is a centered Gaussian noise, independent of 3587:{\displaystyle \operatorname {E} <\infty } 3374: 2697:'th training point relative to the new point 2440:. These are models built from a training set 1582:Typically, for a classification problem with 940: 10031:International Conference on Machine Learning 9511: 9509: 9474:Ortiz-Posadas, Martha Refugio (2020-02-29). 9283:Computational Statistics & Data Analysis 9274:Strobl C, Boulesteix AL, Augustin T (2007). 8668: 8622: 8558: 8507: 5934: 5904: 5898: 5868: 3438: 3403: 3303: 2480: 2447: 1928:Mean Decrease in Impurity Feature Importance 1802: 1769: 10107: 9964: 5544:The construction of Centered KeRF of level 2876:{\displaystyle W(x_{i},x')={\frac {1}{k'}}} 2503:{\displaystyle \{(x_{i},y_{i})\}_{i=1}^{n}} 23:Tree-based ensemble machine learning method 9985: 9416:"Beware Default Random Forest Importances" 8932: 8795:"Documentation for R package randomForest" 8788: 8786: 8730: 8145:{\displaystyle n/2^{k}\rightarrow \infty } 7850:{\displaystyle n/2^{k}\rightarrow \infty } 2793:{\displaystyle W(x_{i},x')={\frac {1}{k}}} 1260:Train a classification or regression tree 947: 933: 10178: 10168: 10057: 10042: 9970: 9946: 9918: 9881: 9813: 9611: 9554: 9525: 9506: 9388: 9331: 9294: 9215: 9157: 9061: 8952: 8870: 8843: 8774: 8698: 8677: 8651: 8631: 8590: 8567: 8186: 7918: 7050:{\displaystyle \operatorname {E} \geq 1,} 3501: 3348: 3243:Unsupervised learning with random forests 3157: 3044: 2620: 2280:is the fraction of samples reaching node 1740: 1545:, using only the trees that did not have 10114:Database and Expert Systems Applications 10058:Piryonesi, Sayed Madeh (November 2019). 8467: 8465: 8463: 8461: 8425: 8423: 8421: 8419: 3636:, by estimating the regression function 1699:Random forests for high-dimensional data 1695:is the number of features in the model. 1152: 10143:Denisko D, Hoffman MM (February 2018). 9867: 9828: 9807: 9717: 9515: 9438: 8971:An Introduction to Statistical Learning 8783: 8743: 6487:Relation between KeRF and random forest 3508:{\displaystyle ^{p}\times \mathbb {R} } 3267: 2906:trees with individual weight functions 1013:, who registered "Random Forests" as a 10222: 9181:Deng, H.; Runger, G.; Tuv, E. (2011). 9041:Geurts P, Ernst D, Wehenkel L (2006). 8978: 7690:{\displaystyle \sigma ^{2}<\infty } 7574: 3614:, associated with the random variable 3187:in this interpretation are the points 2170:is the number of trees in the forest, 1723: 10204:Random Forests classifier description 9749: 9747: 9639: 9637: 9635: 9633: 9631: 9410: 9408: 8458: 8416: 2701:in the same tree. For any particular 1111:Preliminaries: decision tree learning 1021:). The extension combines Breiman's " 8961: 8792: 8541:The Elements of Statistical Learning 8397: â€“ Type of statistical analysis 8104:{\displaystyle k\rightarrow \infty } 7809:{\displaystyle k\rightarrow \infty } 4531:{\displaystyle \mathbf {x} \in ^{d}} 3721:randomized regression trees. Denote 3594:. We aim at predicting the response 9442:Classification and Regression Trees 9232: 8917:k-DT: A multi-tree learning method. 5081:'s falling in the cells containing 3234:. In this way, the neighborhood of 2339:{\displaystyle \Delta i_{T_{i}}(j)} 902:Glossary of artificial intelligence 13: 10071: 10051: 9744: 9662:10.1061/(ASCE)IS.1943-555X.0000602 9628: 9405: 9371:Tolosi L, Lengauer T (July 2011). 9132:Zhu R, Zeng D, Kosorok MR (2015). 8984: 8471: 8429: 8391: â€“ Machine learning technique 8379: â€“ Machine learning algorithm 8373: â€“ Method in machine learning 8139: 8098: 7844: 7803: 7684: 7478: 7395: 7356: 7282: 7254: 7222: 7218: 7195: 7133: 7108: 7065: 7029: 6999: 6918:Assume that there exist sequences 6676: 6588: 6495:Assume that there exist sequences 6069: 6050: 5627: 5608: 5367: 5348: 5236: 5054:which is equal to the mean of the 5027: 4916: 4852: 4833: 4759: 4721: 4592: 4573: 4464: 4384: 4348:{\displaystyle {\mathcal {D}}_{n}} 4334: 4305: 4253: 4199: 4161: 4041: 3973: 3954: 3912:{\displaystyle {\mathcal {D}}_{n}} 3898: 3881:{\displaystyle \mathbf {\Theta } } 3660: 3581: 3556: 3389: 3214:sharing the same leaf in any tree 2693:is the non-negative weight of the 2346:is the change in impurity in tree 2307: 2081: 1755: 14: 10261: 10197: 10118:Lecture Notes in Computer Science 9986:Sagi, Omer; Rokach, Lior (2020). 9650:Journal of Infrastructure Systems 8994:Pattern Analysis and Applications 3365: 3362:is a parameter of the algorithm. 3355:{\displaystyle k\in \mathbb {N} } 2421:Relationship to nearest neighbors 1839:To measure the importance of the 10078: 9693:Expert Systems with Applications 8344: 8241: 8224: 7973: 7956: 7705: 7650: 7599: 7493: 7410: 7371: 7247: 7101: 7022: 6866: 6783: 6744: 6668: 6581: 6437: 6128: 6120: 6042: 5957: 5949: 5863: 5686: 5678: 5600: 5539: 5510: 5501: 5445: 5436: 5340: 5290: 5268: 5228: 5207: 5201: 5158: 5150: 5089: 5019: 4992: 4985: 4908: 4825: 4751: 4713: 4686: 4679: 4565: 4499: 4456: 4429: 4422: 4376: 4283: 4245: 4191: 4153: 4126: 4119: 4063:. For regression trees, we have 4033: 3946: 3874: 3846: 3825: 3782: 3751: 3742: 3684: 3676: 3650: 3622: 3542:{\displaystyle (\mathbf {X} ,Y)} 3526: 3411: 1137:low bias, but very high variance 10120:. Vol. 4653. p. 349. 10018: 9979: 9760:Journal of Neuroscience Methods 9711: 9684: 9587: 9534: 9494: 9467: 9432: 9364: 9311: 9267: 9191: 9174: 9125: 9115: 9106: 9097: 9088: 9079: 9070: 9034: 8926: 8909: 8820: 6540:{\displaystyle (a_{n}),(b_{n})} 6481: 6001: 5942: 3309:Preliminaries: Centered forests 3258:multinomial logistic regression 1135:their training sets, i.e. have 9772:10.1016/j.jneumeth.2013.08.024 9516:Lin, Yi; Jeon, Yongho (2002). 9134:"Reinforcement Learning Trees" 8811: 8322: 8309: 8304: 8280: 8249: 8245: 8237: 8228: 8220: 8200: 8190: 8136: 8095: 8058: 8045: 8040: 8019: 7981: 7977: 7969: 7960: 7952: 7932: 7922: 7841: 7800: 7739: 7726: 7603: 7595: 7497: 7489: 7471: 7418: 7414: 7406: 7388: 7375: 7367: 7347: 7293: 7260: 7257: 7243: 7230: 7201: 7144: 7111: 7097: 7071: 7035: 7032: 7018: 7005: 6976: 6963: 6957: 6944: 6938: 6925: 6870: 6862: 6844: 6791: 6787: 6779: 6761: 6748: 6740: 6720: 6591: 6577: 6534: 6521: 6515: 6502: 6457: 6444: 6401: 6386: 6330: 6315: 6132: 6116: 6078: 6038: 6020: 5977: 5964: 5690: 5674: 5636: 5596: 5578: 5520: 5497: 5455: 5432: 5376: 5336: 5318: 5245: 5224: 5162: 5146: 5036: 5015: 4925: 4904: 4861: 4821: 4803: 4768: 4747: 4730: 4709: 4601: 4561: 4519: 4506: 4473: 4452: 4393: 4372: 4262: 4241: 4208: 4187: 4170: 4149: 4050: 4029: 3982: 3942: 3761: 3738: 3688: 3666: 3654: 3646: 3575: 3562: 3536: 3522: 3488: 3475: 3434: 3406: 3149: 3125: 3041: 3017: 2946: 2852: 2828: 2774: 2750: 2680: 2656: 2617: 2593: 2560: 2523: 2476: 2450: 2438:weighted neighborhoods schemes 2333: 2327: 2247: 2241: 2107: 2101: 2078: 2072: 2044: 2038: 2029: 1951: 1945: 1942:unormalized average importance 1798: 1772: 1556:From bagging to random forests 1481: 1474: 1462: 1451: 1438: 1370: 1359: 1309: 1215:random sample with replacement 1017:in 2006 (as of 2019, owned by 322:Relevance vector machine (RVM) 1: 9390:10.1093/bioinformatics/btr300 9217:10.1093/bioinformatics/btq134 9150:10.1080/01621459.2015.1036994 8974:. Springer. pp. 316–321. 8410: 5304:, then almost surely we have 3774:the predicted value at point 1718: 1605: 1569:random subset of the features 1520:The number of samples/trees, 811:Computational learning theory 375:Expectation–maximization (EM) 10126:10.1007/978-3-540-74469-6_35 10004:10.1016/j.inffus.2020.03.013 9728:10.1007/978-3-540-74469-6_35 9043:"Extremely randomized trees" 7786:Consistency of centered KeRF 7719:is uniformly distributed on 7712:{\displaystyle \mathbf {X} } 7657:{\displaystyle \mathbf {X} } 7635:{\displaystyle \varepsilon } 5297:{\displaystyle \mathbf {z} } 5275:{\displaystyle \mathbf {x} } 5096:{\displaystyle \mathbf {x} } 4290:{\displaystyle \mathbf {x} } 3888:, independent of the sample 3789:{\displaystyle \mathbf {x} } 3629:{\displaystyle \mathbf {X} } 3283: 1105: 768:Coefficient of determination 615:Convolutional neural network 327:Support vector machine (SVM) 7: 10027:"Born-Again Tree Ensembles" 9439:Breiman, Leo (2017-10-25). 8933:Dietterich, Thomas (2000). 8570:"Stochastic Discrimination" 8364: 8081:Consistency of uniform KeRF 4317:{\displaystyle \Theta _{j}} 4297:, designed with randomness 3251: 2894:points in the same leaf as 2686:{\displaystyle W(x_{i},x')} 2430:-nearest neighbor algorithm 1648:{\displaystyle {\sqrt {p}}} 919:Outline of machine learning 816:Empirical risk minimization 10: 10266: 10108:Prinzie A, Poel D (2007). 9892:10.1198/016214505000001230 9705:10.1016/j.eswa.2007.01.029 9342:10.1109/tpami.2016.2636831 9305:10.1016/j.csda.2006.12.030 8881:10.1162/neco.1997.9.7.1545 8793:Liaw A (16 October 2012). 8544:(2nd ed.). Springer. 8152:, there exists a constant 7883:{\displaystyle C_{1}>0} 7857:, there exists a constant 6547:such that, almost surely, 5536:, which defines the KeRF. 3375:From random forest to KeRF 2532:{\displaystyle {\hat {y}}} 1612:extremely randomized trees 1559: 1231:Sample, with replacement, 1146: 1142: 1114: 1032: 556:Feedforward neural network 307:Artificial neural networks 15: 10230:Classification algorithms 9613:10.1038/modpathol.3800322 9063:10.1007/s10994-006-6226-1 8395:Non-parametric statistics 6989:such that, almost surely 6894:When the number of trees 3379:Given a training sample 3304:Notations and definitions 2705:, the weights for points 539:Artificial neural network 10250:Computational statistics 848:Journals and conferences 795:Mathematical foundations 705:Temporal difference (TD) 561:Recurrent neural network 481:Conditional random field 404:Dimensionality reduction 152:Dimensionality reduction 114:Quantum machine learning 109:Neuromorphic engineering 69:Self-supervised learning 64:Semi-supervised learning 10170:10.1073/pnas.1800256115 9565:10.1198/106186006X94072 9445:. New York: Routledge. 8954:10.1023/A:1007607513941 8776:10.1023/A:1010933404324 8433:Random Decision Forests 7664:, with finite variance 4275:is the cell containing 3262:naive Bayes classifiers 1655:for classification and 1235:training examples from 964:random decision forests 257:Apprenticeship learning 9849:Cite journal requires 9253:10.1061/JPEODX.0000175 8653:10.1214/aos/1032181157 8377:Decision tree learning 8335: 8172: 8171:{\displaystyle C>0} 8146: 8105: 8071: 7904: 7884: 7851: 7810: 7772: 7752: 7713: 7691: 7658: 7636: 7616: 7572: 7565: 7330: 7181: 7051: 6983: 6908: 6887: 6880: 6706: 6655: 6541: 6473: 6367: 6302: 6195: 6085: 5993: 5860: 5753: 5643: 5558: 5530: 5480: 5405: 5298: 5276: 5254: 5198: 5117: 5097: 5075: 5048: 4972: 4951: 4893: 4783: 4663: 4637: 4532: 4482: 4419: 4349: 4318: 4291: 4269: 4218: 4103: 4057: 4018: 3913: 3882: 3860: 3810: 3790: 3768: 3715: 3695: 3630: 3608: 3588: 3543: 3509: 3462: 3356: 3328: 3228: 3208: 3175: 3114: 3078: 3006: 2985: 2933:, its predictions are 2927: 2877: 2794: 2726: 2687: 2638: 2589: 2533: 2510:that make predictions 2504: 2380: 2360: 2340: 2294: 2274: 2211: 2191: 2164: 2137: 2117: 2001: 1893: 1873: 1853: 1826: 1741:Permutation Importance 1689: 1675:for regression, where 1669: 1649: 1562:Random subspace method 1532:, or by observing the 1512: 1437: 1377: 1348: 1209:, bagging repeatedly ( 1166: 1117:Decision tree learning 1083:as an estimate of the 1000:random subspace method 806:Bias–variance tradeoff 688:Reinforcement learning 664:Spiking neural network 74:Reinforcement learning 9451:10.1201/9781315139470 9006:10.1007/s100440200009 8336: 8173: 8147: 8106: 8072: 7905: 7885: 7852: 7811: 7773: 7753: 7714: 7692: 7659: 7637: 7617: 7566: 7331: 7182: 7052: 6984: 6916: 6909: 6881: 6707: 6635: 6542: 6493: 6474: 6334: 6282: 6175: 6086: 5994: 5840: 5733: 5644: 5559: 5531: 5460: 5385: 5299: 5277: 5255: 5178: 5118: 5098: 5076: 5074:{\displaystyle Y_{i}} 5049: 4952: 4931: 4873: 4784: 4643: 4617: 4533: 4483: 4399: 4350: 4319: 4292: 4270: 4219: 4083: 4058: 3998: 3914: 3883: 3861: 3811: 3791: 3769: 3716: 3696: 3631: 3609: 3589: 3544: 3510: 3463: 3357: 3329: 3229: 3209: 3207:{\displaystyle x_{i}} 3176: 3094: 3058: 2986: 2965: 2928: 2926:{\displaystyle W_{j}} 2898:, and zero otherwise. 2878: 2815:, and zero otherwise. 2795: 2740:-NN, the weights are 2727: 2725:{\displaystyle x_{i}} 2688: 2639: 2569: 2534: 2505: 2381: 2361: 2341: 2295: 2275: 2212: 2192: 2190:{\displaystyle T_{i}} 2165: 2163:{\displaystyle n_{T}} 2143:indicates a feature, 2138: 2118: 1974: 1894: 1874: 1854: 1827: 1690: 1670: 1650: 1513: 1417: 1378: 1328: 1171:bootstrap aggregating 1156: 1149:Bootstrap aggregating 642:Neural radiance field 464:Structured prediction 187:Structured prediction 59:Unsupervised learning 10206:(Leo Breiman's site) 8985:Ho, Tin Kam (2002). 8678:Kleinberg E (2000). 8639:Annals of Statistics 8632:Kleinberg E (1996). 8568:Kleinberg E (1990). 8430:Ho, Tin Kam (1995). 8404:Randomized algorithm 8182: 8156: 8115: 8089: 7914: 7894: 7861: 7820: 7794: 7762: 7751:{\displaystyle ^{d}} 7723: 7701: 7668: 7646: 7626: 7583: 7343: 7339:Then almost surely, 7192: 7062: 6996: 6922: 6898: 6716: 6712:Then almost surely, 6551: 6499: 6095: 6010: 5653: 5568: 5548: 5308: 5286: 5264: 5127: 5107: 5085: 5058: 4793: 4542: 4495: 4359: 4328: 4301: 4279: 4228: 4067: 3923: 3892: 3870: 3820: 3800: 3778: 3725: 3705: 3640: 3618: 3598: 3553: 3519: 3472: 3383: 3338: 3318: 3268:Kernel random forest 3218: 3191: 2937: 2910: 2822: 2744: 2709: 2650: 2551: 2514: 2444: 2370: 2350: 2304: 2284: 2221: 2201: 2174: 2147: 2127: 1937: 1915:partial permutations 1883: 1863: 1843: 1749: 1679: 1659: 1635: 1404: 1300: 1096:generalization error 1085:generalization error 1058:Thomas G. Dietterich 831:Statistical learning 729:Learning with humans 521:Local outlier factor 10161:2018PNAS..115.1690D 10037:. PMLR: 9743–9753. 9480:. Springer Nature. 8767:2001MachL..45....5B 8219: 7951: 7890:such that, for all 7575:Consistency results 6433: for all  6115: 5945: for all  5673: 3457: 2499: 1821: 1724:Variable importance 1528:can be found using 674:Electrochemical RAM 581:reservoir computing 312:Logistic regression 231:Supervised learning 217:Multimodal learning 192:Feature engineering 137:Generative modeling 99:Rule-based learning 94:Curriculum learning 54:Supervised learning 29:Part of a series on 9992:Information Fusion 9144:(512): 1770–1784. 8858:Neural Computation 8601:10.1007/BF01531079 8532:Tibshirani, Robert 8331: 8193: 8168: 8142: 8101: 8067: 7925: 7900: 7880: 7847: 7806: 7768: 7748: 7709: 7687: 7654: 7632: 7612: 7561: 7542: 7326: 7177: 7047: 6979: 6904: 6876: 6702: 6537: 6469: 6213: 6098: 6081: 5989: 5771: 5656: 5639: 5554: 5526: 5294: 5272: 5250: 5113: 5093: 5071: 5044: 4779: 4528: 4478: 4345: 4314: 4287: 4265: 4214: 4053: 3909: 3878: 3856: 3806: 3786: 3764: 3711: 3691: 3626: 3604: 3584: 3539: 3505: 3458: 3437: 3352: 3324: 3224: 3204: 3171: 2923: 2873: 2811:points closest to 2790: 2722: 2683: 2634: 2529: 2500: 2479: 2401:Mean squared error 2376: 2356: 2336: 2290: 2270: 2207: 2187: 2160: 2133: 2113: 2054: 1889: 1869: 1849: 1822: 1801: 1685: 1665: 1645: 1508: 1373: 1167: 242: • 157:Density estimation 10235:Ensemble learning 10135:978-3-540-74467-2 9737:978-3-540-74467-2 9487:978-3-030-38021-2 9460:978-1-315-13947-0 9326:(11): 2142–2153. 8709:10.1109/34.857004 8493:10.1109/34.709601 8389:Gradient boosting 8383:Ensemble learning 8203: 7935: 7903:{\displaystyle n} 7771:{\displaystyle m} 7521: 7474: 7462: 7391: 6907:{\displaystyle M} 6847: 6835: 6764: 6633: 6610: 6434: 6424: 6270: 6255: 6138: 6023: 5946: 5828: 5813: 5696: 5581: 5557:{\displaystyle k} 5524: 5321: 5176: 5123:finite forest as 5116:{\displaystyle M} 4929: 4806: 4772: 4615: 4212: 3996: 3809:{\displaystyle j} 3714:{\displaystyle M} 3607:{\displaystyle Y} 3327:{\displaystyle k} 3227:{\displaystyle j} 3092: 2963: 2949: 2871: 2788: 2563: 2526: 2379:{\displaystyle j} 2359:{\displaystyle t} 2293:{\displaystyle j} 2268: 2210:{\displaystyle i} 2136:{\displaystyle x} 2036: 2010: 2002: 1972: 1943: 1892:{\displaystyle j} 1872:{\displaystyle j} 1852:{\displaystyle j} 1688:{\displaystyle p} 1668:{\displaystyle p} 1643: 1503: 1502: 1477: 1326: 1312: 1213:times) selects a 968:ensemble learning 957: 956: 762:Model diagnostics 745:Human-in-the-loop 588:Boltzmann machine 501:Anomaly detection 297:Linear regression 212:Ontology learning 207:Grammar induction 182:Semantic analysis 177:Association rules 162:Anomaly detection 104:Neuro-symbolic AI 10257: 10192: 10182: 10172: 10155:(8): 1690–1692. 10139: 10082: 10081: 10066: 10065: 10055: 10049: 10048: 10046: 10022: 10016: 10015: 9983: 9977: 9976: 9974: 9962: 9953: 9952: 9950: 9934: 9925: 9924: 9922: 9910: 9904: 9903: 9885: 9876:(474): 578–590. 9865: 9859: 9858: 9852: 9847: 9845: 9837: 9826: 9820: 9819: 9817: 9805: 9792: 9791: 9751: 9742: 9741: 9715: 9709: 9708: 9699:(3): 1721–1732. 9688: 9682: 9681: 9641: 9626: 9625: 9615: 9600:Modern Pathology 9591: 9585: 9584: 9558: 9538: 9532: 9531: 9529: 9513: 9504: 9498: 9492: 9491: 9471: 9465: 9464: 9436: 9430: 9429: 9427: 9426: 9412: 9403: 9402: 9392: 9368: 9362: 9361: 9335: 9315: 9309: 9308: 9298: 9280: 9271: 9265: 9264: 9236: 9230: 9229: 9219: 9195: 9189: 9188: 9178: 9172: 9171: 9161: 9129: 9123: 9119: 9113: 9110: 9104: 9101: 9095: 9092: 9086: 9083: 9077: 9074: 9068: 9067: 9065: 9050:Machine Learning 9047: 9038: 9032: 9031: 9029: 9028: 9022: 9016:. Archived from 8991: 8982: 8976: 8975: 8965: 8959: 8958: 8956: 8940:Machine Learning 8930: 8924: 8913: 8907: 8906: 8904: 8903: 8897: 8891:. Archived from 8874: 8865:(7): 1545–1588. 8854: 8841: 8832: 8831: 8824: 8818: 8815: 8809: 8808: 8806: 8804: 8799: 8790: 8781: 8780: 8778: 8754:Machine Learning 8749:"Random Forests" 8741: 8728: 8727: 8725: 8719:. Archived from 8702: 8684: 8675: 8666: 8665: 8655: 8646:(6): 2319–2349. 8629: 8620: 8619: 8617: 8611:. Archived from 8594: 8585:(1–4): 207–239. 8574: 8565: 8556: 8555: 8536:Friedman, Jerome 8524: 8505: 8504: 8478: 8469: 8456: 8455: 8453: 8451: 8446:on 17 April 2016 8445: 8438: 8427: 8400: 8351:interpretability 8340: 8338: 8337: 8332: 8330: 8329: 8308: 8307: 8279: 8257: 8256: 8244: 8227: 8218: 8210: 8205: 8204: 8196: 8189: 8177: 8175: 8174: 8169: 8151: 8149: 8148: 8143: 8135: 8134: 8125: 8110: 8108: 8107: 8102: 8076: 8074: 8073: 8068: 8066: 8065: 8044: 8043: 8018: 8002: 8001: 7989: 7988: 7976: 7959: 7950: 7942: 7937: 7936: 7928: 7921: 7909: 7907: 7906: 7901: 7889: 7887: 7886: 7881: 7873: 7872: 7856: 7854: 7853: 7848: 7840: 7839: 7830: 7815: 7813: 7812: 7807: 7777: 7775: 7774: 7769: 7757: 7755: 7754: 7749: 7747: 7746: 7718: 7716: 7715: 7710: 7708: 7696: 7694: 7693: 7688: 7680: 7679: 7663: 7661: 7660: 7655: 7653: 7641: 7639: 7638: 7633: 7621: 7619: 7618: 7613: 7602: 7570: 7568: 7567: 7562: 7557: 7553: 7552: 7551: 7541: 7515: 7514: 7496: 7488: 7487: 7476: 7475: 7467: 7463: 7461: 7460: 7451: 7450: 7449: 7437: 7436: 7426: 7421: 7413: 7405: 7404: 7393: 7392: 7384: 7374: 7366: 7365: 7350: 7335: 7333: 7332: 7327: 7319: 7314: 7313: 7292: 7291: 7286: 7285: 7275: 7274: 7250: 7242: 7241: 7226: 7225: 7213: 7212: 7186: 7184: 7183: 7178: 7170: 7165: 7164: 7143: 7142: 7137: 7136: 7126: 7125: 7104: 7096: 7095: 7083: 7082: 7056: 7054: 7053: 7048: 7025: 7017: 7016: 6988: 6986: 6985: 6980: 6975: 6974: 6956: 6955: 6937: 6936: 6913: 6911: 6910: 6905: 6885: 6883: 6882: 6877: 6869: 6861: 6860: 6849: 6848: 6840: 6836: 6834: 6833: 6824: 6823: 6822: 6810: 6809: 6799: 6794: 6786: 6778: 6777: 6766: 6765: 6757: 6747: 6739: 6738: 6723: 6711: 6709: 6708: 6703: 6698: 6697: 6685: 6684: 6683: 6671: 6665: 6664: 6654: 6649: 6634: 6626: 6621: 6620: 6611: 6608: 6606: 6605: 6584: 6576: 6575: 6563: 6562: 6546: 6544: 6543: 6538: 6533: 6532: 6514: 6513: 6478: 6476: 6475: 6470: 6465: 6464: 6440: 6435: 6432: 6430: 6426: 6425: 6423: 6415: 6414: 6409: 6405: 6404: 6399: 6398: 6389: 6369: 6366: 6359: 6358: 6348: 6333: 6328: 6327: 6318: 6301: 6296: 6281: 6280: 6275: 6271: 6263: 6256: 6254: 6250: 6249: 6234: 6233: 6223: 6215: 6212: 6205: 6204: 6194: 6189: 6171: 6170: 6152: 6151: 6131: 6123: 6114: 6106: 6090: 6088: 6087: 6082: 6077: 6076: 6058: 6057: 6045: 6037: 6036: 6025: 6024: 6016: 5998: 5996: 5995: 5990: 5985: 5984: 5960: 5952: 5947: 5944: 5938: 5937: 5933: 5932: 5923: 5922: 5921: 5920: 5897: 5896: 5887: 5886: 5885: 5884: 5866: 5859: 5854: 5839: 5838: 5833: 5829: 5821: 5814: 5812: 5808: 5807: 5792: 5791: 5781: 5773: 5770: 5763: 5762: 5752: 5747: 5729: 5728: 5710: 5709: 5689: 5681: 5672: 5664: 5648: 5646: 5645: 5640: 5635: 5634: 5616: 5615: 5603: 5595: 5594: 5583: 5582: 5574: 5563: 5561: 5560: 5555: 5535: 5533: 5532: 5527: 5525: 5523: 5519: 5518: 5513: 5504: 5496: 5495: 5479: 5474: 5458: 5454: 5453: 5448: 5439: 5431: 5430: 5415: 5414: 5404: 5399: 5383: 5375: 5374: 5356: 5355: 5343: 5335: 5334: 5323: 5322: 5314: 5303: 5301: 5300: 5295: 5293: 5281: 5279: 5278: 5273: 5271: 5259: 5257: 5256: 5251: 5249: 5248: 5244: 5243: 5231: 5223: 5222: 5210: 5204: 5197: 5192: 5177: 5169: 5161: 5153: 5145: 5144: 5122: 5120: 5119: 5114: 5102: 5100: 5099: 5094: 5092: 5080: 5078: 5077: 5072: 5070: 5069: 5053: 5051: 5050: 5045: 5040: 5039: 5035: 5034: 5022: 5014: 5013: 5001: 5000: 4995: 4988: 4982: 4981: 4971: 4966: 4950: 4945: 4930: 4928: 4924: 4923: 4911: 4903: 4902: 4892: 4887: 4868: 4860: 4859: 4841: 4840: 4828: 4820: 4819: 4808: 4807: 4799: 4788: 4786: 4785: 4780: 4778: 4774: 4773: 4771: 4767: 4766: 4754: 4746: 4745: 4735: 4734: 4733: 4729: 4728: 4716: 4708: 4707: 4695: 4694: 4689: 4682: 4676: 4675: 4665: 4662: 4657: 4636: 4631: 4616: 4608: 4600: 4599: 4581: 4580: 4568: 4560: 4559: 4537: 4535: 4534: 4529: 4527: 4526: 4502: 4487: 4485: 4484: 4479: 4477: 4476: 4472: 4471: 4459: 4451: 4450: 4438: 4437: 4432: 4425: 4418: 4413: 4392: 4391: 4379: 4371: 4370: 4354: 4352: 4351: 4346: 4344: 4343: 4338: 4337: 4323: 4321: 4320: 4315: 4313: 4312: 4296: 4294: 4293: 4288: 4286: 4274: 4272: 4271: 4266: 4261: 4260: 4248: 4240: 4239: 4223: 4221: 4220: 4215: 4213: 4211: 4207: 4206: 4194: 4186: 4185: 4175: 4174: 4173: 4169: 4168: 4156: 4148: 4147: 4135: 4134: 4129: 4122: 4116: 4115: 4105: 4102: 4097: 4079: 4078: 4062: 4060: 4059: 4054: 4049: 4048: 4036: 4028: 4027: 4017: 4012: 3997: 3989: 3981: 3980: 3962: 3961: 3949: 3941: 3940: 3918: 3916: 3915: 3910: 3908: 3907: 3902: 3901: 3887: 3885: 3884: 3879: 3877: 3865: 3863: 3862: 3857: 3855: 3854: 3849: 3834: 3833: 3828: 3816:-th tree, where 3815: 3813: 3812: 3807: 3795: 3793: 3792: 3787: 3785: 3773: 3771: 3770: 3765: 3760: 3759: 3754: 3745: 3737: 3736: 3720: 3718: 3717: 3712: 3700: 3698: 3697: 3692: 3687: 3679: 3653: 3635: 3633: 3632: 3627: 3625: 3613: 3611: 3610: 3605: 3593: 3591: 3590: 3585: 3574: 3573: 3548: 3546: 3545: 3540: 3529: 3514: 3512: 3511: 3506: 3504: 3496: 3495: 3467: 3465: 3464: 3459: 3456: 3451: 3433: 3432: 3420: 3419: 3414: 3399: 3398: 3393: 3392: 3361: 3359: 3358: 3353: 3351: 3334:is built, where 3333: 3331: 3330: 3325: 3237: 3233: 3231: 3230: 3225: 3213: 3211: 3210: 3205: 3203: 3202: 3186: 3180: 3178: 3177: 3172: 3167: 3166: 3156: 3152: 3148: 3137: 3136: 3124: 3123: 3113: 3108: 3093: 3085: 3077: 3072: 3054: 3053: 3040: 3029: 3028: 3016: 3015: 3005: 3000: 2984: 2979: 2964: 2956: 2951: 2950: 2942: 2932: 2930: 2929: 2924: 2922: 2921: 2905: 2897: 2893: 2889: 2882: 2880: 2879: 2874: 2872: 2870: 2859: 2851: 2840: 2839: 2814: 2810: 2806: 2799: 2797: 2796: 2791: 2789: 2781: 2773: 2762: 2761: 2739: 2731: 2729: 2728: 2723: 2721: 2720: 2704: 2700: 2696: 2692: 2690: 2689: 2684: 2679: 2668: 2667: 2643: 2641: 2640: 2635: 2630: 2629: 2616: 2605: 2604: 2588: 2583: 2565: 2564: 2556: 2546: 2542: 2538: 2536: 2535: 2530: 2528: 2527: 2519: 2509: 2507: 2506: 2501: 2498: 2493: 2475: 2474: 2462: 2461: 2435: 2429: 2396:Gini coefficient 2385: 2383: 2382: 2377: 2365: 2363: 2362: 2357: 2345: 2343: 2342: 2337: 2326: 2325: 2324: 2323: 2299: 2297: 2296: 2291: 2279: 2277: 2276: 2271: 2269: 2264: 2263: 2254: 2240: 2239: 2238: 2237: 2216: 2214: 2213: 2208: 2196: 2194: 2193: 2188: 2186: 2185: 2169: 2167: 2166: 2161: 2159: 2158: 2142: 2140: 2139: 2134: 2122: 2120: 2119: 2114: 2100: 2099: 2098: 2097: 2071: 2070: 2069: 2068: 2053: 2037: 2034: 2032: 2027: 2026: 2011: 2008: 2000: 1999: 1998: 1988: 1973: 1971: 1970: 1958: 1944: 1941: 1898: 1896: 1895: 1890: 1878: 1876: 1875: 1870: 1858: 1856: 1855: 1850: 1834:out-of-bag error 1831: 1829: 1828: 1823: 1820: 1815: 1797: 1796: 1784: 1783: 1765: 1764: 1759: 1758: 1694: 1692: 1691: 1686: 1674: 1672: 1671: 1666: 1654: 1652: 1651: 1646: 1644: 1639: 1620:information gain 1601: 1594: 1593: 1592: 1585: 1578: 1551: 1544: 1535:out-of-bag error 1530:cross-validation 1527: 1523: 1517: 1515: 1514: 1509: 1504: 1501: 1490: 1489: 1488: 1479: 1478: 1470: 1461: 1450: 1449: 1436: 1431: 1415: 1414: 1399: 1382: 1380: 1379: 1374: 1369: 1358: 1357: 1347: 1342: 1327: 1319: 1314: 1313: 1305: 1293: 1289: 1280: 1273: 1266: 1256: 1249: 1242: 1238: 1234: 1227: 1223: 1208: 1201: 1194: 1190: 1183: 1176: 1081:out-of-bag error 949: 942: 935: 896:Related articles 773:Confusion matrix 526:Isolation forest 471:Graphical models 250: 249: 202:Learning to rank 197:Feature learning 35:Machine learning 26: 25: 10265: 10264: 10260: 10259: 10258: 10256: 10255: 10254: 10245:Decision theory 10220: 10219: 10200: 10195: 10136: 10103: 10102: 10101: 10083: 10079: 10074: 10072:Further reading 10069: 10056: 10052: 10023: 10019: 9984: 9980: 9963: 9956: 9935: 9928: 9911: 9907: 9883:10.1.1.153.9168 9866: 9862: 9850: 9848: 9839: 9838: 9827: 9823: 9806: 9795: 9752: 9745: 9738: 9716: 9712: 9689: 9685: 9656:(2): 04021005. 9642: 9629: 9592: 9588: 9556:10.1.1.698.2365 9539: 9535: 9527:10.1.1.153.9168 9514: 9507: 9499: 9495: 9488: 9472: 9468: 9461: 9437: 9433: 9424: 9422: 9414: 9413: 9406: 9383:(14): 1986–94. 9369: 9365: 9316: 9312: 9296:10.1.1.525.3178 9278: 9272: 9268: 9247:(2): 04020022. 9237: 9233: 9196: 9192: 9179: 9175: 9130: 9126: 9120: 9116: 9111: 9107: 9102: 9098: 9093: 9089: 9084: 9080: 9075: 9071: 9045: 9039: 9035: 9026: 9024: 9020: 8989: 8983: 8979: 8966: 8962: 8931: 8927: 8914: 8910: 8901: 8899: 8895: 8852: 8842: 8835: 8826: 8825: 8821: 8816: 8812: 8802: 8800: 8797: 8791: 8784: 8742: 8731: 8723: 8682: 8676: 8669: 8630: 8623: 8615: 8572: 8566: 8559: 8552: 8525: 8508: 8476: 8470: 8459: 8449: 8447: 8443: 8436: 8428: 8417: 8413: 8398: 8367: 8347: 8325: 8321: 8275: 8268: 8264: 8252: 8248: 8240: 8223: 8211: 8206: 8195: 8194: 8185: 8183: 8180: 8179: 8157: 8154: 8153: 8130: 8126: 8121: 8116: 8113: 8112: 8090: 8087: 8086: 8083: 8061: 8057: 8014: 8007: 8003: 7997: 7993: 7984: 7980: 7972: 7955: 7943: 7938: 7927: 7926: 7917: 7915: 7912: 7911: 7895: 7892: 7891: 7868: 7864: 7862: 7859: 7858: 7835: 7831: 7826: 7821: 7818: 7817: 7795: 7792: 7791: 7788: 7763: 7760: 7759: 7742: 7738: 7724: 7721: 7720: 7704: 7702: 7699: 7698: 7675: 7671: 7669: 7666: 7665: 7649: 7647: 7644: 7643: 7627: 7624: 7623: 7598: 7584: 7581: 7580: 7577: 7547: 7543: 7525: 7520: 7516: 7510: 7506: 7492: 7477: 7466: 7465: 7464: 7456: 7452: 7445: 7441: 7432: 7428: 7427: 7425: 7417: 7409: 7394: 7383: 7382: 7381: 7370: 7355: 7351: 7346: 7344: 7341: 7340: 7315: 7309: 7305: 7287: 7281: 7280: 7279: 7270: 7266: 7246: 7237: 7233: 7221: 7217: 7208: 7204: 7193: 7190: 7189: 7166: 7160: 7156: 7138: 7132: 7131: 7130: 7121: 7117: 7100: 7091: 7087: 7078: 7074: 7063: 7060: 7059: 7021: 7012: 7008: 6997: 6994: 6993: 6970: 6966: 6951: 6947: 6932: 6928: 6923: 6920: 6919: 6899: 6896: 6895: 6892: 6865: 6850: 6839: 6838: 6837: 6829: 6825: 6818: 6814: 6805: 6801: 6800: 6798: 6790: 6782: 6767: 6756: 6755: 6754: 6743: 6728: 6724: 6719: 6717: 6714: 6713: 6693: 6689: 6679: 6675: 6667: 6666: 6660: 6656: 6650: 6639: 6625: 6616: 6612: 6609: and  6607: 6601: 6597: 6580: 6571: 6567: 6558: 6554: 6552: 6549: 6548: 6528: 6524: 6509: 6505: 6500: 6497: 6496: 6489: 6484: 6460: 6456: 6436: 6431: 6416: 6410: 6400: 6394: 6390: 6385: 6375: 6371: 6370: 6368: 6354: 6350: 6349: 6338: 6329: 6323: 6319: 6314: 6307: 6303: 6297: 6286: 6276: 6262: 6258: 6257: 6245: 6241: 6229: 6225: 6224: 6216: 6214: 6200: 6196: 6190: 6179: 6166: 6162: 6147: 6143: 6142: 6127: 6119: 6107: 6102: 6096: 6093: 6092: 6072: 6068: 6053: 6049: 6041: 6026: 6015: 6014: 6013: 6011: 6008: 6007: 6004: 5980: 5976: 5956: 5948: 5943: 5928: 5924: 5916: 5912: 5911: 5907: 5892: 5888: 5880: 5876: 5875: 5871: 5867: 5862: 5861: 5855: 5844: 5834: 5820: 5816: 5815: 5803: 5799: 5787: 5783: 5782: 5774: 5772: 5758: 5754: 5748: 5737: 5724: 5720: 5705: 5701: 5700: 5685: 5677: 5665: 5660: 5654: 5651: 5650: 5630: 5626: 5611: 5607: 5599: 5584: 5573: 5572: 5571: 5569: 5566: 5565: 5549: 5546: 5545: 5542: 5514: 5509: 5508: 5500: 5485: 5481: 5475: 5464: 5459: 5449: 5444: 5443: 5435: 5420: 5416: 5410: 5406: 5400: 5389: 5384: 5382: 5370: 5366: 5351: 5347: 5339: 5324: 5313: 5312: 5311: 5309: 5306: 5305: 5289: 5287: 5284: 5283: 5267: 5265: 5262: 5261: 5239: 5235: 5227: 5218: 5214: 5206: 5205: 5200: 5199: 5193: 5182: 5168: 5157: 5149: 5134: 5130: 5128: 5125: 5124: 5108: 5105: 5104: 5088: 5086: 5083: 5082: 5065: 5061: 5059: 5056: 5055: 5030: 5026: 5018: 5009: 5005: 4996: 4991: 4990: 4989: 4984: 4983: 4977: 4973: 4967: 4956: 4946: 4935: 4919: 4915: 4907: 4898: 4894: 4888: 4877: 4872: 4867: 4855: 4851: 4836: 4832: 4824: 4809: 4798: 4797: 4796: 4794: 4791: 4790: 4762: 4758: 4750: 4741: 4737: 4736: 4724: 4720: 4712: 4703: 4699: 4690: 4685: 4684: 4683: 4678: 4677: 4671: 4667: 4666: 4664: 4658: 4647: 4642: 4638: 4632: 4621: 4607: 4595: 4591: 4576: 4572: 4564: 4549: 4545: 4543: 4540: 4539: 4522: 4518: 4498: 4496: 4493: 4492: 4467: 4463: 4455: 4446: 4442: 4433: 4428: 4427: 4426: 4421: 4420: 4414: 4403: 4387: 4383: 4375: 4366: 4362: 4360: 4357: 4356: 4339: 4333: 4332: 4331: 4329: 4326: 4325: 4308: 4304: 4302: 4299: 4298: 4282: 4280: 4277: 4276: 4256: 4252: 4244: 4235: 4231: 4229: 4226: 4225: 4202: 4198: 4190: 4181: 4177: 4176: 4164: 4160: 4152: 4143: 4139: 4130: 4125: 4124: 4123: 4118: 4117: 4111: 4107: 4106: 4104: 4098: 4087: 4074: 4070: 4068: 4065: 4064: 4044: 4040: 4032: 4023: 4019: 4013: 4002: 3988: 3976: 3972: 3957: 3953: 3945: 3930: 3926: 3924: 3921: 3920: 3903: 3897: 3896: 3895: 3893: 3890: 3889: 3873: 3871: 3868: 3867: 3850: 3845: 3844: 3829: 3824: 3823: 3821: 3818: 3817: 3801: 3798: 3797: 3781: 3779: 3776: 3775: 3755: 3750: 3749: 3741: 3732: 3728: 3726: 3723: 3722: 3706: 3703: 3702: 3683: 3675: 3649: 3641: 3638: 3637: 3621: 3619: 3616: 3615: 3599: 3596: 3595: 3569: 3565: 3554: 3551: 3550: 3525: 3520: 3517: 3516: 3500: 3491: 3487: 3473: 3470: 3469: 3452: 3441: 3428: 3424: 3415: 3410: 3409: 3394: 3388: 3387: 3386: 3384: 3381: 3380: 3377: 3368: 3347: 3339: 3336: 3335: 3319: 3316: 3315: 3311: 3306: 3286: 3270: 3254: 3245: 3235: 3219: 3216: 3215: 3198: 3194: 3192: 3189: 3188: 3184: 3162: 3158: 3141: 3132: 3128: 3119: 3115: 3109: 3098: 3084: 3083: 3079: 3073: 3062: 3049: 3045: 3033: 3024: 3020: 3011: 3007: 3001: 2990: 2980: 2969: 2955: 2941: 2940: 2938: 2935: 2934: 2917: 2913: 2911: 2908: 2907: 2903: 2895: 2891: 2888: 2884: 2863: 2858: 2844: 2835: 2831: 2823: 2820: 2819: 2812: 2808: 2805: 2801: 2780: 2766: 2757: 2753: 2745: 2742: 2741: 2737: 2716: 2712: 2710: 2707: 2706: 2702: 2698: 2694: 2672: 2663: 2659: 2651: 2648: 2647: 2625: 2621: 2609: 2600: 2596: 2584: 2573: 2555: 2554: 2552: 2549: 2548: 2544: 2540: 2539:for new points 2518: 2517: 2515: 2512: 2511: 2494: 2483: 2470: 2466: 2457: 2453: 2445: 2442: 2441: 2433: 2427: 2423: 2371: 2368: 2367: 2351: 2348: 2347: 2319: 2315: 2314: 2310: 2305: 2302: 2301: 2285: 2282: 2281: 2259: 2255: 2253: 2233: 2229: 2228: 2224: 2222: 2219: 2218: 2202: 2199: 2198: 2197:indicates tree 2181: 2177: 2175: 2172: 2171: 2154: 2150: 2148: 2145: 2144: 2128: 2125: 2124: 2093: 2089: 2088: 2084: 2064: 2060: 2059: 2055: 2033: 2028: 2022: 2018: 2007: 2006: 1994: 1990: 1989: 1978: 1966: 1962: 1957: 1940: 1938: 1935: 1934: 1930: 1884: 1881: 1880: 1864: 1861: 1860: 1844: 1841: 1840: 1816: 1805: 1792: 1788: 1779: 1775: 1760: 1754: 1753: 1752: 1750: 1747: 1746: 1743: 1726: 1721: 1701: 1680: 1677: 1676: 1660: 1657: 1656: 1638: 1636: 1633: 1632: 1608: 1596: 1590: 1589: 1587: 1583: 1576: 1564: 1558: 1550: 1546: 1543: 1539: 1525: 1521: 1491: 1484: 1480: 1469: 1468: 1454: 1445: 1441: 1432: 1421: 1416: 1413: 1405: 1402: 1401: 1397: 1362: 1353: 1349: 1343: 1332: 1318: 1304: 1303: 1301: 1298: 1297: 1291: 1287: 1284: 1279: 1275: 1272: 1268: 1265: 1261: 1255: 1251: 1248: 1244: 1240: 1236: 1232: 1225: 1221: 1207: 1203: 1200: 1196: 1192: 1191:with responses 1189: 1185: 1182: 1178: 1174: 1151: 1145: 1119: 1113: 1108: 1035: 953: 924: 923: 897: 889: 888: 849: 841: 840: 801:Kernel machines 796: 788: 787: 763: 755: 754: 735:Active learning 730: 722: 721: 690: 680: 679: 605:Diffusion model 541: 531: 530: 503: 493: 492: 466: 456: 455: 411:Factor analysis 406: 396: 395: 379: 342: 332: 331: 252: 251: 235: 234: 233: 222: 221: 127: 119: 118: 84:Online learning 49: 37: 24: 21: 12: 11: 5: 10263: 10253: 10252: 10247: 10242: 10240:Decision trees 10237: 10232: 10218: 10217: 10207: 10199: 10198:External links 10196: 10194: 10193: 10140: 10134: 10104: 10084: 10077: 10076: 10075: 10073: 10070: 10068: 10067: 10050: 10017: 9978: 9954: 9926: 9905: 9860: 9851:|journal= 9821: 9793: 9743: 9736: 9710: 9683: 9627: 9586: 9549:(1): 118–138. 9533: 9505: 9493: 9486: 9466: 9459: 9431: 9404: 9377:Bioinformatics 9363: 9310: 9266: 9231: 9210:(10): 1340–7. 9204:Bioinformatics 9190: 9173: 9124: 9114: 9105: 9096: 9087: 9078: 9069: 9033: 9000:(2): 102–112. 8977: 8960: 8947:(2): 139–157. 8925: 8923:, pp. 138-149. 8908: 8872:10.1.1.57.6069 8833: 8819: 8810: 8782: 8729: 8726:on 2018-01-18. 8700:10.1.1.33.4131 8693:(5): 473–490. 8667: 8621: 8618:on 2018-01-18. 8592:10.1.1.25.6750 8557: 8550: 8528:Hastie, Trevor 8506: 8487:(8): 832–844. 8472:Ho TK (1998). 8457: 8414: 8412: 8409: 8408: 8407: 8401: 8392: 8386: 8380: 8374: 8366: 8363: 8346: 8343: 8328: 8324: 8320: 8317: 8314: 8311: 8306: 8303: 8300: 8297: 8294: 8291: 8288: 8285: 8282: 8278: 8274: 8271: 8267: 8263: 8260: 8255: 8251: 8247: 8243: 8239: 8236: 8233: 8230: 8226: 8222: 8217: 8214: 8209: 8202: 8199: 8192: 8188: 8167: 8164: 8161: 8141: 8138: 8133: 8129: 8124: 8120: 8100: 8097: 8094: 8082: 8079: 8064: 8060: 8056: 8053: 8050: 8047: 8042: 8039: 8036: 8033: 8030: 8027: 8024: 8021: 8017: 8013: 8010: 8006: 8000: 7996: 7992: 7987: 7983: 7979: 7975: 7971: 7968: 7965: 7962: 7958: 7954: 7949: 7946: 7941: 7934: 7931: 7924: 7920: 7899: 7879: 7876: 7871: 7867: 7846: 7843: 7838: 7834: 7829: 7825: 7805: 7802: 7799: 7787: 7784: 7767: 7745: 7741: 7737: 7734: 7731: 7728: 7707: 7686: 7683: 7678: 7674: 7652: 7631: 7611: 7608: 7605: 7601: 7597: 7594: 7591: 7588: 7576: 7573: 7560: 7556: 7550: 7546: 7540: 7537: 7534: 7531: 7528: 7524: 7519: 7513: 7509: 7505: 7502: 7499: 7495: 7491: 7486: 7483: 7480: 7473: 7470: 7459: 7455: 7448: 7444: 7440: 7435: 7431: 7424: 7420: 7416: 7412: 7408: 7403: 7400: 7397: 7390: 7387: 7380: 7377: 7373: 7369: 7364: 7361: 7358: 7354: 7349: 7337: 7336: 7325: 7322: 7318: 7312: 7308: 7304: 7301: 7298: 7295: 7290: 7284: 7278: 7273: 7269: 7265: 7262: 7259: 7256: 7253: 7249: 7245: 7240: 7236: 7232: 7229: 7224: 7220: 7216: 7211: 7207: 7203: 7200: 7197: 7187: 7176: 7173: 7169: 7163: 7159: 7155: 7152: 7149: 7146: 7141: 7135: 7129: 7124: 7120: 7116: 7113: 7110: 7107: 7103: 7099: 7094: 7090: 7086: 7081: 7077: 7073: 7070: 7067: 7057: 7046: 7043: 7040: 7037: 7034: 7031: 7028: 7024: 7020: 7015: 7011: 7007: 7004: 7001: 6978: 6973: 6969: 6965: 6962: 6959: 6954: 6950: 6946: 6943: 6940: 6935: 6931: 6927: 6903: 6891: 6888: 6875: 6872: 6868: 6864: 6859: 6856: 6853: 6846: 6843: 6832: 6828: 6821: 6817: 6813: 6808: 6804: 6797: 6793: 6789: 6785: 6781: 6776: 6773: 6770: 6763: 6760: 6753: 6750: 6746: 6742: 6737: 6734: 6731: 6727: 6722: 6701: 6696: 6692: 6688: 6682: 6678: 6674: 6670: 6663: 6659: 6653: 6648: 6645: 6642: 6638: 6632: 6629: 6624: 6619: 6615: 6604: 6600: 6596: 6593: 6590: 6587: 6583: 6579: 6574: 6570: 6566: 6561: 6557: 6536: 6531: 6527: 6523: 6520: 6517: 6512: 6508: 6504: 6488: 6485: 6483: 6480: 6468: 6463: 6459: 6455: 6452: 6449: 6446: 6443: 6439: 6429: 6422: 6419: 6413: 6408: 6403: 6397: 6393: 6388: 6384: 6381: 6378: 6374: 6365: 6362: 6357: 6353: 6347: 6344: 6341: 6337: 6332: 6326: 6322: 6317: 6313: 6310: 6306: 6300: 6295: 6292: 6289: 6285: 6279: 6274: 6269: 6266: 6261: 6253: 6248: 6244: 6240: 6237: 6232: 6228: 6222: 6219: 6211: 6208: 6203: 6199: 6193: 6188: 6185: 6182: 6178: 6174: 6169: 6165: 6161: 6158: 6155: 6150: 6146: 6141: 6137: 6134: 6130: 6126: 6122: 6118: 6113: 6110: 6105: 6101: 6080: 6075: 6071: 6067: 6064: 6061: 6056: 6052: 6048: 6044: 6040: 6035: 6032: 6029: 6022: 6019: 6003: 6000: 5988: 5983: 5979: 5975: 5972: 5969: 5966: 5963: 5959: 5955: 5951: 5941: 5936: 5931: 5927: 5919: 5915: 5910: 5906: 5903: 5900: 5895: 5891: 5883: 5879: 5874: 5870: 5865: 5858: 5853: 5850: 5847: 5843: 5837: 5832: 5827: 5824: 5819: 5811: 5806: 5802: 5798: 5795: 5790: 5786: 5780: 5777: 5769: 5766: 5761: 5757: 5751: 5746: 5743: 5740: 5736: 5732: 5727: 5723: 5719: 5716: 5713: 5708: 5704: 5699: 5695: 5692: 5688: 5684: 5680: 5676: 5671: 5668: 5663: 5659: 5638: 5633: 5629: 5625: 5622: 5619: 5614: 5610: 5606: 5602: 5598: 5593: 5590: 5587: 5580: 5577: 5553: 5541: 5538: 5522: 5517: 5512: 5507: 5503: 5499: 5494: 5491: 5488: 5484: 5478: 5473: 5470: 5467: 5463: 5457: 5452: 5447: 5442: 5438: 5434: 5429: 5426: 5423: 5419: 5413: 5409: 5403: 5398: 5395: 5392: 5388: 5381: 5378: 5373: 5369: 5365: 5362: 5359: 5354: 5350: 5346: 5342: 5338: 5333: 5330: 5327: 5320: 5317: 5292: 5270: 5247: 5242: 5238: 5234: 5230: 5226: 5221: 5217: 5213: 5209: 5203: 5196: 5191: 5188: 5185: 5181: 5175: 5172: 5167: 5164: 5160: 5156: 5152: 5148: 5143: 5140: 5137: 5133: 5112: 5091: 5068: 5064: 5043: 5038: 5033: 5029: 5025: 5021: 5017: 5012: 5008: 5004: 4999: 4994: 4987: 4980: 4976: 4970: 4965: 4962: 4959: 4955: 4949: 4944: 4941: 4938: 4934: 4927: 4922: 4918: 4914: 4910: 4906: 4901: 4897: 4891: 4886: 4883: 4880: 4876: 4871: 4866: 4863: 4858: 4854: 4850: 4847: 4844: 4839: 4835: 4831: 4827: 4823: 4818: 4815: 4812: 4805: 4802: 4777: 4770: 4765: 4761: 4757: 4753: 4749: 4744: 4740: 4732: 4727: 4723: 4719: 4715: 4711: 4706: 4702: 4698: 4693: 4688: 4681: 4674: 4670: 4661: 4656: 4653: 4650: 4646: 4641: 4635: 4630: 4627: 4624: 4620: 4614: 4611: 4606: 4603: 4598: 4594: 4590: 4587: 4584: 4579: 4575: 4571: 4567: 4563: 4558: 4555: 4552: 4548: 4525: 4521: 4517: 4514: 4511: 4508: 4505: 4501: 4475: 4470: 4466: 4462: 4458: 4454: 4449: 4445: 4441: 4436: 4431: 4424: 4417: 4412: 4409: 4406: 4402: 4398: 4395: 4390: 4386: 4382: 4378: 4374: 4369: 4365: 4342: 4336: 4311: 4307: 4285: 4264: 4259: 4255: 4251: 4247: 4243: 4238: 4234: 4210: 4205: 4201: 4197: 4193: 4189: 4184: 4180: 4172: 4167: 4163: 4159: 4155: 4151: 4146: 4142: 4138: 4133: 4128: 4121: 4114: 4110: 4101: 4096: 4093: 4090: 4086: 4082: 4077: 4073: 4052: 4047: 4043: 4039: 4035: 4031: 4026: 4022: 4016: 4011: 4008: 4005: 4001: 3995: 3992: 3987: 3984: 3979: 3975: 3971: 3968: 3965: 3960: 3956: 3952: 3948: 3944: 3939: 3936: 3933: 3929: 3906: 3900: 3876: 3853: 3848: 3843: 3840: 3837: 3832: 3827: 3805: 3784: 3763: 3758: 3753: 3748: 3744: 3740: 3735: 3731: 3710: 3690: 3686: 3682: 3678: 3674: 3671: 3668: 3665: 3662: 3659: 3656: 3652: 3648: 3645: 3624: 3603: 3583: 3580: 3577: 3572: 3568: 3564: 3561: 3558: 3538: 3535: 3532: 3528: 3524: 3503: 3499: 3494: 3490: 3486: 3483: 3480: 3477: 3455: 3450: 3447: 3444: 3440: 3436: 3431: 3427: 3423: 3418: 3413: 3408: 3405: 3402: 3397: 3391: 3376: 3373: 3367: 3366:Uniform forest 3364: 3350: 3346: 3343: 3323: 3310: 3307: 3305: 3302: 3293:kernel methods 3285: 3282: 3278:kernel methods 3274:kernel methods 3269: 3266: 3253: 3250: 3244: 3241: 3223: 3201: 3197: 3170: 3165: 3161: 3155: 3151: 3147: 3144: 3140: 3135: 3131: 3127: 3122: 3118: 3112: 3107: 3104: 3101: 3097: 3091: 3088: 3082: 3076: 3071: 3068: 3065: 3061: 3057: 3052: 3048: 3043: 3039: 3036: 3032: 3027: 3023: 3019: 3014: 3010: 3004: 2999: 2996: 2993: 2989: 2983: 2978: 2975: 2972: 2968: 2962: 2959: 2954: 2948: 2945: 2920: 2916: 2900: 2899: 2890:is one of the 2886: 2869: 2866: 2862: 2857: 2854: 2850: 2847: 2843: 2838: 2834: 2830: 2827: 2816: 2807:is one of the 2803: 2787: 2784: 2779: 2776: 2772: 2769: 2765: 2760: 2756: 2752: 2749: 2719: 2715: 2682: 2678: 2675: 2671: 2666: 2662: 2658: 2655: 2633: 2628: 2624: 2619: 2615: 2612: 2608: 2603: 2599: 2595: 2592: 2587: 2582: 2579: 2576: 2572: 2568: 2562: 2559: 2525: 2522: 2497: 2492: 2489: 2486: 2482: 2478: 2473: 2469: 2465: 2460: 2456: 2452: 2449: 2422: 2419: 2418: 2417: 2414: 2404: 2403: 2398: 2393: 2375: 2355: 2335: 2332: 2329: 2322: 2318: 2313: 2309: 2289: 2267: 2262: 2258: 2252: 2249: 2246: 2243: 2236: 2232: 2227: 2206: 2184: 2180: 2157: 2153: 2132: 2112: 2109: 2106: 2103: 2096: 2092: 2087: 2083: 2080: 2077: 2074: 2067: 2063: 2058: 2052: 2049: 2046: 2043: 2040: 2035:split variable 2031: 2025: 2021: 2017: 2014: 2005: 1997: 1993: 1987: 1984: 1981: 1977: 1969: 1965: 1961: 1956: 1953: 1950: 1947: 1929: 1926: 1925: 1924: 1921: 1918: 1888: 1868: 1848: 1819: 1814: 1811: 1808: 1804: 1800: 1795: 1791: 1787: 1782: 1778: 1774: 1771: 1768: 1763: 1757: 1742: 1739: 1725: 1722: 1720: 1717: 1716: 1715: 1712: 1709: 1700: 1697: 1684: 1664: 1642: 1607: 1604: 1560:Main article: 1557: 1554: 1548: 1541: 1507: 1500: 1497: 1494: 1487: 1483: 1476: 1473: 1467: 1464: 1460: 1457: 1453: 1448: 1444: 1440: 1435: 1430: 1427: 1424: 1420: 1412: 1409: 1372: 1368: 1365: 1361: 1356: 1352: 1346: 1341: 1338: 1335: 1331: 1325: 1322: 1317: 1311: 1308: 1283: 1282: 1277: 1270: 1263: 1258: 1253: 1246: 1219: 1205: 1198: 1187: 1180: 1147:Main article: 1144: 1141: 1115:Main article: 1112: 1109: 1107: 1104: 1092: 1091: 1088: 1034: 1031: 980:decision trees 972:classification 960:Random forests 955: 954: 952: 951: 944: 937: 929: 926: 925: 922: 921: 916: 915: 914: 904: 898: 895: 894: 891: 890: 887: 886: 881: 876: 871: 866: 861: 856: 850: 847: 846: 843: 842: 839: 838: 833: 828: 823: 821:Occam learning 818: 813: 808: 803: 797: 794: 793: 790: 789: 786: 785: 780: 778:Learning curve 775: 770: 764: 761: 760: 757: 756: 753: 752: 747: 742: 737: 731: 728: 727: 724: 723: 720: 719: 718: 717: 707: 702: 697: 691: 686: 685: 682: 681: 678: 677: 671: 666: 661: 656: 655: 654: 644: 639: 638: 637: 632: 627: 622: 612: 607: 602: 597: 596: 595: 585: 584: 583: 578: 573: 568: 558: 553: 548: 542: 537: 536: 533: 532: 529: 528: 523: 518: 510: 504: 499: 498: 495: 494: 491: 490: 489: 488: 483: 478: 467: 462: 461: 458: 457: 454: 453: 448: 443: 438: 433: 428: 423: 418: 413: 407: 402: 401: 398: 397: 394: 393: 388: 383: 377: 372: 367: 359: 354: 349: 343: 338: 337: 334: 333: 330: 329: 324: 319: 314: 309: 304: 299: 294: 286: 285: 284: 279: 274: 264: 262:Decision trees 259: 253: 239:classification 229: 228: 227: 224: 223: 220: 219: 214: 209: 204: 199: 194: 189: 184: 179: 174: 169: 164: 159: 154: 149: 144: 139: 134: 132:Classification 128: 125: 124: 121: 120: 117: 116: 111: 106: 101: 96: 91: 89:Batch learning 86: 81: 76: 71: 66: 61: 56: 50: 47: 46: 43: 42: 31: 30: 22: 9: 6: 4: 3: 2: 10262: 10251: 10248: 10246: 10243: 10241: 10238: 10236: 10233: 10231: 10228: 10227: 10225: 10215: 10211: 10208: 10205: 10202: 10201: 10190: 10186: 10181: 10176: 10171: 10166: 10162: 10158: 10154: 10150: 10146: 10141: 10137: 10131: 10127: 10123: 10119: 10115: 10111: 10106: 10105: 10099: 10098: 10097: 10096:Random forest 10091: 10087: 10063: 10062: 10054: 10045: 10040: 10036: 10032: 10028: 10021: 10013: 10009: 10005: 10001: 9997: 9993: 9989: 9982: 9973: 9968: 9961: 9959: 9949: 9948:10.1.1.618.90 9944: 9940: 9933: 9931: 9921: 9916: 9909: 9901: 9897: 9893: 9889: 9884: 9879: 9875: 9871: 9864: 9856: 9843: 9835: 9831: 9825: 9816: 9811: 9804: 9802: 9800: 9798: 9789: 9785: 9781: 9777: 9773: 9769: 9765: 9761: 9757: 9750: 9748: 9739: 9733: 9729: 9725: 9721: 9714: 9706: 9702: 9698: 9694: 9687: 9679: 9675: 9671: 9667: 9663: 9659: 9655: 9651: 9647: 9640: 9638: 9636: 9634: 9632: 9623: 9619: 9614: 9609: 9606:(4): 547–57. 9605: 9601: 9597: 9590: 9582: 9578: 9574: 9570: 9566: 9562: 9557: 9552: 9548: 9544: 9537: 9528: 9523: 9519: 9512: 9510: 9503:31. Aug. 2023 9502: 9497: 9489: 9483: 9479: 9478: 9470: 9462: 9456: 9452: 9448: 9444: 9443: 9435: 9421: 9417: 9411: 9409: 9400: 9396: 9391: 9386: 9382: 9378: 9374: 9367: 9359: 9355: 9351: 9347: 9343: 9339: 9334: 9329: 9325: 9321: 9314: 9306: 9302: 9297: 9292: 9288: 9284: 9277: 9270: 9262: 9258: 9254: 9250: 9246: 9242: 9235: 9227: 9223: 9218: 9213: 9209: 9205: 9201: 9194: 9186: 9185: 9177: 9169: 9165: 9160: 9155: 9151: 9147: 9143: 9139: 9135: 9128: 9118: 9109: 9100: 9091: 9082: 9073: 9064: 9059: 9055: 9051: 9044: 9037: 9023:on 2016-04-17 9019: 9015: 9011: 9007: 9003: 8999: 8995: 8988: 8981: 8973: 8972: 8964: 8955: 8950: 8946: 8942: 8941: 8936: 8929: 8922: 8918: 8912: 8898:on 2018-02-05 8894: 8890: 8886: 8882: 8878: 8873: 8868: 8864: 8860: 8859: 8851: 8847: 8840: 8838: 8829: 8823: 8814: 8796: 8789: 8787: 8777: 8772: 8768: 8764: 8760: 8756: 8755: 8750: 8746: 8740: 8738: 8736: 8734: 8722: 8718: 8714: 8710: 8706: 8701: 8696: 8692: 8688: 8681: 8674: 8672: 8663: 8659: 8654: 8649: 8645: 8641: 8640: 8635: 8628: 8626: 8614: 8610: 8606: 8602: 8598: 8593: 8588: 8584: 8580: 8579: 8571: 8564: 8562: 8553: 8551:0-387-95284-5 8547: 8543: 8542: 8537: 8533: 8529: 8523: 8521: 8519: 8517: 8515: 8513: 8511: 8502: 8498: 8494: 8490: 8486: 8482: 8475: 8468: 8466: 8464: 8462: 8442: 8435: 8434: 8426: 8424: 8422: 8420: 8415: 8405: 8402: 8396: 8393: 8390: 8387: 8384: 8381: 8378: 8375: 8372: 8369: 8368: 8362: 8360: 8356: 8352: 8345:Disadvantages 8342: 8326: 8318: 8315: 8312: 8301: 8298: 8295: 8292: 8289: 8286: 8283: 8276: 8272: 8269: 8265: 8261: 8258: 8253: 8234: 8231: 8215: 8212: 8207: 8197: 8165: 8162: 8159: 8131: 8127: 8122: 8118: 8092: 8078: 8062: 8054: 8051: 8048: 8037: 8034: 8031: 8028: 8025: 8022: 8015: 8011: 8008: 8004: 7998: 7994: 7990: 7985: 7966: 7963: 7947: 7944: 7939: 7929: 7897: 7877: 7874: 7869: 7865: 7836: 7832: 7827: 7823: 7797: 7783: 7781: 7765: 7743: 7735: 7732: 7729: 7681: 7676: 7672: 7629: 7609: 7606: 7592: 7589: 7586: 7571: 7558: 7554: 7548: 7544: 7538: 7535: 7532: 7529: 7526: 7517: 7511: 7507: 7503: 7500: 7484: 7481: 7468: 7457: 7453: 7446: 7442: 7438: 7433: 7429: 7422: 7401: 7398: 7385: 7378: 7362: 7359: 7352: 7323: 7320: 7316: 7310: 7306: 7302: 7299: 7296: 7288: 7276: 7271: 7267: 7263: 7251: 7238: 7234: 7227: 7214: 7209: 7205: 7198: 7188: 7174: 7171: 7167: 7161: 7157: 7153: 7150: 7147: 7139: 7127: 7122: 7118: 7114: 7105: 7092: 7088: 7084: 7079: 7075: 7068: 7058: 7044: 7041: 7038: 7026: 7013: 7009: 7002: 6992: 6991: 6990: 6971: 6967: 6960: 6952: 6948: 6941: 6933: 6929: 6915: 6901: 6886: 6873: 6857: 6854: 6851: 6841: 6830: 6826: 6819: 6815: 6811: 6806: 6802: 6795: 6774: 6771: 6768: 6758: 6751: 6735: 6732: 6729: 6725: 6699: 6694: 6690: 6686: 6680: 6672: 6661: 6657: 6651: 6646: 6643: 6640: 6636: 6630: 6627: 6622: 6617: 6613: 6602: 6598: 6594: 6585: 6572: 6568: 6564: 6559: 6555: 6529: 6525: 6518: 6510: 6506: 6492: 6479: 6466: 6461: 6453: 6450: 6447: 6441: 6427: 6420: 6417: 6411: 6406: 6395: 6391: 6382: 6379: 6376: 6372: 6363: 6360: 6355: 6351: 6345: 6342: 6339: 6335: 6324: 6320: 6311: 6308: 6304: 6298: 6293: 6290: 6287: 6283: 6277: 6272: 6267: 6264: 6259: 6251: 6246: 6242: 6238: 6235: 6230: 6226: 6220: 6217: 6209: 6206: 6201: 6197: 6191: 6186: 6183: 6180: 6176: 6172: 6167: 6163: 6159: 6156: 6153: 6148: 6144: 6139: 6135: 6124: 6111: 6108: 6103: 6099: 6073: 6065: 6062: 6059: 6054: 6046: 6033: 6030: 6027: 6017: 5999: 5986: 5981: 5973: 5970: 5967: 5961: 5953: 5939: 5929: 5925: 5917: 5913: 5908: 5901: 5893: 5889: 5881: 5877: 5872: 5856: 5851: 5848: 5845: 5841: 5835: 5830: 5825: 5822: 5817: 5809: 5804: 5800: 5796: 5793: 5788: 5784: 5778: 5775: 5767: 5764: 5759: 5755: 5749: 5744: 5741: 5738: 5734: 5730: 5725: 5721: 5717: 5714: 5711: 5706: 5702: 5697: 5693: 5682: 5669: 5666: 5661: 5657: 5631: 5623: 5620: 5617: 5612: 5604: 5591: 5588: 5585: 5575: 5551: 5540:Centered KeRF 5537: 5515: 5505: 5492: 5489: 5486: 5482: 5476: 5471: 5468: 5465: 5461: 5450: 5440: 5427: 5424: 5421: 5417: 5411: 5407: 5401: 5396: 5393: 5390: 5386: 5379: 5371: 5363: 5360: 5357: 5352: 5344: 5331: 5328: 5325: 5315: 5240: 5232: 5219: 5215: 5211: 5194: 5189: 5186: 5183: 5179: 5173: 5170: 5165: 5154: 5141: 5138: 5135: 5131: 5110: 5066: 5062: 5041: 5031: 5023: 5010: 5006: 5002: 4997: 4978: 4974: 4968: 4963: 4960: 4957: 4953: 4947: 4942: 4939: 4936: 4932: 4920: 4912: 4899: 4895: 4889: 4884: 4881: 4878: 4874: 4869: 4864: 4856: 4848: 4845: 4842: 4837: 4829: 4816: 4813: 4810: 4800: 4775: 4763: 4755: 4742: 4738: 4725: 4717: 4704: 4700: 4696: 4691: 4672: 4668: 4659: 4654: 4651: 4648: 4644: 4639: 4633: 4628: 4625: 4622: 4618: 4612: 4609: 4604: 4596: 4588: 4585: 4582: 4577: 4569: 4556: 4553: 4550: 4546: 4523: 4515: 4512: 4509: 4503: 4489: 4468: 4460: 4447: 4443: 4439: 4434: 4415: 4410: 4407: 4404: 4400: 4396: 4388: 4380: 4367: 4363: 4340: 4309: 4257: 4249: 4236: 4232: 4203: 4195: 4182: 4178: 4165: 4157: 4144: 4140: 4136: 4131: 4112: 4108: 4099: 4094: 4091: 4088: 4084: 4080: 4075: 4071: 4045: 4037: 4024: 4020: 4014: 4009: 4006: 4003: 3999: 3993: 3990: 3985: 3977: 3969: 3966: 3963: 3958: 3950: 3937: 3934: 3931: 3927: 3904: 3851: 3841: 3838: 3835: 3830: 3803: 3756: 3746: 3733: 3729: 3708: 3680: 3672: 3669: 3663: 3657: 3643: 3601: 3578: 3570: 3566: 3559: 3533: 3530: 3497: 3492: 3484: 3481: 3478: 3453: 3448: 3445: 3442: 3429: 3425: 3421: 3416: 3400: 3395: 3372: 3363: 3344: 3341: 3321: 3301: 3298: 3294: 3290: 3281: 3279: 3275: 3265: 3263: 3259: 3249: 3240: 3221: 3199: 3195: 3181: 3168: 3163: 3159: 3153: 3145: 3142: 3138: 3133: 3129: 3120: 3116: 3110: 3105: 3102: 3099: 3095: 3089: 3086: 3080: 3074: 3069: 3066: 3063: 3059: 3055: 3050: 3046: 3037: 3034: 3030: 3025: 3021: 3012: 3008: 3002: 2997: 2994: 2991: 2987: 2981: 2976: 2973: 2970: 2966: 2960: 2957: 2952: 2943: 2918: 2914: 2867: 2864: 2860: 2855: 2848: 2845: 2841: 2836: 2832: 2825: 2817: 2785: 2782: 2777: 2770: 2767: 2763: 2758: 2754: 2747: 2735: 2734: 2733: 2717: 2713: 2676: 2673: 2669: 2664: 2660: 2653: 2644: 2631: 2626: 2622: 2613: 2610: 2606: 2601: 2597: 2590: 2585: 2580: 2577: 2574: 2570: 2566: 2557: 2520: 2495: 2490: 2487: 2484: 2471: 2467: 2463: 2458: 2454: 2439: 2431: 2415: 2412: 2411: 2410: 2407: 2402: 2399: 2397: 2394: 2392: 2389: 2388: 2387: 2373: 2353: 2330: 2320: 2316: 2311: 2287: 2265: 2260: 2256: 2250: 2244: 2234: 2230: 2225: 2204: 2182: 2178: 2155: 2151: 2130: 2110: 2104: 2094: 2090: 2085: 2075: 2065: 2061: 2056: 2050: 2047: 2041: 2023: 2019: 2015: 2012: 2003: 1995: 1991: 1985: 1982: 1979: 1975: 1967: 1963: 1959: 1954: 1948: 1922: 1919: 1916: 1912: 1911: 1910: 1907: 1906: 1900: 1886: 1866: 1846: 1837: 1835: 1817: 1812: 1809: 1806: 1793: 1789: 1785: 1780: 1776: 1766: 1761: 1738: 1736: 1732: 1713: 1710: 1707: 1706: 1705: 1696: 1682: 1662: 1640: 1629: 1625: 1624:Gini impurity 1621: 1617: 1613: 1603: 1599: 1580: 1574: 1570: 1563: 1553: 1537: 1536: 1531: 1518: 1505: 1498: 1495: 1492: 1485: 1471: 1465: 1458: 1455: 1446: 1442: 1433: 1428: 1425: 1422: 1418: 1410: 1407: 1394: 1391: 1386: 1383: 1366: 1363: 1354: 1350: 1344: 1339: 1336: 1333: 1329: 1323: 1320: 1315: 1306: 1295: 1259: 1243:; call these 1230: 1229: 1218: 1216: 1212: 1172: 1164: 1160: 1155: 1150: 1140: 1138: 1134: 1129: 1127: 1124: 1118: 1103: 1101: 1097: 1089: 1086: 1082: 1078: 1077: 1076: 1074: 1070: 1066: 1061: 1059: 1054: 1050: 1044: 1041: 1030: 1028: 1024: 1020: 1019:Minitab, Inc. 1016: 1012: 1008: 1003: 1001: 997: 992: 990: 986: 981: 977: 973: 969: 965: 961: 950: 945: 943: 938: 936: 931: 930: 928: 927: 920: 917: 913: 910: 909: 908: 905: 903: 900: 899: 893: 892: 885: 882: 880: 877: 875: 872: 870: 867: 865: 862: 860: 857: 855: 852: 851: 845: 844: 837: 834: 832: 829: 827: 824: 822: 819: 817: 814: 812: 809: 807: 804: 802: 799: 798: 792: 791: 784: 781: 779: 776: 774: 771: 769: 766: 765: 759: 758: 751: 748: 746: 743: 741: 740:Crowdsourcing 738: 736: 733: 732: 726: 725: 716: 713: 712: 711: 708: 706: 703: 701: 698: 696: 693: 692: 689: 684: 683: 675: 672: 670: 669:Memtransistor 667: 665: 662: 660: 657: 653: 650: 649: 648: 645: 643: 640: 636: 633: 631: 628: 626: 623: 621: 618: 617: 616: 613: 611: 608: 606: 603: 601: 598: 594: 591: 590: 589: 586: 582: 579: 577: 574: 572: 569: 567: 564: 563: 562: 559: 557: 554: 552: 551:Deep learning 549: 547: 544: 543: 540: 535: 534: 527: 524: 522: 519: 517: 515: 511: 509: 506: 505: 502: 497: 496: 487: 486:Hidden Markov 484: 482: 479: 477: 474: 473: 472: 469: 468: 465: 460: 459: 452: 449: 447: 444: 442: 439: 437: 434: 432: 429: 427: 424: 422: 419: 417: 414: 412: 409: 408: 405: 400: 399: 392: 389: 387: 384: 382: 378: 376: 373: 371: 368: 366: 364: 360: 358: 355: 353: 350: 348: 345: 344: 341: 336: 335: 328: 325: 323: 320: 318: 315: 313: 310: 308: 305: 303: 300: 298: 295: 293: 291: 287: 283: 282:Random forest 280: 278: 275: 273: 270: 269: 268: 265: 263: 260: 258: 255: 254: 247: 246: 241: 240: 232: 226: 225: 218: 215: 213: 210: 208: 205: 203: 200: 198: 195: 193: 190: 188: 185: 183: 180: 178: 175: 173: 170: 168: 167:Data cleaning 165: 163: 160: 158: 155: 153: 150: 148: 145: 143: 140: 138: 135: 133: 130: 129: 123: 122: 115: 112: 110: 107: 105: 102: 100: 97: 95: 92: 90: 87: 85: 82: 80: 79:Meta-learning 77: 75: 72: 70: 67: 65: 62: 60: 57: 55: 52: 51: 45: 44: 41: 36: 33: 32: 28: 27: 19: 10152: 10148: 10113: 10094: 10093: 10092:profile for 10089: 10060: 10053: 10034: 10030: 10020: 9995: 9991: 9981: 9938: 9908: 9873: 9869: 9863: 9842:cite journal 9830:Breiman, Leo 9824: 9766:(1): 85–91. 9763: 9759: 9719: 9713: 9696: 9692: 9686: 9653: 9649: 9603: 9599: 9589: 9546: 9542: 9536: 9517: 9496: 9476: 9469: 9441: 9434: 9423:. Retrieved 9420:explained.ai 9419: 9380: 9376: 9366: 9323: 9319: 9313: 9286: 9282: 9269: 9244: 9240: 9234: 9207: 9203: 9193: 9183: 9176: 9141: 9137: 9127: 9117: 9108: 9099: 9090: 9081: 9072: 9053: 9049: 9036: 9025:. Retrieved 9018:the original 8997: 8993: 8980: 8970: 8963: 8944: 8938: 8928: 8920: 8916: 8911: 8900:. Retrieved 8893:the original 8862: 8856: 8822: 8813: 8801:. Retrieved 8758: 8752: 8721:the original 8690: 8686: 8643: 8637: 8613:the original 8582: 8576: 8540: 8484: 8480: 8448:. Retrieved 8441:the original 8432: 8357:models, and 8348: 8084: 7789: 7697:. Moreover, 7579:Assume that 7578: 7338: 6917: 6893: 6494: 6490: 6005: 6002:Uniform KeRF 5543: 4490: 4324:and dataset 3378: 3369: 3312: 3287: 3271: 3255: 3246: 3182: 2901: 2645: 2437: 2424: 2408: 2405: 1931: 1908: 1904: 1901: 1838: 1744: 1735:randomForest 1734: 1727: 1702: 1627: 1615: 1611: 1609: 1597: 1581: 1565: 1533: 1519: 1395: 1387: 1384: 1296: 1285: 1210: 1168: 1162: 1158: 1130: 1125: 1120: 1093: 1062: 1045: 1036: 1011:Adele Cutler 1004: 993: 989:training set 963: 959: 958: 826:PAC learning 513: 362: 357:Hierarchical 289: 281: 243: 237: 9998:: 124–138. 9289:: 483–501. 8761:(1): 5–32. 8178:such that, 3289:Leo Breiman 2818:In a tree, 1100:correlation 1065:Leo Breiman 1007:Leo Breiman 985:overfitting 970:method for 710:Multi-agent 647:Transformer 546:Autoencoder 302:Naive Bayes 40:data mining 18:Random tree 10224:Categories 10044:2003.11132 9815:1502.03836 9425:2023-10-25 9333:1512.03444 9027:2015-11-13 8902:2008-04-01 8411:References 8355:rule-based 8085:Providing 7790:Providing 6482:Properties 2009:node  1719:Properties 1606:ExtraTrees 1586:features, 1224:= 1, ..., 998:using the 996:Tin Kam Ho 976:regression 695:Q-learning 593:Restricted 391:Mean shift 340:Clustering 317:Perceptron 245:regression 147:Clustering 142:Regression 10064:(Thesis). 10012:216444882 9972:1407.3939 9943:CiteSeerX 9920:1402.4293 9878:CiteSeerX 9678:233550030 9670:1076-0342 9551:CiteSeerX 9522:CiteSeerX 9291:CiteSeerX 9261:216485629 8867:CiteSeerX 8745:Breiman L 8695:CiteSeerX 8609:206795835 8587:CiteSeerX 8501:206420153 8359:attention 8316:⁡ 8299:⁡ 8270:− 8259:≤ 8232:− 8201:~ 8140:∞ 8137:→ 8099:∞ 8096:→ 8052:⁡ 8035:⁡ 8009:− 7991:≤ 7964:− 7933:~ 7845:∞ 7842:→ 7804:∞ 7801:→ 7780:Lipschitz 7685:∞ 7673:σ 7630:ε 7610:ε 7536:≤ 7530:≤ 7508:ε 7479:∞ 7472:~ 7439:− 7423:≤ 7396:∞ 7389:~ 7379:− 7357:∞ 7307:ε 7303:− 7297:≥ 7277:∣ 7264:≤ 7255:Θ 7228:⁡ 7223:Θ 7215:≤ 7199:⁡ 7158:ε 7154:− 7148:≥ 7128:∣ 7115:≤ 7109:Θ 7085:≤ 7069:⁡ 7039:≥ 7030:Θ 7003:⁡ 6930:ε 6845:~ 6812:− 6796:≤ 6762:~ 6752:− 6687:≤ 6677:Θ 6637:∑ 6623:≤ 6595:≤ 6589:Θ 6565:≤ 6442:∈ 6383:⁡ 6377:− 6361:− 6336:∑ 6312:− 6284:∏ 6239:… 6177:∑ 6157:… 6140:∑ 6070:Θ 6063:… 6051:Θ 6021:~ 5962:∈ 5935:⌉ 5905:⌈ 5899:⌉ 5869:⌈ 5842:∏ 5797:⋯ 5735:∑ 5715:… 5698:∑ 5628:Θ 5621:… 5609:Θ 5579:~ 5516:ℓ 5466:ℓ 5462:∑ 5387:∑ 5368:Θ 5361:… 5349:Θ 5319:~ 5237:Θ 5212:∈ 5180:∑ 5028:Θ 5003:∈ 4954:∑ 4933:∑ 4917:Θ 4875:∑ 4853:Θ 4846:… 4834:Θ 4804:~ 4760:Θ 4722:Θ 4697:∈ 4645:∑ 4619:∑ 4593:Θ 4586:… 4574:Θ 4504:∈ 4465:Θ 4440:∈ 4401:∑ 4385:Θ 4306:Θ 4254:Θ 4200:Θ 4162:Θ 4137:∈ 4085:∑ 4042:Θ 4000:∑ 3974:Θ 3967:… 3955:Θ 3875:Θ 3847:Θ 3839:… 3826:Θ 3752:Θ 3673:∣ 3664:⁡ 3582:∞ 3560:⁡ 3498:× 3345:∈ 3096:∑ 3060:∑ 2988:∑ 2967:∑ 2947:^ 2571:∑ 2561:^ 2524:^ 2308:Δ 2082:Δ 2016:∈ 2004:∑ 1976:∑ 1496:− 1475:^ 1466:− 1419:∑ 1408:σ 1330:∑ 1310:^ 1106:Algorithm 1015:trademark 987:to their 854:ECML PKDD 836:VC theory 783:ROC curve 715:Self-play 635:DeepDream 476:Bayes net 267:Ensembles 48:Paradigms 10189:29440440 9832:(2000). 9788:13195700 9780:24012917 9622:15529185 9573:27594168 9399:21576180 9350:28114007 9226:20385727 9168:26903687 9056:: 3–42. 8889:12470146 8848:(1997). 8844:Amit Y, 8803:15 March 8747:(2001). 8538:(2008). 8371:Boosting 8365:See also 7622:, where 4224:, where 3549:, where 3252:Variants 3146:′ 3038:′ 2868:′ 2849:′ 2771:′ 2677:′ 2614:′ 2366:at node 1733:package 1573:features 1459:′ 1398:x′ 1390:variance 1367:′ 1053:subspace 277:Boosting 126:Problems 10180:5828645 10157:Bibcode 10086:Scholia 9941:(670). 9900:2469856 9358:5381516 9159:4760114 9014:7415435 8846:Geman D 8763:Bibcode 8717:3563126 8662:1425956 3796:by the 3284:History 2391:entropy 1622:or the 1616:optimal 1588:√ 1202:, ..., 1184:, ..., 1143:Bagging 1133:overfit 1073:bagging 1040:feature 1033:History 1023:bagging 859:NeurIPS 676:(ECRAM) 630:AlexNet 272:Bagging 10187:  10177:  10132:  10088:has a 10010:  9945:  9898:  9880:  9786:  9778:  9734:  9676:  9668:  9620:  9581:245216 9579:  9571:  9553:  9524:  9484:  9457:  9397:  9356:  9348:  9293:  9259:  9224:  9166:  9156:  9012:  8887:  8869:  8715:  8697:  8660:  8607:  8589:  8548:  8499:  8450:5 June 4355:, and 3297:i.i.d. 2646:Here, 2123:where 1905:et al. 1628:random 1126:et al. 1123:Hastie 1079:Using 966:is an 652:Vision 508:RANSAC 386:OPTICS 381:DBSCAN 365:-means 172:AutoML 10090:topic 10039:arXiv 10008:S2CID 9967:arXiv 9915:arXiv 9896:S2CID 9810:arXiv 9784:S2CID 9674:S2CID 9577:S2CID 9569:JSTOR 9354:S2CID 9328:arXiv 9279:(PDF) 9257:S2CID 9046:(PDF) 9021:(PDF) 9010:S2CID 8990:(PDF) 8896:(PDF) 8885:S2CID 8853:(PDF) 8798:(PDF) 8724:(PDF) 8713:S2CID 8683:(PDF) 8616:(PDF) 8605:S2CID 8573:(PDF) 8497:S2CID 8477:(PDF) 8444:(PDF) 8437:(PDF) 1626:), a 1027:Geman 874:IJCAI 700:SARSA 659:Mamba 625:LeNet 620:U-Net 446:t-SNE 370:Fuzzy 347:BIRCH 10185:PMID 10130:ISBN 9855:help 9776:PMID 9732:ISBN 9666:ISSN 9618:PMID 9482:ISBN 9455:ISBN 9395:PMID 9346:PMID 9222:PMID 9164:PMID 8805:2013 8546:ISBN 8452:2016 8163:> 8111:and 7875:> 7816:and 7758:and 7682:< 5282:and 3579:< 3260:and 1220:For 1069:CART 1049:tree 1009:and 884:JMLR 869:ICLR 864:ICML 750:RLHF 566:LSTM 352:CURE 38:and 10175:PMC 10165:doi 10153:115 10122:doi 10035:119 10000:doi 9888:doi 9874:101 9768:doi 9764:220 9724:doi 9701:doi 9658:doi 9608:doi 9561:doi 9447:doi 9385:doi 9338:doi 9301:doi 9249:doi 9245:146 9212:doi 9154:PMC 9146:doi 9142:110 9058:doi 9002:doi 8949:doi 8919:In 8877:doi 8771:doi 8705:doi 8648:doi 8597:doi 8489:doi 8313:log 8296:log 8049:log 8032:log 7778:is 7523:max 3468:of 2883:if 2800:if 2736:In 1267:on 962:or 610:SOM 600:GAN 576:ESN 571:GRU 516:-NN 451:SDL 441:PGD 436:PCA 431:NMF 426:LDA 421:ICA 416:CCA 292:-NN 10226:: 10183:. 10173:. 10163:. 10151:. 10147:. 10128:. 10116:. 10112:. 10033:. 10029:. 10006:. 9996:61 9994:. 9990:. 9957:^ 9929:^ 9894:. 9886:. 9872:. 9846:: 9844:}} 9840:{{ 9796:^ 9782:. 9774:. 9762:. 9758:. 9746:^ 9730:. 9697:34 9695:. 9672:. 9664:. 9654:27 9652:. 9648:. 9630:^ 9616:. 9604:18 9602:. 9598:. 9575:. 9567:. 9559:. 9547:15 9545:. 9508:^ 9453:. 9418:. 9407:^ 9393:. 9381:27 9379:. 9375:. 9352:. 9344:. 9336:. 9324:39 9322:. 9299:. 9287:52 9285:. 9281:. 9255:. 9243:. 9220:. 9208:26 9206:. 9202:. 9162:. 9152:. 9140:. 9136:. 9054:63 9052:. 9048:. 9008:. 8996:. 8992:. 8945:40 8943:. 8937:. 8883:. 8875:. 8861:. 8855:. 8836:^ 8785:^ 8769:. 8759:45 8757:. 8751:. 8732:^ 8711:. 8703:. 8691:22 8689:. 8685:. 8670:^ 8658:MR 8656:. 8644:24 8642:. 8636:. 8624:^ 8603:. 8595:. 8581:. 8575:. 8560:^ 8534:; 8530:; 8509:^ 8495:. 8485:20 8483:. 8479:. 8460:^ 8418:^ 8341:. 8077:. 7910:, 6380:ln 4538:, 4488:. 3236:x' 3185:x' 2896:x' 2892:k' 2813:x' 2703:x' 2699:x' 2547:: 2541:x' 2300:, 2217:, 1737:. 1600:/3 1400:: 1294:: 1292:x' 1288:x' 1274:, 1250:, 1239:, 1228:: 1195:= 1177:= 1102:. 1060:. 991:. 974:, 879:ML 10216:) 10214:R 10191:. 10167:: 10159:: 10138:. 10124:: 10100:. 10047:. 10041:: 10014:. 10002:: 9975:. 9969:: 9951:. 9923:. 9917:: 9902:. 9890:: 9857:) 9853:( 9818:. 9812:: 9790:. 9770:: 9740:. 9726:: 9707:. 9703:: 9680:. 9660:: 9624:. 9610:: 9583:. 9563:: 9530:. 9490:. 9463:. 9449:: 9428:. 9401:. 9387:: 9360:. 9340:: 9330:: 9307:. 9303:: 9263:. 9251:: 9228:. 9214:: 9170:. 9148:: 9066:. 9060:: 9030:. 9004:: 8998:5 8957:. 8951:: 8905:. 8879:: 8863:9 8830:. 8807:. 8779:. 8773:: 8765:: 8707:: 8664:. 8650:: 8599:: 8583:1 8554:. 8503:. 8491:: 8454:. 8327:2 8323:) 8319:n 8310:( 8305:) 8302:2 8293:d 8290:3 8287:+ 8284:6 8281:( 8277:/ 8273:2 8266:n 8262:C 8254:2 8250:] 8246:) 8242:X 8238:( 8235:m 8229:) 8225:X 8221:( 8216:f 8213:u 8208:n 8198:m 8191:[ 8187:E 8166:0 8160:C 8132:k 8128:2 8123:/ 8119:n 8093:k 8063:2 8059:) 8055:n 8046:( 8041:) 8038:2 8029:d 8026:+ 8023:3 8020:( 8016:/ 8012:1 8005:n 7999:1 7995:C 7986:2 7982:] 7978:) 7974:X 7970:( 7967:m 7961:) 7957:X 7953:( 7948:c 7945:c 7940:n 7930:m 7923:[ 7919:E 7898:n 7878:0 7870:1 7866:C 7837:k 7833:2 7828:/ 7824:n 7798:k 7766:m 7744:d 7740:] 7736:1 7733:, 7730:0 7727:[ 7706:X 7677:2 7651:X 7607:+ 7604:) 7600:X 7596:( 7593:m 7590:= 7587:Y 7559:. 7555:) 7549:i 7545:Y 7539:n 7533:i 7527:1 7518:( 7512:n 7504:n 7501:+ 7498:) 7494:x 7490:( 7485:n 7482:, 7469:m 7458:n 7454:a 7447:n 7443:a 7434:n 7430:b 7419:| 7415:) 7411:x 7407:( 7402:n 7399:, 7386:m 7376:) 7372:x 7368:( 7363:n 7360:, 7353:m 7348:| 7324:, 7321:2 7317:/ 7311:n 7300:1 7294:] 7289:n 7283:D 7272:n 7268:b 7261:] 7258:) 7252:, 7248:x 7244:( 7239:n 7235:N 7231:[ 7219:E 7210:n 7206:a 7202:[ 7196:P 7175:, 7172:2 7168:/ 7162:n 7151:1 7145:] 7140:n 7134:D 7123:n 7119:b 7112:) 7106:, 7102:x 7098:( 7093:n 7089:N 7080:n 7076:a 7072:[ 7066:P 7045:, 7042:1 7036:] 7033:) 7027:, 7023:x 7019:( 7014:n 7010:N 7006:[ 7000:E 6977:) 6972:n 6968:b 6964:( 6961:, 6958:) 6953:n 6949:a 6945:( 6942:, 6939:) 6934:n 6926:( 6902:M 6874:. 6871:) 6867:x 6863:( 6858:n 6855:, 6852:M 6842:m 6831:n 6827:a 6820:n 6816:a 6807:n 6803:b 6792:| 6788:) 6784:x 6780:( 6775:n 6772:, 6769:M 6759:m 6749:) 6745:x 6741:( 6736:n 6733:, 6730:M 6726:m 6721:| 6700:. 6695:n 6691:b 6681:m 6673:, 6669:x 6662:n 6658:N 6652:M 6647:1 6644:= 6641:m 6631:M 6628:1 6618:n 6614:a 6603:n 6599:b 6592:) 6586:, 6582:x 6578:( 6573:n 6569:N 6560:n 6556:a 6535:) 6530:n 6526:b 6522:( 6519:, 6516:) 6511:n 6507:a 6503:( 6467:. 6462:d 6458:] 6454:1 6451:, 6448:0 6445:[ 6438:x 6428:) 6421:! 6418:j 6412:j 6407:) 6402:| 6396:m 6392:x 6387:| 6373:( 6364:1 6356:m 6352:k 6346:0 6343:= 6340:j 6331:| 6325:m 6321:x 6316:| 6309:1 6305:( 6299:d 6294:1 6291:= 6288:m 6278:k 6273:) 6268:d 6265:1 6260:( 6252:! 6247:d 6243:k 6236:! 6231:1 6227:k 6221:! 6218:k 6210:k 6207:= 6202:j 6198:k 6192:d 6187:1 6184:= 6181:j 6173:, 6168:d 6164:k 6160:, 6154:, 6149:1 6145:k 6136:= 6133:) 6129:x 6125:, 6121:0 6117:( 6112:f 6109:u 6104:k 6100:K 6079:) 6074:M 6066:, 6060:, 6055:1 6047:, 6043:x 6039:( 6034:n 6031:, 6028:M 6018:m 5987:. 5982:d 5978:] 5974:1 5971:, 5968:0 5965:[ 5958:z 5954:, 5950:x 5940:, 5930:j 5926:z 5918:j 5914:k 5909:2 5902:= 5894:j 5890:x 5882:j 5878:k 5873:2 5864:1 5857:d 5852:1 5849:= 5846:j 5836:k 5831:) 5826:d 5823:1 5818:( 5810:! 5805:d 5801:k 5794:! 5789:1 5785:k 5779:! 5776:k 5768:k 5765:= 5760:j 5756:k 5750:d 5745:1 5742:= 5739:j 5731:, 5726:d 5722:k 5718:, 5712:, 5707:1 5703:k 5694:= 5691:) 5687:z 5683:, 5679:x 5675:( 5670:c 5667:c 5662:k 5658:K 5637:) 5632:M 5624:, 5618:, 5613:1 5605:, 5601:x 5597:( 5592:n 5589:, 5586:M 5576:m 5552:k 5521:) 5511:x 5506:, 5502:x 5498:( 5493:n 5490:, 5487:M 5483:K 5477:n 5472:1 5469:= 5456:) 5451:i 5446:x 5441:, 5437:x 5433:( 5428:n 5425:, 5422:M 5418:K 5412:i 5408:Y 5402:n 5397:1 5394:= 5391:i 5380:= 5377:) 5372:M 5364:, 5358:, 5353:1 5345:, 5341:x 5337:( 5332:n 5329:, 5326:M 5316:m 5291:z 5269:x 5246:) 5241:j 5233:, 5229:x 5225:( 5220:n 5216:A 5208:z 5202:1 5195:M 5190:1 5187:= 5184:j 5174:M 5171:1 5166:= 5163:) 5159:z 5155:, 5151:x 5147:( 5142:n 5139:, 5136:M 5132:K 5111:M 5090:x 5067:i 5063:Y 5042:, 5037:) 5032:j 5024:, 5020:x 5016:( 5011:n 5007:A 4998:i 4993:X 4986:1 4979:i 4975:Y 4969:n 4964:1 4961:= 4958:i 4948:M 4943:1 4940:= 4937:j 4926:) 4921:j 4913:, 4909:x 4905:( 4900:n 4896:N 4890:M 4885:1 4882:= 4879:j 4870:1 4865:= 4862:) 4857:M 4849:, 4843:, 4838:1 4830:, 4826:x 4822:( 4817:n 4814:, 4811:M 4801:m 4776:) 4769:) 4764:j 4756:, 4752:x 4748:( 4743:n 4739:N 4731:) 4726:j 4718:, 4714:x 4710:( 4705:n 4701:A 4692:i 4687:X 4680:1 4673:i 4669:Y 4660:n 4655:1 4652:= 4649:i 4640:( 4634:M 4629:1 4626:= 4623:j 4613:M 4610:1 4605:= 4602:) 4597:M 4589:, 4583:, 4578:1 4570:, 4566:x 4562:( 4557:n 4554:, 4551:M 4547:m 4524:d 4520:] 4516:1 4513:, 4510:0 4507:[ 4500:x 4474:) 4469:j 4461:, 4457:x 4453:( 4448:n 4444:A 4435:i 4430:X 4423:1 4416:n 4411:1 4408:= 4405:i 4397:= 4394:) 4389:j 4381:, 4377:x 4373:( 4368:n 4364:N 4341:n 4335:D 4310:j 4284:x 4263:) 4258:j 4250:, 4246:x 4242:( 4237:n 4233:A 4209:) 4204:j 4196:, 4192:x 4188:( 4183:n 4179:N 4171:) 4166:j 4158:, 4154:x 4150:( 4145:n 4141:A 4132:i 4127:X 4120:1 4113:i 4109:Y 4100:n 4095:1 4092:= 4089:i 4081:= 4076:n 4072:m 4051:) 4046:j 4038:, 4034:x 4030:( 4025:n 4021:m 4015:M 4010:1 4007:= 4004:j 3994:M 3991:1 3986:= 3983:) 3978:M 3970:, 3964:, 3959:1 3951:, 3947:x 3943:( 3938:n 3935:, 3932:M 3928:m 3905:n 3899:D 3852:M 3842:, 3836:, 3831:1 3804:j 3783:x 3762:) 3757:j 3747:, 3743:x 3739:( 3734:n 3730:m 3709:M 3689:] 3685:x 3681:= 3677:X 3670:Y 3667:[ 3661:E 3658:= 3655:) 3651:x 3647:( 3644:m 3623:X 3602:Y 3576:] 3571:2 3567:Y 3563:[ 3557:E 3537:) 3534:Y 3531:, 3527:X 3523:( 3502:R 3493:p 3489:] 3485:1 3482:, 3479:0 3476:[ 3454:n 3449:1 3446:= 3443:i 3439:} 3435:) 3430:i 3426:Y 3422:, 3417:i 3412:X 3407:( 3404:{ 3401:= 3396:n 3390:D 3349:N 3342:k 3322:k 3222:j 3200:i 3196:x 3169:. 3164:i 3160:y 3154:) 3150:) 3143:x 3139:, 3134:i 3130:x 3126:( 3121:j 3117:W 3111:m 3106:1 3103:= 3100:j 3090:m 3087:1 3081:( 3075:n 3070:1 3067:= 3064:i 3056:= 3051:i 3047:y 3042:) 3035:x 3031:, 3026:i 3022:x 3018:( 3013:j 3009:W 3003:n 2998:1 2995:= 2992:i 2982:m 2977:1 2974:= 2971:j 2961:m 2958:1 2953:= 2944:y 2919:j 2915:W 2904:m 2887:i 2885:x 2865:k 2861:1 2856:= 2853:) 2846:x 2842:, 2837:i 2833:x 2829:( 2826:W 2809:k 2804:i 2802:x 2786:k 2783:1 2778:= 2775:) 2768:x 2764:, 2759:i 2755:x 2751:( 2748:W 2738:k 2718:i 2714:x 2695:i 2681:) 2674:x 2670:, 2665:i 2661:x 2657:( 2654:W 2632:. 2627:i 2623:y 2618:) 2611:x 2607:, 2602:i 2598:x 2594:( 2591:W 2586:n 2581:1 2578:= 2575:i 2567:= 2558:y 2545:W 2521:y 2496:n 2491:1 2488:= 2485:i 2481:} 2477:) 2472:i 2468:y 2464:, 2459:i 2455:x 2451:( 2448:{ 2434:k 2432:( 2428:k 2374:j 2354:t 2334:) 2331:j 2328:( 2321:i 2317:T 2312:i 2288:j 2266:n 2261:j 2257:n 2251:= 2248:) 2245:j 2242:( 2235:i 2231:T 2226:p 2205:i 2183:i 2179:T 2156:T 2152:n 2131:x 2111:, 2108:) 2105:j 2102:( 2095:i 2091:T 2086:i 2079:) 2076:j 2073:( 2066:i 2062:T 2057:p 2051:x 2048:= 2045:) 2042:j 2039:( 2030:| 2024:i 2020:T 2013:j 1996:T 1992:n 1986:1 1983:= 1980:i 1968:T 1964:n 1960:1 1955:= 1952:) 1949:x 1946:( 1887:j 1867:j 1847:j 1818:n 1813:1 1810:= 1807:i 1803:} 1799:) 1794:i 1790:Y 1786:, 1781:i 1777:X 1773:( 1770:{ 1767:= 1762:n 1756:D 1731:R 1683:p 1663:p 1641:p 1598:p 1591:p 1584:p 1577:B 1549:i 1547:x 1542:i 1540:x 1526:B 1522:B 1506:. 1499:1 1493:B 1486:2 1482:) 1472:f 1463:) 1456:x 1452:( 1447:b 1443:f 1439:( 1434:B 1429:1 1426:= 1423:b 1411:= 1371:) 1364:x 1360:( 1355:b 1351:f 1345:B 1340:1 1337:= 1334:b 1324:B 1321:1 1316:= 1307:f 1281:. 1278:b 1276:Y 1271:b 1269:X 1264:b 1262:f 1257:. 1254:b 1252:Y 1247:b 1245:X 1241:Y 1237:X 1233:n 1226:B 1222:b 1211:B 1206:n 1204:y 1199:1 1197:y 1193:Y 1188:n 1186:x 1181:1 1179:x 1175:X 1163:n 1159:n 1087:. 948:e 941:t 934:v 514:k 363:k 290:k 248:) 236:( 20:.

Index

Random tree
Machine learning
data mining
Supervised learning
Unsupervised learning
Semi-supervised learning
Self-supervised learning
Reinforcement learning
Meta-learning
Online learning
Batch learning
Curriculum learning
Rule-based learning
Neuro-symbolic AI
Neuromorphic engineering
Quantum machine learning
Classification
Generative modeling
Regression
Clustering
Dimensionality reduction
Density estimation
Anomaly detection
Data cleaning
AutoML
Association rules
Semantic analysis
Structured prediction
Feature engineering
Feature learning

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑