7498:
12266:, Harris Drucker, Christopher J. C. Burges, Linda Kaufman and Alexander J. Smola. This method is called support vector regression (SVR). The model produced by support vector classification (as described above) depends only on a subset of the training data, because the cost function for building the model does not care about training points that lie beyond the margin. Analogously, the model produced by SVR depends only on a subset of the training data, because the cost function for building the model ignores any training data close to the model prediction. Another SVM version known as
6965:
11668:). Classification of new instances for the one-versus-all case is done by a winner-takes-all strategy, in which the classifier with the highest-output function assigns the class (it is important that the output functions be calibrated to produce comparable scores). For the one-versus-one approach, classification is done by a max-wins voting strategy, in which every classifier assigns the instance to one of the two classes, then the vote for the assigned class is increased by one vote, and finally the class with the most votes determines the instance classification.
1108:
6016:
7493:{\displaystyle {\begin{aligned}{\text{maximize}}\,\,f(c_{1}\ldots c_{n})&=\sum _{i=1}^{n}c_{i}-{\frac {1}{2}}\sum _{i=1}^{n}\sum _{j=1}^{n}y_{i}c_{i}(\varphi (\mathbf {x} _{i})\cdot \varphi (\mathbf {x} _{j}))y_{j}c_{j}\\&=\sum _{i=1}^{n}c_{i}-{\frac {1}{2}}\sum _{i=1}^{n}\sum _{j=1}^{n}y_{i}c_{i}k(\mathbf {x} _{i},\mathbf {x} _{j})y_{j}c_{j}\\{\text{subject to }}\sum _{i=1}^{n}c_{i}y_{i}&=0,\,{\text{and }}0\leq c_{i}\leq {\frac {1}{2n\lambda }}\;{\text{for all }}i.\end{aligned}}}
3915:
8795:
12243:
6621:
5672:
5647:
3948:
1275:
3670:
3005:
8459:
1727:
2336:, we can select two parallel hyperplanes that separate the two classes of data, so that the distance between them is as large as possible. The region bounded by these two hyperplanes is called the "margin", and the maximum-margin hyperplane is the hyperplane that lies halfway between them. With a normalized or standardized dataset, these hyperplanes can be described by the equations
5437:
6011:{\displaystyle {\begin{aligned}&{\text{maximize}}\,\,f(c_{1}\ldots c_{n})=\sum _{i=1}^{n}c_{i}-{\frac {1}{2}}\sum _{i=1}^{n}\sum _{j=1}^{n}y_{i}c_{i}(\mathbf {x} _{i}^{\mathsf {T}}\mathbf {x} _{j})y_{j}c_{j},\\&{\text{subject to }}\sum _{i=1}^{n}c_{i}y_{i}=0,\,{\text{and }}0\leq c_{i}\leq {\frac {1}{2n\lambda }}\;{\text{for all }}i.\end{aligned}}}
3910:{\displaystyle {\begin{aligned}&{\underset {\mathbf {w} ,\;b,\;\mathbf {\zeta } }{\operatorname {minimize} }}&&\|\mathbf {w} \|_{2}^{2}+C\sum _{i=1}^{n}\zeta _{i}\\&{\text{subject to}}&&y_{i}(\mathbf {w} ^{\top }\mathbf {x} _{i}-b)\geq 1-\zeta _{i},\quad \zeta _{i}\geq 0\quad \forall i\in \{1,\dots ,n\}\end{aligned}}}
2836:
12615:(SMO) algorithm, which breaks the problem down into 2-dimensional sub-problems that are solved analytically, eliminating the need for a numerical optimization algorithm and matrix storage. This algorithm is conceptually simple, easy to implement, generally faster, and has better scaling properties for difficult SVM problems.
8790:{\displaystyle {\begin{aligned}&{\text{maximize}}\,\,f(c_{1}\ldots c_{n})=\sum _{i=1}^{n}c_{i}-{\frac {1}{2}}\sum _{i=1}^{n}\sum _{j=1}^{n}y_{i}c_{i}(x_{i}\cdot x_{j})y_{j}c_{j},\\&{\text{subject to }}\sum _{i=1}^{n}c_{i}y_{i}=0,\,{\text{and }}0\leq c_{i}\leq {\frac {1}{2n\lambda }}\;{\text{for all }}i.\end{aligned}}}
12155:
8992:. Seen this way, support vector machines belong to a natural class of algorithms for statistical inference, and many of its unique features are due to the behavior of the hinge loss. This perspective can provide further insight into how and why SVMs work, and allow us to better analyze their statistical properties.
1684:
based on SVM weights have been suggested as a mechanism for interpretation of SVM models. Support vector machine weights have also been used to interpret SVM models in the past. Posthoc interpretation of support vector machine models in order to identify features used by the model to make predictions
1329:
selected to suit the problem. The hyperplanes in the higher-dimensional space are defined as the set of points whose dot product with a vector in that space is constant, where such a set of vectors is an orthogonal (and thus minimal) set of vectors that defines a hyperplane. The vectors defining the
12233:
Structured support-vector machine is an extension of the traditional SVM model. While the SVM model is primarily designed for binary classification, multiclass classification, and regression tasks, structured SVM broadens its application to handle general structured output labels, for example parse
8973:
is projected onto the nearest vector of coefficients that satisfies the given constraints. (Typically
Euclidean distances are used.) The process is then repeated until a near-optimal vector of coefficients is obtained. The resulting algorithm is extremely fast in practice, although few performance
8148:
Recent algorithms for finding the SVM classifier include sub-gradient descent and coordinate descent. Both techniques have proven to offer significant advantages over the traditional approach when dealing with large, sparse datasets—sub-gradient methods are especially efficient when there are many
5642:{\displaystyle {\begin{aligned}&{\text{minimize }}{\frac {1}{n}}\sum _{i=1}^{n}\zeta _{i}+\lambda \|\mathbf {w} \|^{2}\\&{\text{subject to }}y_{i}\left(\mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b\right)\geq 1-\zeta _{i}\,{\text{ and }}\,\zeta _{i}\geq 0,\,{\text{for all }}i.\end{aligned}}}
1286:
in that space. For this reason, it was proposed that the original finite-dimensional space be mapped into a much higher-dimensional space, presumably making the separation easier in that space. To keep the computational load reasonable, the mappings used by SVM schemes are designed to ensure that
8139:
12691:
Preprocessing of data (standardization) is highly recommended to enhance accuracy of classification. There are a few methods of standardization, such as min-max, normalization by decimal scaling, Z-score. Subtraction of mean and division by variance of each feature is usually used for SVM.
7956:
12607:
of the primal and dual problems. Instead of solving a sequence of broken-down problems, this approach directly solves the problem altogether. To avoid solving a linear system involving the large kernel matrix, a low-rank approximation to the matrix is often used in the kernel trick.
4357:
10140:
1041:. Thus, SVMs use the kernel trick to implicitly map their inputs into high-dimensional feature spaces where linear classification can be performed. Being max-margin models, SVMs are resilient to noisy data (for example, mis-classified examples). SVMs can also be used for
11837:
3000:{\displaystyle {\begin{aligned}&{\underset {\mathbf {w} ,\;b}{\operatorname {minimize} }}&&{\frac {1}{2}}\|\mathbf {w} \|^{2}\\&{\text{subject to}}&&y_{i}(\mathbf {w} ^{\top }\mathbf {x} _{i}-b)\geq 1\quad \forall i\in \{1,\dots ,n\}\end{aligned}}}
10762:
6518:
1259:, or other tasks like outliers detection. Intuitively, a good separation is achieved by the hyperplane that has the largest distance to the nearest training-data point of any class (so-called functional margin), since in general the larger the margin, the lower the
11198:) that correctly classifies the data. This extends the geometric interpretation of SVM—for linear classification, the empirical risk is minimized by any function whose margins lie between the support vectors, and the simplest of these is the max-margin classifier.
12017:
2813:
9974:
6812:
4931:
4742:
10871:
4547:
5314:
4228:
4094:
3111:
9611:
7965:
2707:
1658:
can also be performed using SVMs. Experimental results show that SVMs achieve significantly higher search accuracy than traditional query refinement schemes after just three to four rounds of relevance feedback. This is also true for
11602:, often requiring the evaluation of far fewer parameter combinations than grid search. The final model, which is used for testing and for classifying new data, is then trained on the whole training set using the selected parameters.
3319:
2621:
7653:
5428:
9656:(for example, that they are generated by a finite Markov process), if the set of hypotheses being considered is small enough, the minimizer of the empirical risk will closely approximate the minimizer of the expected risk as
1608:. In this way, the sum of kernels above can be used to measure the relative nearness of each test point to the data points originating in one or the other of the sets to be discriminated. Note the fact that the set of points
6929:
11112:
4824:
9866:
1484:
1843:
11732:
13120:. Proceedings of the Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics: HLT-NAACL 2004. Association for Computational Linguistics. pp. 233–240.
6166:
10669:
8341:
6399:
12396:
11460:
11550:
11032:
3983:. The transformation may be nonlinear and the transformed space high-dimensional; although the classifier is a hyperplane in the transformed feature space, it may be nonlinear in the original input space.
5442:
3401:
2443:
2323:
2159:
4240:
2385:
1037:, representing the data only through a set of pairwise similarity comparisons between the original data points using a kernel function, which transforms them into coordinates in the higher dimensional
12580:. Florian Wenzel developed two different versions, a variational inference (VI) scheme for the Bayesian kernel support vector machine (SVM) and a stochastic version (SVI) for the linear Bayesian SVM.
10051:
3665:
lie on the correct side of the margin (Note we can add a weight to either term in the equation above). By deconstructing the hinge loss, this optimization problem can be massaged into the following:
2720:
9873:
3606:
8971:
11885:
10771:
10372:
12022:
10293:
9799:
8464:
7658:
6970:
5677:
5102:
3675:
2841:
12630:(e.g., LIBLINEAR). LIBLINEAR has some attractive training-time properties. Each convergence iteration takes time linear in the time taken to read the train data, and the iterations also have a
11932:
10548:
9435:
7608:
6291:
2486:
2250:
13470:
12215:
10938:
6691:
3143:
7646:
4641:
1225:, between the two classes. So we choose the hyperplane so that the distance from it to the nearest data point on each side is maximized. If such a hyperplane exists, it is known as the
12498:
10460:
8911:
12320:
10297:
In light of the above discussion, we see that the SVM technique is equivalent to empirical risk minimization with
Tikhonov regularization, where in this case the loss function is the
8843:
5211:
4433:
6716:
4646:
4835:
9500:
2626:
4448:
12150:{\displaystyle {\begin{aligned}&y_{i}(\mathbf {w} \cdot \mathbf {x} _{i}-b)\geq 1,\\&y_{j}^{\star }(\mathbf {w} \cdot \mathbf {x} _{j}^{\star }-b)\geq 1,\end{aligned}}}
10018:
9490:
9078:
9038:
8419:) methods can be adapted, where instead of taking a step in the direction of the function's gradient, a step is taken in the direction of a vector selected from the function's
3231:
2543:
1680:
The SVM algorithm has been widely applied in the biological and other sciences. They have been used to classify proteins with up to 90% of the compounds classified correctly.
11687:
problem into a single optimization problem, rather than decomposing it into multiple binary classification problems. See also Lee, Lin and Wahba and Van den Burg and
Groenen.
6568:
6394:
6320:
6230:
3663:
3443:
3208:
3179:
2514:
2079:
1981:
1928:
1899:
11725:
11196:
9734:
9710:
12540:
range of the true predictions. Slack variables are usually added into the above to allow for errors and to allow approximation in the case the above problem is infeasible.
9654:
12538:
12518:
12010:
11972:
11174:. Thus, in a sufficiently rich hypothesis space—or equivalently, for an appropriately chosen kernel—the SVM classifier will converge to the simplest function (in terms of
8389:
6837:
6342:
4573:
4383:
3029:
2272:
2211:
2181:
2108:
6604:
5216:
3979:
are replaced by kernels, is easily derived in the dual representation of the SVM problem. This allows the algorithm to fit the maximum-margin hyperplane in a transformed
1355:
9284:
9210:
5341:
4131:
1628:
mapped into any hyperplane can be quite convoluted as a result, allowing much more complex discrimination between sets that are not convex at all in the original space.
1063:
11624:
The SVM is only directly applicable for two-class tasks. Therefore, algorithms that reduce the multi-class task to several binary problems have to be applied; see the
11578:
11348:
11304:
11262:
6844:
5138:
4003:
3054:
2050:
11599:
11369:
11324:
11283:
9243:
9144:
9111:
6201:
2014:
13489:
Aizerman, Mark A.; Braverman, Emmanuel M. & Rozonoer, Lev I. (1964). "Theoretical foundations of the potential function method in pattern recognition learning".
4599:
3634:
1519:
1327:
877:
14716:
Joachims, Thorsten (1998). "Text categorization with
Support Vector Machines: Learning with many relevant features". In Nédellec, Claire; Rouveirol, Céline (eds.).
12453:
12426:
11172:
11139:
10616:
10586:
8870:
7527:
6958:
6083:
6049:
3348:
1870:
1606:
1385:
1211:
915:
13592:
Hsieh, Cho-Jui; Chang, Kai-Wei; Lin, Chih-Jen; Keerthi, S. Sathiya; Sundararajan, S. (2008-01-01). "A dual coordinate descent method for large-scale linear SVM".
10662:
4120:
10636:
10038:
9758:
9674:
9324:
9304:
9168:
8441:
8409:
8363:
7547:
6711:
6365:
3937:
3049:
2538:
1948:
1757:
1752:
1626:
1579:
1559:
1539:
1405:
1179:
1159:
12731:
13525:
3939:, it will behave similar to the hard-margin SVM, if the input data are linearly classifiable, but will still learn if a classification rule is viable or not.
1221:. There are many hyperplanes that might classify the data. One reasonable choice as the best hyperplane is the one that represents the largest separation, or
6090:
13141:
12588:
The parameters of the maximum-margin hyperplane are derived by solving the optimization. There exist several specialized algorithms for quickly solving the
1087:
The popularity of SVMs is likely due to their amenability to theoretical analysis, their flexibility in being applied to a wide variety of tasks, including
8165:
5346:
872:
6652:
Suppose now that we would like to learn a nonlinear classification rule which corresponds to a linear classification rule for the transformed data points
862:
3445:
lies on the correct side of the margin. For data on the wrong side of the margin, the function's value is proportional to the distance from the margin.
1076:, applies the statistics of support vectors, developed in the support vector machines algorithm, to categorize unlabeled data. These data sets require
13362:
11645:
Multiclass SVM aims to assign labels to instances by using support vector machines, where the labels are drawn from a finite set of several elements.
11037:
8134:{\displaystyle \mathbf {z} \mapsto \operatorname {sgn}(\mathbf {w} ^{\mathsf {T}}\varphi (\mathbf {z} )-b)=\operatorname {sgn} \left(\left-b\right).}
12618:
The special case of linear support vector machines can be solved more efficiently by the same kind of algorithms used to optimize its close cousin,
3960:
703:
10388:. The difference between the three lies in the choice of loss function: regularized least-squares amounts to empirical risk minimization with the
9807:
4753:
1414:
14059:
13926:
2280:
2113:
14254:
14026:
7951:{\displaystyle {\begin{aligned}b=\mathbf {w} ^{\mathsf {T}}\varphi (\mathbf {x} _{i})-y_{i}&=\left-y_{i}\\&=\left-y_{i}.\end{aligned}}}
910:
14543:
Fan, Rong-En; Chang, Kai-Wei; Hsieh, Cho-Jui; Wang, Xiang-Rui; Lin, Chih-Jen (2008). "LIBLINEAR: A library for large linear classification".
14526:
1730:
Maximum-margin hyperplane and margins for an SVM trained with samples from two classes. Samples on the margin are called the support vectors.
13361:
Cuingnet, Rémi; Rosso, Charlotte; Chupin, Marie; Lehéricy, Stéphane; Dormont, Didier; Benali, Habib; Samson, Yves; Colliot, Olivier (2011).
11841:
of test examples to be classified. Formally, a transductive support vector machine is defined by the following primal optimization problem:
3453:
13539:
Shalev-Shwartz, Shai; Singer, Yoram; Srebro, Nathan; Cotter, Andrew (2010-10-16). "Pegasos: primal estimated sub-gradient solver for SVM".
867:
718:
10305:
12657:
10147:
4949:
449:
12327:
11892:
11375:
10471:
9337:
14304:
Wenzel, Florian; Galy-Fajou, Theo; Deutsch, Matthäus; Kloft, Marius (2017). "Bayesian
Nonlinear Support Vector Machines for Big Data".
13314:"Analytic estimation of statistical significance maps for support vector machine based multi-variate image analysis and classification"
11465:
950:
753:
13754:
1291:
of pairs of input data vectors may be computed easily in terms of the variables in the original space, by defining them in terms of a
12592:(QP) problem that arises from SVMs, mostly relying on heuristics for breaking the problem down into smaller, more manageable chunks.
10943:
14471:
13775:
14444:
13843:
4352:{\displaystyle k(\mathbf {x} _{i},\mathbf {x} _{j})=\exp \left(-\gamma \left\|\mathbf {x} _{i}-\mathbf {x} _{j}\right\|^{2}\right)}
13993:
11114:, they give us more information than we need. In fact, they give us enough information to completely describe the distribution of
10135:{\displaystyle {\hat {\mathbf {w} }},b:\mathbf {x} \mapsto \operatorname {sgn}({\hat {\mathbf {w} }}^{\mathsf {T}}\mathbf {x} -b)}
3357:
14417:
12164:
11832:{\displaystyle {\mathcal {D}}^{\star }=\{\mathbf {x} _{i}^{\star }\mid \mathbf {x} _{i}^{\star }\in \mathbb {R} ^{p}\}_{i=1}^{k}}
2397:
1282:
Whereas the original problem may be stated in a finite-dimensional space, it often happens that the sets to discriminate are not
829:
14903:
5150:
problem, is detailed below. Then, more recent approaches such as sub-gradient descent and coordinate descent will be discussed.
2342:
10757:{\displaystyle y_{x}={\begin{cases}1&{\text{with probability }}p_{x}\\-1&{\text{with probability }}1-p_{x}\end{cases}}}
378:
17:
13065:
14862:
14843:
14805:
14733:
14339:
13812:
13619:
13288:
13164:
13098:
13059:
12934:
12895:
12860:
12604:
6513:{\displaystyle y_{i}(\mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b)=1\iff b=\mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-y_{i}.}
14497:
12267:
3152:
An important consequence of this geometric description is that the max-margin hyperplane is completely determined by those
1644:, as their application can significantly reduce the need for labeled training instances in both the standard inductive and
887:
650:
185:
13707:
R. Collobert and S. Bengio (2004). Links between
Perceptrons, MLPs and SVMs. Proc. Int'l Conf. on Machine Learning (ICML).
11847:
10377:
14937:
9763:
905:
5140:
yields the hard-margin classifier for linearly classifiable input data. The classical approach, which involves reducing
1714:
to maximum-margin hyperplanes. The "soft margin" incarnation, as is commonly used in software packages, was proposed by
1018:, 1995, Vapnik et al., 1997) SVMs are one of the most studied models, being based on statistical learning frameworks of
14932:
13022:
738:
713:
662:
5164:
can be rewritten as a constrained optimization problem with a differentiable objective function in the following way.
2520:
equation. We also have to prevent data points from falling into the margin, we add the following constraint: for each
14824:
14779:
14690:
13786:
13079:
Joachims, Thorsten (1998). "Text categorization with
Support Vector Machines: Learning with many relevant features".
12569:
12565:
10463:
10389:
7552:
6235:
786:
781:
434:
2452:
2216:
14927:
14195:
13509:
13438:
2517:
444:
82:
8423:. This approach has the advantage that, for certain implementations, the number of iterations does not scale with
13178:
12742:
12612:
12560:(where the parameters are connected via probability distributions). This extended view allows the application of
11700:
10878:
8916:
6655:
3116:
1645:
14166:
11244:
The effectiveness of SVM depends on the selection of kernel, the kernel's parameters, and soft margin parameter
7613:
4608:
2808:{\displaystyle y_{i}(\mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b)\geq 1,\quad {\text{ for all }}1\leq i\leq n.}
12458:
11618:
10395:
9969:{\displaystyle {\hat {f}}=\mathrm {arg} \min _{f\in {\mathcal {H}}}{\hat {\varepsilon }}(f)+{\mathcal {R}}(f).}
9688:
In order for the minimization problem to have a well-defined solution, we have to place constraints on the set
8875:
6807:{\displaystyle k(\mathbf {x} _{i},\mathbf {x} _{j})=\varphi (\mathbf {x} _{i})\cdot \varphi (\mathbf {x} _{j})}
4926:{\textstyle \mathbf {w} \cdot \varphi (\mathbf {x} )=\sum _{i}\alpha _{i}y_{i}k(\mathbf {x} _{i},\mathbf {x} )}
4737:{\displaystyle k(\mathbf {x} _{i},\mathbf {x} _{j})=\varphi (\mathbf {x} _{i})\cdot \varphi (\mathbf {x_{j}} )}
1663:
systems, including those using a modified version SVM that uses the privileged approach as suggested by Vapnik.
943:
839:
603:
424:
12280:
8802:
5170:
4388:
11553:
10866:{\displaystyle f^{*}(x)={\begin{cases}1&{\text{if }}p_{x}\geq 1/2\\-1&{\text{otherwise}}\end{cases}}}
814:
516:
292:
14790:
14148:
13753:(Technical report). Department of Computer Science and Information Engineering, National Taiwan University.
1134:. Suppose some given data points each belong to one of two classes, and the goal is to decide which class a
14177:
Drucker, Harris; Burges, Christ. C.; Kaufman, Linda; Smola, Alexander J.; and Vapnik, Vladimir N. (1997); "
13393:
4542:{\displaystyle k(\mathbf {x_{i}} ,\mathbf {x_{j}} )=\tanh(\kappa \mathbf {x} _{i}\cdot \mathbf {x} _{j}+c)}
4234:
1034:
771:
708:
618:
596:
439:
429:
14178:
13202:
A. Maity (2016). "Supervised
Classification of RADARSAT-2 Polarimetric Data for Different Land Features".
14752:
13717:
Meyer, David; Leisch, Friedrich; Hornik, Kurt (September 2003). "The support vector machine under test".
13363:"Spatial regularization of SVM for the detection of diffusion alterations associated with stroke outcome"
12701:
12638:
12623:
11695:
Transductive support vector machines extend SVMs in that they could also treat partially labeled data in
8983:
8416:
922:
834:
819:
280:
102:
13643:
Rosasco, Lorenzo; De Vito, Ernesto; Caponnetto, Andrea; Piana, Michele; Verri, Alessandro (2004-05-01).
12911:
Ben-Hur, Asa; Horn, David; Siegelmann, Hava; Vapnik, Vladimir N. ""Support vector clustering" (2001);".
12573:
14711:. DIMACS Series in Discrete Mathematics and Theoretical Computer Science. Vol. 70. pp. 13–20.
12792:
9990:
9442:
9043:
9003:
1252:
1127:
991:
882:
809:
559:
454:
242:
175:
135:
14044:
13907:
13224:
13142:"Spatial-Taxon Information Granules as Used in Iterative Fuzzy-Decision-Making for Image Segmentation"
6525:
6370:
6296:
6206:
5120:
We focus on the soft-margin classifier since, as noted above, choosing a sufficiently small value for
3639:
3419:
3184:
3155:
2491:
2055:
1957:
1904:
1875:
14213:
14156:. Proceedings of the 1999 International Conference on Machine Learning (ICML 1999). pp. 200–209.
14011:
12847:. Lecture Notes in Computer Science. Vol. 1327. Berlin, Heidelberg: Springer. pp. 261–271.
11684:
10381:
1227:
1222:
999:
983:
936:
542:
310:
180:
14720:. Lecture Notes in Computer Science. Vol. 1398. Berlin, Heidelberg: Springer. p. 137-142.
14396:
13795:
13661:
13602:
13553:
13520:
11706:
11177:
10802:
10691:
9715:
9691:
5309:{\displaystyle \zeta _{i}=\max \left(0,1-y_{i}(\mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b)\right)}
14233:
14090:
13005:
12806:
12736:
11696:
11556:, and the parameters with best cross-validation accuracy are picked. Alternatively, recent work in
9618:
4223:{\displaystyle k(\mathbf {x} _{i},\mathbf {x} _{j})=(\mathbf {x} _{i}\cdot \mathbf {x} _{j}+r)^{d}}
1667:
1649:
1292:
564:
484:
407:
325:
155:
117:
112:
72:
67:
14517:
13418:"Using SVM weight-based methods to identify causally relevant and non-causally relevant variables"
12523:
12503:
11977:
11939:
9740:(as is the case for SVM), a particularly effective technique is to consider only those hypotheses
8372:
6820:
6325:
4552:
4362:
3012:
2255:
2194:
2164:
2091:
12673:
11215:
9980:
6573:
4089:{\displaystyle k(\mathbf {x} _{i},\mathbf {x} _{j})=(\mathbf {x} _{i}\cdot \mathbf {x} _{j})^{d}}
3106:{\displaystyle \mathbf {x} \mapsto \operatorname {sgn}(\mathbf {w} ^{\mathsf {T}}\mathbf {x} -b)}
1674:
1333:
1091:. It is not clear that SVMs have better predictive performance than other linear models, such as
511:
360:
260:
87:
10557:
The difference between the hinge loss and these other loss functions is best stated in terms of
9254:
9173:
5319:
3975:(originally proposed by Aizerman et al.) to maximum-margin hyperplanes. The kernel trick, where
14391:
14228:
14085:
13790:
13656:
13597:
13548:
13515:
13000:
12801:
12784:
11671:
5664:
3997:
1048:
691:
667:
569:
305:
265:
77:
14363:
13918:
11563:
11333:
11289:
11247:
5123:
2019:
12967:
12596:
12589:
11677:
11653:
11584:
11557:
11354:
11309:
11268:
9215:
9116:
9083:
6173:
6052:
5147:
3990:
of support vector machines, although given enough samples the algorithm still performs well.
1986:
1088:
1077:
645:
467:
419:
275:
190:
62:
9606:{\displaystyle {\hat {\varepsilon }}(f)={\frac {1}{n}}\sum _{k=1}^{n}\ell (y_{k},f(X_{k})).}
4578:
3613:
2702:{\displaystyle \mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b\leq -1\,,{\text{ if }}y_{i}=-1.}
1489:
1297:
14319:
13969:
13745:
12726:
12561:
12549:
12431:
12404:
11150:
11117:
10594:
10564:
9492:
outright. In these cases, a common strategy is to choose the hypothesis that minimizes the
9328:
8848:
8149:
training examples, and coordinate descent when the dimension of the feature space is high.
7505:
6936:
6061:
6027:
4941:
Computing the (soft-margin) SVM classifier amounts to minimizing an expression of the form
3987:
3326:
1848:
1655:
1584:
1363:
1264:
1260:
1184:
574:
524:
14489:
13818:
13046:
Press, William H.; Teukolsky, Saul A.; Vetterling, William T.; Flannery, Brian P. (2007).
3955:
The original maximum-margin hyperplane algorithm proposed by Vapnik in 1963 constructed a
3314:{\displaystyle \max \left(0,1-y_{i}(\mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b)\right).}
2616:{\displaystyle \mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b\geq 1\,,{\text{ if }}y_{i}=1,}
1685:
is a relatively new area of research with special significance in the biological sciences.
8:
14462:
14242:
12631:
12619:
12259:
10641:
10385:
9737:
4099:
2709:
These constraints state that each data point must lie on the correct side of the margin.
1954:
vector. We want to find the "maximum-margin hyperplane" that divides the group of points
1699:
1641:
1283:
1256:
1092:
1042:
1023:
995:
980:
677:
613:
584:
489:
315:
248:
234:
220:
195:
145:
97:
57:
14435:
14323:
13973:
13874:
11236:
A comparison of the SVM to other classifiers has been made by Meyer, Leisch and Hornik.
7529:
can be solved for using quadratic programming, as before. Again, we can find some index
14739:
14669:
14612:
14601:"Predicting and explaining behavioral data with structured feature space decomposition"
14409:
14345:
14309:
14246:
14103:
14076:
Lee, Yoonkyung; Lin, Yi; Wahba, Grace (2004). "Multicategory
Support Vector Machines".
13985:
13959:
13944:
13690:
13625:
13574:
13338:
13313:
13294:
13249:
13203:
13170:
13130:
Vapnik, Vladimir N.: Invited
Speaker. IPMU Information Processing and Management 2014).
13028:
12963:
12840:
12821:
12752:
12627:
12600:
12263:
11649:
10621:
10023:
9743:
9659:
9309:
9289:
9153:
8451:
8426:
8394:
8348:
8157:
7532:
6696:
6350:
4442:
3922:
3034:
2523:
2333:
1933:
1737:
1695:
1660:
1611:
1564:
1544:
1524:
1390:
1164:
1144:
655:
579:
365:
160:
14627:
14600:
14437:
Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines
14376:
13730:
12992:
14900:
14858:
14839:
14820:
14801:
14775:
14729:
14686:
14673:
14488:
Fan, Rong-En; Chang, Kai-Wei; Hsieh, Cho-Jui; Wang, Xiang-Rui; Lin, Chih-Jen (2008).
14335:
13903:
13899:
13866:
13808:
13682:
13674:
13615:
13566:
13385:
13343:
13329:
13284:
13160:
13094:
13055:
13032:
13018:
12959:
12891:
12856:
12825:
12721:
12553:
12520:
is a free parameter that serves as a threshold: all predictions have to be within an
11660:
Building binary classifiers that distinguish between one of the labels and the rest (
11229:
11207:
5423:{\displaystyle y_{i}(\mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b)\geq 1-\zeta _{i}.}
4125:
3956:
2188:
2184:
2052:, which is defined so that the distance between the hyperplane and the nearest point
1251:
or set of hyperplanes in a high or infinite-dimensional space, which can be used for
1234:
1218:
1096:
1030:
748:
591:
504:
300:
270:
215:
210:
165:
107:
14413:
14122:
13989:
13694:
13578:
13298:
13269:"CNN based common approach to handwritten character recognition of multiple scripts"
12997:
Proceedings of the fifth annual workshop on Computational learning theory – COLT '92
12876:
Awad, Mariette; Khanna, Rahul (2015). "Support Vector Machines for Classification".
12843:. In Gerstner, Wulfram; Germond, Alain; Hasler, Martin; Nicoud, Jean-Daniel (eds.).
12219:
Transductive support vector machines were introduced by Vladimir N. Vapnik in 1998.
11621:—SVM stems from Vapnik's theory which avoids estimating probabilities on finite data
1033:, SVMs can efficiently perform a non-linear classification using what is called the
14767:
14743:
14721:
14701:
14661:
14622:
14575:
14401:
14349:
14327:
14284:
14238:
14107:
14095:
13977:
13858:
13800:
13726:
13666:
13629:
13607:
13558:
13377:
13333:
13325:
13276:
13239:
13174:
13152:
13084:
13047:
13010:
12881:
12848:
12811:
12669:
8412:
4438:
1681:
1240:
1131:
1107:
1081:
964:
776:
529:
479:
389:
373:
343:
205:
200:
150:
140:
38:
14683:
An Introduction to Support Vector Machines and other kernel-based learning methods
14364:”Scalable Approximate Inference for the Bayesian Nonlinear Support Vector Machine”
13434:
12969:
The Elements of Statistical Learning : Data Mining, Inference, and Prediction
12930:
10875:
For the square-loss, the target function is the conditional expectation function,
3986:
It is noteworthy that working in a higher-dimensional feature space increases the
3636:
determines the trade-off between increasing the margin size and ensuring that the
14907:
14596:
14516:
Allen Zhu, Zeyuan; Chen, Weizhu; Wang, Gang; Zhu, Chenguang; Chen, Zheng (2009).
14331:
14250:
13417:
13253:
13156:
13083:. Lecture Notes in Computer Science. Vol. 1398. Springer. pp. 137–142.
12780:
12706:
12642:
12557:
8366:
6924:{\displaystyle \mathbf {w} =\sum _{i=1}^{n}c_{i}y_{i}\varphi (\mathbf {x} _{i}),}
3968:
1707:
1073:
1069:
1015:
1003:
804:
608:
474:
414:
13381:
12886:
11144:
On the other hand, one can check that the target function for the hinge loss is
11107:{\displaystyle \operatorname {sgn}(f_{sq})=\operatorname {sgn}(f_{\log })=f^{*}}
2252:
determines the offset of the hyperplane from the origin along the normal vector
1141:
will be in. In the case of support vector machines, a data point is viewed as a
14888:
14646:
14099:
13670:
13644:
13594:
Proceedings of the 25th international conference on Machine learning - ICML '08
13280:
13273:
2015 13th International Conference on Document Analysis and Recognition (ICDAR)
13268:
12776:
12228:
10561:
the function that minimizes expected risk for a given pair of random variables
3964:
1715:
1703:
1358:
1011:
1007:
824:
355:
92:
14771:
14405:
14012:"On the Algorithmic Implementation of Multiclass Kernel-based Vector Machines"
13562:
13244:
14921:
14167:
https://www.cs.cornell.edu/people/tj/publications/tsochantaridis_etal_04a.pdf
13678:
13570:
12955:
12747:
12716:
12711:
12661:
9247:
8420:
6615:
4819:{\textstyle \mathbf {w} =\sum _{i}\alpha _{i}y_{i}\varphi (\mathbf {x} _{i})}
3980:
3146:
1408:
1038:
743:
672:
554:
285:
170:
14878:
14150:
Transductive Inference for Text Classification using Support Vector Machines
13611:
9861:{\displaystyle {\mathcal {R}}(f)=\lambda _{k}\lVert f\rVert _{\mathcal {H}}}
7648:
lies on the boundary of the margin in the transformed space, and then solve
6024:
problem. Since the dual maximization problem is a quadratic function of the
1479:{\displaystyle \textstyle \sum _{i}\alpha _{i}k(x_{i},x)={\text{constant}}.}
14897:
is a collection of software tools for learning and classification using SVM
14789:
James, Gareth; Witten, Daniela; Hastie, Trevor; Tibshirani, Robert (2013).
14580:
14563:
13870:
13686:
13416:
Statnikov, Alexander; Hardin, Douglas; & Aliferis, Constantin; (2006);
13389:
13347:
13115:
12665:
8982:
The soft-margin support vector machine described above is an example of an
3972:
1711:
14665:
13014:
2445:(anything on or below this boundary is of the other class, with label −1).
1838:{\displaystyle (\mathbf {x} _{1},y_{1}),\ldots ,(\mathbf {x} _{n},y_{n}),}
1561:, each term in the sum measures the degree of closeness of the test point
1387:
that occur in the data base. With this choice of a hyperplane, the points
12739:, a probabilistic sparse-kernel model identical in functional form to SVM
12234:
trees, classification with taxonomies, sequence alignment and many more.
11327:
3976:
3222:
To extend SVM to cases in which the data are not linearly separable, the
1951:
1288:
1268:
1181:
numbers), and we want to know whether we can separate such points with a
549:
43:
14308:. Lecture Notes in Computer Science. Vol. 10534. pp. 307–322.
13945:"Solving Multiclass Learning Problems via Error-Correcting Output Codes"
13804:
11218:. A special property is that they simultaneously minimize the empirical
11034:. While both of these target functions yield the correct classifier, as
2277:
Warning: most of the literature on the subject defines the bias so that
14911:
14725:
13914:
13089:
12852:
12816:
12681:
12648:
Kernel SVMs are available in many machine-learning toolkits, including
11211:
10298:
9147:
8988:
6161:{\displaystyle \mathbf {w} =\sum _{i=1}^{n}c_{i}y_{i}\mathbf {x} _{i}.}
3224:
2085:
1248:
1214:
1138:
698:
394:
320:
13862:
14798:
An Introduction to Statistical Learning : with Applications in R
14289:
14272:
13981:
11264:. A common choice is a Gaussian kernel, which has a single parameter
8336:{\displaystyle f(\mathbf {w} ,b)=\left+\lambda \|\mathbf {w} \|^{2}.}
2488:, so to maximize the distance between the planes we want to minimize
1019:
987:
857:
638:
12991:
Boser, Bernhard E.; Guyon, Isabelle M.; Vapnik, Vladimir N. (1992).
12572:. Recently, a scalable version of the Bayesian SVM was developed by
12242:
11683:
Crammer and Singer proposed a multiclass SVM method which casts the
11552:. Typically, each combination of parameter choices is checked using
6620:
1330:
hyperplanes can be chosen to be linear combinations with parameters
14617:
14314:
13964:
13208:
12577:
10376:
From this perspective, SVM is closely related to other fundamental
4832:
for classification can again be computed by the kernel trick, i.e.
1084:
to groups and, then, to map new data according to these clusters.
12931:"1.4. Support Vector Machines — scikit-learn 0.20.2 documentation"
12637:
The general kernel SVMs can also be solved more efficiently using
12391:{\displaystyle |y_{i}-\langle w,x_{i}\rangle -b|\leq \varepsilon }
11455:{\displaystyle \lambda \in \{2^{-5},2^{-3},\dots ,2^{13},2^{15}\}}
2387:(anything on or above this boundary is of one class, with label 1)
14564:"Standardization and Its Effects on K-Means Clustering Algorithm"
14362:
Florian Wenzel; Matthäus Deutsch; Théo Galy-Fajou; Marius Kloft;
12564:
techniques to SVMs, such as flexible feature modeling, automatic
12246:
Support vector regression (prediction) with different thresholds
11545:{\displaystyle \gamma \in \{2^{-15},2^{-13},\dots ,2^{1},2^{3}\}}
9615:
Under certain assumptions about the sequence of random variables
3947:
1872:
are either 1 or −1, each indicating the class to which the point
1274:
633:
14891:
is a library for large linear classification including some SVMs
14568:
Research Journal of Applied Sciences, Engineering and Technology
14121:
Van den Burg, Gerrit J. J. & Groenen, Patrick J. F. (2016).
13844:"A Comparison of Methods for Multiclass Support Vector Machines"
13538:
13045:
9000:
In supervised learning, one is given a set of training examples
3971:
suggested a way to create nonlinear classifiers by applying the
1710:
suggested a way to create nonlinear classifiers by applying the
1411:
that are mapped into the hyperplane are defined by the relation
14894:
14882:
12685:
12653:
12649:
12548:
In 2011 it was shown by Polson and Scott that the SVM admits a
11027:{\displaystyle f_{\log }(x)=\ln \left(p_{x}/({1-p_{x}})\right)}
9326:. We would then like to choose a hypothesis that minimizes the
9245:. A "good" approximation is usually defined with the help of a
6344:
can be written as a linear combination of the support vectors.
384:
14183:
Advances in Neural Information Processing Systems 9, NIPS 1996
12677:
11610:
Potential drawbacks of the SVM include the following aspects:
13776:"Which Is the Best Multiclass SVM Method? An Empirical Study"
13744:
Hsu, Chih-Wei; Chang, Chih-Chung & Lin, Chih-Jen (2003).
13642:
13267:
Maitra, D. S.; Bhattacharya, U.; Parui, S. K. (August 2015).
6051:
subject to linear constraints, it is efficiently solvable by
2449:
Geometrically, the distance between these two hyperplanes is
1726:
628:
623:
350:
14461:
Shalev-Shwartz, Shai; Singer, Yoram; Srebro, Nathan (2007).
14377:"Interior-Point Methods for Massive Support Vector Machines"
14303:
13897:
3396:{\displaystyle \mathbf {w} ^{\mathsf {T}}\mathbf {x} _{i}-b}
14788:
13360:
12254:
increases, the prediction becomes less sensitive to errors.
11648:
The dominant approach for doing so is to reduce the single
10859:
10750:
2438:{\displaystyle \mathbf {w} ^{\mathsf {T}}\mathbf {x} -b=-1}
2318:{\displaystyle \mathbf {w} ^{\mathsf {T}}\mathbf {x} +b=0.}
2154:{\displaystyle \mathbf {w} ^{\mathsf {T}}\mathbf {x} -b=0,}
14460:
2831:
We can put this together to get the optimization problem:
2380:{\displaystyle \mathbf {w} ^{\mathsf {T}}\mathbf {x} -b=1}
14123:"GenSVM: A Generalized Multiclass Support Vector Machine"
13488:
8160:
algorithms for the SVM work directly with the expression
5667:
of the above problem, one obtains the simplified problem
14853:
Theodoridis, Sergios; Koutroumbas, Konstantinos (2009).
12954:
11631:
Parameters of a solved model are difficult to interpret.
10020:
can be some measure of the complexity of the hypothesis
5432:
Thus we can rewrite the optimization problem as follows
3051:
that solve this problem determine the final classifier,
1267:
means that the implementer is less likely to experience
916:
List of datasets in computer vision and image processing
14680:
13266:
13151:. Studies in Big Data. Vol. 10. pp. 285–318.
12910:
11690:
9439:
In most cases, we don't know the joint distribution of
1636:
SVMs can be used to solve various real-world problems:
14852:
14753:"Applications of Support Vector Machines in Chemistry"
14519:
P-packSVM: Parallel Primal grAdient desCent Kernel SVM
14490:"LIBLINEAR: A library for large linear classification"
13117:
Shallow Semantic Parsing using Support Vector Machines
13054:(3rd ed.). New York: Cambridge University Press.
12732:
Regularization perspectives on support vector machines
12285:
12270:(LS-SVM) has been proposed by Suykens and Vandewalle.
4838:
4756:
3601:{\displaystyle \lVert \mathbf {w} \rVert ^{2}+C\left,}
2457:
2221:
1418:
14464:
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
14306:
Machine Learning and Knowledge Discovery in Databases
13514:. Advances in Neural Information Processing Systems.
13441:
12993:"A training algorithm for optimal margin classifiers"
12526:
12506:
12461:
12434:
12407:
12330:
12283:
12167:
12020:
11980:
11942:
11895:
11850:
11735:
11709:
11656:
problems. Common methods for such reduction include:
11587:
11566:
11468:
11378:
11357:
11336:
11312:
11292:
11271:
11250:
11180:
11153:
11120:
11040:
10946:
10881:
10774:
10672:
10644:
10624:
10597:
10567:
10474:
10398:
10308:
10150:
10054:
10026:
9993:
9876:
9810:
9766:
9746:
9718:
9694:
9662:
9621:
9503:
9445:
9340:
9312:
9292:
9257:
9218:
9176:
9156:
9119:
9086:
9046:
9006:
8919:
8878:
8851:
8805:
8462:
8429:
8397:
8375:
8351:
8168:
7968:
7656:
7616:
7555:
7535:
7508:
6968:
6939:
6847:
6823:
6719:
6699:
6658:
6576:
6528:
6402:
6373:
6353:
6328:
6299:
6238:
6209:
6176:
6093:
6064:
6030:
5675:
5440:
5349:
5322:
5219:
5173:
5126:
4952:
4649:
4611:
4581:
4555:
4451:
4391:
4365:
4243:
4134:
4102:
4006:
3925:
3673:
3642:
3616:
3456:
3422:
3360:
3329:
3234:
3187:
3158:
3119:
3057:
3037:
3015:
2839:
2723:
2629:
2546:
2526:
2494:
2455:
2400:
2345:
2283:
2258:
2219:
2197:
2167:
2116:
2094:
2058:
2022:
1989:
1960:
1936:
1907:
1878:
1851:
1760:
1740:
1614:
1587:
1567:
1547:
1527:
1492:
1417:
1393:
1366:
1336:
1300:
1247:
More formally, a support vector machine constructs a
1187:
1167:
1147:
1051:
14515:
13591:
11880:{\displaystyle \mathbf {w} ,b,\mathbf {y} ^{\star }}
10367:{\displaystyle \ell (y,z)=\max \left(0,1-yz\right).}
1068:
The support vector clustering algorithm, created by
14833:
14120:
14043:Lee, Yoonkyung; Lin, Yi & Wahba, Grace (2001).
12975:(Second ed.). New York: Springer. p. 134.
10288:{\displaystyle \left+\lambda \|\mathbf {w} \|^{2}.}
9794:{\displaystyle \lVert f\rVert _{\mathcal {H}}<k}
5097:{\displaystyle \left+\lambda \|\mathbf {w} \|^{2}.}
1231:and the linear classifier it defines is known as a
13747:A Practical Guide to Support Vector Classification
13464:
13052:Numerical Recipes: The Art of Scientific Computing
12532:
12512:
12492:
12447:
12420:
12390:
12314:
12209:
12149:
12004:
11966:
11927:{\displaystyle {\frac {1}{2}}\|\mathbf {w} \|^{2}}
11926:
11879:
11831:
11719:
11593:
11572:
11544:
11454:
11363:
11342:
11318:
11298:
11277:
11256:
11190:
11166:
11133:
11106:
11026:
10940:; For the logistic loss, it's the logit function,
10932:
10865:
10756:
10656:
10630:
10610:
10580:
10543:{\displaystyle \ell _{\log }(y,z)=\ln(1+e^{-yz}).}
10542:
10454:
10366:
10287:
10134:
10032:
10012:
9968:
9860:
9793:
9752:
9728:
9704:
9668:
9648:
9605:
9484:
9430:{\displaystyle \varepsilon (f)=\mathbb {E} \left.}
9429:
9318:
9298:
9278:
9237:
9204:
9162:
9138:
9105:
9072:
9032:
8965:
8905:
8864:
8837:
8789:
8454:algorithms for the SVM work from the dual problem
8435:
8403:
8383:
8357:
8335:
8133:
7950:
7640:
7602:
7541:
7521:
7492:
6952:
6923:
6831:
6806:
6705:
6685:
6624:A training example of SVM with kernel given by φ((
6598:
6562:
6512:
6388:
6359:
6336:
6314:
6285:
6224:
6195:
6160:
6077:
6043:
6010:
5641:
5422:
5335:
5308:
5205:
5132:
5096:
4925:
4818:
4736:
4635:
4593:
4567:
4541:
4427:
4377:
4351:
4222:
4114:
4088:
3931:
3909:
3657:
3628:
3600:
3448:The goal of the optimization then is to minimize:
3437:
3395:
3342:
3313:
3202:
3173:
3137:
3105:
3043:
3023:
2999:
2807:
2701:
2615:
2532:
2508:
2480:
2437:
2379:
2317:
2266:
2244:
2205:
2175:
2153:
2102:
2073:
2044:
2008:
1975:
1942:
1922:
1893:
1864:
1837:
1746:
1620:
1600:
1573:
1553:
1533:
1513:
1478:
1399:
1379:
1349:
1321:
1205:
1173:
1153:
1057:
27:Set of methods for supervised statistical learning
14815:Schölkopf, Bernhard; Smola, Alexander J. (2002).
14542:
14487:
13923:Advances in Neural Information Processing Systems
13908:"Large margin DAGs for multiclass classification"
13716:
13311:
6960:are obtained by solving the optimization problem
14919:
14594:
14196:Least squares support vector machine classifiers
13942:
12990:
11214:. They can also be considered a special case of
10330:
10187:
10142:is chosen to minimize the following expression:
9904:
8228:
5233:
4989:
3517:
3235:
3181:that lie nearest to it (explained below). These
2213:is not necessarily a unit vector. The parameter
14681:Cristianini, Nello; Shawe-Taylor, John (2000).
14644:
14273:"Data Augmentation for Support Vector Machines"
14194:Suykens, Johan A. K.; Vandewalle, Joos P. L.; "
14078:Journal of the American Statistical Association
13511:Dimensionality dependent PAC-Bayes margin bound
12771:
12769:
12767:
12634:property, making the algorithm extremely fast.
12576:, enabling the application of Bayesian SVMs to
9683:
7603:{\displaystyle 0<c_{i}<(2n\lambda )^{-1}}
6322:lies on the margin's boundary. It follows that
6286:{\displaystyle 0<c_{i}<(2n\lambda )^{-1}}
4936:
14814:
14699:
14647:"Support Vector Machines: Hype or Hallelujah?"
14271:Polson, Nicholas G.; Scott, Steven L. (2011).
14211:
13943:Dietterich, Thomas G.; Bakiri, Ghulum (1995).
13114:Pradhan, Sameer S.; et al. (2 May 2004).
11210:and can be interpreted as an extension of the
8977:
5343:is the smallest nonnegative number satisfying
3354:-th target (i.e., in this case, 1 or −1), and
2481:{\displaystyle {\tfrac {2}{\|\mathbf {w} \|}}}
2245:{\displaystyle {\tfrac {b}{\|\mathbf {w} \|}}}
911:List of datasets for machine-learning research
14834:Steinwart, Ingo; Christmann, Andreas (2008).
14645:Bennett, Kristin P.; Campbell, Colin (2000).
14009:
13465:{\displaystyle {\frac {2}{\|\mathbf {w} \|}}}
12904:
10048:Recall that the (soft-margin) SVM classifier
8913:. Then, the resulting vector of coefficients
944:
14702:"Support Vector Machines for Classification"
14562:Mohamad, Ismail; Usman, Dauda (2013-09-01).
14561:
14375:Ferris, Michael C.; Munson, Todd S. (2002).
14374:
14212:Smola, Alex J.; Schölkopf, Bernhard (2004).
13596:. New York, NY, USA: ACM. pp. 408–415.
13456:
13448:
13225:"Training Invariant Support Vector Machines"
12986:
12984:
12982:
12775:
12764:
12481:
12462:
12368:
12349:
12303:
12296:
12201:
12186:
11915:
11906:
11809:
11753:
11539:
11475:
11449:
11385:
10273:
10264:
10040:, so that simpler hypotheses are preferred.
9847:
9840:
9774:
9767:
8832:
8812:
8321:
8312:
6232:lies on the correct side of the margin, and
5507:
5498:
5200:
5180:
5082:
5073:
3900:
3882:
3723:
3714:
3466:
3457:
2990:
2972:
2888:
2879:
2503:
2495:
2471:
2463:
2235:
2227:
14270:
14075:
14042:
13952:Journal of Artificial Intelligence Research
13773:
13769:
13767:
13743:
12603:-like iterations to find a solution of the
12210:{\displaystyle y_{j}^{\star }\in \{-1,1\}.}
10933:{\displaystyle f_{sq}(x)=\mathbb {E} \left}
10664:. In the classification setting, we have:
9868:, and solving the new optimization problem
8966:{\displaystyle (c_{1}',\,\ldots ,\,c_{n}')}
6686:{\displaystyle \varphi (\mathbf {x} _{i}).}
3410:This function is zero if the constraint in
3138:{\displaystyle \operatorname {sgn}(\cdot )}
1718:and Vapnik in 1993 and published in 1995.
1694:The original SVM algorithm was invented by
14010:Crammer, Koby & Singer, Yoram (2001).
13842:Hsu, Chih-Wei & Lin, Chih-Jen (2002).
13774:Duan, Kai-Bo; Keerthi, S. Sathiya (2005).
12875:
12556:. In this approach the SVM is viewed as a
8771:
7641:{\displaystyle \varphi (\mathbf {x} _{i})}
7474:
6461:
6457:
5992:
4636:{\displaystyle \varphi (\mathbf {x} _{i})}
3700:
3693:
2859:
1080:approaches, which attempt to find natural
951:
937:
14626:
14616:
14579:
14395:
14313:
14288:
14232:
14214:"A tutorial on support vector regression"
14089:
13963:
13794:
13660:
13601:
13552:
13519:
13337:
13243:
13207:
13088:
13004:
12979:
12885:
12815:
12805:
12493:{\displaystyle \langle w,x_{i}\rangle +b}
11798:
10908:
10574:
10455:{\displaystyle \ell _{sq}(y,z)=(y-z)^{2}}
10043:
9635:
9465:
9357:
8946:
8939:
8906:{\displaystyle \partial f/\partial c_{i}}
8828:
8821:
8728:
8474:
8473:
7431:
6979:
6978:
6693:Moreover, we are given a kernel function
5949:
5687:
5686:
5623:
5603:
5597:
5196:
5189:
2671:
2585:
14800:. New York: Springer. pp. 337–372.
14750:
14715:
14700:Fradkin, Dmitriy; Muchnik, Ilya (2006).
14202:, vol. 9, no. 3, Jun. 1999, pp. 293–300.
13764:
13201:
13139:
13078:
12552:interpretation through the technique of
12315:{\displaystyle {\tfrac {1}{2}}\|w\|^{2}}
12273:Training the original SVR means solving
12241:
11703:. Here, in addition to the training set
11330:with exponentially growing sequences of
8838:{\displaystyle i\in \{1,\,\ldots ,\,n\}}
6619:
5206:{\displaystyle i\in \{1,\,\ldots ,\,n\}}
4428:{\displaystyle \gamma =1/(2\sigma ^{2})}
3946:
1725:
1273:
1106:
14045:"Multicategory Support Vector Machines"
13835:
13507:
13222:
13113:
13048:"Section 16.5. Support Vector Machines"
12500:is the prediction for that sample, and
12428:is a training sample with target value
11206:SVMs belong to a family of generalized
8152:
4750:is also in the transformed space, with
4605:The kernel is related to the transform
1123:separates them with the maximal margin.
14:
14920:
13841:
13149:Granular Computing and Decision-Making
12838:
11239:
10228:
10112:
8269:
7994:
7675:
6476:
6424:
5845:
5549:
5371:
5274:
5030:
3558:
3369:
3276:
3083:
2745:
2638:
2555:
2409:
2354:
2292:
2125:
1666:Classification of satellite data like
1119:does, but only with a small margin. H
14707:. In Abello, J.; Carmode, G. (eds.).
14433:
12845:Artificial Neural Networks — ICANN'97
12570:predictive uncertainty quantification
10766:The optimal classifier is therefore:
9676:grows large. This approach is called
8446:
6396:on the margin's boundary and solving
2516:. The distance is computed using the
2187:to the hyperplane. This is much like
1652:are based on support vector machines.
1581:to the corresponding data base point
1006:with colleagues (Boser et al., 1992,
14885:is a popular library of SVM learners
14545:Journal of Machine Learning Research
14498:Journal of Machine Learning Research
14146:
14130:Journal of Machine Learning Research
14019:Journal of Machine Learning Research
13851:IEEE Transactions on Neural Networks
13789:. Vol. 3541. pp. 278–285.
13312:Gaonkar, B.; Davatzikos, C. (2013).
12913:Journal of Machine Learning Research
12622:; this class of algorithms includes
12268:least-squares support vector machine
11691:Transductive support vector machines
11664:) or between every pair of classes (
11614:Requires full labeling of input data
8995:
4943:
3942:
2714:
2183:is the (not necessarily normalized)
2088:can be written as the set of points
1022:proposed by Vapnik (1982, 1995) and
12455:. The inner product plus intercept
11625:
10552:
9801:. This is equivalent to imposing a
9712:of hypotheses being considered. If
6839:in the transformed space satisfies
2016:from the group of points for which
1734:We are given a training dataset of
1045:tasks, where the objective becomes
906:Glossary of artificial intelligence
24:
14760:Reviews in Computational Chemistry
14638:
14243:10.1023/B:STCO.0000035301.49549.88
14179:Support Vector Regression Machines
13645:"Are Loss Functions All the Same?"
12641:(e.g. P-packSVM), especially when
11739:
11727:, the learner is also given a set
11712:
11183:
10462:; logistic regression employs the
9996:
9949:
9916:
9899:
9896:
9893:
9852:
9813:
9779:
9721:
9697:
8890:
8879:
6817:We know the classification vector
3873:
3807:
2963:
2930:
1702:in 1964. In 1992, Bernhard Boser,
25:
14949:
14872:
12611:Another common method is Platt's
12583:
12222:
11640:
10013:{\displaystyle {\mathcal {R}}(f)}
9485:{\displaystyle X_{n+1},\,y_{n+1}}
9073:{\displaystyle y_{1}\ldots y_{n}}
9033:{\displaystyle X_{1}\ldots X_{n}}
8143:
6367:, can be recovered by finding an
4122:, this becomes the linear kernel.
3416:is satisfied, in other words, if
1642:text and hypertext categorization
14857:(4th ed.). Academic Press.
14709:Discrete Methods in Epidemiology
14532:from the original on 2014-04-07.
14477:from the original on 2013-12-15.
14450:from the original on 2015-07-02.
14423:from the original on 2008-12-04.
14260:from the original on 2012-01-31.
14065:from the original on 2013-06-17.
14052:Computing Science and Statistics
14032:from the original on 2015-08-29.
13999:from the original on 2013-05-09.
13932:from the original on 2012-06-16.
13760:from the original on 2013-06-25.
13528:from the original on 2015-04-02.
13452:
13435:"Why is the SVM margin equal to
13330:10.1016/j.neuroimage.2013.03.066
13068:from the original on 2011-08-11.
12110:
12101:
12049:
12040:
11910:
11867:
11852:
11778:
11758:
10268:
10235:
10222:
10119:
10100:
10079:
10059:
8872:is adjusted in the direction of
8377:
8316:
8277:
8263:
8176:
8105:
8091:
8007:
7988:
7970:
7910:
7895:
7800:
7776:
7689:
7669:
7625:
7335:
7320:
7160:
7136:
6905:
6849:
6825:
6791:
6767:
6743:
6728:
6667:
6563:{\displaystyle y_{i}^{-1}=y_{i}}
6484:
6470:
6432:
6418:
6389:{\displaystyle \mathbf {x} _{i}}
6376:
6330:
6315:{\displaystyle \mathbf {x} _{i}}
6302:
6225:{\displaystyle \mathbf {x} _{i}}
6212:
6145:
6095:
5853:
5834:
5557:
5543:
5502:
5379:
5365:
5282:
5268:
5077:
5038:
5024:
4916:
4902:
4854:
4840:
4803:
4758:
4725:
4721:
4697:
4673:
4658:
4620:
4520:
4505:
4479:
4475:
4464:
4460:
4323:
4308:
4267:
4252:
4194:
4179:
4158:
4143:
4066:
4051:
4030:
4015:
3814:
3802:
3718:
3686:
3658:{\displaystyle \mathbf {x} _{i}}
3645:
3566:
3552:
3461:
3438:{\displaystyle \mathbf {x} _{i}}
3425:
3377:
3363:
3284:
3270:
3203:{\displaystyle \mathbf {x} _{i}}
3190:
3174:{\displaystyle \mathbf {x} _{i}}
3161:
3090:
3077:
3059:
3017:
2937:
2925:
2883:
2852:
2753:
2739:
2646:
2632:
2563:
2549:
2518:distance from a point to a plane
2509:{\displaystyle \|\mathbf {w} \|}
2499:
2467:
2416:
2403:
2361:
2348:
2299:
2286:
2260:
2231:
2199:
2169:
2132:
2119:
2096:
2081:from either group is maximized.
2074:{\displaystyle \mathbf {x} _{i}}
2061:
1976:{\displaystyle \mathbf {x} _{i}}
1963:
1923:{\displaystyle \mathbf {x} _{i}}
1910:
1894:{\displaystyle \mathbf {x} _{i}}
1881:
1806:
1766:
1115:does not separate the classes. H
986:models with associated learning
14628:10.1140/epjds/s13688-019-0201-0
14588:
14555:
14536:
14509:
14481:
14454:
14427:
14368:
14356:
14297:
14264:
14205:
14188:
14171:
14160:
14140:
14114:
14069:
14036:
14003:
13936:
13925:. MIT Press. pp. 547–553.
13891:
13737:
13710:
13701:
13636:
13585:
13532:
13501:
13482:
13427:
13410:
13354:
13305:
13260:
13216:
13195:
13133:
13124:
13107:
12937:from the original on 2017-11-08
12743:Sequential minimal optimization
12613:sequential minimal optimization
12543:
11699:by following the principles of
11226:; hence they are also known as
8845:, iteratively, the coefficient
6609:
4385:. Sometimes parametrized using
3872:
3855:
2962:
2781:
1673:Hand-written characters can be
1631:
1241:perceptron of optimal stability
1161:-dimensional vector (a list of
14685:. Cambridge University Press.
13508:Jin, Chi; Wang, Liwei (2012).
13072:
13039:
12948:
12923:
12869:
12832:
12595:Another approach is to use an
12378:
12332:
12131:
12097:
12065:
12036:
11720:{\displaystyle {\mathcal {D}}}
11619:class membership probabilities
11191:{\displaystyle {\mathcal {R}}}
11088:
11075:
11063:
11047:
11016:
10995:
10963:
10957:
10901:
10895:
10791:
10785:
10638:conditional on the event that
10534:
10509:
10497:
10485:
10443:
10430:
10424:
10412:
10324:
10312:
10245:
10217:
10129:
10104:
10092:
10083:
10063:
10007:
10001:
9960:
9954:
9941:
9935:
9929:
9883:
9824:
9818:
9729:{\displaystyle {\mathcal {H}}}
9705:{\displaystyle {\mathcal {H}}}
9597:
9594:
9581:
9562:
9522:
9516:
9510:
9416:
9413:
9394:
9369:
9350:
9344:
9286:, which characterizes how bad
9273:
9261:
9199:
9180:
8960:
8920:
8642:
8616:
8504:
8478:
8293:
8258:
8186:
8172:
8109:
8086:
8020:
8011:
8003:
7983:
7974:
7920:
7890:
7810:
7795:
7786:
7771:
7699:
7684:
7635:
7620:
7588:
7575:
7345:
7315:
7173:
7170:
7155:
7146:
7131:
7125:
7009:
6983:
6915:
6900:
6801:
6786:
6777:
6762:
6753:
6723:
6677:
6662:
6458:
6448:
6413:
6271:
6258:
5863:
5829:
5717:
5691:
5395:
5360:
5298:
5263:
5054:
5019:
4920:
4897:
4858:
4850:
4813:
4798:
4731:
4716:
4707:
4692:
4683:
4653:
4630:
4615:
4536:
4497:
4485:
4455:
4422:
4406:
4334:
4302:
4277:
4247:
4211:
4174:
4168:
4138:
4077:
4046:
4040:
4010:
3830:
3797:
3582:
3547:
3300:
3265:
3217:
3132:
3126:
3100:
3072:
3063:
2953:
2920:
2769:
2734:
2327:
1829:
1801:
1789:
1761:
1508:
1496:
1461:
1442:
1316:
1304:
1200:
1188:
1089:structured prediction problems
326:Relevance vector machine (RVM)
13:
1:
13731:10.1016/S0925-2312(03)00431-4
13491:Automation and Remote Control
12758:
12605:Karush–Kuhn–Tucker conditions
12237:
11678:Error-correcting output codes
11635:
11201:
9649:{\displaystyle X_{k},\,y_{k}}
9212:is a "good" approximation of
8974:guarantees have been proven.
8443:, the number of data points.
3993:Some common kernels include:
1721:
1102:
815:Computational learning theory
379:Expectation–maximization (EM)
14819:. Cambridge, MA: MIT Press.
14595:Fennell, Peter; Zuo, Zhiya;
14384:SIAM Journal on Optimization
14332:10.1007/978-3-319-71249-9_19
13157:10.1007/978-3-319-16829-6_12
12839:Vapnik, Vladimir N. (1997).
12533:{\displaystyle \varepsilon }
12513:{\displaystyle \varepsilon }
12005:{\displaystyle j=1,\dots ,k}
11967:{\displaystyle i=1,\dots ,n}
9684:Regularization and stability
9678:empirical risk minimization,
8384:{\displaystyle \mathbf {w} }
6832:{\displaystyle \mathbf {w} }
6337:{\displaystyle \mathbf {w} }
4937:Computing the SVM classifier
4568:{\displaystyle \kappa >0}
4378:{\displaystyle \gamma >0}
3024:{\displaystyle \mathbf {w} }
2267:{\displaystyle \mathbf {w} }
2206:{\displaystyle \mathbf {w} }
2176:{\displaystyle \mathbf {w} }
2103:{\displaystyle \mathbf {x} }
772:Coefficient of determination
619:Convolutional neural network
331:Support vector machine (SVM)
7:
13783:Multiple Classifier Systems
13382:10.1016/j.media.2011.05.007
12887:10.1007/978-1-4302-5990-9_3
12878:Efficient Learning Machines
12841:"The Support Vector method"
12702:In situ adaptive tabulation
12695:
8984:empirical risk minimization
8978:Empirical risk minimization
6599:{\displaystyle y_{i}=\pm 1}
5160:
5142:
5110:
3412:
2821:
1648:settings. Some methods for
1350:{\displaystyle \alpha _{i}}
1263:of the classifier. A lower
923:Outline of machine learning
820:Empirical risk minimization
10:
14954:
14938:Statistical classification
14718:"Machine Learning: ECML-98
14100:10.1198/016214504000000098
13671:10.1162/089976604773135104
13476:Mathematics Stack Exchange
13281:10.1109/ICDAR.2015.7333916
12880:. Apress. pp. 39–66.
12226:
11286:. The best combination of
9279:{\displaystyle \ell (y,z)}
9205:{\displaystyle f(X_{n+1})}
6613:
5336:{\displaystyle \zeta _{i}}
3919:Thus, for large values of
1689:
1670:data using supervised SVM.
1029:In addition to performing
1000:AT&T Bell Laboratories
560:Feedforward neural network
311:Artificial neural networks
14933:Classification algorithms
14791:"Support Vector Machines"
14772:10.1002/9780470116449.ch6
14751:Ivanciuc, Ovidiu (2007).
14406:10.1137/S1052623400374379
14200:Neural Processing Letters
13563:10.1007/s10107-010-0420-4
13140:Barghout, Lauren (2015).
13081:Machine Learning: ECML-98
12785:"Support-vector networks"
11685:multiclass classification
11605:
10382:regularized least-squares
10378:classification algorithms
5153:
2712:This can be rewritten as
1228:maximum-margin hyperplane
1058:{\displaystyle \epsilon }
543:Artificial neural network
14221:Statistics and Computing
13541:Mathematical Programming
13223:DeCoste, Dennis (2002).
12737:Relevance vector machine
12262:was proposed in 1996 by
11697:semi-supervised learning
11573:{\displaystyle \lambda }
11343:{\displaystyle \lambda }
11299:{\displaystyle \lambda }
11257:{\displaystyle \lambda }
9978:This approach is called
9080:, and wishes to predict
8986:(ERM) algorithm for the
5213:we introduce a variable
5133:{\displaystyle \lambda }
3998:Polynomial (homogeneous)
2332:If the training data is
2045:{\displaystyle y_{i}=-1}
1656:Classification of images
1650:shallow semantic parsing
1541:grows further away from
852:Journals and conferences
799:Mathematical foundations
709:Temporal difference (TD)
565:Recurrent neural network
485:Conditional random field
408:Dimensionality reduction
156:Dimensionality reduction
118:Quantum machine learning
113:Neuromorphic engineering
73:Self-supervised learning
68:Semi-supervised learning
14928:Support vector machines
14836:Support Vector Machines
14434:Platt, John C. (1998).
13612:10.1145/1390156.1390208
13245:10.1023/A:1012454411458
11594:{\displaystyle \gamma }
11364:{\displaystyle \gamma }
11326:is often selected by a
11319:{\displaystyle \gamma }
11278:{\displaystyle \gamma }
11216:Tikhonov regularization
9981:Tikhonov regularization
9238:{\displaystyle y_{n+1}}
9146:. To do so one forms a
9139:{\displaystyle X_{n+1}}
9106:{\displaystyle y_{n+1}}
8411:. As such, traditional
6196:{\displaystyle c_{i}=0}
5658:
2009:{\displaystyle y_{i}=1}
1700:Alexey Ya. Chervonenkis
1238:; or equivalently, the
977:support vector networks
969:support vector machines
261:Apprenticeship learning
14914:implementation of SVMs
14838:. New York: Springer.
14581:10.19026/rjaset.6.3638
13466:
13370:Medical Image Analysis
13275:. pp. 1021–1025.
12534:
12514:
12494:
12449:
12422:
12392:
12316:
12255:
12211:
12151:
12006:
11968:
11928:
11881:
11833:
11721:
11672:Directed acyclic graph
11595:
11574:
11560:can be used to select
11546:
11456:
11365:
11344:
11320:
11300:
11279:
11258:
11192:
11168:
11135:
11108:
11028:
10934:
10867:
10758:
10728:with probability
10701:with probability
10658:
10632:
10612:
10582:
10544:
10456:
10368:
10289:
10186:
10136:
10044:SVM and the hinge loss
10034:
10014:
9970:
9862:
9803:regularization penalty
9795:
9754:
9730:
9706:
9670:
9650:
9607:
9558:
9486:
9431:
9320:
9306:is as a prediction of
9300:
9280:
9239:
9206:
9164:
9140:
9107:
9074:
9034:
8967:
8907:
8866:
8839:
8791:
8698:
8595:
8574:
8530:
8437:
8405:
8385:
8359:
8337:
8227:
8135:
8062:
7952:
7866:
7747:
7642:
7604:
7543:
7523:
7494:
7397:
7291:
7270:
7226:
7104:
7083:
7039:
6954:
6925:
6876:
6833:
6808:
6707:
6687:
6649:
6600:
6564:
6514:
6390:
6361:
6338:
6316:
6287:
6226:
6197:
6162:
6122:
6085:are defined such that
6079:
6045:
6012:
5919:
5808:
5787:
5743:
5643:
5481:
5424:
5337:
5310:
5207:
5134:
5098:
4988:
4927:
4820:
4738:
4637:
4595:
4594:{\displaystyle c<0}
4569:
4543:
4429:
4379:
4353:
4224:
4116:
4090:
3952:
3933:
3911:
3763:
3659:
3630:
3629:{\displaystyle C>0}
3602:
3516:
3439:
3397:
3344:
3315:
3204:
3175:
3139:
3107:
3045:
3025:
3001:
2809:
2703:
2617:
2534:
2510:
2482:
2439:
2381:
2319:
2268:
2246:
2207:
2177:
2155:
2104:
2075:
2046:
2010:
1977:
1944:
1924:
1895:
1866:
1839:
1748:
1731:
1622:
1602:
1575:
1555:
1535:
1515:
1514:{\displaystyle k(x,y)}
1480:
1401:
1381:
1351:
1323:
1322:{\displaystyle k(x,y)}
1279:
1207:
1175:
1155:
1124:
1082:clustering of the data
1059:
990:that analyze data for
810:Bias–variance tradeoff
692:Reinforcement learning
668:Spiking neural network
78:Reinforcement learning
18:Support Vector Machine
14817:Learning with Kernels
14666:10.1145/380995.380999
14185:, 155–161, MIT Press.
13467:
13015:10.1145/130385.130401
12597:interior-point method
12590:quadratic programming
12535:
12515:
12495:
12450:
12448:{\displaystyle y_{i}}
12423:
12421:{\displaystyle x_{i}}
12393:
12317:
12258:A version of SVM for
12245:
12212:
12152:
12007:
11969:
11929:
11882:
11834:
11722:
11654:binary classification
11596:
11575:
11558:Bayesian optimization
11547:
11457:
11366:
11345:
11321:
11301:
11280:
11259:
11193:
11169:
11167:{\displaystyle f^{*}}
11136:
11134:{\displaystyle y_{x}}
11109:
11029:
10935:
10868:
10759:
10659:
10633:
10613:
10611:{\displaystyle y_{x}}
10583:
10581:{\displaystyle X,\,y}
10545:
10457:
10369:
10290:
10166:
10137:
10035:
10015:
9971:
9863:
9796:
9755:
9731:
9707:
9671:
9651:
9608:
9538:
9487:
9432:
9321:
9301:
9281:
9240:
9207:
9165:
9141:
9108:
9075:
9035:
8968:
8908:
8867:
8865:{\displaystyle c_{i}}
8840:
8792:
8678:
8575:
8554:
8510:
8438:
8406:
8386:
8360:
8338:
8207:
8136:
8042:
7953:
7846:
7727:
7643:
7605:
7544:
7524:
7522:{\displaystyle c_{i}}
7495:
7377:
7271:
7250:
7206:
7084:
7063:
7019:
6955:
6953:{\displaystyle c_{i}}
6926:
6856:
6834:
6809:
6708:
6688:
6623:
6601:
6565:
6515:
6391:
6362:
6339:
6317:
6288:
6227:
6198:
6163:
6102:
6080:
6078:{\displaystyle c_{i}}
6053:quadratic programming
6046:
6044:{\displaystyle c_{i}}
6013:
5899:
5788:
5767:
5723:
5644:
5461:
5425:
5338:
5311:
5208:
5148:quadratic programming
5135:
5099:
4968:
4928:
4821:
4739:
4638:
4596:
4570:
4549:for some (not every)
4544:
4430:
4380:
4354:
4235:radial basis function
4225:
4117:
4096:. Particularly, when
4091:
3950:
3934:
3912:
3743:
3660:
3631:
3603:
3496:
3440:
3398:
3345:
3343:{\displaystyle y_{i}}
3316:
3205:
3176:
3140:
3108:
3046:
3026:
3002:
2810:
2704:
2618:
2535:
2511:
2483:
2440:
2382:
2320:
2269:
2247:
2208:
2178:
2156:
2105:
2076:
2047:
2011:
1978:
1945:
1925:
1896:
1867:
1865:{\displaystyle y_{i}}
1840:
1749:
1729:
1623:
1603:
1601:{\displaystyle x_{i}}
1576:
1556:
1536:
1516:
1481:
1402:
1382:
1380:{\displaystyle x_{i}}
1352:
1324:
1277:
1208:
1206:{\displaystyle (p-1)}
1176:
1156:
1110:
1078:unsupervised learning
1060:
1031:linear classification
646:Neural radiance field
468:Structured prediction
191:Structured prediction
63:Unsupervised learning
14147:Joachims, Thorsten.
13919:Müller, Klaus-Robert
13439:
12727:Predictive analytics
12639:sub-gradient descent
12632:Q-linear convergence
12626:(e.g., PEGASOS) and
12624:sub-gradient descent
12524:
12504:
12459:
12432:
12405:
12328:
12281:
12165:
12018:
11978:
11940:
11936:subject to (for any
11893:
11848:
11733:
11707:
11585:
11564:
11466:
11376:
11355:
11334:
11310:
11290:
11269:
11248:
11220:classification error
11178:
11151:
11118:
11038:
10944:
10879:
10772:
10670:
10642:
10622:
10595:
10565:
10472:
10396:
10306:
10148:
10052:
10024:
9991:
9874:
9808:
9764:
9744:
9716:
9692:
9660:
9619:
9501:
9443:
9338:
9310:
9290:
9255:
9216:
9174:
9154:
9117:
9084:
9044:
9004:
8917:
8876:
8849:
8803:
8460:
8427:
8395:
8373:
8349:
8166:
8158:Sub-gradient descent
8153:Sub-gradient descent
7966:
7654:
7614:
7553:
7533:
7506:
6966:
6937:
6845:
6821:
6717:
6697:
6656:
6574:
6526:
6400:
6371:
6351:
6326:
6297:
6236:
6207:
6174:
6091:
6062:
6058:Here, the variables
6028:
5673:
5438:
5347:
5320:
5217:
5171:
5124:
4950:
4836:
4826:. Dot products with
4754:
4647:
4609:
4579:
4553:
4449:
4389:
4363:
4241:
4132:
4100:
4004:
3988:generalization error
3959:. However, in 1992,
3923:
3671:
3640:
3614:
3610:where the parameter
3454:
3420:
3358:
3327:
3232:
3228:function is helpful
3185:
3156:
3117:
3055:
3035:
3013:
2837:
2721:
2627:
2544:
2524:
2492:
2453:
2398:
2343:
2281:
2256:
2217:
2195:
2165:
2114:
2092:
2056:
2020:
1987:
1958:
1934:
1905:
1876:
1849:
1758:
1738:
1640:SVMs are helpful in
1612:
1585:
1565:
1545:
1525:
1490:
1415:
1391:
1364:
1334:
1298:
1265:generalization error
1261:generalization error
1185:
1165:
1145:
1130:is a common task in
1049:
835:Statistical learning
733:Learning with humans
525:Local outlier factor
14855:Pattern Recognition
14654:SIGKDD Explorations
14324:2017arXiv170705532W
13974:1995cs........1101D
13805:10.1007/11494683_28
12620:logistic regression
12182:
12124:
12096:
11828:
11792:
11772:
11240:Parameter selection
10657:{\displaystyle X=x}
10591:In particular, let
10386:logistic regression
8959:
8935:
6546:
6020:This is called the
5850:
5663:By solving for the
5651:This is called the
4115:{\displaystyle d=1}
3736:
2784: for all
1754:points of the form
1217:. This is called a
1093:logistic regression
996:regression analysis
678:Electrochemical RAM
585:reservoir computing
316:Logistic regression
235:Supervised learning
221:Multimodal learning
196:Feature engineering
141:Generative modeling
103:Rule-based learning
98:Curriculum learning
58:Supervised learning
33:Part of a series on
14910:is a GUI demo for
14906:2013-05-05 at the
14726:10.1007/BFb0026683
13904:Shawe-Taylor, John
13900:Cristianini, Nello
13649:Neural Computation
13462:
13090:10.1007/BFb0026683
12960:Tibshirani, Robert
12853:10.1007/BFb0020166
12817:10.1007/BF00994018
12753:Winnow (algorithm)
12628:coordinate descent
12530:
12510:
12490:
12445:
12418:
12388:
12312:
12294:
12264:Vladimir N. Vapnik
12256:
12207:
12168:
12147:
12145:
12108:
12082:
12002:
11964:
11924:
11877:
11829:
11808:
11776:
11756:
11717:
11650:multiclass problem
11591:
11570:
11542:
11452:
11361:
11340:
11316:
11296:
11275:
11254:
11230:margin classifiers
11208:linear classifiers
11188:
11164:
11131:
11104:
11024:
10930:
10863:
10858:
10754:
10749:
10654:
10628:
10608:
10578:
10559:target functions -
10540:
10452:
10364:
10285:
10132:
10030:
10010:
9966:
9922:
9858:
9791:
9750:
9726:
9702:
9666:
9646:
9603:
9482:
9427:
9316:
9296:
9276:
9235:
9202:
9160:
9136:
9103:
9070:
9030:
8963:
8947:
8923:
8903:
8862:
8835:
8787:
8785:
8452:Coordinate descent
8447:Coordinate descent
8433:
8401:
8381:
8355:
8333:
8131:
7948:
7946:
7638:
7600:
7539:
7519:
7490:
7488:
6950:
6921:
6829:
6804:
6703:
6683:
6650:
6596:
6560:
6529:
6510:
6386:
6357:
6334:
6312:
6283:
6222:
6193:
6158:
6075:
6041:
6008:
6006:
5832:
5639:
5637:
5420:
5333:
5306:
5203:
5130:
5094:
4923:
4873:
4816:
4774:
4734:
4633:
4591:
4565:
4539:
4443:Hyperbolic tangent
4425:
4375:
4349:
4220:
4112:
4086:
3953:
3929:
3907:
3905:
3722:
3707:
3655:
3626:
3598:
3435:
3393:
3340:
3311:
3200:
3171:
3135:
3103:
3041:
3021:
2997:
2995:
2864:
2805:
2699:
2613:
2530:
2506:
2478:
2476:
2435:
2377:
2334:linearly separable
2315:
2264:
2242:
2240:
2203:
2173:
2151:
2100:
2071:
2042:
2006:
1973:
1940:
1920:
1891:
1862:
1835:
1744:
1732:
1696:Vladimir N. Vapnik
1661:image segmentation
1618:
1598:
1571:
1551:
1531:
1511:
1476:
1475:
1428:
1397:
1377:
1347:
1319:
1284:linearly separable
1280:
1203:
1171:
1151:
1125:
1055:
246: •
161:Density estimation
14864:978-1-59749-272-0
14845:978-0-387-77241-7
14807:978-1-4614-7137-0
14735:978-3-540-64417-0
14574:(17): 3299–3303.
14551:(Aug): 1871–1874.
14341:978-3-319-71248-2
14277:Bayesian Analysis
13917:; Leen, Todd K.;
13863:10.1109/72.991427
13814:978-3-540-26306-7
13621:978-1-60558-205-4
13460:
13290:978-1-4799-1805-8
13166:978-3-319-16828-9
13100:978-3-540-64417-0
13061:978-0-521-88068-8
12897:978-1-4302-5990-9
12862:978-3-540-69620-9
12722:Polynomial kernel
12554:data augmentation
12293:
11904:
11222:and maximize the
10854:
10813:
10729:
10702:
10631:{\displaystyle y}
10164:
10107:
10066:
10033:{\displaystyle f}
9932:
9903:
9886:
9753:{\displaystyle f}
9669:{\displaystyle n}
9536:
9513:
9319:{\displaystyle y}
9299:{\displaystyle z}
9163:{\displaystyle f}
8996:Risk minimization
8775:
8769:
8732:
8676:
8552:
8471:
8436:{\displaystyle n}
8404:{\displaystyle b}
8358:{\displaystyle f}
8205:
7542:{\displaystyle i}
7502:The coefficients
7478:
7472:
7435:
7375:
7248:
7061:
6976:
6706:{\displaystyle k}
6360:{\displaystyle b}
5996:
5990:
5953:
5897:
5765:
5684:
5627:
5601:
5524:
5459:
5449:
5118:
5117:
4966:
4864:
4765:
4128:(inhomogeneous):
3957:linear classifier
3943:Nonlinear kernels
3932:{\displaystyle C}
3782:
3680:
3494:
3044:{\displaystyle b}
2905:
2877:
2846:
2829:
2828:
2785:
2678:
2592:
2533:{\displaystyle i}
2475:
2239:
2189:Hesse normal form
1943:{\displaystyle p}
1747:{\displaystyle n}
1682:Permutation tests
1621:{\displaystyle x}
1574:{\displaystyle x}
1554:{\displaystyle x}
1534:{\displaystyle y}
1521:becomes small as
1470:
1419:
1400:{\displaystyle x}
1235:margin classifier
1219:linear classifier
1174:{\displaystyle p}
1154:{\displaystyle p}
1097:linear regression
961:
960:
766:Model diagnostics
749:Human-in-the-loop
592:Boltzmann machine
505:Anomaly detection
301:Linear regression
216:Ontology learning
211:Grammar induction
186:Semantic analysis
181:Association rules
166:Anomaly detection
108:Neuro-symbolic AI
16:(Redirected from
14945:
14868:
14849:
14830:
14811:
14795:
14785:
14757:
14747:
14712:
14706:
14696:
14677:
14651:
14633:
14632:
14630:
14620:
14605:EPJ Data Science
14597:Lerman, Kristina
14592:
14586:
14585:
14583:
14559:
14553:
14552:
14540:
14534:
14533:
14531:
14524:
14513:
14507:
14506:
14494:
14485:
14479:
14478:
14476:
14469:
14458:
14452:
14451:
14449:
14442:
14431:
14425:
14424:
14422:
14399:
14381:
14372:
14366:
14360:
14354:
14353:
14317:
14301:
14295:
14294:
14292:
14290:10.1214/11-BA601
14268:
14262:
14261:
14259:
14236:
14218:
14209:
14203:
14192:
14186:
14175:
14169:
14164:
14158:
14157:
14155:
14144:
14138:
14137:
14127:
14118:
14112:
14111:
14093:
14073:
14067:
14066:
14064:
14049:
14040:
14034:
14033:
14031:
14016:
14007:
14001:
14000:
13998:
13982:10.1613/jair.105
13967:
13949:
13940:
13934:
13933:
13931:
13912:
13895:
13889:
13888:
13886:
13885:
13879:
13873:. Archived from
13848:
13839:
13833:
13832:
13830:
13829:
13823:
13817:. Archived from
13798:
13780:
13771:
13762:
13761:
13759:
13752:
13741:
13735:
13734:
13725:(1–2): 169–186.
13714:
13708:
13705:
13699:
13698:
13664:
13655:(5): 1063–1076.
13640:
13634:
13633:
13605:
13589:
13583:
13582:
13556:
13536:
13530:
13529:
13523:
13505:
13499:
13498:
13486:
13480:
13479:
13471:
13469:
13468:
13463:
13461:
13459:
13455:
13443:
13431:
13425:
13414:
13408:
13407:
13405:
13404:
13398:
13392:. Archived from
13367:
13358:
13352:
13351:
13341:
13309:
13303:
13302:
13264:
13258:
13257:
13247:
13232:Machine Learning
13229:
13220:
13214:
13213:
13211:
13199:
13193:
13192:
13190:
13189:
13183:
13177:. Archived from
13146:
13137:
13131:
13128:
13122:
13121:
13111:
13105:
13104:
13092:
13076:
13070:
13069:
13043:
13037:
13036:
13008:
12988:
12977:
12976:
12974:
12964:Friedman, Jerome
12952:
12946:
12945:
12943:
12942:
12927:
12921:
12920:
12908:
12902:
12901:
12889:
12873:
12867:
12866:
12836:
12830:
12829:
12819:
12809:
12793:Machine Learning
12789:
12781:Vapnik, Vladimir
12773:
12539:
12537:
12536:
12531:
12519:
12517:
12516:
12511:
12499:
12497:
12496:
12491:
12480:
12479:
12454:
12452:
12451:
12446:
12444:
12443:
12427:
12425:
12424:
12419:
12417:
12416:
12397:
12395:
12394:
12389:
12381:
12367:
12366:
12345:
12344:
12335:
12321:
12319:
12318:
12313:
12311:
12310:
12295:
12286:
12216:
12214:
12213:
12208:
12181:
12176:
12156:
12154:
12153:
12148:
12146:
12123:
12118:
12113:
12104:
12095:
12090:
12080:
12058:
12057:
12052:
12043:
12035:
12034:
12024:
12011:
12009:
12008:
12003:
11973:
11971:
11970:
11965:
11933:
11931:
11930:
11925:
11923:
11922:
11913:
11905:
11897:
11886:
11884:
11883:
11878:
11876:
11875:
11870:
11855:
11838:
11836:
11835:
11830:
11827:
11822:
11807:
11806:
11801:
11791:
11786:
11781:
11771:
11766:
11761:
11749:
11748:
11743:
11742:
11726:
11724:
11723:
11718:
11716:
11715:
11600:
11598:
11597:
11592:
11579:
11577:
11576:
11571:
11554:cross validation
11551:
11549:
11548:
11543:
11538:
11537:
11525:
11524:
11506:
11505:
11490:
11489:
11461:
11459:
11458:
11453:
11448:
11447:
11435:
11434:
11416:
11415:
11400:
11399:
11370:
11368:
11367:
11362:
11349:
11347:
11346:
11341:
11325:
11323:
11322:
11317:
11305:
11303:
11302:
11297:
11284:
11282:
11281:
11276:
11263:
11261:
11260:
11255:
11224:geometric margin
11197:
11195:
11194:
11189:
11187:
11186:
11173:
11171:
11170:
11165:
11163:
11162:
11140:
11138:
11137:
11132:
11130:
11129:
11113:
11111:
11110:
11105:
11103:
11102:
11087:
11086:
11062:
11061:
11033:
11031:
11030:
11025:
11023:
11019:
11015:
11014:
11013:
10994:
10989:
10988:
10956:
10955:
10939:
10937:
10936:
10931:
10929:
10925:
10924:
10911:
10894:
10893:
10872:
10870:
10869:
10864:
10862:
10861:
10855:
10852:
10835:
10824:
10823:
10814:
10811:
10784:
10783:
10763:
10761:
10760:
10755:
10753:
10752:
10746:
10745:
10730:
10727:
10713:
10712:
10703:
10700:
10682:
10681:
10663:
10661:
10660:
10655:
10637:
10635:
10634:
10629:
10617:
10615:
10614:
10609:
10607:
10606:
10587:
10585:
10584:
10579:
10553:Target functions
10549:
10547:
10546:
10541:
10533:
10532:
10484:
10483:
10461:
10459:
10458:
10453:
10451:
10450:
10411:
10410:
10373:
10371:
10370:
10365:
10360:
10356:
10294:
10292:
10291:
10286:
10281:
10280:
10271:
10257:
10253:
10252:
10248:
10238:
10233:
10232:
10231:
10225:
10216:
10215:
10185:
10180:
10165:
10157:
10141:
10139:
10138:
10133:
10122:
10117:
10116:
10115:
10109:
10108:
10103:
10098:
10082:
10068:
10067:
10062:
10057:
10039:
10037:
10036:
10031:
10019:
10017:
10016:
10011:
10000:
9999:
9987:More generally,
9975:
9973:
9972:
9967:
9953:
9952:
9934:
9933:
9925:
9921:
9920:
9919:
9902:
9888:
9887:
9879:
9867:
9865:
9864:
9859:
9857:
9856:
9855:
9839:
9838:
9817:
9816:
9800:
9798:
9797:
9792:
9784:
9783:
9782:
9759:
9757:
9756:
9751:
9735:
9733:
9732:
9727:
9725:
9724:
9711:
9709:
9708:
9703:
9701:
9700:
9675:
9673:
9672:
9667:
9655:
9653:
9652:
9647:
9645:
9644:
9631:
9630:
9612:
9610:
9609:
9604:
9593:
9592:
9574:
9573:
9557:
9552:
9537:
9529:
9515:
9514:
9506:
9491:
9489:
9488:
9483:
9481:
9480:
9461:
9460:
9436:
9434:
9433:
9428:
9423:
9419:
9412:
9411:
9387:
9386:
9360:
9325:
9323:
9322:
9317:
9305:
9303:
9302:
9297:
9285:
9283:
9282:
9277:
9244:
9242:
9241:
9236:
9234:
9233:
9211:
9209:
9208:
9203:
9198:
9197:
9169:
9167:
9166:
9161:
9145:
9143:
9142:
9137:
9135:
9134:
9112:
9110:
9109:
9104:
9102:
9101:
9079:
9077:
9076:
9071:
9069:
9068:
9056:
9055:
9039:
9037:
9036:
9031:
9029:
9028:
9016:
9015:
8972:
8970:
8969:
8964:
8955:
8931:
8912:
8910:
8909:
8904:
8902:
8901:
8889:
8871:
8869:
8868:
8863:
8861:
8860:
8844:
8842:
8841:
8836:
8796:
8794:
8793:
8788:
8786:
8776:
8773:
8770:
8768:
8754:
8749:
8748:
8733:
8730:
8718:
8717:
8708:
8707:
8697:
8692:
8677:
8675:subject to
8674:
8671:
8664:
8663:
8654:
8653:
8641:
8640:
8628:
8627:
8615:
8614:
8605:
8604:
8594:
8589:
8573:
8568:
8553:
8545:
8540:
8539:
8529:
8524:
8503:
8502:
8490:
8489:
8472:
8469:
8466:
8442:
8440:
8439:
8434:
8413:gradient descent
8410:
8408:
8407:
8402:
8390:
8388:
8387:
8382:
8380:
8364:
8362:
8361:
8356:
8342:
8340:
8339:
8334:
8329:
8328:
8319:
8305:
8301:
8300:
8296:
8286:
8285:
8280:
8274:
8273:
8272:
8266:
8257:
8256:
8226:
8221:
8206:
8198:
8179:
8140:
8138:
8137:
8132:
8127:
8123:
8116:
8112:
8108:
8100:
8099:
8094:
8082:
8081:
8072:
8071:
8061:
8056:
8010:
7999:
7998:
7997:
7991:
7973:
7957:
7955:
7954:
7949:
7947:
7940:
7939:
7927:
7923:
7919:
7918:
7913:
7904:
7903:
7898:
7886:
7885:
7876:
7875:
7865:
7860:
7834:
7830:
7829:
7817:
7813:
7809:
7808:
7803:
7785:
7784:
7779:
7767:
7766:
7757:
7756:
7746:
7741:
7714:
7713:
7698:
7697:
7692:
7680:
7679:
7678:
7672:
7647:
7645:
7644:
7639:
7634:
7633:
7628:
7609:
7607:
7606:
7601:
7599:
7598:
7571:
7570:
7548:
7546:
7545:
7540:
7528:
7526:
7525:
7520:
7518:
7517:
7499:
7497:
7496:
7491:
7489:
7479:
7476:
7473:
7471:
7457:
7452:
7451:
7436:
7433:
7417:
7416:
7407:
7406:
7396:
7391:
7376:
7374:subject to
7373:
7367:
7366:
7357:
7356:
7344:
7343:
7338:
7329:
7328:
7323:
7311:
7310:
7301:
7300:
7290:
7285:
7269:
7264:
7249:
7241:
7236:
7235:
7225:
7220:
7199:
7195:
7194:
7185:
7184:
7169:
7168:
7163:
7145:
7144:
7139:
7124:
7123:
7114:
7113:
7103:
7098:
7082:
7077:
7062:
7054:
7049:
7048:
7038:
7033:
7008:
7007:
6995:
6994:
6977:
6974:
6959:
6957:
6956:
6951:
6949:
6948:
6930:
6928:
6927:
6922:
6914:
6913:
6908:
6896:
6895:
6886:
6885:
6875:
6870:
6852:
6838:
6836:
6835:
6830:
6828:
6813:
6811:
6810:
6805:
6800:
6799:
6794:
6776:
6775:
6770:
6752:
6751:
6746:
6737:
6736:
6731:
6713:which satisfies
6712:
6710:
6709:
6704:
6692:
6690:
6689:
6684:
6676:
6675:
6670:
6605:
6603:
6602:
6597:
6586:
6585:
6569:
6567:
6566:
6561:
6559:
6558:
6545:
6537:
6519:
6517:
6516:
6511:
6506:
6505:
6493:
6492:
6487:
6481:
6480:
6479:
6473:
6441:
6440:
6435:
6429:
6428:
6427:
6421:
6412:
6411:
6395:
6393:
6392:
6387:
6385:
6384:
6379:
6366:
6364:
6363:
6358:
6343:
6341:
6340:
6335:
6333:
6321:
6319:
6318:
6313:
6311:
6310:
6305:
6292:
6290:
6289:
6284:
6282:
6281:
6254:
6253:
6231:
6229:
6228:
6223:
6221:
6220:
6215:
6202:
6200:
6199:
6194:
6186:
6185:
6167:
6165:
6164:
6159:
6154:
6153:
6148:
6142:
6141:
6132:
6131:
6121:
6116:
6098:
6084:
6082:
6081:
6076:
6074:
6073:
6050:
6048:
6047:
6042:
6040:
6039:
6017:
6015:
6014:
6009:
6007:
5997:
5994:
5991:
5989:
5975:
5970:
5969:
5954:
5951:
5939:
5938:
5929:
5928:
5918:
5913:
5898:
5896:subject to
5895:
5892:
5885:
5884:
5875:
5874:
5862:
5861:
5856:
5849:
5848:
5842:
5837:
5828:
5827:
5818:
5817:
5807:
5802:
5786:
5781:
5766:
5758:
5753:
5752:
5742:
5737:
5716:
5715:
5703:
5702:
5685:
5682:
5679:
5648:
5646:
5645:
5640:
5638:
5628:
5625:
5613:
5612:
5602:
5599:
5596:
5595:
5577:
5573:
5566:
5565:
5560:
5554:
5553:
5552:
5546:
5535:
5534:
5525:
5523:subject to
5522:
5519:
5515:
5514:
5505:
5491:
5490:
5480:
5475:
5460:
5452:
5450:
5447:
5444:
5429:
5427:
5426:
5421:
5416:
5415:
5388:
5387:
5382:
5376:
5375:
5374:
5368:
5359:
5358:
5342:
5340:
5339:
5334:
5332:
5331:
5315:
5313:
5312:
5307:
5305:
5301:
5291:
5290:
5285:
5279:
5278:
5277:
5271:
5262:
5261:
5229:
5228:
5212:
5210:
5209:
5204:
5139:
5137:
5136:
5131:
5112:
5103:
5101:
5100:
5095:
5090:
5089:
5080:
5066:
5062:
5061:
5057:
5047:
5046:
5041:
5035:
5034:
5033:
5027:
5018:
5017:
4987:
4982:
4967:
4959:
4944:
4932:
4930:
4929:
4924:
4919:
4911:
4910:
4905:
4893:
4892:
4883:
4882:
4872:
4857:
4843:
4831:
4825:
4823:
4822:
4817:
4812:
4811:
4806:
4794:
4793:
4784:
4783:
4773:
4761:
4749:
4743:
4741:
4740:
4735:
4730:
4729:
4728:
4706:
4705:
4700:
4682:
4681:
4676:
4667:
4666:
4661:
4643:by the equation
4642:
4640:
4639:
4634:
4629:
4628:
4623:
4600:
4598:
4597:
4592:
4574:
4572:
4571:
4566:
4548:
4546:
4545:
4540:
4529:
4528:
4523:
4514:
4513:
4508:
4484:
4483:
4482:
4469:
4468:
4467:
4439:Sigmoid function
4434:
4432:
4431:
4426:
4421:
4420:
4405:
4384:
4382:
4381:
4376:
4358:
4356:
4355:
4350:
4348:
4344:
4343:
4342:
4337:
4333:
4332:
4331:
4326:
4317:
4316:
4311:
4276:
4275:
4270:
4261:
4260:
4255:
4229:
4227:
4226:
4221:
4219:
4218:
4203:
4202:
4197:
4188:
4187:
4182:
4167:
4166:
4161:
4152:
4151:
4146:
4121:
4119:
4118:
4113:
4095:
4093:
4092:
4087:
4085:
4084:
4075:
4074:
4069:
4060:
4059:
4054:
4039:
4038:
4033:
4024:
4023:
4018:
3938:
3936:
3935:
3930:
3916:
3914:
3913:
3908:
3906:
3865:
3864:
3851:
3850:
3823:
3822:
3817:
3811:
3810:
3805:
3796:
3795:
3785:
3783:
3780:
3777:
3773:
3772:
3762:
3757:
3735:
3730:
3721:
3710:
3708:
3706:
3705:
3689:
3677:
3664:
3662:
3661:
3656:
3654:
3653:
3648:
3635:
3633:
3632:
3627:
3607:
3605:
3604:
3599:
3594:
3590:
3589:
3585:
3575:
3574:
3569:
3563:
3562:
3561:
3555:
3546:
3545:
3515:
3510:
3495:
3487:
3474:
3473:
3464:
3444:
3442:
3441:
3436:
3434:
3433:
3428:
3402:
3400:
3399:
3394:
3386:
3385:
3380:
3374:
3373:
3372:
3366:
3349:
3347:
3346:
3341:
3339:
3338:
3320:
3318:
3317:
3312:
3307:
3303:
3293:
3292:
3287:
3281:
3280:
3279:
3273:
3264:
3263:
3209:
3207:
3206:
3201:
3199:
3198:
3193:
3180:
3178:
3177:
3172:
3170:
3169:
3164:
3144:
3142:
3141:
3136:
3112:
3110:
3109:
3104:
3093:
3088:
3087:
3086:
3080:
3062:
3050:
3048:
3047:
3042:
3030:
3028:
3027:
3022:
3020:
3006:
3004:
3003:
2998:
2996:
2946:
2945:
2940:
2934:
2933:
2928:
2919:
2918:
2908:
2906:
2903:
2900:
2896:
2895:
2886:
2878:
2870:
2867:
2865:
2863:
2855:
2843:
2823:
2814:
2812:
2811:
2806:
2786:
2783:
2762:
2761:
2756:
2750:
2749:
2748:
2742:
2733:
2732:
2715:
2708:
2706:
2705:
2700:
2689:
2688:
2679:
2676:
2655:
2654:
2649:
2643:
2642:
2641:
2635:
2622:
2620:
2619:
2614:
2603:
2602:
2593:
2590:
2572:
2571:
2566:
2560:
2559:
2558:
2552:
2539:
2537:
2536:
2531:
2515:
2513:
2512:
2507:
2502:
2487:
2485:
2484:
2479:
2477:
2474:
2470:
2458:
2444:
2442:
2441:
2436:
2419:
2414:
2413:
2412:
2406:
2386:
2384:
2383:
2378:
2364:
2359:
2358:
2357:
2351:
2324:
2322:
2321:
2316:
2302:
2297:
2296:
2295:
2289:
2273:
2271:
2270:
2265:
2263:
2251:
2249:
2248:
2243:
2241:
2238:
2234:
2222:
2212:
2210:
2209:
2204:
2202:
2182:
2180:
2179:
2174:
2172:
2160:
2158:
2157:
2152:
2135:
2130:
2129:
2128:
2122:
2109:
2107:
2106:
2101:
2099:
2080:
2078:
2077:
2072:
2070:
2069:
2064:
2051:
2049:
2048:
2043:
2032:
2031:
2015:
2013:
2012:
2007:
1999:
1998:
1982:
1980:
1979:
1974:
1972:
1971:
1966:
1949:
1947:
1946:
1941:
1929:
1927:
1926:
1921:
1919:
1918:
1913:
1900:
1898:
1897:
1892:
1890:
1889:
1884:
1871:
1869:
1868:
1863:
1861:
1860:
1844:
1842:
1841:
1836:
1828:
1827:
1815:
1814:
1809:
1788:
1787:
1775:
1774:
1769:
1753:
1751:
1750:
1745:
1627:
1625:
1624:
1619:
1607:
1605:
1604:
1599:
1597:
1596:
1580:
1578:
1577:
1572:
1560:
1558:
1557:
1552:
1540:
1538:
1537:
1532:
1520:
1518:
1517:
1512:
1485:
1483:
1482:
1477:
1471:
1468:
1454:
1453:
1438:
1437:
1427:
1406:
1404:
1403:
1398:
1386:
1384:
1383:
1378:
1376:
1375:
1356:
1354:
1353:
1348:
1346:
1345:
1328:
1326:
1325:
1320:
1212:
1210:
1209:
1204:
1180:
1178:
1177:
1172:
1160:
1158:
1157:
1152:
1132:machine learning
1128:Classifying data
1064:
1062:
1061:
1056:
965:machine learning
953:
946:
939:
900:Related articles
777:Confusion matrix
530:Isolation forest
475:Graphical models
254:
253:
206:Learning to rank
201:Feature learning
39:Machine learning
30:
29:
21:
14953:
14952:
14948:
14947:
14946:
14944:
14943:
14942:
14918:
14917:
14908:Wayback Machine
14901:SVMJS live demo
14875:
14865:
14846:
14827:
14808:
14793:
14782:
14755:
14736:
14704:
14693:
14649:
14641:
14639:Further reading
14636:
14593:
14589:
14560:
14556:
14541:
14537:
14529:
14522:
14514:
14510:
14492:
14486:
14482:
14474:
14467:
14459:
14455:
14447:
14440:
14432:
14428:
14420:
14397:10.1.1.216.6893
14379:
14373:
14369:
14361:
14357:
14342:
14302:
14298:
14269:
14265:
14257:
14216:
14210:
14206:
14193:
14189:
14176:
14172:
14165:
14161:
14153:
14145:
14141:
14125:
14119:
14115:
14074:
14070:
14062:
14047:
14041:
14037:
14029:
14014:
14008:
14004:
13996:
13947:
13941:
13937:
13929:
13910:
13896:
13892:
13883:
13881:
13877:
13846:
13840:
13836:
13827:
13825:
13821:
13815:
13796:10.1.1.110.6789
13778:
13772:
13765:
13757:
13750:
13742:
13738:
13715:
13711:
13706:
13702:
13662:10.1.1.109.6786
13641:
13637:
13622:
13603:10.1.1.149.5594
13590:
13586:
13554:10.1.1.161.9629
13537:
13533:
13521:10.1.1.420.3487
13506:
13502:
13487:
13483:
13451:
13447:
13442:
13440:
13437:
13436:
13433:
13432:
13428:
13415:
13411:
13402:
13400:
13396:
13365:
13359:
13355:
13310:
13306:
13291:
13265:
13261:
13227:
13221:
13217:
13200:
13196:
13187:
13185:
13181:
13167:
13144:
13138:
13134:
13129:
13125:
13112:
13108:
13101:
13077:
13073:
13062:
13044:
13040:
13025:
12999:. p. 144.
12989:
12980:
12972:
12953:
12949:
12940:
12938:
12929:
12928:
12924:
12909:
12905:
12898:
12874:
12870:
12863:
12837:
12833:
12787:
12777:Cortes, Corinna
12774:
12765:
12761:
12707:Kernel machines
12698:
12682:JKernelMachines
12643:parallelization
12586:
12558:graphical model
12546:
12525:
12522:
12521:
12505:
12502:
12501:
12475:
12471:
12460:
12457:
12456:
12439:
12435:
12433:
12430:
12429:
12412:
12408:
12406:
12403:
12402:
12377:
12362:
12358:
12340:
12336:
12331:
12329:
12326:
12325:
12306:
12302:
12284:
12282:
12279:
12278:
12240:
12231:
12225:
12177:
12172:
12166:
12163:
12162:
12144:
12143:
12119:
12114:
12109:
12100:
12091:
12086:
12078:
12077:
12053:
12048:
12047:
12039:
12030:
12026:
12021:
12019:
12016:
12015:
11979:
11976:
11975:
11941:
11938:
11937:
11918:
11914:
11909:
11896:
11894:
11891:
11890:
11871:
11866:
11865:
11851:
11849:
11846:
11845:
11823:
11812:
11802:
11797:
11796:
11787:
11782:
11777:
11767:
11762:
11757:
11744:
11738:
11737:
11736:
11734:
11731:
11730:
11711:
11710:
11708:
11705:
11704:
11693:
11643:
11638:
11626:multi-class SVM
11608:
11586:
11583:
11582:
11565:
11562:
11561:
11533:
11529:
11520:
11516:
11498:
11494:
11482:
11478:
11467:
11464:
11463:
11443:
11439:
11430:
11426:
11408:
11404:
11392:
11388:
11377:
11374:
11373:
11372:, for example,
11356:
11353:
11352:
11335:
11332:
11331:
11311:
11308:
11307:
11291:
11288:
11287:
11270:
11267:
11266:
11249:
11246:
11245:
11242:
11204:
11182:
11181:
11179:
11176:
11175:
11158:
11154:
11152:
11149:
11148:
11125:
11121:
11119:
11116:
11115:
11098:
11094:
11082:
11078:
11054:
11050:
11039:
11036:
11035:
11009:
11005:
10998:
10990:
10984:
10980:
10979:
10975:
10951:
10947:
10945:
10942:
10941:
10920:
10916:
10912:
10907:
10886:
10882:
10880:
10877:
10876:
10857:
10856:
10851:
10849:
10840:
10839:
10831:
10819:
10815:
10810:
10808:
10798:
10797:
10779:
10775:
10773:
10770:
10769:
10748:
10747:
10741:
10737:
10726:
10724:
10715:
10714:
10708:
10704:
10699:
10697:
10687:
10686:
10677:
10673:
10671:
10668:
10667:
10643:
10640:
10639:
10623:
10620:
10619:
10602:
10598:
10596:
10593:
10592:
10566:
10563:
10562:
10555:
10522:
10518:
10479:
10475:
10473:
10470:
10469:
10446:
10442:
10403:
10399:
10397:
10394:
10393:
10337:
10333:
10307:
10304:
10303:
10276:
10272:
10267:
10234:
10227:
10226:
10221:
10220:
10211:
10207:
10194:
10190:
10181:
10170:
10156:
10155:
10151:
10149:
10146:
10145:
10118:
10111:
10110:
10099:
10097:
10096:
10095:
10078:
10058:
10056:
10055:
10053:
10050:
10049:
10046:
10025:
10022:
10021:
9995:
9994:
9992:
9989:
9988:
9948:
9947:
9924:
9923:
9915:
9914:
9907:
9892:
9878:
9877:
9875:
9872:
9871:
9851:
9850:
9846:
9834:
9830:
9812:
9811:
9809:
9806:
9805:
9778:
9777:
9773:
9765:
9762:
9761:
9745:
9742:
9741:
9720:
9719:
9717:
9714:
9713:
9696:
9695:
9693:
9690:
9689:
9686:
9661:
9658:
9657:
9640:
9636:
9626:
9622:
9620:
9617:
9616:
9588:
9584:
9569:
9565:
9553:
9542:
9528:
9505:
9504:
9502:
9499:
9498:
9494:empirical risk:
9470:
9466:
9450:
9446:
9444:
9441:
9440:
9401:
9397:
9376:
9372:
9365:
9361:
9356:
9339:
9336:
9335:
9311:
9308:
9307:
9291:
9288:
9287:
9256:
9253:
9252:
9223:
9219:
9217:
9214:
9213:
9187:
9183:
9175:
9172:
9171:
9155:
9152:
9151:
9124:
9120:
9118:
9115:
9114:
9091:
9087:
9085:
9082:
9081:
9064:
9060:
9051:
9047:
9045:
9042:
9041:
9024:
9020:
9011:
9007:
9005:
9002:
9001:
8998:
8980:
8951:
8927:
8918:
8915:
8914:
8897:
8893:
8885:
8877:
8874:
8873:
8856:
8852:
8850:
8847:
8846:
8804:
8801:
8800:
8784:
8783:
8772:
8758:
8753:
8744:
8740:
8729:
8713:
8709:
8703:
8699:
8693:
8682:
8673:
8669:
8668:
8659:
8655:
8649:
8645:
8636:
8632:
8623:
8619:
8610:
8606:
8600:
8596:
8590:
8579:
8569:
8558:
8544:
8535:
8531:
8525:
8514:
8498:
8494:
8485:
8481:
8468:
8463:
8461:
8458:
8457:
8449:
8428:
8425:
8424:
8396:
8393:
8392:
8376:
8374:
8371:
8370:
8367:convex function
8350:
8347:
8346:
8324:
8320:
8315:
8281:
8276:
8275:
8268:
8267:
8262:
8261:
8252:
8248:
8235:
8231:
8222:
8211:
8197:
8196:
8192:
8175:
8167:
8164:
8163:
8155:
8146:
8104:
8095:
8090:
8089:
8077:
8073:
8067:
8063:
8057:
8046:
8041:
8037:
8036:
8032:
8006:
7993:
7992:
7987:
7986:
7969:
7967:
7964:
7963:
7945:
7944:
7935:
7931:
7914:
7909:
7908:
7899:
7894:
7893:
7881:
7877:
7871:
7867:
7861:
7850:
7845:
7841:
7832:
7831:
7825:
7821:
7804:
7799:
7798:
7780:
7775:
7774:
7762:
7758:
7752:
7748:
7742:
7731:
7726:
7722:
7715:
7709:
7705:
7693:
7688:
7687:
7674:
7673:
7668:
7667:
7657:
7655:
7652:
7651:
7629:
7624:
7623:
7615:
7612:
7611:
7591:
7587:
7566:
7562:
7554:
7551:
7550:
7534:
7531:
7530:
7513:
7509:
7507:
7504:
7503:
7487:
7486:
7475:
7461:
7456:
7447:
7443:
7432:
7418:
7412:
7408:
7402:
7398:
7392:
7381:
7372:
7369:
7368:
7362:
7358:
7352:
7348:
7339:
7334:
7333:
7324:
7319:
7318:
7306:
7302:
7296:
7292:
7286:
7275:
7265:
7254:
7240:
7231:
7227:
7221:
7210:
7197:
7196:
7190:
7186:
7180:
7176:
7164:
7159:
7158:
7140:
7135:
7134:
7119:
7115:
7109:
7105:
7099:
7088:
7078:
7067:
7053:
7044:
7040:
7034:
7023:
7012:
7003:
6999:
6990:
6986:
6973:
6969:
6967:
6964:
6963:
6944:
6940:
6938:
6935:
6934:
6909:
6904:
6903:
6891:
6887:
6881:
6877:
6871:
6860:
6848:
6846:
6843:
6842:
6824:
6822:
6819:
6818:
6795:
6790:
6789:
6771:
6766:
6765:
6747:
6742:
6741:
6732:
6727:
6726:
6718:
6715:
6714:
6698:
6695:
6694:
6671:
6666:
6665:
6657:
6654:
6653:
6618:
6612:
6581:
6577:
6575:
6572:
6571:
6554:
6550:
6538:
6533:
6527:
6524:
6523:
6501:
6497:
6488:
6483:
6482:
6475:
6474:
6469:
6468:
6436:
6431:
6430:
6423:
6422:
6417:
6416:
6407:
6403:
6401:
6398:
6397:
6380:
6375:
6374:
6372:
6369:
6368:
6352:
6349:
6348:
6329:
6327:
6324:
6323:
6306:
6301:
6300:
6298:
6295:
6294:
6274:
6270:
6249:
6245:
6237:
6234:
6233:
6216:
6211:
6210:
6208:
6205:
6204:
6181:
6177:
6175:
6172:
6171:
6149:
6144:
6143:
6137:
6133:
6127:
6123:
6117:
6106:
6094:
6092:
6089:
6088:
6069:
6065:
6063:
6060:
6059:
6035:
6031:
6029:
6026:
6025:
6005:
6004:
5993:
5979:
5974:
5965:
5961:
5950:
5934:
5930:
5924:
5920:
5914:
5903:
5894:
5890:
5889:
5880:
5876:
5870:
5866:
5857:
5852:
5851:
5844:
5843:
5838:
5833:
5823:
5819:
5813:
5809:
5803:
5792:
5782:
5771:
5757:
5748:
5744:
5738:
5727:
5711:
5707:
5698:
5694:
5681:
5676:
5674:
5671:
5670:
5665:Lagrangian dual
5661:
5636:
5635:
5624:
5608:
5604:
5600: and
5598:
5591:
5587:
5561:
5556:
5555:
5548:
5547:
5542:
5541:
5540:
5536:
5530:
5526:
5521:
5517:
5516:
5510:
5506:
5501:
5486:
5482:
5476:
5465:
5451:
5446:
5441:
5439:
5436:
5435:
5411:
5407:
5383:
5378:
5377:
5370:
5369:
5364:
5363:
5354:
5350:
5348:
5345:
5344:
5327:
5323:
5321:
5318:
5317:
5286:
5281:
5280:
5273:
5272:
5267:
5266:
5257:
5253:
5240:
5236:
5224:
5220:
5218:
5215:
5214:
5172:
5169:
5168:
5156:
5125:
5122:
5121:
5085:
5081:
5076:
5042:
5037:
5036:
5029:
5028:
5023:
5022:
5013:
5009:
4996:
4992:
4983:
4972:
4958:
4957:
4953:
4951:
4948:
4947:
4939:
4915:
4906:
4901:
4900:
4888:
4884:
4878:
4874:
4868:
4853:
4839:
4837:
4834:
4833:
4827:
4807:
4802:
4801:
4789:
4785:
4779:
4775:
4769:
4757:
4755:
4752:
4751:
4745:
4724:
4720:
4719:
4701:
4696:
4695:
4677:
4672:
4671:
4662:
4657:
4656:
4648:
4645:
4644:
4624:
4619:
4618:
4610:
4607:
4606:
4580:
4577:
4576:
4554:
4551:
4550:
4524:
4519:
4518:
4509:
4504:
4503:
4478:
4474:
4473:
4463:
4459:
4458:
4450:
4447:
4446:
4416:
4412:
4401:
4390:
4387:
4386:
4364:
4361:
4360:
4338:
4327:
4322:
4321:
4312:
4307:
4306:
4305:
4301:
4300:
4293:
4289:
4271:
4266:
4265:
4256:
4251:
4250:
4242:
4239:
4238:
4214:
4210:
4198:
4193:
4192:
4183:
4178:
4177:
4162:
4157:
4156:
4147:
4142:
4141:
4133:
4130:
4129:
4101:
4098:
4097:
4080:
4076:
4070:
4065:
4064:
4055:
4050:
4049:
4034:
4029:
4028:
4019:
4014:
4013:
4005:
4002:
4001:
3969:Vladimir Vapnik
3945:
3924:
3921:
3920:
3904:
3903:
3860:
3856:
3846:
3842:
3818:
3813:
3812:
3806:
3801:
3800:
3791:
3787:
3784:
3779:
3775:
3774:
3768:
3764:
3758:
3747:
3731:
3726:
3717:
3709:
3701:
3685:
3684:
3679:
3674:
3672:
3669:
3668:
3649:
3644:
3643:
3641:
3638:
3637:
3615:
3612:
3611:
3570:
3565:
3564:
3557:
3556:
3551:
3550:
3541:
3537:
3524:
3520:
3511:
3500:
3486:
3485:
3481:
3469:
3465:
3460:
3455:
3452:
3451:
3429:
3424:
3423:
3421:
3418:
3417:
3381:
3376:
3375:
3368:
3367:
3362:
3361:
3359:
3356:
3355:
3334:
3330:
3328:
3325:
3324:
3288:
3283:
3282:
3275:
3274:
3269:
3268:
3259:
3255:
3242:
3238:
3233:
3230:
3229:
3220:
3212:support vectors
3194:
3189:
3188:
3186:
3183:
3182:
3165:
3160:
3159:
3157:
3154:
3153:
3118:
3115:
3114:
3089:
3082:
3081:
3076:
3075:
3058:
3056:
3053:
3052:
3036:
3033:
3032:
3016:
3014:
3011:
3010:
2994:
2993:
2941:
2936:
2935:
2929:
2924:
2923:
2914:
2910:
2907:
2902:
2898:
2897:
2891:
2887:
2882:
2869:
2866:
2851:
2850:
2845:
2840:
2838:
2835:
2834:
2782:
2757:
2752:
2751:
2744:
2743:
2738:
2737:
2728:
2724:
2722:
2719:
2718:
2684:
2680:
2675:
2650:
2645:
2644:
2637:
2636:
2631:
2630:
2628:
2625:
2624:
2598:
2594:
2589:
2567:
2562:
2561:
2554:
2553:
2548:
2547:
2545:
2542:
2541:
2525:
2522:
2521:
2498:
2493:
2490:
2489:
2466:
2462:
2456:
2454:
2451:
2450:
2415:
2408:
2407:
2402:
2401:
2399:
2396:
2395:
2360:
2353:
2352:
2347:
2346:
2344:
2341:
2340:
2330:
2298:
2291:
2290:
2285:
2284:
2282:
2279:
2278:
2259:
2257:
2254:
2253:
2230:
2226:
2220:
2218:
2215:
2214:
2198:
2196:
2193:
2192:
2168:
2166:
2163:
2162:
2131:
2124:
2123:
2118:
2117:
2115:
2112:
2111:
2095:
2093:
2090:
2089:
2065:
2060:
2059:
2057:
2054:
2053:
2027:
2023:
2021:
2018:
2017:
1994:
1990:
1988:
1985:
1984:
1967:
1962:
1961:
1959:
1956:
1955:
1935:
1932:
1931:
1914:
1909:
1908:
1906:
1903:
1902:
1885:
1880:
1879:
1877:
1874:
1873:
1856:
1852:
1850:
1847:
1846:
1823:
1819:
1810:
1805:
1804:
1783:
1779:
1770:
1765:
1764:
1759:
1756:
1755:
1739:
1736:
1735:
1724:
1708:Vladimir Vapnik
1692:
1634:
1613:
1610:
1609:
1592:
1588:
1586:
1583:
1582:
1566:
1563:
1562:
1546:
1543:
1542:
1526:
1523:
1522:
1491:
1488:
1487:
1467:
1449:
1445:
1433:
1429:
1423:
1416:
1413:
1412:
1392:
1389:
1388:
1371:
1367:
1365:
1362:
1361:
1359:feature vectors
1341:
1337:
1335:
1332:
1331:
1299:
1296:
1295:
1293:kernel function
1186:
1183:
1182:
1166:
1163:
1162:
1146:
1143:
1142:
1122:
1118:
1114:
1105:
1074:Vladimir Vapnik
1070:Hava Siegelmann
1050:
1047:
1046:
1004:Vladimir Vapnik
998:. Developed at
957:
928:
927:
901:
893:
892:
853:
845:
844:
805:Kernel machines
800:
792:
791:
767:
759:
758:
739:Active learning
734:
726:
725:
694:
684:
683:
609:Diffusion model
545:
535:
534:
507:
497:
496:
470:
460:
459:
415:Factor analysis
410:
400:
399:
383:
346:
336:
335:
256:
255:
239:
238:
237:
226:
225:
131:
123:
122:
88:Online learning
53:
41:
28:
23:
22:
15:
12:
11:
5:
14951:
14941:
14940:
14935:
14930:
14916:
14915:
14898:
14892:
14886:
14874:
14873:External links
14871:
14870:
14869:
14863:
14850:
14844:
14831:
14825:
14812:
14806:
14786:
14780:
14748:
14734:
14713:
14697:
14691:
14678:
14640:
14637:
14635:
14634:
14599:(2019-12-01).
14587:
14554:
14535:
14508:
14480:
14453:
14426:
14390:(3): 783–804.
14367:
14355:
14340:
14296:
14263:
14234:10.1.1.41.1452
14227:(3): 199–222.
14204:
14187:
14170:
14159:
14139:
14113:
14091:10.1.1.22.1879
14084:(465): 67–81.
14068:
14035:
14002:
13935:
13915:Solla, Sara A.
13890:
13834:
13813:
13763:
13736:
13719:Neurocomputing
13709:
13700:
13635:
13620:
13584:
13531:
13500:
13481:
13478:. 30 May 2015.
13458:
13454:
13450:
13446:
13426:
13409:
13376:(5): 729–737.
13353:
13304:
13289:
13259:
13215:
13194:
13165:
13132:
13123:
13106:
13099:
13071:
13060:
13038:
13024:978-0897914970
13023:
13006:10.1.1.21.3818
12978:
12956:Hastie, Trevor
12947:
12922:
12903:
12896:
12868:
12861:
12831:
12807:10.1.1.15.9362
12800:(3): 273–297.
12762:
12760:
12757:
12756:
12755:
12750:
12745:
12740:
12734:
12729:
12724:
12719:
12714:
12709:
12704:
12697:
12694:
12585:
12584:Implementation
12582:
12574:Florian Wenzel
12566:hyperparameter
12545:
12542:
12529:
12509:
12489:
12486:
12483:
12478:
12474:
12470:
12467:
12464:
12442:
12438:
12415:
12411:
12399:
12398:
12387:
12384:
12380:
12376:
12373:
12370:
12365:
12361:
12357:
12354:
12351:
12348:
12343:
12339:
12334:
12322:
12309:
12305:
12301:
12298:
12292:
12289:
12239:
12236:
12229:structured SVM
12227:Main article:
12224:
12223:Structured SVM
12221:
12206:
12203:
12200:
12197:
12194:
12191:
12188:
12185:
12180:
12175:
12171:
12142:
12139:
12136:
12133:
12130:
12127:
12122:
12117:
12112:
12107:
12103:
12099:
12094:
12089:
12085:
12081:
12079:
12076:
12073:
12070:
12067:
12064:
12061:
12056:
12051:
12046:
12042:
12038:
12033:
12029:
12025:
12023:
12001:
11998:
11995:
11992:
11989:
11986:
11983:
11963:
11960:
11957:
11954:
11951:
11948:
11945:
11921:
11917:
11912:
11908:
11903:
11900:
11874:
11869:
11864:
11861:
11858:
11854:
11826:
11821:
11818:
11815:
11811:
11805:
11800:
11795:
11790:
11785:
11780:
11775:
11770:
11765:
11760:
11755:
11752:
11747:
11741:
11714:
11692:
11689:
11681:
11680:
11675:
11669:
11666:one-versus-one
11662:one-versus-all
11652:into multiple
11642:
11641:Multiclass SVM
11639:
11637:
11634:
11633:
11632:
11629:
11622:
11615:
11607:
11604:
11590:
11569:
11541:
11536:
11532:
11528:
11523:
11519:
11515:
11512:
11509:
11504:
11501:
11497:
11493:
11488:
11485:
11481:
11477:
11474:
11471:
11451:
11446:
11442:
11438:
11433:
11429:
11425:
11422:
11419:
11414:
11411:
11407:
11403:
11398:
11395:
11391:
11387:
11384:
11381:
11360:
11339:
11315:
11295:
11274:
11253:
11241:
11238:
11203:
11200:
11185:
11161:
11157:
11128:
11124:
11101:
11097:
11093:
11090:
11085:
11081:
11077:
11074:
11071:
11068:
11065:
11060:
11057:
11053:
11049:
11046:
11043:
11022:
11018:
11012:
11008:
11004:
11001:
10997:
10993:
10987:
10983:
10978:
10974:
10971:
10968:
10965:
10962:
10959:
10954:
10950:
10928:
10923:
10919:
10915:
10910:
10906:
10903:
10900:
10897:
10892:
10889:
10885:
10860:
10850:
10848:
10845:
10842:
10841:
10838:
10834:
10830:
10827:
10822:
10818:
10809:
10807:
10804:
10803:
10801:
10796:
10793:
10790:
10787:
10782:
10778:
10751:
10744:
10740:
10736:
10733:
10725:
10723:
10720:
10717:
10716:
10711:
10707:
10698:
10696:
10693:
10692:
10690:
10685:
10680:
10676:
10653:
10650:
10647:
10627:
10605:
10601:
10577:
10573:
10570:
10554:
10551:
10539:
10536:
10531:
10528:
10525:
10521:
10517:
10514:
10511:
10508:
10505:
10502:
10499:
10496:
10493:
10490:
10487:
10482:
10478:
10449:
10445:
10441:
10438:
10435:
10432:
10429:
10426:
10423:
10420:
10417:
10414:
10409:
10406:
10402:
10363:
10359:
10355:
10352:
10349:
10346:
10343:
10340:
10336:
10332:
10329:
10326:
10323:
10320:
10317:
10314:
10311:
10284:
10279:
10275:
10270:
10266:
10263:
10260:
10256:
10251:
10247:
10244:
10241:
10237:
10230:
10224:
10219:
10214:
10210:
10206:
10203:
10200:
10197:
10193:
10189:
10184:
10179:
10176:
10173:
10169:
10163:
10160:
10154:
10131:
10128:
10125:
10121:
10114:
10106:
10102:
10094:
10091:
10088:
10085:
10081:
10077:
10074:
10071:
10065:
10061:
10045:
10042:
10029:
10009:
10006:
10003:
9998:
9965:
9962:
9959:
9956:
9951:
9946:
9943:
9940:
9937:
9931:
9928:
9918:
9913:
9910:
9906:
9901:
9898:
9895:
9891:
9885:
9882:
9854:
9849:
9845:
9842:
9837:
9833:
9829:
9826:
9823:
9820:
9815:
9790:
9787:
9781:
9776:
9772:
9769:
9749:
9723:
9699:
9685:
9682:
9665:
9643:
9639:
9634:
9629:
9625:
9602:
9599:
9596:
9591:
9587:
9583:
9580:
9577:
9572:
9568:
9564:
9561:
9556:
9551:
9548:
9545:
9541:
9535:
9532:
9527:
9524:
9521:
9518:
9512:
9509:
9479:
9476:
9473:
9469:
9464:
9459:
9456:
9453:
9449:
9426:
9422:
9418:
9415:
9410:
9407:
9404:
9400:
9396:
9393:
9390:
9385:
9382:
9379:
9375:
9371:
9368:
9364:
9359:
9355:
9352:
9349:
9346:
9343:
9315:
9295:
9275:
9272:
9269:
9266:
9263:
9260:
9232:
9229:
9226:
9222:
9201:
9196:
9193:
9190:
9186:
9182:
9179:
9159:
9133:
9130:
9127:
9123:
9100:
9097:
9094:
9090:
9067:
9063:
9059:
9054:
9050:
9027:
9023:
9019:
9014:
9010:
8997:
8994:
8979:
8976:
8962:
8958:
8954:
8950:
8945:
8942:
8938:
8934:
8930:
8926:
8922:
8900:
8896:
8892:
8888:
8884:
8881:
8859:
8855:
8834:
8831:
8827:
8824:
8820:
8817:
8814:
8811:
8808:
8782:
8779:
8767:
8764:
8761:
8757:
8752:
8747:
8743:
8739:
8736:
8727:
8724:
8721:
8716:
8712:
8706:
8702:
8696:
8691:
8688:
8685:
8681:
8672:
8670:
8667:
8662:
8658:
8652:
8648:
8644:
8639:
8635:
8631:
8626:
8622:
8618:
8613:
8609:
8603:
8599:
8593:
8588:
8585:
8582:
8578:
8572:
8567:
8564:
8561:
8557:
8551:
8548:
8543:
8538:
8534:
8528:
8523:
8520:
8517:
8513:
8509:
8506:
8501:
8497:
8493:
8488:
8484:
8480:
8477:
8467:
8465:
8448:
8445:
8432:
8400:
8379:
8354:
8332:
8327:
8323:
8318:
8314:
8311:
8308:
8304:
8299:
8295:
8292:
8289:
8284:
8279:
8271:
8265:
8260:
8255:
8251:
8247:
8244:
8241:
8238:
8234:
8230:
8225:
8220:
8217:
8214:
8210:
8204:
8201:
8195:
8191:
8188:
8185:
8182:
8178:
8174:
8171:
8154:
8151:
8145:
8144:Modern methods
8142:
8130:
8126:
8122:
8119:
8115:
8111:
8107:
8103:
8098:
8093:
8088:
8085:
8080:
8076:
8070:
8066:
8060:
8055:
8052:
8049:
8045:
8040:
8035:
8031:
8028:
8025:
8022:
8019:
8016:
8013:
8009:
8005:
8002:
7996:
7990:
7985:
7982:
7979:
7976:
7972:
7943:
7938:
7934:
7930:
7926:
7922:
7917:
7912:
7907:
7902:
7897:
7892:
7889:
7884:
7880:
7874:
7870:
7864:
7859:
7856:
7853:
7849:
7844:
7840:
7837:
7835:
7833:
7828:
7824:
7820:
7816:
7812:
7807:
7802:
7797:
7794:
7791:
7788:
7783:
7778:
7773:
7770:
7765:
7761:
7755:
7751:
7745:
7740:
7737:
7734:
7730:
7725:
7721:
7718:
7716:
7712:
7708:
7704:
7701:
7696:
7691:
7686:
7683:
7677:
7671:
7666:
7663:
7660:
7659:
7637:
7632:
7627:
7622:
7619:
7597:
7594:
7590:
7586:
7583:
7580:
7577:
7574:
7569:
7565:
7561:
7558:
7538:
7516:
7512:
7485:
7482:
7470:
7467:
7464:
7460:
7455:
7450:
7446:
7442:
7439:
7430:
7427:
7424:
7421:
7419:
7415:
7411:
7405:
7401:
7395:
7390:
7387:
7384:
7380:
7371:
7370:
7365:
7361:
7355:
7351:
7347:
7342:
7337:
7332:
7327:
7322:
7317:
7314:
7309:
7305:
7299:
7295:
7289:
7284:
7281:
7278:
7274:
7268:
7263:
7260:
7257:
7253:
7247:
7244:
7239:
7234:
7230:
7224:
7219:
7216:
7213:
7209:
7205:
7202:
7200:
7198:
7193:
7189:
7183:
7179:
7175:
7172:
7167:
7162:
7157:
7154:
7151:
7148:
7143:
7138:
7133:
7130:
7127:
7122:
7118:
7112:
7108:
7102:
7097:
7094:
7091:
7087:
7081:
7076:
7073:
7070:
7066:
7060:
7057:
7052:
7047:
7043:
7037:
7032:
7029:
7026:
7022:
7018:
7015:
7013:
7011:
7006:
7002:
6998:
6993:
6989:
6985:
6982:
6972:
6971:
6947:
6943:
6920:
6917:
6912:
6907:
6902:
6899:
6894:
6890:
6884:
6880:
6874:
6869:
6866:
6863:
6859:
6855:
6851:
6827:
6803:
6798:
6793:
6788:
6785:
6782:
6779:
6774:
6769:
6764:
6761:
6758:
6755:
6750:
6745:
6740:
6735:
6730:
6725:
6722:
6702:
6682:
6679:
6674:
6669:
6664:
6661:
6614:Main article:
6611:
6608:
6595:
6592:
6589:
6584:
6580:
6557:
6553:
6549:
6544:
6541:
6536:
6532:
6509:
6504:
6500:
6496:
6491:
6486:
6478:
6472:
6467:
6464:
6460:
6456:
6453:
6450:
6447:
6444:
6439:
6434:
6426:
6420:
6415:
6410:
6406:
6383:
6378:
6356:
6332:
6309:
6304:
6280:
6277:
6273:
6269:
6266:
6263:
6260:
6257:
6252:
6248:
6244:
6241:
6219:
6214:
6192:
6189:
6184:
6180:
6157:
6152:
6147:
6140:
6136:
6130:
6126:
6120:
6115:
6112:
6109:
6105:
6101:
6097:
6072:
6068:
6038:
6034:
6003:
6000:
5988:
5985:
5982:
5978:
5973:
5968:
5964:
5960:
5957:
5948:
5945:
5942:
5937:
5933:
5927:
5923:
5917:
5912:
5909:
5906:
5902:
5893:
5891:
5888:
5883:
5879:
5873:
5869:
5865:
5860:
5855:
5847:
5841:
5836:
5831:
5826:
5822:
5816:
5812:
5806:
5801:
5798:
5795:
5791:
5785:
5780:
5777:
5774:
5770:
5764:
5761:
5756:
5751:
5747:
5741:
5736:
5733:
5730:
5726:
5722:
5719:
5714:
5710:
5706:
5701:
5697:
5693:
5690:
5680:
5678:
5660:
5657:
5634:
5631:
5622:
5619:
5616:
5611:
5607:
5594:
5590:
5586:
5583:
5580:
5576:
5572:
5569:
5564:
5559:
5551:
5545:
5539:
5533:
5529:
5520:
5518:
5513:
5509:
5504:
5500:
5497:
5494:
5489:
5485:
5479:
5474:
5471:
5468:
5464:
5458:
5455:
5448:minimize
5445:
5443:
5419:
5414:
5410:
5406:
5403:
5400:
5397:
5394:
5391:
5386:
5381:
5373:
5367:
5362:
5357:
5353:
5330:
5326:
5304:
5300:
5297:
5294:
5289:
5284:
5276:
5270:
5265:
5260:
5256:
5252:
5249:
5246:
5243:
5239:
5235:
5232:
5227:
5223:
5202:
5199:
5195:
5192:
5188:
5185:
5182:
5179:
5176:
5155:
5152:
5129:
5116:
5115:
5106:
5104:
5093:
5088:
5084:
5079:
5075:
5072:
5069:
5065:
5060:
5056:
5053:
5050:
5045:
5040:
5032:
5026:
5021:
5016:
5012:
5008:
5005:
5002:
4999:
4995:
4991:
4986:
4981:
4978:
4975:
4971:
4965:
4962:
4956:
4938:
4935:
4922:
4918:
4914:
4909:
4904:
4899:
4896:
4891:
4887:
4881:
4877:
4871:
4867:
4863:
4860:
4856:
4852:
4849:
4846:
4842:
4815:
4810:
4805:
4800:
4797:
4792:
4788:
4782:
4778:
4772:
4768:
4764:
4760:
4733:
4727:
4723:
4718:
4715:
4712:
4709:
4704:
4699:
4694:
4691:
4688:
4685:
4680:
4675:
4670:
4665:
4660:
4655:
4652:
4632:
4627:
4622:
4617:
4614:
4603:
4602:
4590:
4587:
4584:
4564:
4561:
4558:
4538:
4535:
4532:
4527:
4522:
4517:
4512:
4507:
4502:
4499:
4496:
4493:
4490:
4487:
4481:
4477:
4472:
4466:
4462:
4457:
4454:
4436:
4424:
4419:
4415:
4411:
4408:
4404:
4400:
4397:
4394:
4374:
4371:
4368:
4347:
4341:
4336:
4330:
4325:
4320:
4315:
4310:
4304:
4299:
4296:
4292:
4288:
4285:
4282:
4279:
4274:
4269:
4264:
4259:
4254:
4249:
4246:
4231:
4217:
4213:
4209:
4206:
4201:
4196:
4191:
4186:
4181:
4176:
4173:
4170:
4165:
4160:
4155:
4150:
4145:
4140:
4137:
4123:
4111:
4108:
4105:
4083:
4079:
4073:
4068:
4063:
4058:
4053:
4048:
4045:
4042:
4037:
4032:
4027:
4022:
4017:
4012:
4009:
3965:Isabelle Guyon
3961:Bernhard Boser
3951:Kernel machine
3944:
3941:
3928:
3902:
3899:
3896:
3893:
3890:
3887:
3884:
3881:
3878:
3875:
3871:
3868:
3863:
3859:
3854:
3849:
3845:
3841:
3838:
3835:
3832:
3829:
3826:
3821:
3816:
3809:
3804:
3799:
3794:
3790:
3786:
3778:
3776:
3771:
3767:
3761:
3756:
3753:
3750:
3746:
3742:
3739:
3734:
3729:
3725:
3720:
3716:
3713:
3711:
3704:
3699:
3696:
3692:
3688:
3683:
3678:
3676:
3652:
3647:
3625:
3622:
3619:
3597:
3593:
3588:
3584:
3581:
3578:
3573:
3568:
3560:
3554:
3549:
3544:
3540:
3536:
3533:
3530:
3527:
3523:
3519:
3514:
3509:
3506:
3503:
3499:
3493:
3490:
3484:
3480:
3477:
3472:
3468:
3463:
3459:
3432:
3427:
3392:
3389:
3384:
3379:
3371:
3365:
3337:
3333:
3310:
3306:
3302:
3299:
3296:
3291:
3286:
3278:
3272:
3267:
3262:
3258:
3254:
3251:
3248:
3245:
3241:
3237:
3219:
3216:
3197:
3192:
3168:
3163:
3134:
3131:
3128:
3125:
3122:
3102:
3099:
3096:
3092:
3085:
3079:
3074:
3071:
3068:
3065:
3061:
3040:
3019:
2992:
2989:
2986:
2983:
2980:
2977:
2974:
2971:
2968:
2965:
2961:
2958:
2955:
2952:
2949:
2944:
2939:
2932:
2927:
2922:
2917:
2913:
2909:
2901:
2899:
2894:
2890:
2885:
2881:
2876:
2873:
2868:
2862:
2858:
2854:
2849:
2844:
2842:
2827:
2826:
2817:
2815:
2804:
2801:
2798:
2795:
2792:
2789:
2780:
2777:
2774:
2771:
2768:
2765:
2760:
2755:
2747:
2741:
2736:
2731:
2727:
2698:
2695:
2692:
2687:
2683:
2677: if
2674:
2670:
2667:
2664:
2661:
2658:
2653:
2648:
2640:
2634:
2612:
2609:
2606:
2601:
2597:
2591: if
2588:
2584:
2581:
2578:
2575:
2570:
2565:
2557:
2551:
2529:
2505:
2501:
2497:
2473:
2469:
2465:
2461:
2447:
2446:
2434:
2431:
2428:
2425:
2422:
2418:
2411:
2405:
2389:
2388:
2376:
2373:
2370:
2367:
2363:
2356:
2350:
2329:
2326:
2314:
2311:
2308:
2305:
2301:
2294:
2288:
2262:
2237:
2233:
2229:
2225:
2201:
2191:, except that
2171:
2150:
2147:
2144:
2141:
2138:
2134:
2127:
2121:
2098:
2068:
2063:
2041:
2038:
2035:
2030:
2026:
2005:
2002:
1997:
1993:
1970:
1965:
1939:
1917:
1912:
1901:belongs. Each
1888:
1883:
1859:
1855:
1834:
1831:
1826:
1822:
1818:
1813:
1808:
1803:
1800:
1797:
1794:
1791:
1786:
1782:
1778:
1773:
1768:
1763:
1743:
1723:
1720:
1716:Corinna Cortes
1704:Isabelle Guyon
1691:
1688:
1687:
1686:
1678:
1671:
1664:
1653:
1633:
1630:
1617:
1595:
1591:
1570:
1550:
1530:
1510:
1507:
1504:
1501:
1498:
1495:
1474:
1466:
1463:
1460:
1457:
1452:
1448:
1444:
1441:
1436:
1432:
1426:
1422:
1396:
1374:
1370:
1344:
1340:
1318:
1315:
1312:
1309:
1306:
1303:
1278:Kernel machine
1253:classification
1202:
1199:
1196:
1193:
1190:
1170:
1150:
1120:
1116:
1112:
1104:
1101:
1054:
1010:et al., 1993,
992:classification
959:
958:
956:
955:
948:
941:
933:
930:
929:
926:
925:
920:
919:
918:
908:
902:
899:
898:
895:
894:
891:
890:
885:
880:
875:
870:
865:
860:
854:
851:
850:
847:
846:
843:
842:
837:
832:
827:
825:Occam learning
822:
817:
812:
807:
801:
798:
797:
794:
793:
790:
789:
784:
782:Learning curve
779:
774:
768:
765:
764:
761:
760:
757:
756:
751:
746:
741:
735:
732:
731:
728:
727:
724:
723:
722:
721:
711:
706:
701:
695:
690:
689:
686:
685:
682:
681:
675:
670:
665:
660:
659:
658:
648:
643:
642:
641:
636:
631:
626:
616:
611:
606:
601:
600:
599:
589:
588:
587:
582:
577:
572:
562:
557:
552:
546:
541:
540:
537:
536:
533:
532:
527:
522:
514:
508:
503:
502:
499:
498:
495:
494:
493:
492:
487:
482:
471:
466:
465:
462:
461:
458:
457:
452:
447:
442:
437:
432:
427:
422:
417:
411:
406:
405:
402:
401:
398:
397:
392:
387:
381:
376:
371:
363:
358:
353:
347:
342:
341:
338:
337:
334:
333:
328:
323:
318:
313:
308:
303:
298:
290:
289:
288:
283:
278:
268:
266:Decision trees
263:
257:
243:classification
233:
232:
231:
228:
227:
224:
223:
218:
213:
208:
203:
198:
193:
188:
183:
178:
173:
168:
163:
158:
153:
148:
143:
138:
136:Classification
132:
129:
128:
125:
124:
121:
120:
115:
110:
105:
100:
95:
93:Batch learning
90:
85:
80:
75:
70:
65:
60:
54:
51:
50:
47:
46:
35:
34:
26:
9:
6:
4:
3:
2:
14950:
14939:
14936:
14934:
14931:
14929:
14926:
14925:
14923:
14913:
14909:
14905:
14902:
14899:
14896:
14893:
14890:
14887:
14884:
14880:
14877:
14876:
14866:
14860:
14856:
14851:
14847:
14841:
14837:
14832:
14828:
14826:0-262-19475-9
14822:
14818:
14813:
14809:
14803:
14799:
14792:
14787:
14783:
14781:9780470116449
14777:
14773:
14769:
14765:
14761:
14754:
14749:
14745:
14741:
14737:
14731:
14727:
14723:
14719:
14714:
14710:
14703:
14698:
14694:
14692:0-521-78019-5
14688:
14684:
14679:
14675:
14671:
14667:
14663:
14659:
14655:
14648:
14643:
14642:
14629:
14624:
14619:
14614:
14610:
14606:
14602:
14598:
14591:
14582:
14577:
14573:
14569:
14565:
14558:
14550:
14546:
14539:
14528:
14521:
14520:
14512:
14504:
14500:
14499:
14491:
14484:
14473:
14466:
14465:
14457:
14446:
14439:
14438:
14430:
14419:
14415:
14411:
14407:
14403:
14398:
14393:
14389:
14385:
14378:
14371:
14365:
14359:
14351:
14347:
14343:
14337:
14333:
14329:
14325:
14321:
14316:
14311:
14307:
14300:
14291:
14286:
14282:
14278:
14274:
14267:
14256:
14252:
14248:
14244:
14240:
14235:
14230:
14226:
14222:
14215:
14208:
14201:
14197:
14191:
14184:
14180:
14174:
14168:
14163:
14152:
14151:
14143:
14135:
14131:
14124:
14117:
14109:
14105:
14101:
14097:
14092:
14087:
14083:
14079:
14072:
14061:
14057:
14053:
14046:
14039:
14028:
14024:
14020:
14013:
14006:
13995:
13991:
13987:
13983:
13979:
13975:
13971:
13966:
13961:
13957:
13953:
13946:
13939:
13928:
13924:
13920:
13916:
13909:
13905:
13901:
13898:Platt, John;
13894:
13880:on 2013-05-03
13876:
13872:
13868:
13864:
13860:
13857:(2): 415–25.
13856:
13852:
13845:
13838:
13824:on 2013-05-03
13820:
13816:
13810:
13806:
13802:
13797:
13792:
13788:
13784:
13777:
13770:
13768:
13756:
13749:
13748:
13740:
13732:
13728:
13724:
13720:
13713:
13704:
13696:
13692:
13688:
13684:
13680:
13676:
13672:
13668:
13663:
13658:
13654:
13650:
13646:
13639:
13631:
13627:
13623:
13617:
13613:
13609:
13604:
13599:
13595:
13588:
13580:
13576:
13572:
13568:
13564:
13560:
13555:
13550:
13546:
13542:
13535:
13527:
13522:
13517:
13513:
13512:
13504:
13496:
13492:
13485:
13477:
13473:
13444:
13430:
13423:
13419:
13413:
13399:on 2018-12-22
13395:
13391:
13387:
13383:
13379:
13375:
13371:
13364:
13357:
13349:
13345:
13340:
13335:
13331:
13327:
13323:
13319:
13315:
13308:
13300:
13296:
13292:
13286:
13282:
13278:
13274:
13270:
13263:
13255:
13251:
13246:
13241:
13237:
13233:
13226:
13219:
13210:
13205:
13198:
13184:on 2018-01-08
13180:
13176:
13172:
13168:
13162:
13158:
13154:
13150:
13143:
13136:
13127:
13119:
13118:
13110:
13102:
13096:
13091:
13086:
13082:
13075:
13067:
13063:
13057:
13053:
13049:
13042:
13034:
13030:
13026:
13020:
13016:
13012:
13007:
13002:
12998:
12994:
12987:
12985:
12983:
12971:
12970:
12965:
12961:
12957:
12951:
12936:
12932:
12926:
12918:
12914:
12907:
12899:
12893:
12888:
12883:
12879:
12872:
12864:
12858:
12854:
12850:
12846:
12842:
12835:
12827:
12823:
12818:
12813:
12808:
12803:
12799:
12795:
12794:
12786:
12782:
12778:
12772:
12770:
12768:
12763:
12754:
12751:
12749:
12748:Space mapping
12746:
12744:
12741:
12738:
12735:
12733:
12730:
12728:
12725:
12723:
12720:
12718:
12717:Platt scaling
12715:
12713:
12712:Fisher kernel
12710:
12708:
12705:
12703:
12700:
12699:
12693:
12689:
12687:
12683:
12679:
12675:
12671:
12667:
12663:
12659:
12655:
12651:
12646:
12644:
12640:
12635:
12633:
12629:
12625:
12621:
12616:
12614:
12609:
12606:
12602:
12598:
12593:
12591:
12581:
12579:
12575:
12571:
12567:
12563:
12559:
12555:
12551:
12541:
12527:
12507:
12487:
12484:
12476:
12472:
12468:
12465:
12440:
12436:
12413:
12409:
12385:
12382:
12374:
12371:
12363:
12359:
12355:
12352:
12346:
12341:
12337:
12323:
12307:
12299:
12290:
12287:
12276:
12275:
12274:
12271:
12269:
12265:
12261:
12253:
12249:
12244:
12235:
12230:
12220:
12217:
12204:
12198:
12195:
12192:
12189:
12183:
12178:
12173:
12169:
12160:
12157:
12140:
12137:
12134:
12128:
12125:
12120:
12115:
12105:
12092:
12087:
12083:
12074:
12071:
12068:
12062:
12059:
12054:
12044:
12031:
12027:
12013:
11999:
11996:
11993:
11990:
11987:
11984:
11981:
11961:
11958:
11955:
11952:
11949:
11946:
11943:
11934:
11919:
11901:
11898:
11888:
11872:
11862:
11859:
11856:
11844:Minimize (in
11842:
11839:
11824:
11819:
11816:
11813:
11803:
11793:
11788:
11783:
11773:
11768:
11763:
11750:
11745:
11728:
11702:
11698:
11688:
11686:
11679:
11676:
11673:
11670:
11667:
11663:
11659:
11658:
11657:
11655:
11651:
11646:
11630:
11627:
11623:
11620:
11617:Uncalibrated
11616:
11613:
11612:
11611:
11603:
11601:
11588:
11567:
11559:
11555:
11534:
11530:
11526:
11521:
11517:
11513:
11510:
11507:
11502:
11499:
11495:
11491:
11486:
11483:
11479:
11472:
11469:
11444:
11440:
11436:
11431:
11427:
11423:
11420:
11417:
11412:
11409:
11405:
11401:
11396:
11393:
11389:
11382:
11379:
11371:
11358:
11337:
11329:
11313:
11293:
11285:
11272:
11251:
11237:
11234:
11232:
11231:
11225:
11221:
11217:
11213:
11209:
11199:
11159:
11155:
11147:
11142:
11126:
11122:
11099:
11095:
11091:
11083:
11079:
11072:
11069:
11066:
11058:
11055:
11051:
11044:
11041:
11020:
11010:
11006:
11002:
10999:
10991:
10985:
10981:
10976:
10972:
10969:
10966:
10960:
10952:
10948:
10926:
10921:
10917:
10913:
10904:
10898:
10890:
10887:
10883:
10873:
10846:
10843:
10836:
10832:
10828:
10825:
10820:
10816:
10805:
10799:
10794:
10788:
10780:
10776:
10767:
10764:
10742:
10738:
10734:
10731:
10721:
10718:
10709:
10705:
10694:
10688:
10683:
10678:
10674:
10665:
10651:
10648:
10645:
10625:
10603:
10599:
10589:
10575:
10571:
10568:
10560:
10550:
10537:
10529:
10526:
10523:
10519:
10515:
10512:
10506:
10503:
10500:
10494:
10491:
10488:
10480:
10476:
10467:
10465:
10447:
10439:
10436:
10433:
10427:
10421:
10418:
10415:
10407:
10404:
10400:
10391:
10387:
10383:
10379:
10374:
10361:
10357:
10353:
10350:
10347:
10344:
10341:
10338:
10334:
10327:
10321:
10318:
10315:
10309:
10301:
10300:
10295:
10282:
10277:
10261:
10258:
10254:
10249:
10242:
10239:
10212:
10208:
10204:
10201:
10198:
10195:
10191:
10182:
10177:
10174:
10171:
10167:
10161:
10158:
10152:
10143:
10126:
10123:
10089:
10086:
10075:
10072:
10069:
10041:
10027:
10004:
9985:
9984:
9982:
9976:
9963:
9957:
9944:
9938:
9926:
9911:
9908:
9889:
9880:
9869:
9843:
9835:
9831:
9827:
9821:
9804:
9788:
9785:
9770:
9747:
9739:
9681:
9679:
9663:
9641:
9637:
9632:
9627:
9623:
9613:
9600:
9589:
9585:
9578:
9575:
9570:
9566:
9559:
9554:
9549:
9546:
9543:
9539:
9533:
9530:
9525:
9519:
9507:
9496:
9495:
9477:
9474:
9471:
9467:
9462:
9457:
9454:
9451:
9447:
9437:
9424:
9420:
9408:
9405:
9402:
9398:
9391:
9388:
9383:
9380:
9377:
9373:
9366:
9362:
9353:
9347:
9341:
9333:
9332:
9330:
9329:expected risk
9313:
9293:
9270:
9267:
9264:
9258:
9251:
9249:
9248:loss function
9230:
9227:
9224:
9220:
9194:
9191:
9188:
9184:
9177:
9157:
9149:
9131:
9128:
9125:
9121:
9098:
9095:
9092:
9088:
9065:
9061:
9057:
9052:
9048:
9025:
9021:
9017:
9012:
9008:
8993:
8991:
8990:
8985:
8975:
8956:
8952:
8948:
8943:
8940:
8936:
8932:
8928:
8924:
8898:
8894:
8886:
8882:
8857:
8853:
8829:
8825:
8822:
8818:
8815:
8809:
8806:
8797:
8780:
8777:
8774:for all
8765:
8762:
8759:
8755:
8750:
8745:
8741:
8737:
8734:
8725:
8722:
8719:
8714:
8710:
8704:
8700:
8694:
8689:
8686:
8683:
8679:
8665:
8660:
8656:
8650:
8646:
8637:
8633:
8629:
8624:
8620:
8611:
8607:
8601:
8597:
8591:
8586:
8583:
8580:
8576:
8570:
8565:
8562:
8559:
8555:
8549:
8546:
8541:
8536:
8532:
8526:
8521:
8518:
8515:
8511:
8507:
8499:
8495:
8491:
8486:
8482:
8475:
8455:
8453:
8444:
8430:
8422:
8418:
8414:
8398:
8368:
8352:
8343:
8330:
8325:
8309:
8306:
8302:
8297:
8290:
8287:
8282:
8253:
8249:
8245:
8242:
8239:
8236:
8232:
8223:
8218:
8215:
8212:
8208:
8202:
8199:
8193:
8189:
8183:
8180:
8169:
8161:
8159:
8150:
8141:
8128:
8124:
8120:
8117:
8113:
8101:
8096:
8083:
8078:
8074:
8068:
8064:
8058:
8053:
8050:
8047:
8043:
8038:
8033:
8029:
8026:
8023:
8017:
8014:
8000:
7980:
7977:
7961:
7958:
7941:
7936:
7932:
7928:
7924:
7915:
7905:
7900:
7887:
7882:
7878:
7872:
7868:
7862:
7857:
7854:
7851:
7847:
7842:
7838:
7836:
7826:
7822:
7818:
7814:
7805:
7792:
7789:
7781:
7768:
7763:
7759:
7753:
7749:
7743:
7738:
7735:
7732:
7728:
7723:
7719:
7717:
7710:
7706:
7702:
7694:
7681:
7664:
7661:
7649:
7630:
7617:
7595:
7592:
7584:
7581:
7578:
7572:
7567:
7563:
7559:
7556:
7536:
7514:
7510:
7500:
7483:
7480:
7477:for all
7468:
7465:
7462:
7458:
7453:
7448:
7444:
7440:
7437:
7428:
7425:
7422:
7420:
7413:
7409:
7403:
7399:
7393:
7388:
7385:
7382:
7378:
7363:
7359:
7353:
7349:
7340:
7330:
7325:
7312:
7307:
7303:
7297:
7293:
7287:
7282:
7279:
7276:
7272:
7266:
7261:
7258:
7255:
7251:
7245:
7242:
7237:
7232:
7228:
7222:
7217:
7214:
7211:
7207:
7203:
7201:
7191:
7187:
7181:
7177:
7165:
7152:
7149:
7141:
7128:
7120:
7116:
7110:
7106:
7100:
7095:
7092:
7089:
7085:
7079:
7074:
7071:
7068:
7064:
7058:
7055:
7050:
7045:
7041:
7035:
7030:
7027:
7024:
7020:
7016:
7014:
7004:
7000:
6996:
6991:
6987:
6980:
6961:
6945:
6941:
6931:
6918:
6910:
6897:
6892:
6888:
6882:
6878:
6872:
6867:
6864:
6861:
6857:
6853:
6840:
6815:
6796:
6783:
6780:
6772:
6759:
6756:
6748:
6738:
6733:
6720:
6700:
6680:
6672:
6659:
6647:
6643:
6639:
6635:
6631:
6627:
6622:
6617:
6616:Kernel method
6607:
6593:
6590:
6587:
6582:
6578:
6555:
6551:
6547:
6542:
6539:
6534:
6530:
6520:
6507:
6502:
6498:
6494:
6489:
6465:
6462:
6454:
6451:
6445:
6442:
6437:
6408:
6404:
6381:
6354:
6345:
6307:
6278:
6275:
6267:
6264:
6261:
6255:
6250:
6246:
6242:
6239:
6217:
6203:exactly when
6190:
6187:
6182:
6178:
6168:
6155:
6150:
6138:
6134:
6128:
6124:
6118:
6113:
6110:
6107:
6103:
6099:
6086:
6070:
6066:
6056:
6054:
6036:
6032:
6023:
6018:
6001:
5998:
5995:for all
5986:
5983:
5980:
5976:
5971:
5966:
5962:
5958:
5955:
5946:
5943:
5940:
5935:
5931:
5925:
5921:
5915:
5910:
5907:
5904:
5900:
5886:
5881:
5877:
5871:
5867:
5858:
5839:
5824:
5820:
5814:
5810:
5804:
5799:
5796:
5793:
5789:
5783:
5778:
5775:
5772:
5768:
5762:
5759:
5754:
5749:
5745:
5739:
5734:
5731:
5728:
5724:
5720:
5712:
5708:
5704:
5699:
5695:
5688:
5668:
5666:
5656:
5654:
5649:
5632:
5629:
5626:for all
5620:
5617:
5614:
5609:
5605:
5592:
5588:
5584:
5581:
5578:
5574:
5570:
5567:
5562:
5537:
5531:
5527:
5511:
5495:
5492:
5487:
5483:
5477:
5472:
5469:
5466:
5462:
5456:
5453:
5433:
5430:
5417:
5412:
5408:
5404:
5401:
5398:
5392:
5389:
5384:
5355:
5351:
5328:
5324:
5302:
5295:
5292:
5287:
5258:
5254:
5250:
5247:
5244:
5241:
5237:
5230:
5225:
5221:
5197:
5193:
5190:
5186:
5183:
5177:
5174:
5165:
5163:
5162:
5151:
5149:
5145:
5144:
5127:
5114:
5107:
5105:
5091:
5086:
5070:
5067:
5063:
5058:
5051:
5048:
5043:
5014:
5010:
5006:
5003:
5000:
4997:
4993:
4984:
4979:
4976:
4973:
4969:
4963:
4960:
4954:
4946:
4945:
4942:
4934:
4912:
4907:
4894:
4889:
4885:
4879:
4875:
4869:
4865:
4861:
4847:
4844:
4830:
4808:
4795:
4790:
4786:
4780:
4776:
4770:
4766:
4762:
4748:
4713:
4710:
4702:
4689:
4686:
4678:
4668:
4663:
4650:
4625:
4612:
4588:
4585:
4582:
4562:
4559:
4556:
4533:
4530:
4525:
4515:
4510:
4500:
4494:
4491:
4488:
4470:
4452:
4444:
4440:
4437:
4417:
4413:
4409:
4402:
4398:
4395:
4392:
4372:
4369:
4366:
4345:
4339:
4328:
4318:
4313:
4297:
4294:
4290:
4286:
4283:
4280:
4272:
4262:
4257:
4244:
4236:
4232:
4215:
4207:
4204:
4199:
4189:
4184:
4171:
4163:
4153:
4148:
4135:
4127:
4124:
4109:
4106:
4103:
4081:
4071:
4061:
4056:
4043:
4035:
4025:
4020:
4007:
3999:
3996:
3995:
3994:
3991:
3989:
3984:
3982:
3981:feature space
3978:
3974:
3970:
3966:
3962:
3958:
3949:
3940:
3926:
3917:
3897:
3894:
3891:
3888:
3885:
3879:
3876:
3869:
3866:
3861:
3857:
3852:
3847:
3843:
3839:
3836:
3833:
3827:
3824:
3819:
3792:
3788:
3769:
3765:
3759:
3754:
3751:
3748:
3744:
3740:
3737:
3732:
3727:
3712:
3702:
3697:
3694:
3690:
3681:
3666:
3650:
3623:
3620:
3617:
3608:
3595:
3591:
3586:
3579:
3576:
3571:
3542:
3538:
3534:
3531:
3528:
3525:
3521:
3512:
3507:
3504:
3501:
3497:
3491:
3488:
3482:
3478:
3475:
3470:
3449:
3446:
3430:
3415:
3414:
3408:
3406:
3390:
3387:
3382:
3353:
3335:
3331:
3321:
3308:
3304:
3297:
3294:
3289:
3260:
3256:
3252:
3249:
3246:
3243:
3239:
3227:
3226:
3215:
3213:
3195:
3166:
3150:
3148:
3147:sign function
3129:
3123:
3120:
3097:
3094:
3069:
3066:
3038:
3007:
2987:
2984:
2981:
2978:
2975:
2969:
2966:
2959:
2956:
2950:
2947:
2942:
2915:
2911:
2892:
2874:
2871:
2860:
2856:
2847:
2832:
2825:
2818:
2816:
2802:
2799:
2796:
2793:
2790:
2787:
2778:
2775:
2772:
2766:
2763:
2758:
2729:
2725:
2717:
2716:
2713:
2710:
2696:
2693:
2690:
2685:
2681:
2672:
2668:
2665:
2662:
2659:
2656:
2651:
2610:
2607:
2604:
2599:
2595:
2586:
2582:
2579:
2576:
2573:
2568:
2527:
2519:
2459:
2432:
2429:
2426:
2423:
2420:
2394:
2393:
2392:
2374:
2371:
2368:
2365:
2339:
2338:
2337:
2335:
2325:
2312:
2309:
2306:
2303:
2275:
2223:
2190:
2186:
2185:normal vector
2148:
2145:
2142:
2139:
2136:
2087:
2082:
2066:
2039:
2036:
2033:
2028:
2024:
2003:
2000:
1995:
1991:
1968:
1953:
1950:-dimensional
1937:
1915:
1886:
1857:
1853:
1832:
1824:
1820:
1816:
1811:
1798:
1795:
1792:
1784:
1780:
1776:
1771:
1741:
1728:
1719:
1717:
1713:
1709:
1705:
1701:
1697:
1683:
1679:
1676:
1672:
1669:
1665:
1662:
1657:
1654:
1651:
1647:
1643:
1639:
1638:
1637:
1629:
1615:
1593:
1589:
1568:
1548:
1528:
1505:
1502:
1499:
1493:
1486:Note that if
1472:
1464:
1458:
1455:
1450:
1446:
1439:
1434:
1430:
1424:
1420:
1410:
1409:feature space
1394:
1372:
1368:
1360:
1357:of images of
1342:
1338:
1313:
1310:
1307:
1301:
1294:
1290:
1285:
1276:
1272:
1270:
1266:
1262:
1258:
1254:
1250:
1245:
1243:
1242:
1237:
1236:
1230:
1229:
1224:
1220:
1216:
1213:-dimensional
1197:
1194:
1191:
1168:
1148:
1140:
1137:
1133:
1129:
1109:
1100:
1098:
1094:
1090:
1085:
1083:
1079:
1075:
1071:
1066:
1052:
1044:
1040:
1039:feature space
1036:
1032:
1027:
1025:
1021:
1017:
1013:
1009:
1005:
1001:
997:
993:
989:
985:
982:
978:
974:
970:
966:
954:
949:
947:
942:
940:
935:
934:
932:
931:
924:
921:
917:
914:
913:
912:
909:
907:
904:
903:
897:
896:
889:
886:
884:
881:
879:
876:
874:
871:
869:
866:
864:
861:
859:
856:
855:
849:
848:
841:
838:
836:
833:
831:
828:
826:
823:
821:
818:
816:
813:
811:
808:
806:
803:
802:
796:
795:
788:
785:
783:
780:
778:
775:
773:
770:
769:
763:
762:
755:
752:
750:
747:
745:
744:Crowdsourcing
742:
740:
737:
736:
730:
729:
720:
717:
716:
715:
712:
710:
707:
705:
702:
700:
697:
696:
693:
688:
687:
679:
676:
674:
673:Memtransistor
671:
669:
666:
664:
661:
657:
654:
653:
652:
649:
647:
644:
640:
637:
635:
632:
630:
627:
625:
622:
621:
620:
617:
615:
612:
610:
607:
605:
602:
598:
595:
594:
593:
590:
586:
583:
581:
578:
576:
573:
571:
568:
567:
566:
563:
561:
558:
556:
555:Deep learning
553:
551:
548:
547:
544:
539:
538:
531:
528:
526:
523:
521:
519:
515:
513:
510:
509:
506:
501:
500:
491:
490:Hidden Markov
488:
486:
483:
481:
478:
477:
476:
473:
472:
469:
464:
463:
456:
453:
451:
448:
446:
443:
441:
438:
436:
433:
431:
428:
426:
423:
421:
418:
416:
413:
412:
409:
404:
403:
396:
393:
391:
388:
386:
382:
380:
377:
375:
372:
370:
368:
364:
362:
359:
357:
354:
352:
349:
348:
345:
340:
339:
332:
329:
327:
324:
322:
319:
317:
314:
312:
309:
307:
304:
302:
299:
297:
295:
291:
287:
286:Random forest
284:
282:
279:
277:
274:
273:
272:
269:
267:
264:
262:
259:
258:
251:
250:
245:
244:
236:
230:
229:
222:
219:
217:
214:
212:
209:
207:
204:
202:
199:
197:
194:
192:
189:
187:
184:
182:
179:
177:
174:
172:
171:Data cleaning
169:
167:
164:
162:
159:
157:
154:
152:
149:
147:
144:
142:
139:
137:
134:
133:
127:
126:
119:
116:
114:
111:
109:
106:
104:
101:
99:
96:
94:
91:
89:
86:
84:
83:Meta-learning
81:
79:
76:
74:
71:
69:
66:
64:
61:
59:
56:
55:
49:
48:
45:
40:
37:
36:
32:
31:
19:
14854:
14835:
14816:
14797:
14763:
14759:
14717:
14708:
14682:
14657:
14653:
14608:
14604:
14590:
14571:
14567:
14557:
14548:
14544:
14538:
14518:
14511:
14505:: 1871–1874.
14502:
14496:
14483:
14463:
14456:
14436:
14429:
14387:
14383:
14370:
14358:
14305:
14299:
14280:
14276:
14266:
14224:
14220:
14207:
14199:
14190:
14182:
14173:
14162:
14149:
14142:
14136:(224): 1–42.
14133:
14129:
14116:
14081:
14077:
14071:
14055:
14051:
14038:
14022:
14018:
14005:
13955:
13951:
13938:
13922:
13893:
13882:. Retrieved
13875:the original
13854:
13850:
13837:
13826:. Retrieved
13819:the original
13782:
13746:
13739:
13722:
13718:
13712:
13703:
13652:
13648:
13638:
13593:
13587:
13544:
13540:
13534:
13510:
13503:
13494:
13490:
13484:
13475:
13429:
13421:
13412:
13401:. Retrieved
13394:the original
13373:
13369:
13356:
13321:
13317:
13307:
13272:
13262:
13235:
13231:
13218:
13197:
13186:. Retrieved
13179:the original
13148:
13135:
13126:
13116:
13109:
13080:
13074:
13051:
13041:
12996:
12968:
12950:
12939:. Retrieved
12925:
12916:
12912:
12906:
12877:
12871:
12844:
12834:
12797:
12791:
12690:
12688:and others.
12666:scikit-learn
12660:, SVMlight,
12647:
12645:is allowed.
12636:
12617:
12610:
12594:
12587:
12568:tuning, and
12547:
12544:Bayesian SVM
12400:
12272:
12257:
12251:
12247:
12232:
12218:
12161:
12158:
12014:
11935:
11889:
11843:
11840:
11729:
11701:transduction
11694:
11682:
11674:SVM (DAGSVM)
11665:
11661:
11647:
11644:
11609:
11581:
11351:
11265:
11243:
11235:
11227:
11223:
11219:
11205:
11145:
11143:
10874:
10768:
10765:
10666:
10590:
10558:
10556:
10468:
10375:
10302:
10296:
10144:
10047:
9986:
9979:
9977:
9870:
9802:
9738:normed space
9687:
9677:
9614:
9497:
9493:
9438:
9334:
9327:
9246:
9170:, such that
9040:with labels
8999:
8987:
8981:
8798:
8456:
8450:
8421:sub-gradient
8344:
8162:
8156:
8147:
7962:
7959:
7650:
7501:
6962:
6932:
6841:
6816:
6651:
6645:
6641:
6637:
6633:
6629:
6625:
6610:Kernel trick
6521:
6347:The offset,
6346:
6169:
6087:
6057:
6055:algorithms.
6021:
6019:
5669:
5662:
5652:
5650:
5434:
5431:
5316:. Note that
5166:
5159:
5157:
5141:
5119:
5108:
4940:
4828:
4746:
4744:. The value
4604:
3992:
3985:
3977:dot products
3973:kernel trick
3954:
3918:
3667:
3609:
3450:
3447:
3411:
3409:
3407:-th output.
3404:
3351:
3322:
3223:
3221:
3211:
3151:
3008:
2833:
2830:
2819:
2711:
2448:
2390:
2331:
2276:
2083:
1733:
1712:kernel trick
1693:
1646:transductive
1635:
1632:Applications
1289:dot products
1281:
1246:
1239:
1232:
1226:
1135:
1126:
1086:
1067:
1065:-sensitive.
1035:kernel trick
1028:
1024:Chervonenkis
976:
972:
968:
962:
830:PAC learning
517:
366:
361:Hierarchical
330:
293:
247:
241:
14766:: 291–400.
14660:(2): 1–13.
14283:(1): 1–23.
14025:: 265–292.
13958:: 263–286.
13547:(1): 3–30.
13324:: 270–283.
13238:: 161–190.
12324:subject to
11328:grid search
10390:square-loss
6933:where, the
6522:(Note that
5158:Minimizing
3218:Soft-margin
3210:are called
2328:Hard-margin
2110:satisfying
1269:overfitting
714:Multi-agent
651:Transformer
550:Autoencoder
306:Naive Bayes
44:data mining
14922:Categories
14912:JavaScript
14618:1810.09841
14315:1707.05532
13965:cs/9501101
13884:2018-01-08
13828:2019-07-18
13497:: 821–837.
13403:2018-01-08
13318:NeuroImage
13209:1608.00501
13188:2018-01-08
12941:2017-11-08
12919:: 125–137.
12759:References
12599:that uses
12260:regression
12238:Regression
11636:Extensions
11212:perceptron
11202:Properties
10299:hinge loss
9760:for which
9148:hypothesis
8989:hinge loss
8345:Note that
7610:, so that
7549:such that
6170:Moreover,
4126:Polynomial
3781:subject to
3323:Note that
3225:hinge loss
2904:subject to
2086:hyperplane
1983:for which
1845:where the
1722:Linear SVM
1677:using SVM.
1675:recognized
1257:regression
1249:hyperplane
1215:hyperplane
1139:data point
1103:Motivation
1043:regression
988:algorithms
984:max-margin
981:supervised
699:Q-learning
597:Restricted
395:Mean shift
344:Clustering
321:Perceptron
249:regression
151:Clustering
146:Regression
14895:SVM light
14889:liblinear
14674:207753020
14392:CiteSeerX
14229:CiteSeerX
14086:CiteSeerX
13791:CiteSeerX
13679:0899-7667
13657:CiteSeerX
13598:CiteSeerX
13571:0025-5610
13549:CiteSeerX
13516:CiteSeerX
13457:‖
13449:‖
13033:207165665
13001:CiteSeerX
12826:206787478
12802:CiteSeerX
12528:ε
12508:ε
12482:⟩
12463:⟨
12386:ε
12383:≤
12372:−
12369:⟩
12350:⟨
12347:−
12304:‖
12297:‖
12277:minimize
12190:−
12184:∈
12179:⋆
12135:≥
12126:−
12121:⋆
12106:⋅
12093:⋆
12069:≥
12060:−
12045:⋅
11994:…
11956:…
11916:‖
11907:‖
11873:⋆
11794:∈
11789:⋆
11774:∣
11769:⋆
11746:⋆
11589:γ
11568:λ
11511:…
11500:−
11484:−
11473:∈
11470:γ
11421:…
11410:−
11394:−
11383:∈
11380:λ
11359:γ
11338:λ
11314:γ
11294:λ
11273:γ
11252:λ
11160:∗
11100:∗
11073:
11045:
11003:−
10973:
10853:otherwise
10844:−
10826:≥
10781:∗
10735:−
10719:−
10524:−
10507:
10477:ℓ
10437:−
10401:ℓ
10348:−
10310:ℓ
10274:‖
10265:‖
10262:λ
10240:−
10205:−
10168:∑
10124:−
10105:^
10090:
10084:↦
10064:^
9930:^
9927:ε
9912:∈
9884:^
9848:‖
9841:‖
9832:λ
9775:‖
9768:‖
9560:ℓ
9540:∑
9511:^
9508:ε
9367:ℓ
9342:ε
9259:ℓ
9058:…
9018:…
8941:…
8891:∂
8880:∂
8823:…
8810:∈
8799:For each
8766:λ
8751:≤
8738:≤
8731:and
8680:∑
8630:⋅
8577:∑
8556:∑
8542:−
8512:∑
8492:…
8322:‖
8313:‖
8310:λ
8288:−
8246:−
8209:∑
8118:−
8044:∑
8030:
8015:−
8001:φ
7981:
7975:↦
7960:Finally,
7929:−
7848:∑
7819:−
7793:φ
7790:⋅
7769:φ
7729:∑
7703:−
7682:φ
7618:φ
7593:−
7585:λ
7469:λ
7454:≤
7441:≤
7434:and
7379:∑
7273:∑
7252:∑
7238:−
7208:∑
7153:φ
7150:⋅
7129:φ
7086:∑
7065:∑
7051:−
7021:∑
6997:…
6898:φ
6858:∑
6784:φ
6781:⋅
6760:φ
6660:φ
6591:±
6540:−
6495:−
6459:⟺
6443:−
6276:−
6268:λ
6104:∑
5987:λ
5972:≤
5959:≤
5952:and
5901:∑
5790:∑
5769:∑
5755:−
5725:∑
5705:…
5655:problem.
5615:≥
5606:ζ
5589:ζ
5585:−
5579:≥
5568:−
5508:‖
5499:‖
5496:λ
5484:ζ
5463:∑
5409:ζ
5405:−
5399:≥
5390:−
5325:ζ
5293:−
5251:−
5222:ζ
5191:…
5178:∈
5167:For each
5128:λ
5083:‖
5074:‖
5071:λ
5049:−
5007:−
4970:∑
4876:α
4866:∑
4848:φ
4845:⋅
4796:φ
4777:α
4767:∑
4714:φ
4711:⋅
4690:φ
4613:φ
4557:κ
4516:⋅
4501:κ
4495:
4414:σ
4393:γ
4367:γ
4319:−
4298:γ
4295:−
4287:
4233:Gaussian
4190:⋅
4062:⋅
3892:…
3880:∈
3874:∀
3867:≥
3858:ζ
3844:ζ
3840:−
3834:≥
3825:−
3808:⊤
3766:ζ
3745:∑
3724:‖
3715:‖
3703:ζ
3577:−
3535:−
3498:∑
3467:‖
3458:‖
3388:−
3295:−
3253:−
3130:⋅
3124:
3095:−
3070:
3064:↦
2982:…
2970:∈
2964:∀
2957:≥
2948:−
2931:⊤
2889:‖
2880:‖
2797:≤
2791:≤
2773:≥
2764:−
2694:−
2666:−
2663:≤
2657:−
2580:≥
2574:−
2504:‖
2496:‖
2472:‖
2464:‖
2430:−
2421:−
2366:−
2236:‖
2228:‖
2137:−
2037:−
1796:…
1431:α
1421:∑
1339:α
1195:−
1053:ϵ
1026:(1974).
1020:VC theory
858:ECML PKDD
840:VC theory
787:ROC curve
719:Self-play
639:DeepDream
480:Bayes net
271:Ensembles
52:Paradigms
14904:Archived
14527:Archived
14525:. ICDM.
14472:Archived
14470:. ICML.
14445:Archived
14443:. NIPS.
14418:Archived
14414:13563302
14255:Archived
14060:Archived
14027:Archived
13994:Archived
13990:47109072
13927:Archived
13921:(eds.).
13906:(2000).
13871:18244442
13755:Archived
13695:11845688
13687:15070510
13579:53306004
13526:Archived
13390:21752695
13348:23583748
13299:25739012
13066:Archived
12966:(2008).
12935:Archived
12783:(1995).
12696:See also
12578:big data
12562:Bayesian
12550:Bayesian
11974:and any
11628:section.
11228:maximum
10812:if
10464:log-loss
10380:such as
9680:or ERM.
8957:′
8933:′
8470:maximize
6975:maximize
5683:maximize
4335:‖
4303:‖
3682:minimize
3113:, where
2848:minimize
1469:constant
1233:maximum-
281:Boosting
130:Problems
14744:2427083
14350:4018290
14320:Bibcode
14108:7066611
13970:Bibcode
13630:7880266
13424:, 1, 4.
13339:3767485
13175:4154772
12662:kernlab
11146:exactly
10618:denote
3403:is the
3350:is the
3145:is the
2540:either
1690:History
1407:in the
975:, also
863:NeurIPS
680:(ECRAM)
634:AlexNet
276:Bagging
14883:LIBSVM
14879:libsvm
14861:
14842:
14823:
14804:
14778:
14742:
14732:
14689:
14672:
14412:
14394:
14348:
14338:
14249:
14231:
14181:", in
14106:
14088:
13988:
13869:
13811:
13793:
13693:
13685:
13677:
13659:
13628:
13618:
13600:
13577:
13569:
13551:
13518:
13388:
13346:
13336:
13297:
13287:
13252:
13173:
13163:
13097:
13058:
13031:
13021:
13003:
12894:
12859:
12824:
12804:
12686:OpenCV
12670:Shogun
12654:MATLAB
12650:LIBSVM
12601:Newton
12401:where
11606:Issues
9113:given
6632:)) = (
6570:since
5653:primal
5154:Primal
2161:where
1223:margin
1016:Vapnik
1012:Cortes
979:) are
656:Vision
512:RANSAC
390:OPTICS
385:DBSCAN
369:-means
176:AutoML
14794:(PDF)
14756:(PDF)
14740:S2CID
14705:(PDF)
14670:S2CID
14650:(PDF)
14613:arXiv
14530:(PDF)
14523:(PDF)
14493:(PDF)
14475:(PDF)
14468:(PDF)
14448:(PDF)
14441:(PDF)
14421:(PDF)
14410:S2CID
14380:(PDF)
14346:S2CID
14310:arXiv
14258:(PDF)
14251:15475
14247:S2CID
14217:(PDF)
14154:(PDF)
14126:(PDF)
14104:S2CID
14063:(PDF)
14048:(PDF)
14030:(PDF)
14015:(PDF)
13997:(PDF)
13986:S2CID
13960:arXiv
13948:(PDF)
13930:(PDF)
13913:. In
13911:(PDF)
13878:(PDF)
13847:(PDF)
13822:(PDF)
13779:(PDF)
13758:(PDF)
13751:(PDF)
13691:S2CID
13626:S2CID
13575:S2CID
13397:(PDF)
13366:(PDF)
13295:S2CID
13254:85843
13250:S2CID
13228:(PDF)
13204:arXiv
13182:(PDF)
13171:S2CID
13145:(PDF)
13029:S2CID
12973:(PDF)
12822:S2CID
12788:(PDF)
12678:Shark
12250:. As
9736:is a
8365:is a
6293:when
5146:to a
1930:is a
1008:Guyon
878:IJCAI
704:SARSA
663:Mamba
629:LeNet
624:U-Net
450:t-SNE
374:Fuzzy
351:BIRCH
14859:ISBN
14840:ISBN
14821:ISBN
14802:ISBN
14776:ISBN
14730:ISBN
14687:ISBN
14336:ISBN
13867:PMID
13809:ISBN
13787:LNCS
13683:PMID
13675:ISSN
13616:ISBN
13567:ISSN
13422:Sign
13386:PMID
13344:PMID
13285:ISBN
13161:ISBN
13095:ISBN
13056:ISBN
13019:ISBN
12892:ISBN
12857:ISBN
12674:Weka
12159:and
11580:and
11350:and
11306:and
10384:and
9786:<
8415:(or
8391:and
7573:<
7560:<
6256:<
6243:<
6022:dual
5659:Dual
4586:<
4575:and
4560:>
4492:tanh
4370:>
4359:for
3967:and
3621:>
3031:and
3009:The
2391:and
2084:Any
1952:real
1706:and
1698:and
1095:and
1072:and
1014:and
994:and
973:SVMs
888:JMLR
873:ICLR
868:ICML
754:RLHF
570:LSTM
356:CURE
42:and
14768:doi
14722:doi
14662:doi
14623:doi
14576:doi
14402:doi
14328:doi
14285:doi
14239:doi
14198:",
14096:doi
13978:doi
13859:doi
13801:doi
13727:doi
13667:doi
13608:doi
13559:doi
13545:127
13378:doi
13334:PMC
13326:doi
13277:doi
13240:doi
13153:doi
13085:doi
13011:doi
12882:doi
12849:doi
12812:doi
12658:SAS
11084:log
11070:sgn
11042:sgn
10953:log
10481:log
10392:,
10331:max
10188:max
10087:sgn
9905:min
8417:SGD
8369:of
8229:max
8027:sgn
7978:sgn
6606:.)
5234:max
5161:(2)
5143:(2)
4990:max
4445:):
4284:exp
3518:max
3413:(1)
3236:max
3121:sgn
3067:sgn
2623:or
1668:SAR
1136:new
1002:by
963:In
614:SOM
604:GAN
580:ESN
575:GRU
520:-NN
455:SDL
445:PGD
440:PCA
435:NMF
430:LDA
425:ICA
420:CCA
296:-NN
14924::
14881:,
14796:.
14774:.
14764:23
14762:.
14758:.
14738:.
14728:.
14668:.
14656:.
14652:.
14621:.
14611:.
14607:.
14603:.
14570:.
14566:.
14547:.
14501:.
14495:.
14416:.
14408:.
14400:.
14388:13
14386:.
14382:.
14344:.
14334:.
14326:.
14318:.
14279:.
14275:.
14253:.
14245:.
14237:.
14225:14
14223:.
14219:.
14134:17
14132:.
14128:.
14102:.
14094:.
14082:99
14080:.
14058:.
14056:33
14054:.
14050:.
14021:.
14017:.
13992:.
13984:.
13976:.
13968:.
13954:.
13950:.
13902:;
13865:.
13855:13
13853:.
13849:.
13807:.
13799:.
13785:.
13781:.
13766:^
13723:55
13721:.
13689:.
13681:.
13673:.
13665:.
13653:16
13651:.
13647:.
13624:.
13614:.
13606:.
13573:.
13565:.
13557:.
13543:.
13524:.
13495:25
13493:.
13474:.
13420:,
13384:.
13374:15
13372:.
13368:.
13342:.
13332:.
13322:78
13320:.
13316:.
13293:.
13283:.
13271:.
13248:.
13236:46
13234:.
13230:.
13169:.
13159:.
13147:.
13093:.
13064:.
13050:.
13027:.
13017:.
13009:.
12995:.
12981:^
12962:;
12958:;
12933:.
12915:.
12890:.
12855:.
12820:.
12810:.
12798:20
12796:.
12790:.
12779:;
12766:^
12684:,
12680:,
12676:,
12672:,
12668:,
12664:,
12656:,
12652:,
12012:)
11887:)
11503:13
11487:15
11462:;
11445:15
11432:13
11233:.
11141:.
10970:ln
10588:.
10504:ln
10466:,
9150:,
6814:.
6644:+
6640:,
6636:,
6628:,
4933:.
4237::
4000::
3963:,
3214:.
3149:.
2697:1.
2313:0.
2274:.
1271:.
1255:,
1244:.
1099:.
967:,
883:ML
14867:.
14848:.
14829:.
14810:.
14784:.
14770::
14746:.
14724::
14695:.
14676:.
14664::
14658:2
14631:.
14625::
14615::
14609:8
14584:.
14578::
14572:6
14549:9
14503:9
14404::
14352:.
14330::
14322::
14312::
14293:.
14287::
14281:6
14241::
14110:.
14098::
14023:2
13980::
13972::
13962::
13956:2
13887:.
13861::
13831:.
13803::
13733:.
13729::
13697:.
13669::
13632:.
13610::
13581:.
13561::
13472:"
13453:w
13445:2
13406:.
13380::
13350:.
13328::
13301:.
13279::
13256:.
13242::
13212:.
13206::
13191:.
13155::
13103:.
13087::
13035:.
13013::
12944:.
12917:2
12900:.
12884::
12865:.
12851::
12828:.
12814::
12488:b
12485:+
12477:i
12473:x
12469:,
12466:w
12441:i
12437:y
12414:i
12410:x
12379:|
12375:b
12364:i
12360:x
12356:,
12353:w
12342:i
12338:y
12333:|
12308:2
12300:w
12291:2
12288:1
12252:ε
12248:ε
12205:.
12202:}
12199:1
12196:,
12193:1
12187:{
12174:j
12170:y
12141:,
12138:1
12132:)
12129:b
12116:j
12111:x
12102:w
12098:(
12088:j
12084:y
12075:,
12072:1
12066:)
12063:b
12055:i
12050:x
12041:w
12037:(
12032:i
12028:y
12000:k
11997:,
11991:,
11988:1
11985:=
11982:j
11962:n
11959:,
11953:,
11950:1
11947:=
11944:i
11920:2
11911:w
11902:2
11899:1
11868:y
11863:,
11860:b
11857:,
11853:w
11825:k
11820:1
11817:=
11814:i
11810:}
11804:p
11799:R
11784:i
11779:x
11764:i
11759:x
11754:{
11751:=
11740:D
11713:D
11540:}
11535:3
11531:2
11527:,
11522:1
11518:2
11514:,
11508:,
11496:2
11492:,
11480:2
11476:{
11450:}
11441:2
11437:,
11428:2
11424:,
11418:,
11413:3
11406:2
11402:,
11397:5
11390:2
11386:{
11184:R
11156:f
11127:x
11123:y
11096:f
11092:=
11089:)
11080:f
11076:(
11067:=
11064:)
11059:q
11056:s
11052:f
11048:(
11021:)
11017:)
11011:x
11007:p
11000:1
10996:(
10992:/
10986:x
10982:p
10977:(
10967:=
10964:)
10961:x
10958:(
10949:f
10927:]
10922:x
10918:y
10914:[
10909:E
10905:=
10902:)
10899:x
10896:(
10891:q
10888:s
10884:f
10847:1
10837:2
10833:/
10829:1
10821:x
10817:p
10806:1
10800:{
10795:=
10792:)
10789:x
10786:(
10777:f
10743:x
10739:p
10732:1
10722:1
10710:x
10706:p
10695:1
10689:{
10684:=
10679:x
10675:y
10652:x
10649:=
10646:X
10626:y
10604:x
10600:y
10576:y
10572:,
10569:X
10538:.
10535:)
10530:z
10527:y
10520:e
10516:+
10513:1
10510:(
10501:=
10498:)
10495:z
10492:,
10489:y
10486:(
10448:2
10444:)
10440:z
10434:y
10431:(
10428:=
10425:)
10422:z
10419:,
10416:y
10413:(
10408:q
10405:s
10362:.
10358:)
10354:z
10351:y
10345:1
10342:,
10339:0
10335:(
10328:=
10325:)
10322:z
10319:,
10316:y
10313:(
10283:.
10278:2
10269:w
10259:+
10255:]
10250:)
10246:)
10243:b
10236:x
10229:T
10223:w
10218:(
10213:i
10209:y
10202:1
10199:,
10196:0
10192:(
10183:n
10178:1
10175:=
10172:i
10162:n
10159:1
10153:[
10130:)
10127:b
10120:x
10113:T
10101:w
10093:(
10080:x
10076::
10073:b
10070:,
10060:w
10028:f
10008:)
10005:f
10002:(
9997:R
9983:.
9964:.
9961:)
9958:f
9955:(
9950:R
9945:+
9942:)
9939:f
9936:(
9917:H
9909:f
9900:g
9897:r
9894:a
9890:=
9881:f
9853:H
9844:f
9836:k
9828:=
9825:)
9822:f
9819:(
9814:R
9789:k
9780:H
9771:f
9748:f
9722:H
9698:H
9664:n
9642:k
9638:y
9633:,
9628:k
9624:X
9601:.
9598:)
9595:)
9590:k
9586:X
9582:(
9579:f
9576:,
9571:k
9567:y
9563:(
9555:n
9550:1
9547:=
9544:k
9534:n
9531:1
9526:=
9523:)
9520:f
9517:(
9478:1
9475:+
9472:n
9468:y
9463:,
9458:1
9455:+
9452:n
9448:X
9425:.
9421:]
9417:)
9414:)
9409:1
9406:+
9403:n
9399:X
9395:(
9392:f
9389:,
9384:1
9381:+
9378:n
9374:y
9370:(
9363:[
9358:E
9354:=
9351:)
9348:f
9345:(
9331::
9314:y
9294:z
9274:)
9271:z
9268:,
9265:y
9262:(
9250:,
9231:1
9228:+
9225:n
9221:y
9200:)
9195:1
9192:+
9189:n
9185:X
9181:(
9178:f
9158:f
9132:1
9129:+
9126:n
9122:X
9099:1
9096:+
9093:n
9089:y
9066:n
9062:y
9053:1
9049:y
9026:n
9022:X
9013:1
9009:X
8961:)
8953:n
8949:c
8944:,
8937:,
8929:1
8925:c
8921:(
8899:i
8895:c
8887:/
8883:f
8858:i
8854:c
8833:}
8830:n
8826:,
8819:,
8816:1
8813:{
8807:i
8781:.
8778:i
8763:n
8760:2
8756:1
8746:i
8742:c
8735:0
8726:,
8723:0
8720:=
8715:i
8711:y
8705:i
8701:c
8695:n
8690:1
8687:=
8684:i
8666:,
8661:j
8657:c
8651:j
8647:y
8643:)
8638:j
8634:x
8625:i
8621:x
8617:(
8612:i
8608:c
8602:i
8598:y
8592:n
8587:1
8584:=
8581:j
8571:n
8566:1
8563:=
8560:i
8550:2
8547:1
8537:i
8533:c
8527:n
8522:1
8519:=
8516:i
8508:=
8505:)
8500:n
8496:c
8487:1
8483:c
8479:(
8476:f
8431:n
8399:b
8378:w
8353:f
8331:.
8326:2
8317:w
8307:+
8303:]
8298:)
8294:)
8291:b
8283:i
8278:x
8270:T
8264:w
8259:(
8254:i
8250:y
8243:1
8240:,
8237:0
8233:(
8224:n
8219:1
8216:=
8213:i
8203:n
8200:1
8194:[
8190:=
8187:)
8184:b
8181:,
8177:w
8173:(
8170:f
8129:.
8125:)
8121:b
8114:]
8110:)
8106:z
8102:,
8097:i
8092:x
8087:(
8084:k
8079:i
8075:y
8069:i
8065:c
8059:n
8054:1
8051:=
8048:i
8039:[
8034:(
8024:=
8021:)
8018:b
8012:)
8008:z
8004:(
7995:T
7989:w
7984:(
7971:z
7942:.
7937:i
7933:y
7925:]
7921:)
7916:i
7911:x
7906:,
7901:j
7896:x
7891:(
7888:k
7883:j
7879:y
7873:j
7869:c
7863:n
7858:1
7855:=
7852:j
7843:[
7839:=
7827:i
7823:y
7815:]
7811:)
7806:i
7801:x
7796:(
7787:)
7782:j
7777:x
7772:(
7764:j
7760:y
7754:j
7750:c
7744:n
7739:1
7736:=
7733:j
7724:[
7720:=
7711:i
7707:y
7700:)
7695:i
7690:x
7685:(
7676:T
7670:w
7665:=
7662:b
7636:)
7631:i
7626:x
7621:(
7596:1
7589:)
7582:n
7579:2
7576:(
7568:i
7564:c
7557:0
7537:i
7515:i
7511:c
7484:.
7481:i
7466:n
7463:2
7459:1
7449:i
7445:c
7438:0
7429:,
7426:0
7423:=
7414:i
7410:y
7404:i
7400:c
7394:n
7389:1
7386:=
7383:i
7364:j
7360:c
7354:j
7350:y
7346:)
7341:j
7336:x
7331:,
7326:i
7321:x
7316:(
7313:k
7308:i
7304:c
7298:i
7294:y
7288:n
7283:1
7280:=
7277:j
7267:n
7262:1
7259:=
7256:i
7246:2
7243:1
7233:i
7229:c
7223:n
7218:1
7215:=
7212:i
7204:=
7192:j
7188:c
7182:j
7178:y
7174:)
7171:)
7166:j
7161:x
7156:(
7147:)
7142:i
7137:x
7132:(
7126:(
7121:i
7117:c
7111:i
7107:y
7101:n
7096:1
7093:=
7090:j
7080:n
7075:1
7072:=
7069:i
7059:2
7056:1
7046:i
7042:c
7036:n
7031:1
7028:=
7025:i
7017:=
7010:)
7005:n
7001:c
6992:1
6988:c
6984:(
6981:f
6946:i
6942:c
6919:,
6916:)
6911:i
6906:x
6901:(
6893:i
6889:y
6883:i
6879:c
6873:n
6868:1
6865:=
6862:i
6854:=
6850:w
6826:w
6802:)
6797:j
6792:x
6787:(
6778:)
6773:i
6768:x
6763:(
6757:=
6754:)
6749:j
6744:x
6739:,
6734:i
6729:x
6724:(
6721:k
6701:k
6681:.
6678:)
6673:i
6668:x
6663:(
6648:)
6646:b
6642:a
6638:b
6634:a
6630:b
6626:a
6594:1
6588:=
6583:i
6579:y
6556:i
6552:y
6548:=
6543:1
6535:i
6531:y
6508:.
6503:i
6499:y
6490:i
6485:x
6477:T
6471:w
6466:=
6463:b
6455:1
6452:=
6449:)
6446:b
6438:i
6433:x
6425:T
6419:w
6414:(
6409:i
6405:y
6382:i
6377:x
6355:b
6331:w
6308:i
6303:x
6279:1
6272:)
6265:n
6262:2
6259:(
6251:i
6247:c
6240:0
6218:i
6213:x
6191:0
6188:=
6183:i
6179:c
6156:.
6151:i
6146:x
6139:i
6135:y
6129:i
6125:c
6119:n
6114:1
6111:=
6108:i
6100:=
6096:w
6071:i
6067:c
6037:i
6033:c
6002:.
5999:i
5984:n
5981:2
5977:1
5967:i
5963:c
5956:0
5947:,
5944:0
5941:=
5936:i
5932:y
5926:i
5922:c
5916:n
5911:1
5908:=
5905:i
5887:,
5882:j
5878:c
5872:j
5868:y
5864:)
5859:j
5854:x
5846:T
5840:i
5835:x
5830:(
5825:i
5821:c
5815:i
5811:y
5805:n
5800:1
5797:=
5794:j
5784:n
5779:1
5776:=
5773:i
5763:2
5760:1
5750:i
5746:c
5740:n
5735:1
5732:=
5729:i
5721:=
5718:)
5713:n
5709:c
5700:1
5696:c
5692:(
5689:f
5633:.
5630:i
5621:,
5618:0
5610:i
5593:i
5582:1
5575:)
5571:b
5563:i
5558:x
5550:T
5544:w
5538:(
5532:i
5528:y
5512:2
5503:w
5493:+
5488:i
5478:n
5473:1
5470:=
5467:i
5457:n
5454:1
5418:.
5413:i
5402:1
5396:)
5393:b
5385:i
5380:x
5372:T
5366:w
5361:(
5356:i
5352:y
5329:i
5303:)
5299:)
5296:b
5288:i
5283:x
5275:T
5269:w
5264:(
5259:i
5255:y
5248:1
5245:,
5242:0
5238:(
5231:=
5226:i
5201:}
5198:n
5194:,
5187:,
5184:1
5181:{
5175:i
5113:)
5111:2
5109:(
5092:.
5087:2
5078:w
5068:+
5064:]
5059:)
5055:)
5052:b
5044:i
5039:x
5031:T
5025:w
5020:(
5015:i
5011:y
5004:1
5001:,
4998:0
4994:(
4985:n
4980:1
4977:=
4974:i
4964:n
4961:1
4955:[
4921:)
4917:x
4913:,
4908:i
4903:x
4898:(
4895:k
4890:i
4886:y
4880:i
4870:i
4862:=
4859:)
4855:x
4851:(
4841:w
4829:w
4814:)
4809:i
4804:x
4799:(
4791:i
4787:y
4781:i
4771:i
4763:=
4759:w
4747:w
4732:)
4726:j
4722:x
4717:(
4708:)
4703:i
4698:x
4693:(
4687:=
4684:)
4679:j
4674:x
4669:,
4664:i
4659:x
4654:(
4651:k
4631:)
4626:i
4621:x
4616:(
4601:.
4589:0
4583:c
4563:0
4537:)
4534:c
4531:+
4526:j
4521:x
4511:i
4506:x
4498:(
4489:=
4486:)
4480:j
4476:x
4471:,
4465:i
4461:x
4456:(
4453:k
4441:(
4435:.
4423:)
4418:2
4410:2
4407:(
4403:/
4399:1
4396:=
4373:0
4346:)
4340:2
4329:j
4324:x
4314:i
4309:x
4291:(
4281:=
4278:)
4273:j
4268:x
4263:,
4258:i
4253:x
4248:(
4245:k
4230:.
4216:d
4212:)
4208:r
4205:+
4200:j
4195:x
4185:i
4180:x
4175:(
4172:=
4169:)
4164:j
4159:x
4154:,
4149:i
4144:x
4139:(
4136:k
4110:1
4107:=
4104:d
4082:d
4078:)
4072:j
4067:x
4057:i
4052:x
4047:(
4044:=
4041:)
4036:j
4031:x
4026:,
4021:i
4016:x
4011:(
4008:k
3927:C
3901:}
3898:n
3895:,
3889:,
3886:1
3883:{
3877:i
3870:0
3862:i
3853:,
3848:i
3837:1
3831:)
3828:b
3820:i
3815:x
3803:w
3798:(
3793:i
3789:y
3770:i
3760:n
3755:1
3752:=
3749:i
3741:C
3738:+
3733:2
3728:2
3719:w
3698:,
3695:b
3691:,
3687:w
3651:i
3646:x
3624:0
3618:C
3596:,
3592:]
3587:)
3583:)
3580:b
3572:i
3567:x
3559:T
3553:w
3548:(
3543:i
3539:y
3532:1
3529:,
3526:0
3522:(
3513:n
3508:1
3505:=
3502:i
3492:n
3489:1
3483:[
3479:C
3476:+
3471:2
3462:w
3431:i
3426:x
3405:i
3391:b
3383:i
3378:x
3370:T
3364:w
3352:i
3336:i
3332:y
3309:.
3305:)
3301:)
3298:b
3290:i
3285:x
3277:T
3271:w
3266:(
3261:i
3257:y
3250:1
3247:,
3244:0
3240:(
3196:i
3191:x
3167:i
3162:x
3133:)
3127:(
3101:)
3098:b
3091:x
3084:T
3078:w
3073:(
3060:x
3039:b
3018:w
2991:}
2988:n
2985:,
2979:,
2976:1
2973:{
2967:i
2960:1
2954:)
2951:b
2943:i
2938:x
2926:w
2921:(
2916:i
2912:y
2893:2
2884:w
2875:2
2872:1
2861:b
2857:,
2853:w
2824:)
2822:1
2820:(
2803:.
2800:n
2794:i
2788:1
2779:,
2776:1
2770:)
2767:b
2759:i
2754:x
2746:T
2740:w
2735:(
2730:i
2726:y
2691:=
2686:i
2682:y
2673:,
2669:1
2660:b
2652:i
2647:x
2639:T
2633:w
2611:,
2608:1
2605:=
2600:i
2596:y
2587:,
2583:1
2577:b
2569:i
2564:x
2556:T
2550:w
2528:i
2500:w
2468:w
2460:2
2433:1
2427:=
2424:b
2417:x
2410:T
2404:w
2375:1
2372:=
2369:b
2362:x
2355:T
2349:w
2310:=
2307:b
2304:+
2300:x
2293:T
2287:w
2261:w
2232:w
2224:b
2200:w
2170:w
2149:,
2146:0
2143:=
2140:b
2133:x
2126:T
2120:w
2097:x
2067:i
2062:x
2040:1
2034:=
2029:i
2025:y
2004:1
2001:=
1996:i
1992:y
1969:i
1964:x
1938:p
1916:i
1911:x
1887:i
1882:x
1858:i
1854:y
1833:,
1830:)
1825:n
1821:y
1817:,
1812:n
1807:x
1802:(
1799:,
1793:,
1790:)
1785:1
1781:y
1777:,
1772:1
1767:x
1762:(
1742:n
1616:x
1594:i
1590:x
1569:x
1549:x
1529:y
1509:)
1506:y
1503:,
1500:x
1497:(
1494:k
1473:.
1465:=
1462:)
1459:x
1456:,
1451:i
1447:x
1443:(
1440:k
1435:i
1425:i
1395:x
1373:i
1369:x
1343:i
1317:)
1314:y
1311:,
1308:x
1305:(
1302:k
1201:)
1198:1
1192:p
1189:(
1169:p
1149:p
1121:3
1117:2
1113:1
1111:H
971:(
952:e
945:t
938:v
518:k
367:k
294:k
252:)
240:(
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.