Knowledge

Stability (learning theory)

Source đź“ť

304:, and it can be assessed in algorithms that have hypothesis spaces with unbounded or undefined VC-dimension such as nearest neighbor. A stable learning algorithm is one for which the learned function does not change much when the training set is slightly modified, for instance by leaving out an example. A measure of Leave one out error is used in a Cross Validation Leave One Out (CVloo) algorithm to evaluate a learning algorithm's stability with respect to the loss function. As such, stability analysis is the application of 276:
of functions being learned. However, these results could not be applied to algorithms with hypothesis spaces of unbounded VC-dimension. Put another way, these results could not be applied when the information being learned had a complexity that was too large to measure. Some of the simplest machine
3377: 39:
of the alphabet, using 1000 examples of handwritten letters and their labels ("A" to "Z") as a training set. One way to modify this training set is to leave out an example, so that only 999 examples of handwritten letters and their labels are available. A stable learning algorithm would produce a
2665: 387:, and showed that it is a) sufficient for generalization in bounded loss classes, and b) necessary and sufficient for consistency (and thus generalization) of ERM algorithms for certain loss functions such as the square loss, the absolute value and the binary classification loss. 2958: 393:- Shalev Shwartz et al. noticed problems with the original results of Vapnik due to the complex relations between hypothesis space and loss class. They discuss stability notions that capture different loss classes and different types of learning, supervised and unsupervised. 3561:
This is an important result for the foundations of learning theory, because it shows that two previously unrelated properties of an algorithm, stability and consistency, are equivalent for ERM (and certain loss functions). The generalization bound is given in the article.
34:
output is changed with small perturbations to its inputs. A stable learning algorithm is one for which the prediction does not change much when the training data is modified slightly. For instance, consider a machine learning algorithm that is being trained to
2456: 366:
of a learning algorithm and showed that it implies low generalization error. Uniform hypothesis stability, however, is a strong condition that does not apply to large classes of algorithms, including ERM algorithms with a hypothesis space of only two
2252: 3770:
S. Mukherjee, P. Niyogi, T. Poggio, and R. M. Rifkin. Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization. Adv. Comput. Math., 25(1-3):161–193,
3153: 1626: 2465: 743: 3494:
Uniform Stability is a strong condition which is not met by all algorithms but is, surprisingly, met by the large and important class of Regularization algorithms. The generalization bound is given in the article.
2028: 1824: 1471: 2762: 277:
learning algorithms—for instance, for regression—have hypothesis spaces with unbounded VC-dimension. Another example is language learning algorithms that can produce sentences of arbitrary length.
3855:
Elisseeff, A. Pontil, M., Leave-one-out Error and Stability of Learning Algorithms with Applications, NATO SCIENCE SERIES SUB SERIES III COMPUTER AND SYSTEMS SCIENCES, 2003, VOL 190, pages 111-130
247: 1242: 284:
and is an alternative method for obtaining generalization bounds. The stability of an algorithm is a property of the learning process, rather than a direct property of the hypothesis space
2286: 1327: 3476: 377:. Furthermore, they took an initial step in establishing the relationship between stability and consistency in ERM algorithms in the Probably Approximately Correct (PAC) setting. 3557:
For ERM algorithms specifically (say for the square loss), Leave-one-out cross-validation (CVloo) Stability is both necessary and sufficient for consistency and generalization.
2727: 2062: 3447: 3146: 1142: 3412: 3111: 3780:
Shalev Shwartz, S., Shamir, O., Srebro, N., Sridharan, K., Learnability, Stability and Uniform Convergence, Journal of Machine Learning Research, 11(Oct):2635-2670, 2010.
3858:
Shalev Shwartz, S., Shamir, O., Srebro, N., Sridharan, K., Learnability, Stability and Uniform Convergence, Journal of Machine Learning Research, 11(Oct):2635-2670, 2010
797: 3549: 3076: 3010: 55:
in physics and engineering, as it is a property of the learning process rather than the type of information being learned. The study of stability gained importance in
1077: 3491:
For symmetric learning algorithms with bounded loss, if the algorithm has Uniform Stability with the probabilistic definition above, then the algorithm generalizes.
2691: 83:, or perform accurately on new examples after being trained on a finite number of them. In the 1990s, milestones were reached in obtaining generalization bounds for 916: 849: 461: 399:- Moritz Hardt et al. proved stability of gradient descent given certain assumption on the hypothesis and number of times each instance is used to update the model. 383:- Poggio et al. proved a general relationship between stability and ERM consistency. They proposed a statistical form of leave-one-out-stability which they called 998:, i.e. it does not depend on the order of the elements in the training set. Furthermore, we assume that all functions are measurable and all sets are countable. 408:
We define several terms related to learning algorithms training sets, so that we can then define stability in multiple ways and present theorems from the field.
249:(plus logarithmic factors) from the true error. The result was later extended to almost-ERM algorithms with function classes that do not have unique minimizers. 3662: 3642: 3603: 3034: 2755: 2279: 2055: 1851: 1658: 1265: 1165: 1039: 1019: 996: 976: 956: 936: 889: 869: 822: 765: 601: 581: 561: 541: 521: 501: 481: 429: 338: 302: 274: 207: 187: 164: 137: 117: 3372:{\displaystyle \forall i\in \{1,...,m\},\mathbb {P} _{S}\{|I-{\frac {1}{m}}\sum _{i=1}^{m}V(f_{S^{|i}},z_{i})|\leq \beta _{EL}^{m}\}\geq 1-\delta _{EL}^{m}} 3852:
Andre Elisseeff, Theodoros Evgeniou, Massimiliano Pontil, Stability of Randomized Learning Algorithms, Journal of Machine Learning Research 6, 55–79, 2010
373:- Kutin and Niyogi extended Bousquet and Elisseeff's results by providing generalization bounds for several weaker forms of stability which they called 3585:(SVM) classification with a bounded kernel and where the regularizer is a norm in a Reproducing Kernel Hilbert Space. A large regularization constant 3761:
S. Kutin and P. Niyogi, Almost-everywhere algorithmic stability and generalization error, Technical Report TR-2002-03, University of Chicago (2002).
2660:{\displaystyle \forall S\in Z^{m},\forall i\in \{1,...,m\},\mathbb {P} _{S}\{\sup _{z\in Z}|V(f_{S},z)-V(f_{S^{|i}},z)|\leq \beta \}\geq 1-\delta } 1482: 611: 3849:
Poggio, T., Rifkin, R., Mukherjee, S. and Niyogi, P., "Learning Theory: general conditions for predictivity", Nature, Vol. 428, 419-422, 2004
1858: 1665: 350:- Devroye and Wagner observed that the leave-one-out behavior of an algorithm is related to its sensitivity to small changes in the sample. 1341: 3570:
This is a list of algorithms that have been shown to be stable, and the article where the associated generalization bounds are provided.
3554:
Neither condition alone is sufficient for generalization. However, both together ensure generalization (while the converse is not true).
3721:
L. Devroye and Wagner, Distribution-free performance bounds for potential function rules, IEEE Trans. Inf. Theory 25(5) (1979) 601–604.
2953:{\displaystyle \forall i\in \{1,...,m\},\mathbb {P} _{S}\{|V(f_{S},z_{i})-V(f_{S^{|i}},z_{i})|\leq \beta _{CV}\}\geq 1-\delta _{CV}} 95:
properties of empirical quantities to their means. This technique was used to obtain generalization bounds for the large class of
3789:
Moritz Hardt, Benjamin Recht, Yoram Singer, Train faster, generalize better: Stability of stochastic gradient descent, ICML 2016.
3798:
Elisseeff, A. A study about algorithmic stability and their relation to generalization performances. Technical report. (2000)
3840:
S. Rakhlin, S. Mukherjee, and T. Poggio. Stability results in learning theory. Analysis and Applications, 3(4):397–419, 2005
3810:
Rifkin, R. Everything Old is New Again: A fresh look at historical approaches in machine learning. Ph.D. Thesis, MIT, 2002
3734:, Algorithmic stability and sanity-check bounds for leave-one-out cross-validation, Neural Comput. 11(6) (1999) 1427–1453. 212: 146:
for an ERM binary classification algorithms, is that for any target function and input distribution, any hypothesis space
1170: 3670:
All learning algorithms with Tikhonov regularization satisfies Uniform Stability criteria and are, thus, generalizable.
3611: 256:, established a relationship between generalization of a learning algorithm and properties of the hypothesis space 2451:{\displaystyle \forall S\in Z^{m},\forall i\in \{1,...,m\},\sup _{z\in Z}|V(f_{S},z)-V(f_{S^{|i}},z)|\leq \beta } 3837:
S.Kutin and P.Niyogi.Almost-everywhere algorithmic stability and generalization error. In Proc. of UAI 18, 2002
1270: 281: 56: 27: 3452: 341: 2969: 2247:{\displaystyle \forall S\in Z^{m},\forall i\in \{1,...,m\},|\mathbb {E} _{z}-\mathbb {E} _{z}|\leq \beta } 1331:
Given a training set S of size m, we will build, for all i = 1....,m, modified training sets as follows:
96: 64: 48: 2696: 41: 3417: 3116: 1082: 3872: 3382: 3081: 3820: 3752:
O. Bousquet and A. Elisseeff. Stability and generalization. J. Mach. Learn. Res., 2:499–526, 2002.
1853:
has point-wise hypothesis stability β with respect to the loss function V if the following holds:
80: 60: 770: 320:- Stability in learning theory was earliest described in terms of continuity of the learning map 209:
training examples, the algorithm is consistent and will produce a training error that is at most
36: 3512: 3039: 2973: 3582: 3621: 1044: 2676: 87:. The technique historically used to prove generalization was to show that an algorithm was 894: 827: 434: 305: 88: 99:(ERM) algorithms. An ERM algorithm is one that selects a solution from a hypothesis space 8: 92: 84: 3647: 3627: 3588: 3019: 2740: 2264: 2040: 1836: 1660:
has hypothesis stability β with respect to the loss function V if the following holds:
1643: 1250: 1150: 1024: 1004: 981: 961: 941: 921: 874: 854: 807: 750: 586: 566: 546: 526: 506: 486: 466: 414: 323: 287: 259: 192: 172: 149: 122: 102: 3877: 3701: 3574: 2281:
has uniform stability β with respect to the loss function V if the following holds:
3509:
Leave-one-out cross-validation (CVloo) Stability and Expected-leave-one-out error (
356:- Kearns and Ron discovered a connection between finite VC-dimension and stability. 76: 31: 2757:
has CVloo stability β with respect to the loss function V if the following holds:
2057:
has error stability β with respect to the loss function V if the following holds:
143: 52: 3689: 3866: 3705: 1621:{\displaystyle S^{i}=\{z_{1},...,\ z_{i-1},\ z_{i}',\ z_{i+1},...,\ z_{m}\}} 167: 3505:
For symmetric learning algorithms with bounded loss, if the algorithm has
738:{\displaystyle S=\{z_{1}=(x_{1},\ y_{1})\ ,..,\ z_{m}=(x_{m},\ y_{m})\}} 2023:{\displaystyle \forall i\in \ \{1,...,m\},\mathbb {E} _{S}\leq \beta .} 1819:{\displaystyle \forall i\in \{1,...,m\},\mathbb {E} _{S,z}\leq \beta .} 3843:
V.N. Vapnik. The Nature of Statistical Learning Theory. Springer, 1995
63:. It was shown that for large classes of learning algorithms, notably 362:- In a landmark paper, Bousquet and Elisseeff proposed the notion of 253: 1466:{\displaystyle S^{|i}=\{z_{1},...,\ z_{i-1},\ z_{i+1},...,\ z_{m}\}} 3731: 2732: 67:
algorithms, certain types of stability ensure good generalization.
47:
Stability can be studied for many types of learning problems, from
119:
in such a way to minimize the empirical error on a training set
431:, maps a training data set, which is a set of labeled examples 3846:
Vapnik, V., Statistical Learning Theory. Wiley, New York, 1998
606:
The training set from which an algorithm learns is defined as
563:
are in the same space of the training examples. The functions
3551:) Stability as defined above, then the algorithm generalizes. 411:
A machine learning algorithm, also known as a learning map
3617:
The minimum relative entropy algorithm for classification.
44:
with both the 1000-element and 999-element training sets.
583:
are selected from a hypothesis space of functions called
59:
in the 2000s when it was shown to have a connection with
958:. Here, we consider only deterministic algorithms where 3650: 3630: 3591: 3515: 3455: 3420: 3385: 3156: 3119: 3084: 3042: 3022: 2976: 2765: 2743: 2699: 2679: 2468: 2289: 2267: 2065: 2043: 1861: 1839: 1668: 1646: 1485: 1344: 1273: 1253: 1173: 1153: 1085: 1047: 1027: 1007: 984: 964: 944: 924: 897: 877: 857: 830: 810: 773: 753: 614: 589: 569: 549: 529: 509: 489: 469: 437: 417: 326: 290: 262: 215: 195: 175: 152: 125: 105: 242:{\displaystyle O\left({\sqrt {\frac {d}{n}}}\right)} 2460:A probabilistic version of uniform stability β is: 1237:{\displaystyle I_{S}={\frac {1}{n}}\sum V(f,z_{i})} 3656: 3636: 3597: 3543: 3470: 3441: 3406: 3371: 3140: 3105: 3070: 3028: 3004: 2952: 2749: 2721: 2685: 2659: 2450: 2273: 2246: 2049: 2022: 1845: 1828: 1818: 1652: 1620: 1465: 1321: 1259: 1236: 1159: 1136: 1071: 1033: 1013: 990: 970: 950: 930: 910: 883: 863: 843: 816: 791: 759: 737: 595: 575: 555: 535: 515: 495: 475: 455: 423: 332: 296: 280:Stability analysis was developed in the 2000s for 268: 241: 201: 181: 158: 131: 111: 3687: 2966:to Pointwise-hypothesis stability seen earlier. 79:is to guarantee that the learning algorithm will 3864: 2733:Leave-one-out cross-validation (CVloo) Stability 2546: 2352: 3748: 3746: 3744: 3742: 3740: 3806: 3804: 3717: 3715: 3565: 801:drawn i.i.d. from an unknown distribution D. 311: 3688:Bousquet, Olivier; Elisseeff, AndrĂ© (2002). 3339: 3208: 3190: 3166: 2925: 2817: 2799: 2775: 2642: 2542: 2524: 2500: 2345: 2321: 2121: 2097: 1898: 1874: 1702: 1678: 1615: 1499: 1460: 1363: 732: 621: 3737: 3579:k-NN classifier with a {0-1} loss function. 1630: 3801: 3712: 403: 252:Vapnik's work, using what became known as 3198: 2807: 2532: 2180: 2134: 1906: 1710: 1291: 1322:{\displaystyle I=\mathbb {E} _{z}V(f,z)} 3078:stability if for each n there exists a 2962:The definition of (CVloo) Stability is 1635: 16:Notion in computational learning theory 3865: 3471:{\displaystyle m,\rightarrow \infty } 3821:Stability of Tikhonov Regularization 3694:Journal of Machine Learning Research 2256: 3480: 13: 3830: 3465: 3157: 2766: 2491: 2469: 2312: 2290: 2088: 2066: 2032: 1862: 1669: 14: 3889: 2722:{\displaystyle O({\frac {1}{m}})} 3486:From Bousquet and Elisseeff (02) 3442:{\displaystyle \delta _{EL}^{m}} 3141:{\displaystyle \delta _{EL}^{m}} 1137:{\displaystyle V(f,z)=V(f(x),y)} 3813: 3667:Multi-class SVM classification. 3608:Soft margin SVM classification. 3407:{\displaystyle \beta _{EL}^{m}} 3106:{\displaystyle \beta _{EL}^{m}} 1829:Point-wise Hypothesis Stability 3792: 3783: 3774: 3764: 3755: 3724: 3690:"Stability and Generalization" 3681: 3644:of regressors increasing with 3462: 3314: 3310: 3286: 3272: 3232: 3219: 3212: 2970:Expected-leave-one-out error ( 2905: 2901: 2877: 2863: 2854: 2828: 2821: 2716: 2703: 2632: 2628: 2611: 2597: 2588: 2569: 2562: 2438: 2434: 2417: 2403: 2394: 2375: 2368: 2234: 2230: 2227: 2210: 2196: 2190: 2172: 2169: 2150: 2144: 2128: 2008: 2004: 2000: 1976: 1962: 1953: 1927: 1920: 1916: 1804: 1800: 1796: 1779: 1765: 1756: 1737: 1730: 1726: 1351: 1316: 1304: 1283: 1277: 1231: 1212: 1190: 1184: 1131: 1122: 1116: 1110: 1101: 1089: 1066: 1054: 729: 700: 666: 637: 450: 438: 85:supervised learning algorithms 75:A central goal in designing a 1: 3674: 3624:regularizers with the number 1476:By replacing the i-th element 978:is symmetric with respect to 824:is defined as a mapping from 282:computational learning theory 57:computational learning theory 37:recognize handwritten letters 28:computational learning theory 1335:By removing the i-th element 364:uniform hypothesis stability 342:Andrey Nikolayevich Tikhonov 142:A general result, proved by 7: 3819:Rosasco, L. and Poggio, T. 2669:An algorithm is said to be 1041:with respect to an example 792:{\displaystyle Z=X\times Y} 375:almost-everywhere stability 97:empirical risk minimization 65:empirical risk minimization 10: 3894: 3566:Algorithms that are stable 3544:{\displaystyle Eloo_{err}} 3499:From Mukherjee et al. (06) 3071:{\displaystyle Eloo_{err}} 3005:{\displaystyle Eloo_{err}} 312:Summary of classic results 70: 32:machine learning algorithm 3614:Least Squares regression. 871:, mapping a training set 3605:leads to good stability. 1631:Definitions of stability 1147:The empirical error of 1072:{\displaystyle z=(x,y)} 804:Thus, the learning map 404:Preliminary definitions 77:machine learning system 3658: 3638: 3599: 3583:Support Vector Machine 3545: 3472: 3443: 3408: 3373: 3268: 3142: 3107: 3072: 3030: 3006: 2954: 2751: 2723: 2687: 2686:{\displaystyle \beta } 2661: 2452: 2275: 2248: 2051: 2024: 1847: 1820: 1654: 1622: 1467: 1323: 1261: 1238: 1161: 1138: 1073: 1035: 1015: 992: 972: 952: 932: 912: 885: 865: 845: 818: 793: 761: 739: 597: 577: 557: 537: 517: 497: 477: 457: 425: 334: 298: 270: 243: 203: 183: 160: 133: 113: 3659: 3639: 3600: 3546: 3473: 3444: 3409: 3374: 3248: 3143: 3108: 3073: 3031: 3007: 2955: 2752: 2724: 2688: 2662: 2453: 2276: 2249: 2052: 2025: 1848: 1821: 1655: 1623: 1468: 1324: 1262: 1239: 1162: 1139: 1074: 1036: 1016: 993: 973: 953: 933: 913: 911:{\displaystyle f_{S}} 886: 866: 846: 844:{\displaystyle Z_{m}} 819: 794: 762: 740: 598: 578: 558: 538: 518: 498: 478: 458: 456:{\displaystyle (x,y)} 426: 335: 308:to machine learning. 299: 271: 244: 204: 184: 161: 134: 114: 24:algorithmic stability 3648: 3628: 3589: 3513: 3453: 3418: 3383: 3154: 3117: 3082: 3040: 3020: 2974: 2763: 2741: 2697: 2677: 2673:, when the value of 2466: 2287: 2265: 2063: 2041: 1859: 1837: 1666: 1644: 1636:Hypothesis Stability 1483: 1342: 1271: 1251: 1171: 1151: 1083: 1045: 1025: 1005: 982: 962: 942: 922: 895: 875: 855: 828: 808: 771: 751: 612: 587: 567: 547: 527: 507: 487: 467: 435: 415: 324: 306:sensitivity analysis 288: 260: 213: 193: 173: 150: 123: 103: 3438: 3403: 3368: 3338: 3137: 3102: 1564: 1079:is then defined as 93:uniform convergence 3654: 3634: 3595: 3541: 3468: 3449:going to zero for 3439: 3421: 3404: 3386: 3369: 3351: 3321: 3138: 3120: 3103: 3085: 3068: 3026: 3002: 2950: 2747: 2719: 2683: 2657: 2560: 2448: 2366: 2271: 2244: 2047: 2020: 1843: 1816: 1650: 1618: 1552: 1463: 1319: 1257: 1247:The true error of 1234: 1157: 1134: 1069: 1031: 1011: 988: 968: 948: 928: 908: 881: 861: 841: 814: 789: 757: 735: 593: 573: 553: 533: 513: 493: 473: 463:, onto a function 453: 421: 385:CVEEEloo stability 330: 294: 266: 239: 199: 179: 156: 129: 109: 3657:{\displaystyle n} 3637:{\displaystyle k} 3598:{\displaystyle C} 3575:Linear regression 3246: 3029:{\displaystyle L} 2750:{\displaystyle L} 2714: 2545: 2351: 2274:{\displaystyle L} 2257:Uniform Stability 2050:{\displaystyle L} 1873: 1846:{\displaystyle L} 1653:{\displaystyle L} 1604: 1570: 1551: 1529: 1449: 1415: 1393: 1260:{\displaystyle f} 1204: 1160:{\displaystyle f} 1034:{\displaystyle f} 1014:{\displaystyle V} 991:{\displaystyle S} 971:{\displaystyle L} 951:{\displaystyle Y} 931:{\displaystyle X} 884:{\displaystyle S} 864:{\displaystyle H} 817:{\displaystyle L} 760:{\displaystyle m} 718: 686: 671: 655: 596:{\displaystyle H} 576:{\displaystyle f} 556:{\displaystyle Y} 536:{\displaystyle X} 516:{\displaystyle Y} 496:{\displaystyle X} 476:{\displaystyle f} 424:{\displaystyle L} 333:{\displaystyle L} 297:{\displaystyle H} 269:{\displaystyle H} 233: 232: 202:{\displaystyle n} 182:{\displaystyle d} 159:{\displaystyle H} 132:{\displaystyle S} 112:{\displaystyle H} 49:language learning 26:, is a notion in 3885: 3873:Machine learning 3824: 3817: 3811: 3808: 3799: 3796: 3790: 3787: 3781: 3778: 3772: 3768: 3762: 3759: 3753: 3750: 3735: 3728: 3722: 3719: 3710: 3709: 3700:(Mar): 499–526. 3685: 3663: 3661: 3660: 3655: 3643: 3641: 3640: 3635: 3604: 3602: 3601: 3596: 3550: 3548: 3547: 3542: 3540: 3539: 3481:Classic theorems 3477: 3475: 3474: 3469: 3448: 3446: 3445: 3440: 3437: 3432: 3413: 3411: 3410: 3405: 3402: 3397: 3378: 3376: 3375: 3370: 3367: 3362: 3337: 3332: 3317: 3309: 3308: 3296: 3295: 3294: 3293: 3289: 3267: 3262: 3247: 3239: 3231: 3230: 3215: 3207: 3206: 3201: 3147: 3145: 3144: 3139: 3136: 3131: 3112: 3110: 3109: 3104: 3101: 3096: 3077: 3075: 3074: 3069: 3067: 3066: 3035: 3033: 3032: 3027: 3011: 3009: 3008: 3003: 3001: 3000: 2959: 2957: 2956: 2951: 2949: 2948: 2924: 2923: 2908: 2900: 2899: 2887: 2886: 2885: 2884: 2880: 2853: 2852: 2840: 2839: 2824: 2816: 2815: 2810: 2756: 2754: 2753: 2748: 2728: 2726: 2725: 2720: 2715: 2707: 2692: 2690: 2689: 2684: 2666: 2664: 2663: 2658: 2635: 2621: 2620: 2619: 2618: 2614: 2581: 2580: 2565: 2559: 2541: 2540: 2535: 2487: 2486: 2457: 2455: 2454: 2449: 2441: 2427: 2426: 2425: 2424: 2420: 2387: 2386: 2371: 2365: 2308: 2307: 2280: 2278: 2277: 2272: 2253: 2251: 2250: 2245: 2237: 2220: 2219: 2218: 2217: 2213: 2189: 2188: 2183: 2162: 2161: 2143: 2142: 2137: 2131: 2084: 2083: 2056: 2054: 2053: 2048: 2029: 2027: 2026: 2021: 2007: 1999: 1998: 1986: 1985: 1984: 1983: 1979: 1952: 1951: 1939: 1938: 1923: 1915: 1914: 1909: 1871: 1852: 1850: 1849: 1844: 1825: 1823: 1822: 1817: 1803: 1789: 1788: 1787: 1786: 1782: 1749: 1748: 1733: 1725: 1724: 1713: 1659: 1657: 1656: 1651: 1627: 1625: 1624: 1619: 1614: 1613: 1602: 1586: 1585: 1568: 1560: 1549: 1545: 1544: 1527: 1511: 1510: 1495: 1494: 1472: 1470: 1469: 1464: 1459: 1458: 1447: 1431: 1430: 1413: 1409: 1408: 1391: 1375: 1374: 1359: 1358: 1354: 1328: 1326: 1325: 1320: 1300: 1299: 1294: 1266: 1264: 1263: 1258: 1243: 1241: 1240: 1235: 1230: 1229: 1205: 1197: 1183: 1182: 1166: 1164: 1163: 1158: 1143: 1141: 1140: 1135: 1078: 1076: 1075: 1070: 1040: 1038: 1037: 1032: 1021:of a hypothesis 1020: 1018: 1017: 1012: 997: 995: 994: 989: 977: 975: 974: 969: 957: 955: 954: 949: 937: 935: 934: 929: 917: 915: 914: 909: 907: 906: 891:onto a function 890: 888: 887: 882: 870: 868: 867: 862: 850: 848: 847: 842: 840: 839: 823: 821: 820: 815: 798: 796: 795: 790: 766: 764: 763: 758: 744: 742: 741: 736: 728: 727: 716: 712: 711: 696: 695: 684: 669: 665: 664: 653: 649: 648: 633: 632: 602: 600: 599: 594: 582: 580: 579: 574: 562: 560: 559: 554: 542: 540: 539: 534: 522: 520: 519: 514: 502: 500: 499: 494: 482: 480: 479: 474: 462: 460: 459: 454: 430: 428: 427: 422: 339: 337: 336: 331: 303: 301: 300: 295: 275: 273: 272: 267: 248: 246: 245: 240: 238: 234: 225: 224: 208: 206: 205: 200: 188: 186: 185: 180: 165: 163: 162: 157: 138: 136: 135: 130: 118: 116: 115: 110: 53:inverse problems 22:, also known as 3893: 3892: 3888: 3887: 3886: 3884: 3883: 3882: 3863: 3862: 3861: 3833: 3831:Further reading 3828: 3827: 3818: 3814: 3809: 3802: 3797: 3793: 3788: 3784: 3779: 3775: 3769: 3765: 3760: 3756: 3751: 3738: 3729: 3725: 3720: 3713: 3686: 3682: 3677: 3649: 3646: 3645: 3629: 3626: 3625: 3590: 3587: 3586: 3568: 3529: 3525: 3514: 3511: 3510: 3483: 3454: 3451: 3450: 3433: 3425: 3419: 3416: 3415: 3398: 3390: 3384: 3381: 3380: 3363: 3355: 3333: 3325: 3313: 3304: 3300: 3285: 3284: 3280: 3279: 3275: 3263: 3252: 3238: 3226: 3222: 3211: 3202: 3197: 3196: 3155: 3152: 3151: 3132: 3124: 3118: 3115: 3114: 3097: 3089: 3083: 3080: 3079: 3056: 3052: 3041: 3038: 3037: 3021: 3018: 3017: 3014: 2990: 2986: 2975: 2972: 2971: 2941: 2937: 2916: 2912: 2904: 2895: 2891: 2876: 2875: 2871: 2870: 2866: 2848: 2844: 2835: 2831: 2820: 2811: 2806: 2805: 2764: 2761: 2760: 2742: 2739: 2738: 2735: 2706: 2698: 2695: 2694: 2678: 2675: 2674: 2631: 2610: 2609: 2605: 2604: 2600: 2576: 2572: 2561: 2549: 2536: 2531: 2530: 2482: 2478: 2467: 2464: 2463: 2437: 2416: 2415: 2411: 2410: 2406: 2382: 2378: 2367: 2355: 2303: 2299: 2288: 2285: 2284: 2266: 2263: 2262: 2259: 2233: 2209: 2208: 2204: 2203: 2199: 2184: 2179: 2178: 2157: 2153: 2138: 2133: 2132: 2127: 2079: 2075: 2064: 2061: 2060: 2042: 2039: 2038: 2035: 2033:Error Stability 2003: 1994: 1990: 1975: 1974: 1970: 1969: 1965: 1947: 1943: 1934: 1930: 1919: 1910: 1905: 1904: 1860: 1857: 1856: 1838: 1835: 1834: 1831: 1799: 1778: 1777: 1773: 1772: 1768: 1744: 1740: 1729: 1714: 1709: 1708: 1667: 1664: 1663: 1645: 1642: 1641: 1638: 1633: 1609: 1605: 1575: 1571: 1556: 1534: 1530: 1506: 1502: 1490: 1486: 1484: 1481: 1480: 1454: 1450: 1420: 1416: 1398: 1394: 1370: 1366: 1350: 1349: 1345: 1343: 1340: 1339: 1295: 1290: 1289: 1272: 1269: 1268: 1252: 1249: 1248: 1225: 1221: 1196: 1178: 1174: 1172: 1169: 1168: 1152: 1149: 1148: 1084: 1081: 1080: 1046: 1043: 1042: 1026: 1023: 1022: 1006: 1003: 1002: 983: 980: 979: 963: 960: 959: 943: 940: 939: 923: 920: 919: 902: 898: 896: 893: 892: 876: 873: 872: 856: 853: 852: 835: 831: 829: 826: 825: 809: 806: 805: 772: 769: 768: 752: 749: 748: 747:and is of size 723: 719: 707: 703: 691: 687: 660: 656: 644: 640: 628: 624: 613: 610: 609: 588: 585: 584: 568: 565: 564: 548: 545: 544: 528: 525: 524: 508: 505: 504: 488: 485: 484: 468: 465: 464: 436: 433: 432: 416: 413: 412: 406: 325: 322: 321: 314: 289: 286: 285: 261: 258: 257: 223: 219: 214: 211: 210: 194: 191: 190: 174: 171: 170: 151: 148: 147: 144:Vladimir Vapnik 124: 121: 120: 104: 101: 100: 73: 17: 12: 11: 5: 3891: 3881: 3880: 3875: 3860: 3859: 3856: 3853: 3850: 3847: 3844: 3841: 3838: 3834: 3832: 3829: 3826: 3825: 3812: 3800: 3791: 3782: 3773: 3763: 3754: 3736: 3730:M. Kearns and 3723: 3711: 3679: 3678: 3676: 3673: 3672: 3671: 3668: 3665: 3653: 3633: 3618: 3615: 3609: 3606: 3594: 3580: 3577: 3567: 3564: 3559: 3558: 3555: 3552: 3538: 3535: 3532: 3528: 3524: 3521: 3518: 3482: 3479: 3467: 3464: 3461: 3458: 3436: 3431: 3428: 3424: 3401: 3396: 3393: 3389: 3366: 3361: 3358: 3354: 3350: 3347: 3344: 3341: 3336: 3331: 3328: 3324: 3320: 3316: 3312: 3307: 3303: 3299: 3292: 3288: 3283: 3278: 3274: 3271: 3266: 3261: 3258: 3255: 3251: 3245: 3242: 3237: 3234: 3229: 3225: 3221: 3218: 3214: 3210: 3205: 3200: 3195: 3192: 3189: 3186: 3183: 3180: 3177: 3174: 3171: 3168: 3165: 3162: 3159: 3135: 3130: 3127: 3123: 3100: 3095: 3092: 3088: 3065: 3062: 3059: 3055: 3051: 3048: 3045: 3025: 3013: 2999: 2996: 2993: 2989: 2985: 2982: 2979: 2968: 2947: 2944: 2940: 2936: 2933: 2930: 2927: 2922: 2919: 2915: 2911: 2907: 2903: 2898: 2894: 2890: 2883: 2879: 2874: 2869: 2865: 2862: 2859: 2856: 2851: 2847: 2843: 2838: 2834: 2830: 2827: 2823: 2819: 2814: 2809: 2804: 2801: 2798: 2795: 2792: 2789: 2786: 2783: 2780: 2777: 2774: 2771: 2768: 2746: 2734: 2731: 2718: 2713: 2710: 2705: 2702: 2682: 2656: 2653: 2650: 2647: 2644: 2641: 2638: 2634: 2630: 2627: 2624: 2617: 2613: 2608: 2603: 2599: 2596: 2593: 2590: 2587: 2584: 2579: 2575: 2571: 2568: 2564: 2558: 2555: 2552: 2548: 2544: 2539: 2534: 2529: 2526: 2523: 2520: 2517: 2514: 2511: 2508: 2505: 2502: 2499: 2496: 2493: 2490: 2485: 2481: 2477: 2474: 2471: 2447: 2444: 2440: 2436: 2433: 2430: 2423: 2419: 2414: 2409: 2405: 2402: 2399: 2396: 2393: 2390: 2385: 2381: 2377: 2374: 2370: 2364: 2361: 2358: 2354: 2350: 2347: 2344: 2341: 2338: 2335: 2332: 2329: 2326: 2323: 2320: 2317: 2314: 2311: 2306: 2302: 2298: 2295: 2292: 2270: 2258: 2255: 2243: 2240: 2236: 2232: 2229: 2226: 2223: 2216: 2212: 2207: 2202: 2198: 2195: 2192: 2187: 2182: 2177: 2174: 2171: 2168: 2165: 2160: 2156: 2152: 2149: 2146: 2141: 2136: 2130: 2126: 2123: 2120: 2117: 2114: 2111: 2108: 2105: 2102: 2099: 2096: 2093: 2090: 2087: 2082: 2078: 2074: 2071: 2068: 2046: 2034: 2031: 2019: 2016: 2013: 2010: 2006: 2002: 1997: 1993: 1989: 1982: 1978: 1973: 1968: 1964: 1961: 1958: 1955: 1950: 1946: 1942: 1937: 1933: 1929: 1926: 1922: 1918: 1913: 1908: 1903: 1900: 1897: 1894: 1891: 1888: 1885: 1882: 1879: 1876: 1870: 1867: 1864: 1842: 1830: 1827: 1815: 1812: 1809: 1806: 1802: 1798: 1795: 1792: 1785: 1781: 1776: 1771: 1767: 1764: 1761: 1758: 1755: 1752: 1747: 1743: 1739: 1736: 1732: 1728: 1723: 1720: 1717: 1712: 1707: 1704: 1701: 1698: 1695: 1692: 1689: 1686: 1683: 1680: 1677: 1674: 1671: 1649: 1637: 1634: 1632: 1629: 1617: 1612: 1608: 1601: 1598: 1595: 1592: 1589: 1584: 1581: 1578: 1574: 1567: 1563: 1559: 1555: 1548: 1543: 1540: 1537: 1533: 1526: 1523: 1520: 1517: 1514: 1509: 1505: 1501: 1498: 1493: 1489: 1478: 1477: 1462: 1457: 1453: 1446: 1443: 1440: 1437: 1434: 1429: 1426: 1423: 1419: 1412: 1407: 1404: 1401: 1397: 1390: 1387: 1384: 1381: 1378: 1373: 1369: 1365: 1362: 1357: 1353: 1348: 1337: 1336: 1318: 1315: 1312: 1309: 1306: 1303: 1298: 1293: 1288: 1285: 1282: 1279: 1276: 1256: 1233: 1228: 1224: 1220: 1217: 1214: 1211: 1208: 1203: 1200: 1195: 1192: 1189: 1186: 1181: 1177: 1156: 1133: 1130: 1127: 1124: 1121: 1118: 1115: 1112: 1109: 1106: 1103: 1100: 1097: 1094: 1091: 1088: 1068: 1065: 1062: 1059: 1056: 1053: 1050: 1030: 1010: 987: 967: 947: 927: 905: 901: 880: 860: 838: 834: 813: 788: 785: 782: 779: 776: 756: 734: 731: 726: 722: 715: 710: 706: 702: 699: 694: 690: 683: 680: 677: 674: 668: 663: 659: 652: 647: 643: 639: 636: 631: 627: 623: 620: 617: 592: 572: 552: 532: 512: 492: 472: 452: 449: 446: 443: 440: 420: 405: 402: 401: 400: 394: 388: 378: 368: 357: 351: 345: 329: 313: 310: 293: 265: 237: 231: 228: 222: 218: 198: 178: 155: 128: 108: 72: 69: 61:generalization 15: 9: 6: 4: 3: 2: 3890: 3879: 3876: 3874: 3871: 3870: 3868: 3857: 3854: 3851: 3848: 3845: 3842: 3839: 3836: 3835: 3822: 3816: 3807: 3805: 3795: 3786: 3777: 3767: 3758: 3749: 3747: 3745: 3743: 3741: 3733: 3727: 3718: 3716: 3707: 3703: 3699: 3695: 3691: 3684: 3680: 3669: 3666: 3651: 3631: 3623: 3620:A version of 3619: 3616: 3613: 3610: 3607: 3592: 3584: 3581: 3578: 3576: 3573: 3572: 3571: 3563: 3556: 3553: 3536: 3533: 3530: 3526: 3522: 3519: 3516: 3508: 3504: 3503: 3502: 3500: 3496: 3492: 3489: 3487: 3478: 3459: 3456: 3434: 3429: 3426: 3422: 3399: 3394: 3391: 3387: 3364: 3359: 3356: 3352: 3348: 3345: 3342: 3334: 3329: 3326: 3322: 3318: 3305: 3301: 3297: 3290: 3281: 3276: 3269: 3264: 3259: 3256: 3253: 3249: 3243: 3240: 3235: 3227: 3223: 3216: 3203: 3193: 3187: 3184: 3181: 3178: 3175: 3172: 3169: 3163: 3160: 3149: 3133: 3128: 3125: 3121: 3098: 3093: 3090: 3086: 3063: 3060: 3057: 3053: 3049: 3046: 3043: 3023: 3016:An algorithm 2997: 2994: 2991: 2987: 2983: 2980: 2977: 2967: 2965: 2960: 2945: 2942: 2938: 2934: 2931: 2928: 2920: 2917: 2913: 2909: 2896: 2892: 2888: 2881: 2872: 2867: 2860: 2857: 2849: 2845: 2841: 2836: 2832: 2825: 2812: 2802: 2796: 2793: 2790: 2787: 2784: 2781: 2778: 2772: 2769: 2758: 2744: 2737:An algorithm 2730: 2711: 2708: 2700: 2693:decreases as 2680: 2672: 2667: 2654: 2651: 2648: 2645: 2639: 2636: 2625: 2622: 2615: 2606: 2601: 2594: 2591: 2585: 2582: 2577: 2573: 2566: 2556: 2553: 2550: 2537: 2527: 2521: 2518: 2515: 2512: 2509: 2506: 2503: 2497: 2494: 2488: 2483: 2479: 2475: 2472: 2461: 2458: 2445: 2442: 2431: 2428: 2421: 2412: 2407: 2400: 2397: 2391: 2388: 2383: 2379: 2372: 2362: 2359: 2356: 2348: 2342: 2339: 2336: 2333: 2330: 2327: 2324: 2318: 2315: 2309: 2304: 2300: 2296: 2293: 2282: 2268: 2261:An algorithm 2254: 2241: 2238: 2224: 2221: 2214: 2205: 2200: 2193: 2185: 2175: 2166: 2163: 2158: 2154: 2147: 2139: 2124: 2118: 2115: 2112: 2109: 2106: 2103: 2100: 2094: 2091: 2085: 2080: 2076: 2072: 2069: 2058: 2044: 2037:An algorithm 2030: 2017: 2014: 2011: 1995: 1991: 1987: 1980: 1971: 1966: 1959: 1956: 1948: 1944: 1940: 1935: 1931: 1924: 1911: 1901: 1895: 1892: 1889: 1886: 1883: 1880: 1877: 1868: 1865: 1854: 1840: 1833:An algorithm 1826: 1813: 1810: 1807: 1793: 1790: 1783: 1774: 1769: 1762: 1759: 1753: 1750: 1745: 1741: 1734: 1721: 1718: 1715: 1705: 1699: 1696: 1693: 1690: 1687: 1684: 1681: 1675: 1672: 1661: 1647: 1640:An algorithm 1628: 1610: 1606: 1599: 1596: 1593: 1590: 1587: 1582: 1579: 1576: 1572: 1565: 1561: 1557: 1553: 1546: 1541: 1538: 1535: 1531: 1524: 1521: 1518: 1515: 1512: 1507: 1503: 1496: 1491: 1487: 1475: 1474: 1473: 1455: 1451: 1444: 1441: 1438: 1435: 1432: 1427: 1424: 1421: 1417: 1410: 1405: 1402: 1399: 1395: 1388: 1385: 1382: 1379: 1376: 1371: 1367: 1360: 1355: 1346: 1334: 1333: 1332: 1329: 1313: 1310: 1307: 1301: 1296: 1286: 1280: 1274: 1254: 1245: 1226: 1222: 1218: 1215: 1209: 1206: 1201: 1198: 1193: 1187: 1179: 1175: 1154: 1145: 1128: 1125: 1119: 1113: 1107: 1104: 1098: 1095: 1092: 1086: 1063: 1060: 1057: 1051: 1048: 1028: 1008: 999: 985: 965: 945: 925: 903: 899: 878: 858: 836: 832: 811: 802: 799: 786: 783: 780: 777: 774: 754: 745: 724: 720: 713: 708: 704: 697: 692: 688: 681: 678: 675: 672: 661: 657: 650: 645: 641: 634: 629: 625: 618: 615: 607: 604: 590: 570: 550: 530: 510: 490: 470: 447: 444: 441: 418: 409: 398: 395: 392: 389: 386: 382: 379: 376: 372: 369: 365: 361: 358: 355: 352: 349: 346: 343: 327: 319: 316: 315: 309: 307: 291: 283: 278: 263: 255: 250: 235: 229: 226: 220: 216: 196: 176: 169: 153: 145: 140: 126: 106: 98: 94: 90: 86: 82: 78: 68: 66: 62: 58: 54: 50: 45: 43: 38: 33: 29: 25: 21: 3815: 3794: 3785: 3776: 3766: 3757: 3726: 3697: 3693: 3683: 3569: 3560: 3506: 3498: 3497: 3493: 3490: 3485: 3484: 3150: 3015: 2963: 2961: 2759: 2736: 2670: 2668: 2462: 2459: 2283: 2260: 2059: 2036: 1855: 1832: 1662: 1639: 1479: 1338: 1330: 1246: 1146: 1000: 803: 800: 746: 608: 605: 410: 407: 396: 390: 384: 380: 374: 370: 363: 359: 353: 347: 340:, traced to 317: 279: 251: 168:VC-dimension 141: 91:, using the 74: 46: 23: 19: 18: 3612:Regularized 3148:such that: 3012:) Stability 318:Early 1900s 3867:Categories 3675:References 2964:equivalent 367:functions. 89:consistent 81:generalize 42:classifier 3706:1533-7928 3466:∞ 3463:→ 3423:δ 3388:β 3353:δ 3349:− 3343:≥ 3323:β 3319:≤ 3250:∑ 3236:− 3164:∈ 3158:∀ 3122:δ 3087:β 2939:δ 2935:− 2929:≥ 2914:β 2910:≤ 2858:− 2773:∈ 2767:∀ 2681:β 2655:δ 2652:− 2646:≥ 2640:β 2637:≤ 2592:− 2554:∈ 2498:∈ 2492:∀ 2476:∈ 2470:∀ 2446:β 2443:≤ 2398:− 2360:∈ 2319:∈ 2313:∀ 2297:∈ 2291:∀ 2242:β 2239:≤ 2176:− 2095:∈ 2089:∀ 2073:∈ 2067:∀ 2015:β 2012:≤ 1957:− 1869:∈ 1863:∀ 1811:β 1808:≤ 1760:− 1676:∈ 1670:∀ 1539:− 1403:− 1207:∑ 1001:The loss 784:× 254:VC theory 30:of how a 20:Stability 3878:Learning 1562:′ 523:, where 40:similar 3622:bagging 3379:, with 71:History 3823:, 2009 3732:D. Ron 3704:  3113:and a 2671:stable 1872:  1603:  1569:  1550:  1528:  1448:  1414:  1392:  717:  685:  670:  654:  189:, and 3771:2006. 918:from 851:into 483:from 166:with 3702:ISSN 3507:both 3414:and 3036:has 543:and 397:2016 391:2010 381:2004 371:2002 360:2002 354:1999 348:1979 2547:sup 2353:sup 1267:is 1167:is 938:to 767:in 503:to 51:to 3869:: 3803:^ 3739:^ 3714:^ 3696:. 3692:. 3501:: 3488:: 2729:. 1244:. 1144:. 603:. 139:. 3708:. 3698:2 3664:. 3652:n 3632:k 3593:C 3537:r 3534:r 3531:e 3527:o 3523:o 3520:l 3517:E 3460:, 3457:m 3435:m 3430:L 3427:E 3400:m 3395:L 3392:E 3365:m 3360:L 3357:E 3346:1 3340:} 3335:m 3330:L 3327:E 3315:| 3311:) 3306:i 3302:z 3298:, 3291:i 3287:| 3282:S 3277:f 3273:( 3270:V 3265:m 3260:1 3257:= 3254:i 3244:m 3241:1 3233:] 3228:S 3224:f 3220:[ 3217:I 3213:| 3209:{ 3204:S 3199:P 3194:, 3191:} 3188:m 3185:, 3182:. 3179:. 3176:. 3173:, 3170:1 3167:{ 3161:i 3134:m 3129:L 3126:E 3099:m 3094:L 3091:E 3064:r 3061:r 3058:e 3054:o 3050:o 3047:l 3044:E 3024:L 2998:r 2995:r 2992:e 2988:o 2984:o 2981:l 2978:E 2946:V 2943:C 2932:1 2926:} 2921:V 2918:C 2906:| 2902:) 2897:i 2893:z 2889:, 2882:i 2878:| 2873:S 2868:f 2864:( 2861:V 2855:) 2850:i 2846:z 2842:, 2837:S 2833:f 2829:( 2826:V 2822:| 2818:{ 2813:S 2808:P 2803:, 2800:} 2797:m 2794:, 2791:. 2788:. 2785:. 2782:, 2779:1 2776:{ 2770:i 2745:L 2717:) 2712:m 2709:1 2704:( 2701:O 2649:1 2643:} 2633:| 2629:) 2626:z 2623:, 2616:i 2612:| 2607:S 2602:f 2598:( 2595:V 2589:) 2586:z 2583:, 2578:S 2574:f 2570:( 2567:V 2563:| 2557:Z 2551:z 2543:{ 2538:S 2533:P 2528:, 2525:} 2522:m 2519:, 2516:. 2513:. 2510:. 2507:, 2504:1 2501:{ 2495:i 2489:, 2484:m 2480:Z 2473:S 2439:| 2435:) 2432:z 2429:, 2422:i 2418:| 2413:S 2408:f 2404:( 2401:V 2395:) 2392:z 2389:, 2384:S 2380:f 2376:( 2373:V 2369:| 2363:Z 2357:z 2349:, 2346:} 2343:m 2340:, 2337:. 2334:. 2331:. 2328:, 2325:1 2322:{ 2316:i 2310:, 2305:m 2301:Z 2294:S 2269:L 2235:| 2231:] 2228:) 2225:z 2222:, 2215:i 2211:| 2206:S 2201:f 2197:( 2194:V 2191:[ 2186:z 2181:E 2173:] 2170:) 2167:z 2164:, 2159:S 2155:f 2151:( 2148:V 2145:[ 2140:z 2135:E 2129:| 2125:, 2122:} 2119:m 2116:, 2113:. 2110:. 2107:. 2104:, 2101:1 2098:{ 2092:i 2086:, 2081:m 2077:Z 2070:S 2045:L 2018:. 2009:] 2005:| 2001:) 1996:i 1992:z 1988:, 1981:i 1977:| 1972:S 1967:f 1963:( 1960:V 1954:) 1949:i 1945:z 1941:, 1936:S 1932:f 1928:( 1925:V 1921:| 1917:[ 1912:S 1907:E 1902:, 1899:} 1896:m 1893:, 1890:. 1887:. 1884:. 1881:, 1878:1 1875:{ 1866:i 1841:L 1814:. 1805:] 1801:| 1797:) 1794:z 1791:, 1784:i 1780:| 1775:S 1770:f 1766:( 1763:V 1757:) 1754:z 1751:, 1746:S 1742:f 1738:( 1735:V 1731:| 1727:[ 1722:z 1719:, 1716:S 1711:E 1706:, 1703:} 1700:m 1697:, 1694:. 1691:. 1688:. 1685:, 1682:1 1679:{ 1673:i 1648:L 1616:} 1611:m 1607:z 1600:, 1597:. 1594:. 1591:. 1588:, 1583:1 1580:+ 1577:i 1573:z 1566:, 1558:i 1554:z 1547:, 1542:1 1536:i 1532:z 1525:, 1522:. 1519:. 1516:. 1513:, 1508:1 1504:z 1500:{ 1497:= 1492:i 1488:S 1461:} 1456:m 1452:z 1445:, 1442:. 1439:. 1436:. 1433:, 1428:1 1425:+ 1422:i 1418:z 1411:, 1406:1 1400:i 1396:z 1389:, 1386:. 1383:. 1380:. 1377:, 1372:1 1368:z 1364:{ 1361:= 1356:i 1352:| 1347:S 1317:) 1314:z 1311:, 1308:f 1305:( 1302:V 1297:z 1292:E 1287:= 1284:] 1281:f 1278:[ 1275:I 1255:f 1232:) 1227:i 1223:z 1219:, 1216:f 1213:( 1210:V 1202:n 1199:1 1194:= 1191:] 1188:f 1185:[ 1180:S 1176:I 1155:f 1132:) 1129:y 1126:, 1123:) 1120:x 1117:( 1114:f 1111:( 1108:V 1105:= 1102:) 1099:z 1096:, 1093:f 1090:( 1087:V 1067:) 1064:y 1061:, 1058:x 1055:( 1052:= 1049:z 1029:f 1009:V 986:S 966:L 946:Y 926:X 904:S 900:f 879:S 859:H 837:m 833:Z 812:L 787:Y 781:X 778:= 775:Z 755:m 733:} 730:) 725:m 721:y 714:, 709:m 705:x 701:( 698:= 693:m 689:z 682:, 679:. 676:. 673:, 667:) 662:1 658:y 651:, 646:1 642:x 638:( 635:= 630:1 626:z 622:{ 619:= 616:S 591:H 571:f 551:Y 531:X 511:Y 491:X 471:f 451:) 448:y 445:, 442:x 439:( 419:L 344:. 328:L 292:H 264:H 236:) 230:n 227:d 221:( 217:O 197:n 177:d 154:H 127:S 107:H

Index

computational learning theory
machine learning algorithm
recognize handwritten letters
classifier
language learning
inverse problems
computational learning theory
generalization
empirical risk minimization
machine learning system
generalize
supervised learning algorithms
consistent
uniform convergence
empirical risk minimization
Vladimir Vapnik
VC-dimension
VC theory
computational learning theory
sensitivity analysis
Andrey Nikolayevich Tikhonov
Linear regression
Support Vector Machine
Regularized
bagging
"Stability and Generalization"
ISSN
1533-7928

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑