5412:-th step. Such a sequence can be stochastic or deterministic. The number of iterations is then decoupled to the number of points (each point can be considered more than once). The incremental gradient method can be shown to provide a minimizer to the empirical risk. Incremental techniques can be advantageous when considering objective functions made up of a sum of many terms e.g. an empirical error corresponding to a very large dataset.
9930:, this interpretation is also related to the stochastic gradient descent method, but applied to minimize the empirical risk as opposed to the expected risk. Since this interpretation concerns the empirical risk and not the expected risk, multiple passes through the data are readily allowed and actually lead to tighter bounds on the deviations
9211:
means constantly improving the learned model by processing continuous streams of information. Continual learning capabilities are essential for software systems and autonomous agents interacting in an ever changing real world. However, continual learning is a challenge for machine learning and neural
5426:
Kernels can be used to extend the above algorithms to non-parametric models (or models where the parameters form an infinite dimensional space). The corresponding procedure will no longer be truly online and instead involve storing all the data points, but is still faster than the brute force method.
4880:
3916:
6036:
2598:
1964:. In this case, the space requirements are no longer guaranteed to be constant since it requires storing all previous data points, but the solution may take less time to compute with the addition of a new data point, as compared to batch learning techniques.
4401:
6573:
6262:
2986:
1465:
is a space of functions called a hypothesis space, so that some notion of total loss is minimized. Depending on the type of model (statistical or adversarial), one can devise different notions of loss, which lead to different learning algorithms.
4668:
7684:
5363:
7395:
4663:
4040:
2404:
8220:
9857:
9401:
3756:
8980:
5698:
982:
in which data becomes available in a sequential order and is used to update the best predictor for future data at each step, as opposed to batch learning techniques which generate the best predictor by learning on the entire
9735:. This interpretation is also valid in the case of a finite training set; although with multiple passes through the data the gradients are no longer independent, still complexity results can be obtained in special cases.
7242:
Some online prediction problems however cannot fit in the framework of OCO. For example, in online classification, the prediction domain and the loss functions are not convex. In such scenarios, two simple techniques for
9738:
The second interpretation applies to the case of a finite training set and considers the SGD algorithm as an instance of incremental gradient descent method. In this case, one instead looks at the empirical risk:
3751:
7972:
5217:
3066:
2043:
The simple example of linear least squares is used to explain a variety of ideas in online learning. The ideas are general enough to be applied to other settings, for example, with other convex loss functions.
5231:
In practice, one can perform multiple stochastic gradient passes (also called cycles or epochs) over the data. The algorithm thus obtained is called incremental gradient method and corresponds to an iteration
6405:
9224:
The paradigm of online learning has different interpretations depending on the choice of the learning model, each of which has distinct implications about the predictive quality of the sequence of functions
7850:
5876:
2150:
7183:
2429:
5139:
1684:
9115:
4524:
7476:
5871:
4928:
991:
algorithms. It is also used in situations where it is necessary for the algorithm to dynamically adapt to new patterns in the data, or when the data itself is generated as a function of time, e.g.,
8053:
3695:
6790:
6726:
9520:
1962:
1272:
7510:. However, similar bounds cannot be obtained for the FTL algorithm for other important families of models like online linear optimization. To do so, one modifies FTL by adding regularisation.
8365:
7745:
6410:
6116:
2845:
6981:
2850:
8708:
5560:
4459:
4108:
3458:
2784:
6082:
4963:
1316:
3574:
9282:
8844:
7262:
The simplest learning rule to try is to select (at the current step) the hypothesis that has the least loss over all past rounds. This algorithm is called Follow the leader, and round
2192:
8807:
7559:
9124:
Quadratically regularised FTRL algorithms lead to lazily projected gradient algorithms as described above. To use the above for arbitrary convex functions and regularisers, one uses
8255:
accumulates the gradients. It is also known as
Nesterov's dual averaging algorithm. In this scenario of linear loss functions and quadratic regularisation, the regret is bounded by
2636:
8414:
5235:
2719:
2227:
3375:
7550:
7285:
1439:
8745:
7780:
7096:
4543:
4289:
3920:
2252:
1787:
987:
at once. Online learning is a common technique used in areas of machine learning where it is computationally infeasible to train over the entire dataset, requiring the need of
9892:
9590:
8098:
8089:
6111:
3255:
9163:
9016:
8286:
1463:
10037:
9742:
9287:
8537:
8253:
3313:
1518:
7027:
1816:
and some extra stored information (which is usually expected to have storage requirements independent of training data size). For many formulations, for example nonlinear
10005:
8589:
6859:
5094:
4284:
4221:
3637:
3402:
6662:
1713:
9704:
9198:
4174:
3131:
3105:
2684:
1360:
1149:
4257:
3610:
3494:
3203:
3167:
1045:
7508:
5758:
1991:
1851:
9555:
7061:
1553:
1184:
1123:
884:
10073:
9928:
8646:
8619:
8495:
8468:
8441:
7870:
7237:
7210:
6918:
6889:
6600:
5725:
5479:
5452:
5390:
5015:
1878:
1814:
5064:
1389:
922:
9677:
9733:
9619:
9436:
8864:
7280:
6938:
6746:
6684:
is the cost of evaluating the kernel on a single pair of points. Thus, the use of the kernel has allowed the movement from a finite dimensional parameter space
6682:
6620:
6285:
5499:
5410:
5035:
4983:
4194:
4135:
4060:
3275:
2739:
2658:
2424:
2247:
2070:
2011:
1409:
1085:
1065:
4875:{\displaystyle w_{i}=w_{i-1}-\gamma _{i}x_{i}\left(x_{i}^{\mathsf {T}}w_{i-1}-y_{i}\right)=w_{i-1}-\gamma _{i}\nabla V(\langle w_{i-1},x_{i}\rangle ,y_{i})}
879:
10315:
Bertsekas, D. P. (2011). Incremental gradient, subgradient, and proximal methods for convex optimization: a survey. Optimization for
Machine Learning, 85.
8869:
5427:
This discussion is restricted to the case of the square loss, though it can be extended to any convex loss. It can be shown by an easy induction that if
7875:
5565:
869:
6290:
2013:
much smaller than the total number of training points. Mini-batch techniques are used with repeated passing over the training data to obtain optimized
2075:
710:
10270:
L. Rosasco, T. Poggio, Machine
Learning: a Regularization Approach, MIT-9.520 Lectures Notes, Manuscript, Dec. 2015. Chapter 7 - Online Learning
9212:
network models since the continual acquisition of incrementally available information from non-stationary data distributions generally leads to
3911:{\displaystyle \Gamma _{i}=\Gamma _{i-1}-{\frac {\Gamma _{i-1}x_{i}x_{i}^{\mathsf {T}}\Gamma _{i-1}}{1+x_{i}^{\mathsf {T}}\Gamma _{i-1}x_{i}}}}
1558:
917:
7518:
This is a natural modification of FTL that is used to stabilise the FTL solutions and obtain better regret bounds. A regularisation function
5762:
7063:
in hindsight. As an example, consider the case of online least squares linear regression. Here, the weight vectors come from the convex set
3647:
The recursive least squares (RLS) algorithm considers an online approach to the least squares problem. It can be shown that by initialising
874:
725:
3700:
2993:
456:
5144:
957:
760:
5508:
6407:
then the same proof will also show that predictor minimising the least squares loss is obtained by changing the above recursion to
7785:
7101:
5096:
needs to be chosen carefully to solve the expected risk minimization problem, as detailed above. By choosing a decaying step size
4176:, which is an order of magnitude faster than the corresponding batch learning complexity. The storage requirements at every step
836:
9284:. The prototypical stochastic gradient descent algorithm is used for this discussion. As noted above, its recursion is given by
5099:
385:
9028:
7686:
As a special example, consider the case of online linear optimisation i.e. where nature sends back loss functions of the form
4464:
10413:
10297:
7404:
4885:
3753:, the solution of the linear least squares problem given in the previous section can be computed by the following iteration:
2786:
is invertible (otherwise it is preferential to proceed in a similar fashion with
Tikhonov regularization), the best solution
7977:
3650:
2603:
1820:, true online learning is not possible, though a form of hybrid online learning with recursive algorithms can be used where
10483:
894:
657:
192:
6751:
6687:
9441:
1883:
1193:
912:
2038:
745:
720:
669:
8309:
7689:
6031:{\displaystyle (c_{i})_{i}=\gamma _{i}{\Big (}y_{i}-\sum _{j=1}^{i-1}(c_{i-1})_{j}\langle x_{j},x_{i}\rangle {\Big )}}
2789:
10448:
10436:
6945:
2593:{\displaystyle I_{n}=\sum _{j=1}^{n}V(\langle w,x_{j}\rangle ,y_{j})=\sum _{j=1}^{n}(x_{j}^{\mathsf {T}}w-y_{j})^{2}}
793:
788:
441:
8654:
4406:
10087:: Open-source fast out-of-core online learning system which is notable for supporting a number of machine learning
451:
89:
4065:
3407:
2744:
9621:
and therefore one can apply complexity results for the stochastic gradient descent method to bound the deviation
6041:
4933:
1281:
3499:
9228:
8814:
2155:
1088:
950:
846:
610:
431:
17:
8750:
1967:
A common strategy to overcome the above issues is to learn using mini-batches, which process a small batch of
5730:
1723:). The choice of loss function here gives rise to several well-known learning algorithms such as regularized
821:
523:
299:
10091:, importance weighting and a selection of different loss functions and optimisation algorithms. It uses the
6792:, whose dimension is the same as the size of the training dataset. In general, this is a consequence of the
10234:
10229:
10224:
10134:
8373:
4396:{\displaystyle \sum _{j=1}^{n}\left(x_{j}^{\mathsf {T}}w-y_{j}\right)^{2}+\lambda \left\|w\right\|_{2}^{2}}
2689:
2197:
778:
715:
625:
603:
446:
436:
3318:
10207:
10109:
9592:
in the above iteration are an i.i.d. sample of stochastic estimates of the gradient of the expected risk
9406:
7521:
6568:{\displaystyle (c_{i})_{i}=\gamma _{i}{\Big (}y_{i}-\sum _{j=1}^{i-1}(c_{i-1})_{j}K(x_{j},x_{i}){\Big )}}
6257:{\displaystyle f_{i}(x)=\langle w_{i-1},x\rangle =\sum _{j=1}^{i-1}(c_{i-1})_{j}\langle x_{j},x\rangle .}
5502:
4535:
2018:
1716:
1414:
929:
841:
826:
287:
109:
8715:
7750:
7066:
2981:{\displaystyle w^{*}=(X^{\mathsf {T}}X)^{-1}X^{\mathsf {T}}y=\Sigma _{i}^{-1}\sum _{j=1}^{i}x_{j}y_{j}.}
10478:
2026:
1734:
889:
816:
566:
461:
249:
182:
142:
9862:
9560:
9128:. The optimal regularization in hindsight can be derived for linear loss functions, this leads to the
8065:
6087:
10219:
3208:
996:
943:
549:
317:
187:
9135:
8988:
8258:
1444:
10010:
8503:
8225:
7034:
3280:
1477:
571:
491:
414:
332:
162:
124:
119:
79:
74:
6989:
9933:
9213:
8544:
7679:{\displaystyle w_{t}=\mathop {\operatorname {arg\,min} } _{w\in S}\sum _{i=1}^{t-1}v_{i}(w)+R(w)}
6814:
5220:
5072:
4262:
4199:
4111:
3615:
3380:
1728:
1720:
992:
518:
367:
267:
9894:
in the incremental gradient descent iterations are also stochastic estimates of the gradient of
6625:
1689:
10170:
10113:
10088:
9682:
9168:
9019:
4140:
3110:
3071:
2663:
1321:
1128:
698:
674:
576:
337:
312:
272:
84:
10348:
Parisi, German I.; Kemker, Ronald; Part, Jose L.; Kanan, Christopher; Wermter, Stefan (2019).
10289:
6808:
to allow for efficient algorithms. The framework is that of repeated game playing as follows:
5358:{\displaystyle w_{i}=w_{i-1}-\gamma _{i}\nabla V(\langle w_{i-1},x_{t_{i}}\rangle ,y_{t_{i}})}
4226:
3579:
3463:
3172:
3136:
1018:
9125:
7481:
6804:
Online convex optimization (OCO) is a general framework for decision making which leverages
1970:
1823:
652:
474:
426:
282:
197:
69:
10281:
9525:
7390:{\displaystyle w_{t}=\mathop {\operatorname {arg\,min} } _{w\in S}\sum _{i=1}^{t-1}v_{i}(w)}
7040:
3277:
total points in the dataset, to recompute the solution after the arrival of every datapoint
1523:
1154:
1093:
10154:
10042:
9897:
8624:
8597:
8473:
8446:
8419:
7855:
7215:
7188:
6896:
6867:
6578:
5703:
5457:
5430:
5368:
4988:
4965:, this becomes the stochastic gradient descent algorithm. In this case, the complexity for
4658:{\displaystyle w_{i}=w_{i-1}-\Gamma _{i}x_{i}\left(x_{i}^{\mathsf {T}}w_{i-1}-y_{i}\right)}
4035:{\displaystyle w_{i}=w_{i-1}-\Gamma _{i}x_{i}\left(x_{i}^{\mathsf {T}}w_{i-1}-y_{i}\right)}
1856:
1792:
1000:
581:
531:
10464:
6.883: Online
Methods in Machine Learning: Theory and Applications. Alexander Rakhlin. MIT
5040:
2399:{\displaystyle V(f(x_{j}),y_{j})=(f(x_{j})-y_{j})^{2}=(\langle w,x_{j}\rangle -y_{j})^{2}}
1365:
8:
10202:
10197:
10180:
10130:
9624:
8215:{\displaystyle w_{t+1}=\Pi _{S}(-\eta \sum _{i=1}^{t}z_{i})=\Pi _{S}(\eta \theta _{t+1})}
6805:
6793:
1012:
684:
620:
591:
496:
322:
255:
241:
227:
202:
152:
104:
64:
10095:
for bounding the size of the set of features independent of the amount of training data.
10402:
10361:
10175:
10123:
9709:
9595:
9412:
9208:
8849:
8301:
7265:
6923:
6731:
6667:
6605:
6270:
5484:
5395:
5020:
4968:
4179:
4120:
4045:
3260:
2724:
2643:
2409:
2232:
2055:
1996:
1394:
1070:
1050:
662:
586:
372:
167:
28:
9852:{\displaystyle I_{n}={\frac {1}{n}}\sum _{i=1}^{n}V(\langle w,x_{i}\rangle ,y_{i})\ .}
9396:{\displaystyle w_{t}=w_{t-1}-\gamma _{t}\nabla V(\langle w_{t-1},x_{t}\rangle ,y_{t})}
10444:
10432:
10409:
10379:
10293:
10282:
9438:
defined above. Indeed, in the case of an infinite stream of data, since the examples
984:
755:
598:
511:
307:
277:
222:
217:
172:
114:
8975:{\displaystyle w_{t+1}=\Pi _{S}(\eta \theta _{t+1}),\theta _{t+1}=\theta _{t}+z_{t}}
10371:
10192:
10164:
7398:
5693:{\displaystyle c_{i}=((c_{i})_{1},(c_{i})_{2},...,(c_{i})_{i})\in \mathbb {R} ^{i}}
979:
971:
783:
536:
486:
396:
380:
350:
212:
207:
157:
147:
45:
10349:
10375:
10092:
7244:
7037:, or the difference between cumulative loss and the loss of the best fixed point
4286:
is not invertible, consider the regularised version of the problem loss function
2022:
1731:. A purely online model in this category would learn based on just the new input
811:
615:
481:
421:
5365:
The main difference with the stochastic gradient method is that here a sequence
1817:
831:
362:
99:
10397:
10329:
10472:
10383:
10159:
10084:
7248:
5421:
1724:
1275:
1087:
as a space of outputs, that predicts well on instances that are drawn from a
750:
679:
561:
292:
177:
9132:
algorithm. For the
Euclidean regularisation, one can show a regret bound of
7401:. For the case of online quadratic optimization (where the loss function is
1993:
data points at a time, this can be considered as pseudo-online learning for
10098:
1187:
8368:
2014:
988:
556:
50:
10431:, Harold J. Kushner and G. George Yin, New York: Springer-Verlag, 1997.
10325:
10239:
10105:
9023:
705:
401:
327:
10463:
6602:. The total time complexity for the recursion when evaluating for the
8095:
would need to be projected onto, leading to the modified update rule
2721:
is the column vector of target values after the arrival of the first
864:
645:
3746:{\displaystyle \textstyle \Gamma _{0}=I\in \mathbb {R} ^{d\times d}}
2229:
is a linear filter vector. The goal is to compute the filter vector
10366:
10284:
Stochastic
Approximation and Recursive Algorithms with Applications
7967:{\displaystyle w_{t+1}=-\eta \sum _{i=1}^{t}z_{i}=w_{t}-\eta z_{t}}
7872:. Then, one can show that the regret minimising iteration becomes
6728:
to a possibly infinite dimensional feature represented by a kernel
5212:{\textstyle {\overline {w}}_{n}={\frac {1}{n}}\sum _{i=1}^{n}w_{i}}
4110:. One can look at RLS also in the context of adaptive filters (see
3061:{\displaystyle \Sigma _{i}=\sum _{j=1}^{i}x_{j}x_{j}^{\mathsf {T}}}
10441:
Stochastic
Approximation and Recursive Algorithms and Applications
9409:
method as applied to the problem of minimizing the expected risk
9129:
6400:{\displaystyle f_{i}(x)=\sum _{j=1}^{i-1}(c_{i-1})_{j}K(x_{j},x)}
640:
6575:
The above expression requires storing all the data for updating
8367:. To generalise the algorithm to any convex loss function, the
6748:
by instead performing the recursion on the space of parameters
4042:
The above iteration algorithm can be proved using induction on
391:
2025:, this is currently the de facto training method for training
1686:
A common paradigm in this situation is to estimate a function
7845:{\textstyle R(w)={\frac {1}{2\eta }}\left\|w\right\|_{2}^{2}}
4403:. Then, it's easy to show that the same algorithm works with
2145:{\displaystyle f(x_{j})=\langle w,x_{j}\rangle =w\cdot x_{j}}
1186:
over instances. Instead, the learner usually has access to a
635:
630:
357:
10350:"Continual lifelong learning with neural networks: A review"
7178:{\displaystyle v_{t}(w)=(\langle w,x_{t}\rangle -y_{t})^{2}}
1151:. In reality, the learner never knows the true distribution
10400:(1998). "Online Algorithms and Stochastic Approximations".
5392:
is chosen to decide which training point is visited in the
3642:
8306:
The above proved a regret bound for linear loss functions
8222:
This algorithm is known as lazy projection, as the vector
5226:
5134:{\displaystyle \gamma _{i}\approx {\frac {1}{\sqrt {i}}},}
1520:
are assumed to have been drawn from the true distribution
10101:: Provides out-of-core implementations of algorithms for
1679:{\displaystyle I=\mathbb {E} =\int V(f(x),y)\,dp(x,y)\ .}
10119:
Regression: SGD Regressor, Passive
Aggressive regressor.
9110:{\displaystyle v_{t}(w)=\max\{0,1-y_{t}(w\cdot x_{t})\}}
4519:{\displaystyle \Gamma _{i}=(\Sigma _{i}+\lambda I)^{-1}}
923:
List of datasets in computer vision and image processing
8497:, leading to the online subgradient descent algorithm:
7513:
7471:{\displaystyle v_{t}(w)=\left\|w-x_{t}\right\|_{2}^{2}}
7254:
Some simple online convex optimisation algorithms are:
5866:{\displaystyle (c_{i})_{j}=(c_{i-1})_{j},j=1,2,...,i-1}
4923:{\displaystyle \Gamma _{i}\in \mathbb {R} ^{d\times d}}
8048:{\displaystyle w_{t+1}=w_{t}-\eta \nabla v_{t}(w_{t})}
7788:
6755:
6691:
5147:
3704:
3690:{\displaystyle \textstyle w_{0}=0\in \mathbb {R} ^{d}}
3654:
2017:
versions of machine learning algorithms, for example,
1469:
10347:
10045:
10013:
9936:
9900:
9865:
9745:
9712:
9685:
9627:
9598:
9563:
9528:
9522:
are assumed to be drawn i.i.d. from the distribution
9444:
9415:
9290:
9231:
9219:
9171:
9138:
9031:
8991:
8872:
8852:
8817:
8753:
8718:
8657:
8627:
8600:
8547:
8506:
8476:
8449:
8422:
8376:
8312:
8261:
8228:
8101:
8068:
7980:
7878:
7858:
7753:
7692:
7562:
7524:
7484:
7407:
7288:
7268:
7218:
7191:
7104:
7069:
7043:
6992:
6948:
6926:
6899:
6870:
6817:
6754:
6734:
6690:
6670:
6628:
6608:
6581:
6413:
6293:
6273:
6119:
6090:
6044:
5879:
5765:
5733:
5706:
5568:
5511:
5487:
5460:
5433:
5398:
5371:
5238:
5141:
one can prove the convergence of the average iterate
5102:
5075:
5043:
5023:
4991:
4971:
4936:
4888:
4671:
4546:
4467:
4409:
4292:
4265:
4229:
4202:
4182:
4143:
4123:
4068:
4048:
3923:
3759:
3703:
3653:
3618:
3582:
3502:
3466:
3410:
3383:
3321:
3283:
3263:
3211:
3175:
3139:
3113:
3074:
2996:
2853:
2792:
2747:
2727:
2692:
2666:
2646:
2606:
2432:
2412:
2255:
2235:
2200:
2158:
2078:
2058:
1999:
1973:
1886:
1859:
1826:
1795:
1737:
1692:
1561:
1555:
and the objective is to minimize the expected "risk"
1526:
1480:
1447:
1417:
1397:
1368:
1324:
1284:
1196:
1157:
1131:
1096:
1073:
1053:
1021:
10429:
9200:
for strongly convex and exp-concave loss functions.
8055:, which looks exactly like online gradient descent.
6785:{\displaystyle \textstyle c_{i}\in \mathbb {R} ^{i}}
6721:{\displaystyle \textstyle w_{i}\in \mathbb {R} ^{d}}
1719:
or regularized empirical risk minimization (usually
1474:
In statistical learning models, the training sample
1362:
measures the difference between the predicted value
9515:{\displaystyle (x_{1},y_{1}),(x_{2},y_{2}),\ldots }
8295:
1957:{\displaystyle (x_{1},y_{1}),\ldots ,(x_{t},y_{t})}
1267:{\displaystyle (x_{1},y_{1}),\ldots ,(x_{n},y_{n})}
10401:
10067:
10031:
9999:
9922:
9886:
9851:
9727:
9698:
9671:
9613:
9584:
9549:
9514:
9430:
9395:
9276:
9192:
9157:
9109:
9010:
8974:
8858:
8838:
8801:
8739:
8702:
8640:
8613:
8583:
8531:
8489:
8462:
8435:
8408:
8359:
8280:
8247:
8214:
8083:
8047:
7966:
7864:
7844:
7774:
7739:
7678:
7544:
7502:
7470:
7389:
7274:
7231:
7204:
7177:
7090:
7055:
7021:
6975:
6932:
6912:
6883:
6853:
6784:
6740:
6720:
6676:
6656:
6614:
6594:
6567:
6399:
6279:
6256:
6105:
6076:
6030:
5865:
5752:
5719:
5692:
5554:
5493:
5473:
5446:
5404:
5384:
5357:
5211:
5133:
5088:
5058:
5029:
5009:
4977:
4957:
4922:
4874:
4657:
4518:
4453:
4395:
4278:
4251:
4215:
4188:
4168:
4129:
4102:
4054:
4034:
3910:
3745:
3689:
3631:
3604:
3568:
3488:
3452:
3404:, then updating it at each step needs only adding
3396:
3369:
3315:, the naive approach will have a total complexity
3307:
3269:
3249:
3197:
3169:, while the rest of the multiplication takes time
3161:
3125:
3099:
3060:
2980:
2839:
2778:
2733:
2713:
2678:
2652:
2630:
2592:
2418:
2398:
2241:
2221:
2186:
2144:
2064:
2005:
1985:
1956:
1872:
1845:
1808:
1781:
1707:
1678:
1547:
1512:
1457:
1433:
1403:
1383:
1354:
1310:
1266:
1178:
1143:
1117:
1079:
1059:
1039:
7098:, and nature sends back the convex loss function
6560:
6452:
6023:
5918:
2741:data points. Assuming that the covariance matrix
2052:Consider the setting of supervised learning with
10470:
10288:(Second ed.). New York: Springer. pp.
9054:
8360:{\displaystyle v_{t}(w)=\langle w,z_{t}\rangle }
7740:{\displaystyle v_{t}(w)=\langle w,z_{t}\rangle }
6287:is introduced instead and let the predictor be
2847:to the linear least squares problem is given by
2840:{\displaystyle f^{*}(x)=\langle w^{*},x\rangle }
2032:
6976:{\displaystyle v_{t}:S\rightarrow \mathbb {R} }
8703:{\displaystyle z_{t}\in \partial v_{t}(w_{t})}
5555:{\displaystyle w_{i}=X_{i}^{\mathsf {T}}c_{i}}
4529:
4454:{\displaystyle \Gamma _{0}=(I+\lambda I)^{-1}}
918:List of datasets for machine-learning research
7478:), one can show a regret bound that grows as
6799:
995:. Online learning algorithms may be prone to
951:
9824:
9805:
9374:
9342:
9104:
9057:
8354:
8335:
7734:
7715:
7149:
7130:
6248:
6229:
6167:
6142:
6071:
6045:
6018:
5992:
5329:
5290:
4853:
4821:
4103:{\displaystyle \Gamma _{i}=\Sigma _{i}^{-1}}
3453:{\displaystyle x_{i+1}x_{i+1}^{\mathsf {T}}}
2834:
2815:
2779:{\displaystyle \Sigma _{i}=X^{\mathsf {T}}X}
2501:
2482:
2370:
2351:
2120:
2101:
10311:
10309:
10280:Kushner, Harold J.; Yin, G. George (2003).
6077:{\displaystyle \langle x_{j},x_{i}\rangle }
4958:{\displaystyle \gamma _{i}\in \mathbb {R} }
1311:{\displaystyle V:Y\times Y\to \mathbb {R} }
10331:Introduction to Online Convex Optimization
10279:
7552:is chosen and learning performed in round
7257:
3576:, but with an additional storage space of
3569:{\displaystyle O(nd^{2}+nd^{3})=O(nd^{3})}
958:
944:
10422:
10365:
10337:. Foundations and Trends in Optimization.
10266:
10264:
10262:
10260:
10258:
10256:
10254:
9277:{\displaystyle f_{1},f_{2},\ldots ,f_{n}}
8839:{\displaystyle S\subset \mathbb {R} ^{d}}
8826:
8727:
8071:
7762:
7588:
7538:
7314:
7078:
6969:
6942:Nature sends back a convex loss function
6771:
6707:
6093:
5680:
5017:. The storage requirements at every step
4951:
4904:
3726:
3676:
2701:
2621:
2209:
2187:{\displaystyle x_{j}\in \mathbb {R} ^{d}}
2174:
1648:
1578:
1411:. The ideal goal is to select a function
1304:
10306:
9018:regret bounds for the online version of
8985:One can use the OSD algorithm to derive
8802:{\displaystyle w_{t+1}=w_{t}-\eta z_{t}}
5223:, a well known problem in optimization.
3643:Online learning: recursive least squares
2194:is a vector of inputs (data points) and
5227:Incremental stochastic gradient descent
2990:Now, calculating the covariance matrix
2249:. To this end, a square loss function
2072:being a linear function to be learned:
1067:is thought of as a space of inputs and
14:
10471:
10396:
10251:
9405:The first interpretation consider the
8288:, and thus the average regret goes to
7782:. Suppose the regularisation function
5536:
4740:
4615:
4331:
3992:
3873:
3832:
3444:
3052:
2904:
2876:
2767:
2631:{\displaystyle y_{j}\in \mathbb {R} .}
2558:
10324:
9203:
9165:, which can be improved further to a
8443:is used as a linear approximation to
8409:{\displaystyle \partial v_{t}(w_{t})}
4461:, and the iterations proceed to give
2714:{\displaystyle y\in \mathbb {R} ^{i}}
2222:{\displaystyle w\in \mathbb {R} ^{d}}
999:, a problem that can be addressed by
8846:, project cumulative gradients onto
7514:Follow the regularised leader (FTRL)
7397:This method can thus be looked as a
6113:, and the predictor is of the form
5219:. This setting is a special case of
3377:. Note that when storing the matrix
3370:{\displaystyle O(n^{2}d^{2}+nd^{3})}
10404:Online Learning and Neural Networks
9119:
8062:is instead some convex subspace of
7974:Note that this can be rewritten as
7852:is chosen for some positive number
7545:{\displaystyle R:S\to \mathbb {R} }
4985:steps of this algorithm reduces to
1470:Statistical view of online learning
1434:{\displaystyle f\in {\mathcal {H}}}
913:Glossary of artificial intelligence
24:
10078:
9333:
9220:Interpretations of online learning
9022:for classification, which use the
8893:
8740:{\displaystyle S=\mathbb {R} ^{d}}
8671:
8377:
8178:
8122:
8016:
7775:{\displaystyle S=\mathbb {R} ^{d}}
7595:
7592:
7589:
7585:
7582:
7579:
7321:
7318:
7315:
7311:
7308:
7305:
7091:{\displaystyle S=\mathbb {R} ^{d}}
5281:
4890:
4812:
4580:
4485:
4469:
4411:
4267:
4204:
4083:
4070:
3957:
3880:
3839:
3796:
3774:
3761:
3706:
3620:
3385:
2998:
2917:
2749:
2426:that minimizes the empirical loss
2039:Linear least squares (mathematics)
1450:
1426:
25:
10495:
10457:
5415:
3496:time, reducing the total time to
2047:
1782:{\displaystyle (x_{t+1},y_{t+1})}
9887:{\displaystyle V(\cdot ,\cdot )}
9585:{\displaystyle V(\cdot ,\cdot )}
8296:Online subgradient descent (OSD)
8084:{\displaystyle \mathbb {R} ^{d}}
6106:{\displaystyle \mathbb {R} ^{d}}
9557:, the sequence of gradients of
6084:is just the standard Kernel on
3250:{\displaystyle O(id^{2}+d^{3})}
1006:
10408:. Cambridge University Press.
10390:
10341:
10318:
10273:
10131:Mini-batch dictionary learning
10062:
10056:
9994:
9976:
9960:
9947:
9917:
9911:
9881:
9869:
9840:
9802:
9762:
9756:
9722:
9716:
9666:
9653:
9644:
9631:
9608:
9602:
9579:
9567:
9544:
9532:
9503:
9477:
9471:
9445:
9425:
9419:
9390:
9339:
9187:
9175:
9158:{\displaystyle O({\sqrt {T}})}
9152:
9142:
9101:
9082:
9048:
9042:
9011:{\displaystyle O({\sqrt {T}})}
9005:
8995:
8924:
8902:
8697:
8684:
8403:
8390:
8329:
8323:
8281:{\displaystyle O({\sqrt {T}})}
8275:
8265:
8209:
8187:
8171:
8131:
8042:
8029:
7827:
7821:
7798:
7792:
7709:
7703:
7673:
7667:
7658:
7652:
7534:
7497:
7491:
7453:
7432:
7424:
7418:
7384:
7378:
7251:and surrogate loss functions.
7166:
7127:
7121:
7115:
7016:
7003:
6965:
6651:
6632:
6555:
6529:
6517:
6497:
6428:
6414:
6394:
6375:
6363:
6343:
6310:
6304:
6220:
6200:
6136:
6130:
5983:
5963:
5894:
5880:
5812:
5792:
5780:
5766:
5672:
5663:
5649:
5625:
5611:
5599:
5585:
5582:
5352:
5287:
5053:
5047:
5004:
4995:
4869:
4818:
4504:
4481:
4439:
4423:
4378:
4372:
4246:
4233:
4163:
4147:
3599:
3586:
3563:
3547:
3538:
3506:
3483:
3470:
3364:
3325:
3244:
3215:
3192:
3179:
3156:
3143:
3094:
3078:
2886:
2867:
2809:
2803:
2581:
2544:
2517:
2479:
2449:
2443:
2406:is used to compute the vector
2387:
2348:
2336:
2319:
2306:
2300:
2294:
2278:
2265:
2259:
2095:
2082:
1951:
1925:
1913:
1887:
1776:
1738:
1699:
1667:
1655:
1645:
1636:
1630:
1624:
1612:
1609:
1600:
1594:
1588:
1582:
1571:
1565:
1542:
1530:
1507:
1481:
1458:{\displaystyle {\mathcal {H}}}
1378:
1372:
1349:
1340:
1334:
1328:
1300:
1261:
1235:
1223:
1197:
1173:
1161:
1112:
1100:
1089:joint probability distribution
1031:
333:Relevance vector machine (RVM)
13:
1:
10245:
10032:{\displaystyle w_{n}^{\ast }}
8532:{\displaystyle \eta ,w_{1}=0}
8248:{\displaystyle \theta _{t+1}}
4196:here are to store the matrix
3308:{\displaystyle i=1,\ldots ,n}
2033:Example: linear least squares
1880:and all previous data points
1789:, the current best predictor
1513:{\displaystyle (x_{i},y_{i})}
822:Computational learning theory
386:Expectation–maximization (EM)
10376:10.1016/j.neunet.2019.01.012
10235:Learning vector quantization
10230:k-nearest neighbor algorithm
10225:Hierarchical temporal memory
7022:{\displaystyle v_{t}(w_{t})}
5154:
4062:. The proof also shows that
779:Coefficient of determination
626:Convolutional neural network
338:Support vector machine (SVM)
7:
10484:Machine learning algorithms
10208:Stochastic gradient descent
10143:
10000:{\displaystyle I_{n}-I_{n}}
9407:stochastic gradient descent
8584:{\displaystyle t=1,2,...,T}
6854:{\displaystyle t=1,2,...,T}
5089:{\displaystyle \gamma _{i}}
4536:Stochastic gradient descent
4530:Stochastic gradient descent
4279:{\displaystyle \Sigma _{i}}
4216:{\displaystyle \Gamma _{i}}
4137:steps of this algorithm is
3632:{\displaystyle \Sigma _{i}}
3397:{\displaystyle \Sigma _{i}}
2019:stochastic gradient descent
1717:empirical risk minimization
930:Outline of machine learning
827:Empirical risk minimization
10:
10500:
8299:
6800:Online convex optimization
6657:{\displaystyle O(n^{2}dk)}
5419:
4533:
2036:
2027:artificial neural networks
1853:is permitted to depend on
1708:{\displaystyle {\hat {f}}}
567:Feedforward neural network
318:Artificial neural networks
34:Method of machine learning
26:
10220:Adaptive Resonance Theory
9699:{\displaystyle w^{\ast }}
9193:{\displaystyle O(\log T)}
6267:Now, if a general kernel
5727:satisfies the recursion:
4169:{\displaystyle O(nd^{2})}
3205:, giving a total time of
3126:{\displaystyle d\times d}
3100:{\displaystyle O(id^{2})}
2679:{\displaystyle i\times d}
1355:{\displaystyle V(f(x),y)}
1144:{\displaystyle X\times Y}
997:catastrophic interference
550:Artificial neural network
7212:is implicitly sent with
7033:The goal is to minimize
6920:from a fixed convex set
4252:{\displaystyle O(d^{2})}
3605:{\displaystyle O(d^{2})}
3489:{\displaystyle O(d^{2})}
3198:{\displaystyle O(d^{2})}
3162:{\displaystyle O(d^{3})}
1047:is to be learned, where
1040:{\displaystyle f:X\to Y}
859:Journals and conferences
806:Mathematical foundations
716:Temporal difference (TD)
572:Recurrent neural network
492:Conditional random field
415:Dimensionality reduction
163:Dimensionality reduction
125:Quantum machine learning
120:Neuromorphic engineering
80:Self-supervised learning
75:Semi-supervised learning
27:Not to be confused with
9859:Since the gradients of
9214:catastrophic forgetting
7503:{\displaystyle \log(T)}
7258:Follow the leader (FTL)
6864:Learner receives input
5753:{\displaystyle c_{0}=0}
5454:is the data matrix and
5221:stochastic optimization
4223:, which is constant at
1986:{\displaystyle b\geq 1}
1846:{\displaystyle f_{t+1}}
1729:support vector machines
1721:Tikhonov regularization
1274:. In this setting, the
976:online machine learning
268:Apprenticeship learning
10171:Reinforcement learning
10114:Naive bayes classifier
10069:
10033:
10001:
9924:
9888:
9853:
9798:
9729:
9700:
9673:
9615:
9586:
9551:
9550:{\displaystyle p(x,y)}
9516:
9432:
9397:
9278:
9194:
9159:
9111:
9012:
8976:
8860:
8840:
8803:
8741:
8704:
8642:
8615:
8585:
8533:
8491:
8464:
8437:
8410:
8361:
8282:
8249:
8216:
8160:
8085:
8049:
7968:
7924:
7866:
7846:
7776:
7741:
7680:
7641:
7546:
7504:
7472:
7391:
7367:
7276:
7233:
7206:
7179:
7092:
7057:
7056:{\displaystyle u\in S}
7023:
6977:
6934:
6914:
6885:
6855:
6786:
6742:
6722:
6678:
6658:
6616:
6596:
6569:
6496:
6401:
6342:
6281:
6258:
6199:
6107:
6078:
6032:
5962:
5867:
5754:
5721:
5694:
5556:
5495:
5475:
5448:
5406:
5386:
5359:
5213:
5198:
5135:
5090:
5069:However, the stepsize
5060:
5031:
5011:
4979:
4959:
4924:
4876:
4659:
4520:
4455:
4397:
4313:
4280:
4253:
4217:
4190:
4170:
4131:
4104:
4056:
4036:
3912:
3747:
3691:
3633:
3606:
3570:
3490:
3454:
3398:
3371:
3309:
3271:
3251:
3199:
3163:
3127:
3101:
3062:
3030:
2982:
2954:
2841:
2780:
2735:
2715:
2680:
2654:
2632:
2594:
2543:
2475:
2420:
2400:
2243:
2223:
2188:
2146:
2066:
2007:
1987:
1958:
1874:
1847:
1810:
1783:
1709:
1680:
1549:
1548:{\displaystyle p(x,y)}
1514:
1459:
1435:
1405:
1385:
1356:
1312:
1268:
1180:
1179:{\displaystyle p(x,y)}
1145:
1119:
1118:{\displaystyle p(x,y)}
1081:
1061:
1041:
993:stock price prediction
817:Bias–variance tradeoff
699:Reinforcement learning
675:Spiking neural network
85:Reinforcement learning
10070:
10068:{\displaystyle I_{n}}
10034:
10002:
9925:
9923:{\displaystyle I_{n}}
9889:
9854:
9778:
9730:
9701:
9674:
9616:
9587:
9552:
9517:
9433:
9398:
9279:
9195:
9160:
9126:online mirror descent
9112:
9013:
8977:
8861:
8841:
8804:
8742:
8705:
8643:
8641:{\displaystyle f_{t}}
8616:
8614:{\displaystyle w_{t}}
8586:
8534:
8500:Initialise parameter
8492:
8490:{\displaystyle w_{t}}
8465:
8463:{\displaystyle v_{t}}
8438:
8436:{\displaystyle v_{t}}
8411:
8362:
8283:
8250:
8217:
8140:
8086:
8050:
7969:
7904:
7867:
7865:{\displaystyle \eta }
7847:
7777:
7742:
7681:
7615:
7547:
7505:
7473:
7392:
7341:
7277:
7234:
7232:{\displaystyle v_{t}}
7207:
7205:{\displaystyle y_{t}}
7180:
7093:
7058:
7029:and updates its model
7024:
6986:Learner suffers loss
6978:
6935:
6915:
6913:{\displaystyle w_{t}}
6886:
6884:{\displaystyle x_{t}}
6856:
6787:
6743:
6723:
6679:
6659:
6617:
6597:
6595:{\displaystyle c_{i}}
6570:
6470:
6402:
6316:
6282:
6259:
6173:
6108:
6079:
6033:
5936:
5868:
5755:
5722:
5720:{\displaystyle c_{i}}
5695:
5557:
5496:
5476:
5474:{\displaystyle w_{i}}
5449:
5447:{\displaystyle X_{i}}
5407:
5387:
5385:{\displaystyle t_{i}}
5360:
5214:
5178:
5136:
5091:
5061:
5032:
5012:
5010:{\displaystyle O(nd)}
4980:
4960:
4925:
4877:
4660:
4521:
4456:
4398:
4293:
4281:
4254:
4218:
4191:
4171:
4132:
4105:
4057:
4037:
3913:
3748:
3692:
3634:
3607:
3571:
3491:
3455:
3399:
3372:
3310:
3272:
3252:
3200:
3164:
3128:
3102:
3063:
3010:
2983:
2934:
2842:
2781:
2736:
2716:
2681:
2655:
2633:
2595:
2523:
2455:
2421:
2401:
2244:
2224:
2189:
2147:
2067:
2021:. When combined with
2008:
1988:
1959:
1875:
1873:{\displaystyle f_{t}}
1848:
1811:
1809:{\displaystyle f_{t}}
1784:
1710:
1681:
1550:
1515:
1460:
1436:
1406:
1386:
1357:
1313:
1269:
1181:
1146:
1120:
1082:
1062:
1042:
653:Neural radiance field
475:Structured prediction
198:Structured prediction
70:Unsupervised learning
10167:, the opposite model
10155:Incremental learning
10129:Feature extraction:
10043:
10039:is the minimizer of
10011:
9934:
9898:
9863:
9743:
9710:
9706:is the minimizer of
9683:
9625:
9596:
9561:
9526:
9442:
9413:
9288:
9229:
9169:
9136:
9029:
8989:
8870:
8850:
8815:
8751:
8716:
8655:
8625:
8598:
8545:
8504:
8474:
8447:
8420:
8374:
8310:
8259:
8226:
8099:
8066:
7978:
7876:
7856:
7786:
7751:
7690:
7560:
7522:
7482:
7405:
7286:
7282:is simply given by:
7266:
7216:
7189:
7102:
7067:
7041:
6990:
6946:
6924:
6897:
6868:
6815:
6752:
6732:
6688:
6668:
6626:
6606:
6579:
6411:
6291:
6271:
6117:
6088:
6042:
5877:
5763:
5731:
5704:
5566:
5509:
5485:
5481:is the output after
5458:
5431:
5396:
5369:
5236:
5145:
5100:
5073:
5059:{\displaystyle O(d)}
5041:
5021:
4989:
4969:
4934:
4886:
4669:
4544:
4465:
4407:
4290:
4263:
4259:. For the case when
4227:
4200:
4180:
4141:
4121:
4066:
4046:
3921:
3757:
3701:
3651:
3616:
3580:
3500:
3464:
3408:
3381:
3319:
3281:
3261:
3209:
3173:
3137:
3111:
3072:
2994:
2851:
2790:
2745:
2725:
2690:
2664:
2644:
2604:
2430:
2410:
2253:
2233:
2198:
2156:
2076:
2056:
1997:
1971:
1884:
1857:
1824:
1793:
1735:
1690:
1559:
1524:
1478:
1445:
1415:
1395:
1384:{\displaystyle f(x)}
1366:
1322:
1282:
1194:
1155:
1129:
1094:
1071:
1051:
1019:
1001:incremental learning
842:Statistical learning
740:Learning with humans
532:Local outlier factor
10203:Streaming algorithm
10198:Online optimization
10181:Supervised learning
10028:
9993:
9672:{\displaystyle I-I}
7841:
7467:
6806:convex optimization
6794:representer theorem
5541:
4745:
4620:
4392:
4336:
4117:The complexity for
4099:
3997:
3878:
3837:
3449:
3057:
2933:
2563:
1391:and the true value
1013:supervised learning
685:Electrochemical RAM
592:reservoir computing
323:Logistic regression
242:Supervised learning
228:Multimodal learning
203:Feature engineering
148:Generative modeling
110:Rule-based learning
105:Curriculum learning
65:Supervised learning
40:Part of a series on
10439:; 2nd ed., titled
10187:General algorithms
10176:Multi-armed bandit
10149:Learning paradigms
10124:Mini-batch k-means
10065:
10029:
10014:
9997:
9979:
9920:
9884:
9849:
9725:
9696:
9669:
9611:
9582:
9547:
9512:
9428:
9393:
9274:
9209:Continual learning
9204:Continual learning
9190:
9155:
9107:
9008:
8972:
8856:
8836:
8799:
8737:
8700:
8638:
8611:
8581:
8529:
8487:
8460:
8433:
8406:
8357:
8302:Subgradient method
8278:
8245:
8212:
8081:
8045:
7964:
7862:
7842:
7819:
7772:
7737:
7676:
7611:
7542:
7500:
7468:
7430:
7387:
7337:
7272:
7229:
7202:
7175:
7088:
7053:
7019:
6973:
6930:
6910:
6881:
6851:
6782:
6781:
6738:
6718:
6717:
6674:
6654:
6612:
6592:
6565:
6397:
6277:
6254:
6103:
6074:
6028:
5863:
5750:
5717:
5690:
5552:
5525:
5491:
5471:
5444:
5402:
5382:
5355:
5209:
5131:
5086:
5056:
5027:
5007:
4975:
4955:
4920:
4872:
4729:
4655:
4604:
4516:
4451:
4393:
4370:
4320:
4276:
4249:
4213:
4186:
4166:
4127:
4100:
4082:
4052:
4032:
3981:
3908:
3862:
3821:
3743:
3742:
3687:
3686:
3629:
3602:
3566:
3486:
3450:
3427:
3394:
3367:
3305:
3267:
3247:
3195:
3159:
3133:matrix takes time
3123:
3097:
3058:
3041:
2978:
2916:
2837:
2776:
2731:
2711:
2676:
2650:
2628:
2590:
2547:
2416:
2396:
2239:
2219:
2184:
2142:
2062:
2003:
1983:
1954:
1870:
1843:
1806:
1779:
1705:
1676:
1545:
1510:
1455:
1431:
1401:
1381:
1352:
1308:
1264:
1176:
1141:
1115:
1077:
1057:
1037:
1011:In the setting of
253: •
168:Density estimation
29:online and offline
10479:Online algorithms
10415:978-0-521-65263-6
10299:978-0-387-21769-7
9845:
9776:
9728:{\displaystyle I}
9614:{\displaystyle I}
9431:{\displaystyle I}
9150:
9003:
8859:{\displaystyle S}
8273:
7817:
7576:
7302:
7275:{\displaystyle t}
7185:. Note here that
6933:{\displaystyle S}
6741:{\displaystyle K}
6677:{\displaystyle k}
6622:-th datapoint is
6615:{\displaystyle n}
6280:{\displaystyle K}
6038:Notice that here
5700:and the sequence
5505:algorithm, then,
5494:{\displaystyle i}
5405:{\displaystyle i}
5176:
5157:
5126:
5125:
5030:{\displaystyle i}
4978:{\displaystyle n}
4189:{\displaystyle i}
4130:{\displaystyle n}
4055:{\displaystyle i}
3906:
3270:{\displaystyle n}
3257:. When there are
2734:{\displaystyle i}
2653:{\displaystyle X}
2419:{\displaystyle w}
2242:{\displaystyle w}
2065:{\displaystyle f}
2006:{\displaystyle b}
1702:
1672:
1404:{\displaystyle y}
1080:{\displaystyle Y}
1060:{\displaystyle X}
985:training data set
968:
967:
773:Model diagnostics
756:Human-in-the-loop
599:Boltzmann machine
512:Anomaly detection
308:Linear regression
223:Ontology learning
218:Grammar induction
193:Semantic analysis
188:Association rules
173:Anomaly detection
115:Neuro-symbolic AI
16:(Redirected from
10491:
10452:
10426:
10420:
10419:
10407:
10394:
10388:
10387:
10369:
10345:
10339:
10338:
10336:
10322:
10316:
10313:
10304:
10303:
10287:
10277:
10271:
10268:
10193:Online algorithm
10165:Offline learning
10104:Classification:
10074:
10072:
10071:
10066:
10055:
10054:
10038:
10036:
10035:
10030:
10027:
10022:
10006:
10004:
10003:
9998:
9992:
9987:
9975:
9974:
9959:
9958:
9946:
9945:
9929:
9927:
9926:
9921:
9910:
9909:
9893:
9891:
9890:
9885:
9858:
9856:
9855:
9850:
9843:
9839:
9838:
9823:
9822:
9797:
9792:
9777:
9769:
9755:
9754:
9734:
9732:
9731:
9726:
9705:
9703:
9702:
9697:
9695:
9694:
9678:
9676:
9675:
9670:
9665:
9664:
9643:
9642:
9620:
9618:
9617:
9612:
9591:
9589:
9588:
9583:
9556:
9554:
9553:
9548:
9521:
9519:
9518:
9513:
9502:
9501:
9489:
9488:
9470:
9469:
9457:
9456:
9437:
9435:
9434:
9429:
9402:
9400:
9399:
9394:
9389:
9388:
9373:
9372:
9360:
9359:
9332:
9331:
9319:
9318:
9300:
9299:
9283:
9281:
9280:
9275:
9273:
9272:
9254:
9253:
9241:
9240:
9199:
9197:
9196:
9191:
9164:
9162:
9161:
9156:
9151:
9146:
9120:Other algorithms
9116:
9114:
9113:
9108:
9100:
9099:
9081:
9080:
9041:
9040:
9017:
9015:
9014:
9009:
9004:
8999:
8981:
8979:
8978:
8973:
8971:
8970:
8958:
8957:
8945:
8944:
8923:
8922:
8901:
8900:
8888:
8887:
8865:
8863:
8862:
8857:
8845:
8843:
8842:
8837:
8835:
8834:
8829:
8808:
8806:
8805:
8800:
8798:
8797:
8782:
8781:
8769:
8768:
8746:
8744:
8743:
8738:
8736:
8735:
8730:
8709:
8707:
8706:
8701:
8696:
8695:
8683:
8682:
8667:
8666:
8647:
8645:
8644:
8639:
8637:
8636:
8620:
8618:
8617:
8612:
8610:
8609:
8590:
8588:
8587:
8582:
8538:
8536:
8535:
8530:
8522:
8521:
8496:
8494:
8493:
8488:
8486:
8485:
8469:
8467:
8466:
8461:
8459:
8458:
8442:
8440:
8439:
8434:
8432:
8431:
8415:
8413:
8412:
8407:
8402:
8401:
8389:
8388:
8366:
8364:
8363:
8358:
8353:
8352:
8322:
8321:
8291:
8287:
8285:
8284:
8279:
8274:
8269:
8254:
8252:
8251:
8246:
8244:
8243:
8221:
8219:
8218:
8213:
8208:
8207:
8186:
8185:
8170:
8169:
8159:
8154:
8130:
8129:
8117:
8116:
8094:
8090:
8088:
8087:
8082:
8080:
8079:
8074:
8061:
8054:
8052:
8051:
8046:
8041:
8040:
8028:
8027:
8009:
8008:
7996:
7995:
7973:
7971:
7970:
7965:
7963:
7962:
7947:
7946:
7934:
7933:
7923:
7918:
7894:
7893:
7871:
7869:
7868:
7863:
7851:
7849:
7848:
7843:
7840:
7835:
7830:
7818:
7816:
7805:
7781:
7779:
7778:
7773:
7771:
7770:
7765:
7746:
7744:
7743:
7738:
7733:
7732:
7702:
7701:
7685:
7683:
7682:
7677:
7651:
7650:
7640:
7629:
7610:
7599:
7598:
7572:
7571:
7555:
7551:
7549:
7548:
7543:
7541:
7509:
7507:
7506:
7501:
7477:
7475:
7474:
7469:
7466:
7461:
7456:
7452:
7451:
7450:
7417:
7416:
7399:greedy algorithm
7396:
7394:
7393:
7388:
7377:
7376:
7366:
7355:
7336:
7325:
7324:
7298:
7297:
7281:
7279:
7278:
7273:
7238:
7236:
7235:
7230:
7228:
7227:
7211:
7209:
7208:
7203:
7201:
7200:
7184:
7182:
7181:
7176:
7174:
7173:
7164:
7163:
7148:
7147:
7114:
7113:
7097:
7095:
7094:
7089:
7087:
7086:
7081:
7062:
7060:
7059:
7054:
7028:
7026:
7025:
7020:
7015:
7014:
7002:
7001:
6982:
6980:
6979:
6974:
6972:
6958:
6957:
6939:
6937:
6936:
6931:
6919:
6917:
6916:
6911:
6909:
6908:
6893:Learner outputs
6890:
6888:
6887:
6882:
6880:
6879:
6860:
6858:
6857:
6852:
6791:
6789:
6788:
6783:
6780:
6779:
6774:
6765:
6764:
6747:
6745:
6744:
6739:
6727:
6725:
6724:
6719:
6716:
6715:
6710:
6701:
6700:
6683:
6681:
6680:
6675:
6663:
6661:
6660:
6655:
6644:
6643:
6621:
6619:
6618:
6613:
6601:
6599:
6598:
6593:
6591:
6590:
6574:
6572:
6571:
6566:
6564:
6563:
6554:
6553:
6541:
6540:
6525:
6524:
6515:
6514:
6495:
6484:
6466:
6465:
6456:
6455:
6449:
6448:
6436:
6435:
6426:
6425:
6406:
6404:
6403:
6398:
6387:
6386:
6371:
6370:
6361:
6360:
6341:
6330:
6303:
6302:
6286:
6284:
6283:
6278:
6263:
6261:
6260:
6255:
6241:
6240:
6228:
6227:
6218:
6217:
6198:
6187:
6160:
6159:
6129:
6128:
6112:
6110:
6109:
6104:
6102:
6101:
6096:
6083:
6081:
6080:
6075:
6070:
6069:
6057:
6056:
6037:
6035:
6034:
6029:
6027:
6026:
6017:
6016:
6004:
6003:
5991:
5990:
5981:
5980:
5961:
5950:
5932:
5931:
5922:
5921:
5915:
5914:
5902:
5901:
5892:
5891:
5872:
5870:
5869:
5864:
5820:
5819:
5810:
5809:
5788:
5787:
5778:
5777:
5759:
5757:
5756:
5751:
5743:
5742:
5726:
5724:
5723:
5718:
5716:
5715:
5699:
5697:
5696:
5691:
5689:
5688:
5683:
5671:
5670:
5661:
5660:
5633:
5632:
5623:
5622:
5607:
5606:
5597:
5596:
5578:
5577:
5561:
5559:
5558:
5553:
5551:
5550:
5540:
5539:
5533:
5521:
5520:
5500:
5498:
5497:
5492:
5480:
5478:
5477:
5472:
5470:
5469:
5453:
5451:
5450:
5445:
5443:
5442:
5411:
5409:
5408:
5403:
5391:
5389:
5388:
5383:
5381:
5380:
5364:
5362:
5361:
5356:
5351:
5350:
5349:
5348:
5328:
5327:
5326:
5325:
5308:
5307:
5280:
5279:
5267:
5266:
5248:
5247:
5218:
5216:
5215:
5210:
5208:
5207:
5197:
5192:
5177:
5169:
5164:
5163:
5158:
5150:
5140:
5138:
5137:
5132:
5127:
5121:
5117:
5112:
5111:
5095:
5093:
5092:
5087:
5085:
5084:
5065:
5063:
5062:
5057:
5037:are constant at
5036:
5034:
5033:
5028:
5016:
5014:
5013:
5008:
4984:
4982:
4981:
4976:
4964:
4962:
4961:
4956:
4954:
4946:
4945:
4929:
4927:
4926:
4921:
4919:
4918:
4907:
4898:
4897:
4881:
4879:
4878:
4873:
4868:
4867:
4852:
4851:
4839:
4838:
4811:
4810:
4798:
4797:
4779:
4775:
4774:
4773:
4761:
4760:
4744:
4743:
4737:
4723:
4722:
4713:
4712:
4700:
4699:
4681:
4680:
4664:
4662:
4661:
4656:
4654:
4650:
4649:
4648:
4636:
4635:
4619:
4618:
4612:
4598:
4597:
4588:
4587:
4575:
4574:
4556:
4555:
4525:
4523:
4522:
4517:
4515:
4514:
4493:
4492:
4477:
4476:
4460:
4458:
4457:
4452:
4450:
4449:
4419:
4418:
4402:
4400:
4399:
4394:
4391:
4386:
4381:
4363:
4362:
4357:
4353:
4352:
4351:
4335:
4334:
4328:
4312:
4307:
4285:
4283:
4282:
4277:
4275:
4274:
4258:
4256:
4255:
4250:
4245:
4244:
4222:
4220:
4219:
4214:
4212:
4211:
4195:
4193:
4192:
4187:
4175:
4173:
4172:
4167:
4162:
4161:
4136:
4134:
4133:
4128:
4109:
4107:
4106:
4101:
4098:
4090:
4078:
4077:
4061:
4059:
4058:
4053:
4041:
4039:
4038:
4033:
4031:
4027:
4026:
4025:
4013:
4012:
3996:
3995:
3989:
3975:
3974:
3965:
3964:
3952:
3951:
3933:
3932:
3917:
3915:
3914:
3909:
3907:
3905:
3904:
3903:
3894:
3893:
3877:
3876:
3870:
3854:
3853:
3852:
3836:
3835:
3829:
3820:
3819:
3810:
3809:
3793:
3788:
3787:
3769:
3768:
3752:
3750:
3749:
3744:
3741:
3740:
3729:
3714:
3713:
3696:
3694:
3693:
3688:
3685:
3684:
3679:
3664:
3663:
3638:
3636:
3635:
3630:
3628:
3627:
3611:
3609:
3608:
3603:
3598:
3597:
3575:
3573:
3572:
3567:
3562:
3561:
3537:
3536:
3521:
3520:
3495:
3493:
3492:
3487:
3482:
3481:
3459:
3457:
3456:
3451:
3448:
3447:
3441:
3426:
3425:
3403:
3401:
3400:
3395:
3393:
3392:
3376:
3374:
3373:
3368:
3363:
3362:
3347:
3346:
3337:
3336:
3314:
3312:
3311:
3306:
3276:
3274:
3273:
3268:
3256:
3254:
3253:
3248:
3243:
3242:
3230:
3229:
3204:
3202:
3201:
3196:
3191:
3190:
3168:
3166:
3165:
3160:
3155:
3154:
3132:
3130:
3129:
3124:
3107:, inverting the
3106:
3104:
3103:
3098:
3093:
3092:
3067:
3065:
3064:
3059:
3056:
3055:
3049:
3040:
3039:
3029:
3024:
3006:
3005:
2987:
2985:
2984:
2979:
2974:
2973:
2964:
2963:
2953:
2948:
2932:
2924:
2909:
2908:
2907:
2897:
2896:
2881:
2880:
2879:
2863:
2862:
2846:
2844:
2843:
2838:
2827:
2826:
2802:
2801:
2785:
2783:
2782:
2777:
2772:
2771:
2770:
2757:
2756:
2740:
2738:
2737:
2732:
2720:
2718:
2717:
2712:
2710:
2709:
2704:
2686:data matrix and
2685:
2683:
2682:
2677:
2659:
2657:
2656:
2651:
2637:
2635:
2634:
2629:
2624:
2616:
2615:
2599:
2597:
2596:
2591:
2589:
2588:
2579:
2578:
2562:
2561:
2555:
2542:
2537:
2516:
2515:
2500:
2499:
2474:
2469:
2442:
2441:
2425:
2423:
2422:
2417:
2405:
2403:
2402:
2397:
2395:
2394:
2385:
2384:
2369:
2368:
2344:
2343:
2334:
2333:
2318:
2317:
2293:
2292:
2277:
2276:
2248:
2246:
2245:
2240:
2228:
2226:
2225:
2220:
2218:
2217:
2212:
2193:
2191:
2190:
2185:
2183:
2182:
2177:
2168:
2167:
2151:
2149:
2148:
2143:
2141:
2140:
2119:
2118:
2094:
2093:
2071:
2069:
2068:
2063:
2012:
2010:
2009:
2004:
1992:
1990:
1989:
1984:
1963:
1961:
1960:
1955:
1950:
1949:
1937:
1936:
1912:
1911:
1899:
1898:
1879:
1877:
1876:
1871:
1869:
1868:
1852:
1850:
1849:
1844:
1842:
1841:
1815:
1813:
1812:
1807:
1805:
1804:
1788:
1786:
1785:
1780:
1775:
1774:
1756:
1755:
1714:
1712:
1711:
1706:
1704:
1703:
1695:
1685:
1683:
1682:
1677:
1670:
1581:
1554:
1552:
1551:
1546:
1519:
1517:
1516:
1511:
1506:
1505:
1493:
1492:
1464:
1462:
1461:
1456:
1454:
1453:
1440:
1438:
1437:
1432:
1430:
1429:
1410:
1408:
1407:
1402:
1390:
1388:
1387:
1382:
1361:
1359:
1358:
1353:
1317:
1315:
1314:
1309:
1307:
1273:
1271:
1270:
1265:
1260:
1259:
1247:
1246:
1222:
1221:
1209:
1208:
1185:
1183:
1182:
1177:
1150:
1148:
1147:
1142:
1124:
1122:
1121:
1116:
1086:
1084:
1083:
1078:
1066:
1064:
1063:
1058:
1046:
1044:
1043:
1038:
1015:, a function of
980:machine learning
972:computer science
960:
953:
946:
907:Related articles
784:Confusion matrix
537:Isolation forest
482:Graphical models
261:
260:
213:Learning to rank
208:Feature learning
46:Machine learning
37:
36:
21:
10499:
10498:
10494:
10493:
10492:
10490:
10489:
10488:
10469:
10468:
10460:
10455:
10427:
10423:
10416:
10395:
10391:
10354:Neural Networks
10346:
10342:
10334:
10323:
10319:
10314:
10307:
10300:
10278:
10274:
10269:
10252:
10248:
10214:Learning models
10146:
10135:Incremental PCA
10081:
10079:Implementations
10050:
10046:
10044:
10041:
10040:
10023:
10018:
10012:
10009:
10008:
9988:
9983:
9970:
9966:
9954:
9950:
9941:
9937:
9935:
9932:
9931:
9905:
9901:
9899:
9896:
9895:
9864:
9861:
9860:
9834:
9830:
9818:
9814:
9793:
9782:
9768:
9750:
9746:
9744:
9741:
9740:
9711:
9708:
9707:
9690:
9686:
9684:
9681:
9680:
9660:
9656:
9638:
9634:
9626:
9623:
9622:
9597:
9594:
9593:
9562:
9559:
9558:
9527:
9524:
9523:
9497:
9493:
9484:
9480:
9465:
9461:
9452:
9448:
9443:
9440:
9439:
9414:
9411:
9410:
9384:
9380:
9368:
9364:
9349:
9345:
9327:
9323:
9308:
9304:
9295:
9291:
9289:
9286:
9285:
9268:
9264:
9249:
9245:
9236:
9232:
9230:
9227:
9226:
9222:
9206:
9170:
9167:
9166:
9145:
9137:
9134:
9133:
9122:
9095:
9091:
9076:
9072:
9036:
9032:
9030:
9027:
9026:
8998:
8990:
8987:
8986:
8966:
8962:
8953:
8949:
8934:
8930:
8912:
8908:
8896:
8892:
8877:
8873:
8871:
8868:
8867:
8851:
8848:
8847:
8830:
8825:
8824:
8816:
8813:
8812:
8793:
8789:
8777:
8773:
8758:
8754:
8752:
8749:
8748:
8731:
8726:
8725:
8717:
8714:
8713:
8691:
8687:
8678:
8674:
8662:
8658:
8656:
8653:
8652:
8632:
8628:
8626:
8623:
8622:
8605:
8601:
8599:
8596:
8595:
8546:
8543:
8542:
8517:
8513:
8505:
8502:
8501:
8481:
8477:
8475:
8472:
8471:
8454:
8450:
8448:
8445:
8444:
8427:
8423:
8421:
8418:
8417:
8397:
8393:
8384:
8380:
8375:
8372:
8371:
8348:
8344:
8317:
8313:
8311:
8308:
8307:
8304:
8298:
8289:
8268:
8260:
8257:
8256:
8233:
8229:
8227:
8224:
8223:
8197:
8193:
8181:
8177:
8165:
8161:
8155:
8144:
8125:
8121:
8106:
8102:
8100:
8097:
8096:
8092:
8075:
8070:
8069:
8067:
8064:
8063:
8059:
8036:
8032:
8023:
8019:
8004:
8000:
7985:
7981:
7979:
7976:
7975:
7958:
7954:
7942:
7938:
7929:
7925:
7919:
7908:
7883:
7879:
7877:
7874:
7873:
7857:
7854:
7853:
7836:
7831:
7820:
7809:
7804:
7787:
7784:
7783:
7766:
7761:
7760:
7752:
7749:
7748:
7728:
7724:
7697:
7693:
7691:
7688:
7687:
7646:
7642:
7630:
7619:
7600:
7578:
7577:
7567:
7563:
7561:
7558:
7557:
7553:
7537:
7523:
7520:
7519:
7516:
7483:
7480:
7479:
7462:
7457:
7446:
7442:
7435:
7431:
7412:
7408:
7406:
7403:
7402:
7372:
7368:
7356:
7345:
7326:
7304:
7303:
7293:
7289:
7287:
7284:
7283:
7267:
7264:
7263:
7260:
7245:convexification
7223:
7219:
7217:
7214:
7213:
7196:
7192:
7190:
7187:
7186:
7169:
7165:
7159:
7155:
7143:
7139:
7109:
7105:
7103:
7100:
7099:
7082:
7077:
7076:
7068:
7065:
7064:
7042:
7039:
7038:
7010:
7006:
6997:
6993:
6991:
6988:
6987:
6968:
6953:
6949:
6947:
6944:
6943:
6925:
6922:
6921:
6904:
6900:
6898:
6895:
6894:
6875:
6871:
6869:
6866:
6865:
6816:
6813:
6812:
6802:
6775:
6770:
6769:
6760:
6756:
6753:
6750:
6749:
6733:
6730:
6729:
6711:
6706:
6705:
6696:
6692:
6689:
6686:
6685:
6669:
6666:
6665:
6639:
6635:
6627:
6624:
6623:
6607:
6604:
6603:
6586:
6582:
6580:
6577:
6576:
6559:
6558:
6549:
6545:
6536:
6532:
6520:
6516:
6504:
6500:
6485:
6474:
6461:
6457:
6451:
6450:
6444:
6440:
6431:
6427:
6421:
6417:
6412:
6409:
6408:
6382:
6378:
6366:
6362:
6350:
6346:
6331:
6320:
6298:
6294:
6292:
6289:
6288:
6272:
6269:
6268:
6266:
6236:
6232:
6223:
6219:
6207:
6203:
6188:
6177:
6149:
6145:
6124:
6120:
6118:
6115:
6114:
6097:
6092:
6091:
6089:
6086:
6085:
6065:
6061:
6052:
6048:
6043:
6040:
6039:
6022:
6021:
6012:
6008:
5999:
5995:
5986:
5982:
5970:
5966:
5951:
5940:
5927:
5923:
5917:
5916:
5910:
5906:
5897:
5893:
5887:
5883:
5878:
5875:
5874:
5815:
5811:
5799:
5795:
5783:
5779:
5773:
5769:
5764:
5761:
5760:
5738:
5734:
5732:
5729:
5728:
5711:
5707:
5705:
5702:
5701:
5684:
5679:
5678:
5666:
5662:
5656:
5652:
5628:
5624:
5618:
5614:
5602:
5598:
5592:
5588:
5573:
5569:
5567:
5564:
5563:
5546:
5542:
5535:
5534:
5529:
5516:
5512:
5510:
5507:
5506:
5486:
5483:
5482:
5465:
5461:
5459:
5456:
5455:
5438:
5434:
5432:
5429:
5428:
5424:
5418:
5397:
5394:
5393:
5376:
5372:
5370:
5367:
5366:
5344:
5340:
5339:
5335:
5321:
5317:
5316:
5312:
5297:
5293:
5275:
5271:
5256:
5252:
5243:
5239:
5237:
5234:
5233:
5229:
5203:
5199:
5193:
5182:
5168:
5159:
5149:
5148:
5146:
5143:
5142:
5116:
5107:
5103:
5101:
5098:
5097:
5080:
5076:
5074:
5071:
5070:
5042:
5039:
5038:
5022:
5019:
5018:
4990:
4987:
4986:
4970:
4967:
4966:
4950:
4941:
4937:
4935:
4932:
4931:
4908:
4903:
4902:
4893:
4889:
4887:
4884:
4883:
4863:
4859:
4847:
4843:
4828:
4824:
4806:
4802:
4787:
4783:
4769:
4765:
4750:
4746:
4739:
4738:
4733:
4728:
4724:
4718:
4714:
4708:
4704:
4689:
4685:
4676:
4672:
4670:
4667:
4666:
4665:is replaced by
4644:
4640:
4625:
4621:
4614:
4613:
4608:
4603:
4599:
4593:
4589:
4583:
4579:
4564:
4560:
4551:
4547:
4545:
4542:
4541:
4538:
4532:
4507:
4503:
4488:
4484:
4472:
4468:
4466:
4463:
4462:
4442:
4438:
4414:
4410:
4408:
4405:
4404:
4387:
4382:
4371:
4358:
4347:
4343:
4330:
4329:
4324:
4319:
4315:
4314:
4308:
4297:
4291:
4288:
4287:
4270:
4266:
4264:
4261:
4260:
4240:
4236:
4228:
4225:
4224:
4207:
4203:
4201:
4198:
4197:
4181:
4178:
4177:
4157:
4153:
4142:
4139:
4138:
4122:
4119:
4118:
4091:
4086:
4073:
4069:
4067:
4064:
4063:
4047:
4044:
4043:
4021:
4017:
4002:
3998:
3991:
3990:
3985:
3980:
3976:
3970:
3966:
3960:
3956:
3941:
3937:
3928:
3924:
3922:
3919:
3918:
3899:
3895:
3883:
3879:
3872:
3871:
3866:
3855:
3842:
3838:
3831:
3830:
3825:
3815:
3811:
3799:
3795:
3794:
3792:
3777:
3773:
3764:
3760:
3758:
3755:
3754:
3730:
3725:
3724:
3709:
3705:
3702:
3699:
3698:
3680:
3675:
3674:
3659:
3655:
3652:
3649:
3648:
3645:
3623:
3619:
3617:
3614:
3613:
3593:
3589:
3581:
3578:
3577:
3557:
3553:
3532:
3528:
3516:
3512:
3501:
3498:
3497:
3477:
3473:
3465:
3462:
3461:
3443:
3442:
3431:
3415:
3411:
3409:
3406:
3405:
3388:
3384:
3382:
3379:
3378:
3358:
3354:
3342:
3338:
3332:
3328:
3320:
3317:
3316:
3282:
3279:
3278:
3262:
3259:
3258:
3238:
3234:
3225:
3221:
3210:
3207:
3206:
3186:
3182:
3174:
3171:
3170:
3150:
3146:
3138:
3135:
3134:
3112:
3109:
3108:
3088:
3084:
3073:
3070:
3069:
3051:
3050:
3045:
3035:
3031:
3025:
3014:
3001:
2997:
2995:
2992:
2991:
2969:
2965:
2959:
2955:
2949:
2938:
2925:
2920:
2903:
2902:
2898:
2889:
2885:
2875:
2874:
2870:
2858:
2854:
2852:
2849:
2848:
2822:
2818:
2797:
2793:
2791:
2788:
2787:
2766:
2765:
2761:
2752:
2748:
2746:
2743:
2742:
2726:
2723:
2722:
2705:
2700:
2699:
2691:
2688:
2687:
2665:
2662:
2661:
2645:
2642:
2641:
2620:
2611:
2607:
2605:
2602:
2601:
2584:
2580:
2574:
2570:
2557:
2556:
2551:
2538:
2527:
2511:
2507:
2495:
2491:
2470:
2459:
2437:
2433:
2431:
2428:
2427:
2411:
2408:
2407:
2390:
2386:
2380:
2376:
2364:
2360:
2339:
2335:
2329:
2325:
2313:
2309:
2288:
2284:
2272:
2268:
2254:
2251:
2250:
2234:
2231:
2230:
2213:
2208:
2207:
2199:
2196:
2195:
2178:
2173:
2172:
2163:
2159:
2157:
2154:
2153:
2136:
2132:
2114:
2110:
2089:
2085:
2077:
2074:
2073:
2057:
2054:
2053:
2050:
2041:
2035:
2023:backpropagation
1998:
1995:
1994:
1972:
1969:
1968:
1945:
1941:
1932:
1928:
1907:
1903:
1894:
1890:
1885:
1882:
1881:
1864:
1860:
1858:
1855:
1854:
1831:
1827:
1825:
1822:
1821:
1800:
1796:
1794:
1791:
1790:
1764:
1760:
1745:
1741:
1736:
1733:
1732:
1694:
1693:
1691:
1688:
1687:
1577:
1560:
1557:
1556:
1525:
1522:
1521:
1501:
1497:
1488:
1484:
1479:
1476:
1475:
1472:
1449:
1448:
1446:
1443:
1442:
1425:
1424:
1416:
1413:
1412:
1396:
1393:
1392:
1367:
1364:
1363:
1323:
1320:
1319:
1303:
1283:
1280:
1279:
1255:
1251:
1242:
1238:
1217:
1213:
1204:
1200:
1195:
1192:
1191:
1156:
1153:
1152:
1130:
1127:
1126:
1095:
1092:
1091:
1072:
1069:
1068:
1052:
1049:
1048:
1020:
1017:
1016:
1009:
978:is a method of
964:
935:
934:
908:
900:
899:
860:
852:
851:
812:Kernel machines
807:
799:
798:
774:
766:
765:
746:Active learning
741:
733:
732:
701:
691:
690:
616:Diffusion model
552:
542:
541:
514:
504:
503:
477:
467:
466:
422:Factor analysis
417:
407:
406:
390:
353:
343:
342:
263:
262:
246:
245:
244:
233:
232:
138:
130:
129:
95:Online learning
60:
48:
35:
32:
23:
22:
15:
12:
11:
5:
10497:
10487:
10486:
10481:
10467:
10466:
10459:
10458:External links
10456:
10454:
10453:
10421:
10414:
10389:
10340:
10317:
10305:
10298:
10272:
10249:
10247:
10244:
10243:
10242:
10237:
10232:
10227:
10222:
10211:
10210:
10205:
10200:
10195:
10184:
10183:
10178:
10173:
10168:
10162:
10157:
10145:
10142:
10141:
10140:
10139:
10138:
10127:
10120:
10117:
10110:SGD classifier
10096:
10080:
10077:
10064:
10061:
10058:
10053:
10049:
10026:
10021:
10017:
9996:
9991:
9986:
9982:
9978:
9973:
9969:
9965:
9962:
9957:
9953:
9949:
9944:
9940:
9919:
9916:
9913:
9908:
9904:
9883:
9880:
9877:
9874:
9871:
9868:
9848:
9842:
9837:
9833:
9829:
9826:
9821:
9817:
9813:
9810:
9807:
9804:
9801:
9796:
9791:
9788:
9785:
9781:
9775:
9772:
9767:
9764:
9761:
9758:
9753:
9749:
9724:
9721:
9718:
9715:
9693:
9689:
9668:
9663:
9659:
9655:
9652:
9649:
9646:
9641:
9637:
9633:
9630:
9610:
9607:
9604:
9601:
9581:
9578:
9575:
9572:
9569:
9566:
9546:
9543:
9540:
9537:
9534:
9531:
9511:
9508:
9505:
9500:
9496:
9492:
9487:
9483:
9479:
9476:
9473:
9468:
9464:
9460:
9455:
9451:
9447:
9427:
9424:
9421:
9418:
9392:
9387:
9383:
9379:
9376:
9371:
9367:
9363:
9358:
9355:
9352:
9348:
9344:
9341:
9338:
9335:
9330:
9326:
9322:
9317:
9314:
9311:
9307:
9303:
9298:
9294:
9271:
9267:
9263:
9260:
9257:
9252:
9248:
9244:
9239:
9235:
9221:
9218:
9205:
9202:
9189:
9186:
9183:
9180:
9177:
9174:
9154:
9149:
9144:
9141:
9121:
9118:
9106:
9103:
9098:
9094:
9090:
9087:
9084:
9079:
9075:
9071:
9068:
9065:
9062:
9059:
9056:
9053:
9050:
9047:
9044:
9039:
9035:
9007:
9002:
8997:
8994:
8983:
8982:
8969:
8965:
8961:
8956:
8952:
8948:
8943:
8940:
8937:
8933:
8929:
8926:
8921:
8918:
8915:
8911:
8907:
8904:
8899:
8895:
8891:
8886:
8883:
8880:
8876:
8855:
8833:
8828:
8823:
8820:
8809:
8796:
8792:
8788:
8785:
8780:
8776:
8772:
8767:
8764:
8761:
8757:
8734:
8729:
8724:
8721:
8710:
8699:
8694:
8690:
8686:
8681:
8677:
8673:
8670:
8665:
8661:
8649:
8635:
8631:
8608:
8604:
8594:Predict using
8580:
8577:
8574:
8571:
8568:
8565:
8562:
8559:
8556:
8553:
8550:
8528:
8525:
8520:
8516:
8512:
8509:
8484:
8480:
8457:
8453:
8430:
8426:
8405:
8400:
8396:
8392:
8387:
8383:
8379:
8356:
8351:
8347:
8343:
8340:
8337:
8334:
8331:
8328:
8325:
8320:
8316:
8297:
8294:
8277:
8272:
8267:
8264:
8242:
8239:
8236:
8232:
8211:
8206:
8203:
8200:
8196:
8192:
8189:
8184:
8180:
8176:
8173:
8168:
8164:
8158:
8153:
8150:
8147:
8143:
8139:
8136:
8133:
8128:
8124:
8120:
8115:
8112:
8109:
8105:
8078:
8073:
8044:
8039:
8035:
8031:
8026:
8022:
8018:
8015:
8012:
8007:
8003:
7999:
7994:
7991:
7988:
7984:
7961:
7957:
7953:
7950:
7945:
7941:
7937:
7932:
7928:
7922:
7917:
7914:
7911:
7907:
7903:
7900:
7897:
7892:
7889:
7886:
7882:
7861:
7839:
7834:
7829:
7826:
7823:
7815:
7812:
7808:
7803:
7800:
7797:
7794:
7791:
7769:
7764:
7759:
7756:
7736:
7731:
7727:
7723:
7720:
7717:
7714:
7711:
7708:
7705:
7700:
7696:
7675:
7672:
7669:
7666:
7663:
7660:
7657:
7654:
7649:
7645:
7639:
7636:
7633:
7628:
7625:
7622:
7618:
7614:
7609:
7606:
7603:
7597:
7594:
7591:
7587:
7584:
7581:
7575:
7570:
7566:
7540:
7536:
7533:
7530:
7527:
7515:
7512:
7499:
7496:
7493:
7490:
7487:
7465:
7460:
7455:
7449:
7445:
7441:
7438:
7434:
7429:
7426:
7423:
7420:
7415:
7411:
7386:
7383:
7380:
7375:
7371:
7365:
7362:
7359:
7354:
7351:
7348:
7344:
7340:
7335:
7332:
7329:
7323:
7320:
7317:
7313:
7310:
7307:
7301:
7296:
7292:
7271:
7259:
7256:
7226:
7222:
7199:
7195:
7172:
7168:
7162:
7158:
7154:
7151:
7146:
7142:
7138:
7135:
7132:
7129:
7126:
7123:
7120:
7117:
7112:
7108:
7085:
7080:
7075:
7072:
7052:
7049:
7046:
7031:
7030:
7018:
7013:
7009:
7005:
7000:
6996:
6984:
6971:
6967:
6964:
6961:
6956:
6952:
6940:
6929:
6907:
6903:
6891:
6878:
6874:
6850:
6847:
6844:
6841:
6838:
6835:
6832:
6829:
6826:
6823:
6820:
6801:
6798:
6778:
6773:
6768:
6763:
6759:
6737:
6714:
6709:
6704:
6699:
6695:
6673:
6653:
6650:
6647:
6642:
6638:
6634:
6631:
6611:
6589:
6585:
6562:
6557:
6552:
6548:
6544:
6539:
6535:
6531:
6528:
6523:
6519:
6513:
6510:
6507:
6503:
6499:
6494:
6491:
6488:
6483:
6480:
6477:
6473:
6469:
6464:
6460:
6454:
6447:
6443:
6439:
6434:
6430:
6424:
6420:
6416:
6396:
6393:
6390:
6385:
6381:
6377:
6374:
6369:
6365:
6359:
6356:
6353:
6349:
6345:
6340:
6337:
6334:
6329:
6326:
6323:
6319:
6315:
6312:
6309:
6306:
6301:
6297:
6276:
6253:
6250:
6247:
6244:
6239:
6235:
6231:
6226:
6222:
6216:
6213:
6210:
6206:
6202:
6197:
6194:
6191:
6186:
6183:
6180:
6176:
6172:
6169:
6166:
6163:
6158:
6155:
6152:
6148:
6144:
6141:
6138:
6135:
6132:
6127:
6123:
6100:
6095:
6073:
6068:
6064:
6060:
6055:
6051:
6047:
6025:
6020:
6015:
6011:
6007:
6002:
5998:
5994:
5989:
5985:
5979:
5976:
5973:
5969:
5965:
5960:
5957:
5954:
5949:
5946:
5943:
5939:
5935:
5930:
5926:
5920:
5913:
5909:
5905:
5900:
5896:
5890:
5886:
5882:
5862:
5859:
5856:
5853:
5850:
5847:
5844:
5841:
5838:
5835:
5832:
5829:
5826:
5823:
5818:
5814:
5808:
5805:
5802:
5798:
5794:
5791:
5786:
5782:
5776:
5772:
5768:
5749:
5746:
5741:
5737:
5714:
5710:
5687:
5682:
5677:
5674:
5669:
5665:
5659:
5655:
5651:
5648:
5645:
5642:
5639:
5636:
5631:
5627:
5621:
5617:
5613:
5610:
5605:
5601:
5595:
5591:
5587:
5584:
5581:
5576:
5572:
5549:
5545:
5538:
5532:
5528:
5524:
5519:
5515:
5490:
5468:
5464:
5441:
5437:
5417:
5416:Kernel methods
5414:
5401:
5379:
5375:
5354:
5347:
5343:
5338:
5334:
5331:
5324:
5320:
5315:
5311:
5306:
5303:
5300:
5296:
5292:
5289:
5286:
5283:
5278:
5274:
5270:
5265:
5262:
5259:
5255:
5251:
5246:
5242:
5228:
5225:
5206:
5202:
5196:
5191:
5188:
5185:
5181:
5175:
5172:
5167:
5162:
5156:
5153:
5130:
5124:
5120:
5115:
5110:
5106:
5083:
5079:
5055:
5052:
5049:
5046:
5026:
5006:
5003:
5000:
4997:
4994:
4974:
4953:
4949:
4944:
4940:
4917:
4914:
4911:
4906:
4901:
4896:
4892:
4871:
4866:
4862:
4858:
4855:
4850:
4846:
4842:
4837:
4834:
4831:
4827:
4823:
4820:
4817:
4814:
4809:
4805:
4801:
4796:
4793:
4790:
4786:
4782:
4778:
4772:
4768:
4764:
4759:
4756:
4753:
4749:
4742:
4736:
4732:
4727:
4721:
4717:
4711:
4707:
4703:
4698:
4695:
4692:
4688:
4684:
4679:
4675:
4653:
4647:
4643:
4639:
4634:
4631:
4628:
4624:
4617:
4611:
4607:
4602:
4596:
4592:
4586:
4582:
4578:
4573:
4570:
4567:
4563:
4559:
4554:
4550:
4534:Main article:
4531:
4528:
4513:
4510:
4506:
4502:
4499:
4496:
4491:
4487:
4483:
4480:
4475:
4471:
4448:
4445:
4441:
4437:
4434:
4431:
4428:
4425:
4422:
4417:
4413:
4390:
4385:
4380:
4377:
4374:
4369:
4366:
4361:
4356:
4350:
4346:
4342:
4339:
4333:
4327:
4323:
4318:
4311:
4306:
4303:
4300:
4296:
4273:
4269:
4248:
4243:
4239:
4235:
4232:
4210:
4206:
4185:
4165:
4160:
4156:
4152:
4149:
4146:
4126:
4097:
4094:
4089:
4085:
4081:
4076:
4072:
4051:
4030:
4024:
4020:
4016:
4011:
4008:
4005:
4001:
3994:
3988:
3984:
3979:
3973:
3969:
3963:
3959:
3955:
3950:
3947:
3944:
3940:
3936:
3931:
3927:
3902:
3898:
3892:
3889:
3886:
3882:
3875:
3869:
3865:
3861:
3858:
3851:
3848:
3845:
3841:
3834:
3828:
3824:
3818:
3814:
3808:
3805:
3802:
3798:
3791:
3786:
3783:
3780:
3776:
3772:
3767:
3763:
3739:
3736:
3733:
3728:
3723:
3720:
3717:
3712:
3708:
3683:
3678:
3673:
3670:
3667:
3662:
3658:
3644:
3641:
3626:
3622:
3601:
3596:
3592:
3588:
3585:
3565:
3560:
3556:
3552:
3549:
3546:
3543:
3540:
3535:
3531:
3527:
3524:
3519:
3515:
3511:
3508:
3505:
3485:
3480:
3476:
3472:
3469:
3460:, which takes
3446:
3440:
3437:
3434:
3430:
3424:
3421:
3418:
3414:
3391:
3387:
3366:
3361:
3357:
3353:
3350:
3345:
3341:
3335:
3331:
3327:
3324:
3304:
3301:
3298:
3295:
3292:
3289:
3286:
3266:
3246:
3241:
3237:
3233:
3228:
3224:
3220:
3217:
3214:
3194:
3189:
3185:
3181:
3178:
3158:
3153:
3149:
3145:
3142:
3122:
3119:
3116:
3096:
3091:
3087:
3083:
3080:
3077:
3054:
3048:
3044:
3038:
3034:
3028:
3023:
3020:
3017:
3013:
3009:
3004:
3000:
2977:
2972:
2968:
2962:
2958:
2952:
2947:
2944:
2941:
2937:
2931:
2928:
2923:
2919:
2915:
2912:
2906:
2901:
2895:
2892:
2888:
2884:
2878:
2873:
2869:
2866:
2861:
2857:
2836:
2833:
2830:
2825:
2821:
2817:
2814:
2811:
2808:
2805:
2800:
2796:
2775:
2769:
2764:
2760:
2755:
2751:
2730:
2708:
2703:
2698:
2695:
2675:
2672:
2669:
2649:
2627:
2623:
2619:
2614:
2610:
2587:
2583:
2577:
2573:
2569:
2566:
2560:
2554:
2550:
2546:
2541:
2536:
2533:
2530:
2526:
2522:
2519:
2514:
2510:
2506:
2503:
2498:
2494:
2490:
2487:
2484:
2481:
2478:
2473:
2468:
2465:
2462:
2458:
2454:
2451:
2448:
2445:
2440:
2436:
2415:
2393:
2389:
2383:
2379:
2375:
2372:
2367:
2363:
2359:
2356:
2353:
2350:
2347:
2342:
2338:
2332:
2328:
2324:
2321:
2316:
2312:
2308:
2305:
2302:
2299:
2296:
2291:
2287:
2283:
2280:
2275:
2271:
2267:
2264:
2261:
2258:
2238:
2216:
2211:
2206:
2203:
2181:
2176:
2171:
2166:
2162:
2139:
2135:
2131:
2128:
2125:
2122:
2117:
2113:
2109:
2106:
2103:
2100:
2097:
2092:
2088:
2084:
2081:
2061:
2049:
2048:Batch learning
2046:
2037:Main article:
2034:
2031:
2002:
1982:
1979:
1976:
1953:
1948:
1944:
1940:
1935:
1931:
1927:
1924:
1921:
1918:
1915:
1910:
1906:
1902:
1897:
1893:
1889:
1867:
1863:
1840:
1837:
1834:
1830:
1818:kernel methods
1803:
1799:
1778:
1773:
1770:
1767:
1763:
1759:
1754:
1751:
1748:
1744:
1740:
1701:
1698:
1675:
1669:
1666:
1663:
1660:
1657:
1654:
1651:
1647:
1644:
1641:
1638:
1635:
1632:
1629:
1626:
1623:
1620:
1617:
1614:
1611:
1608:
1605:
1602:
1599:
1596:
1593:
1590:
1587:
1584:
1580:
1576:
1573:
1570:
1567:
1564:
1544:
1541:
1538:
1535:
1532:
1529:
1509:
1504:
1500:
1496:
1491:
1487:
1483:
1471:
1468:
1452:
1428:
1423:
1420:
1400:
1380:
1377:
1374:
1371:
1351:
1348:
1345:
1342:
1339:
1336:
1333:
1330:
1327:
1306:
1302:
1299:
1296:
1293:
1290:
1287:
1263:
1258:
1254:
1250:
1245:
1241:
1237:
1234:
1231:
1228:
1225:
1220:
1216:
1212:
1207:
1203:
1199:
1175:
1172:
1169:
1166:
1163:
1160:
1140:
1137:
1134:
1114:
1111:
1108:
1105:
1102:
1099:
1076:
1056:
1036:
1033:
1030:
1027:
1024:
1008:
1005:
966:
965:
963:
962:
955:
948:
940:
937:
936:
933:
932:
927:
926:
925:
915:
909:
906:
905:
902:
901:
898:
897:
892:
887:
882:
877:
872:
867:
861:
858:
857:
854:
853:
850:
849:
844:
839:
834:
832:Occam learning
829:
824:
819:
814:
808:
805:
804:
801:
800:
797:
796:
791:
789:Learning curve
786:
781:
775:
772:
771:
768:
767:
764:
763:
758:
753:
748:
742:
739:
738:
735:
734:
731:
730:
729:
728:
718:
713:
708:
702:
697:
696:
693:
692:
689:
688:
682:
677:
672:
667:
666:
665:
655:
650:
649:
648:
643:
638:
633:
623:
618:
613:
608:
607:
606:
596:
595:
594:
589:
584:
579:
569:
564:
559:
553:
548:
547:
544:
543:
540:
539:
534:
529:
521:
515:
510:
509:
506:
505:
502:
501:
500:
499:
494:
489:
478:
473:
472:
469:
468:
465:
464:
459:
454:
449:
444:
439:
434:
429:
424:
418:
413:
412:
409:
408:
405:
404:
399:
394:
388:
383:
378:
370:
365:
360:
354:
349:
348:
345:
344:
341:
340:
335:
330:
325:
320:
315:
310:
305:
297:
296:
295:
290:
285:
275:
273:Decision trees
270:
264:
250:classification
240:
239:
238:
235:
234:
231:
230:
225:
220:
215:
210:
205:
200:
195:
190:
185:
180:
175:
170:
165:
160:
155:
150:
145:
143:Classification
139:
136:
135:
132:
131:
128:
127:
122:
117:
112:
107:
102:
100:Batch learning
97:
92:
87:
82:
77:
72:
67:
61:
58:
57:
54:
53:
42:
41:
33:
18:Batch learning
9:
6:
4:
3:
2:
10496:
10485:
10482:
10480:
10477:
10476:
10474:
10465:
10462:
10461:
10450:
10449:0-387-00894-2
10446:
10442:
10438:
10437:0-387-94916-X
10434:
10430:
10425:
10417:
10411:
10406:
10405:
10399:
10393:
10385:
10381:
10377:
10373:
10368:
10363:
10359:
10355:
10351:
10344:
10333:
10332:
10327:
10321:
10312:
10310:
10301:
10295:
10291:
10286:
10285:
10276:
10267:
10265:
10263:
10261:
10259:
10257:
10255:
10250:
10241:
10238:
10236:
10233:
10231:
10228:
10226:
10223:
10221:
10218:
10217:
10216:
10215:
10209:
10206:
10204:
10201:
10199:
10196:
10194:
10191:
10190:
10189:
10188:
10182:
10179:
10177:
10174:
10172:
10169:
10166:
10163:
10161:
10160:Lazy learning
10158:
10156:
10153:
10152:
10151:
10150:
10136:
10132:
10128:
10125:
10121:
10118:
10115:
10111:
10107:
10103:
10102:
10100:
10097:
10094:
10093:hashing trick
10090:
10086:
10085:Vowpal Wabbit
10083:
10082:
10076:
10059:
10051:
10047:
10024:
10019:
10015:
9989:
9984:
9980:
9971:
9967:
9963:
9955:
9951:
9942:
9938:
9914:
9906:
9902:
9878:
9875:
9872:
9866:
9846:
9835:
9831:
9827:
9819:
9815:
9811:
9808:
9799:
9794:
9789:
9786:
9783:
9779:
9773:
9770:
9765:
9759:
9751:
9747:
9736:
9719:
9713:
9691:
9687:
9661:
9657:
9650:
9647:
9639:
9635:
9628:
9605:
9599:
9576:
9573:
9570:
9564:
9541:
9538:
9535:
9529:
9509:
9506:
9498:
9494:
9490:
9485:
9481:
9474:
9466:
9462:
9458:
9453:
9449:
9422:
9416:
9408:
9403:
9385:
9381:
9377:
9369:
9365:
9361:
9356:
9353:
9350:
9346:
9336:
9328:
9324:
9320:
9315:
9312:
9309:
9305:
9301:
9296:
9292:
9269:
9265:
9261:
9258:
9255:
9250:
9246:
9242:
9237:
9233:
9217:
9215:
9210:
9201:
9184:
9181:
9178:
9172:
9147:
9139:
9131:
9127:
9117:
9096:
9092:
9088:
9085:
9077:
9073:
9069:
9066:
9063:
9060:
9051:
9045:
9037:
9033:
9025:
9021:
9000:
8992:
8967:
8963:
8959:
8954:
8950:
8946:
8941:
8938:
8935:
8931:
8927:
8919:
8916:
8913:
8909:
8905:
8897:
8889:
8884:
8881:
8878:
8874:
8853:
8831:
8821:
8818:
8810:
8794:
8790:
8786:
8783:
8778:
8774:
8770:
8765:
8762:
8759:
8755:
8732:
8722:
8719:
8711:
8692:
8688:
8679:
8675:
8668:
8663:
8659:
8650:
8633:
8629:
8606:
8602:
8593:
8592:
8591:
8578:
8575:
8572:
8569:
8566:
8563:
8560:
8557:
8554:
8551:
8548:
8539:
8526:
8523:
8518:
8514:
8510:
8507:
8498:
8482:
8478:
8455:
8451:
8428:
8424:
8398:
8394:
8385:
8381:
8370:
8349:
8345:
8341:
8338:
8332:
8326:
8318:
8314:
8303:
8293:
8270:
8262:
8240:
8237:
8234:
8230:
8204:
8201:
8198:
8194:
8190:
8182:
8174:
8166:
8162:
8156:
8151:
8148:
8145:
8141:
8137:
8134:
8126:
8118:
8113:
8110:
8107:
8103:
8076:
8056:
8037:
8033:
8024:
8020:
8013:
8010:
8005:
8001:
7997:
7992:
7989:
7986:
7982:
7959:
7955:
7951:
7948:
7943:
7939:
7935:
7930:
7926:
7920:
7915:
7912:
7909:
7905:
7901:
7898:
7895:
7890:
7887:
7884:
7880:
7859:
7837:
7832:
7824:
7813:
7810:
7806:
7801:
7795:
7789:
7767:
7757:
7754:
7729:
7725:
7721:
7718:
7712:
7706:
7698:
7694:
7670:
7664:
7661:
7655:
7647:
7643:
7637:
7634:
7631:
7626:
7623:
7620:
7616:
7612:
7607:
7604:
7601:
7573:
7568:
7564:
7531:
7528:
7525:
7511:
7494:
7488:
7485:
7463:
7458:
7447:
7443:
7439:
7436:
7427:
7421:
7413:
7409:
7400:
7381:
7373:
7369:
7363:
7360:
7357:
7352:
7349:
7346:
7342:
7338:
7333:
7330:
7327:
7299:
7294:
7290:
7269:
7255:
7252:
7250:
7249:randomisation
7246:
7240:
7224:
7220:
7197:
7193:
7170:
7160:
7156:
7152:
7144:
7140:
7136:
7133:
7124:
7118:
7110:
7106:
7083:
7073:
7070:
7050:
7047:
7044:
7036:
7011:
7007:
6998:
6994:
6985:
6962:
6959:
6954:
6950:
6941:
6927:
6905:
6901:
6892:
6876:
6872:
6863:
6862:
6861:
6848:
6845:
6842:
6839:
6836:
6833:
6830:
6827:
6824:
6821:
6818:
6809:
6807:
6797:
6795:
6776:
6766:
6761:
6757:
6735:
6712:
6702:
6697:
6693:
6671:
6648:
6645:
6640:
6636:
6629:
6609:
6587:
6583:
6550:
6546:
6542:
6537:
6533:
6526:
6521:
6511:
6508:
6505:
6501:
6492:
6489:
6486:
6481:
6478:
6475:
6471:
6467:
6462:
6458:
6445:
6441:
6437:
6432:
6422:
6418:
6391:
6388:
6383:
6379:
6372:
6367:
6357:
6354:
6351:
6347:
6338:
6335:
6332:
6327:
6324:
6321:
6317:
6313:
6307:
6299:
6295:
6274:
6264:
6251:
6245:
6242:
6237:
6233:
6224:
6214:
6211:
6208:
6204:
6195:
6192:
6189:
6184:
6181:
6178:
6174:
6170:
6164:
6161:
6156:
6153:
6150:
6146:
6139:
6133:
6125:
6121:
6098:
6066:
6062:
6058:
6053:
6049:
6013:
6009:
6005:
6000:
5996:
5987:
5977:
5974:
5971:
5967:
5958:
5955:
5952:
5947:
5944:
5941:
5937:
5933:
5928:
5924:
5911:
5907:
5903:
5898:
5888:
5884:
5860:
5857:
5854:
5851:
5848:
5845:
5842:
5839:
5836:
5833:
5830:
5827:
5824:
5821:
5816:
5806:
5803:
5800:
5796:
5789:
5784:
5774:
5770:
5747:
5744:
5739:
5735:
5712:
5708:
5685:
5675:
5667:
5657:
5653:
5646:
5643:
5640:
5637:
5634:
5629:
5619:
5615:
5608:
5603:
5593:
5589:
5579:
5574:
5570:
5547:
5543:
5530:
5526:
5522:
5517:
5513:
5504:
5501:steps of the
5488:
5466:
5462:
5439:
5435:
5423:
5422:Kernel method
5413:
5399:
5377:
5373:
5345:
5341:
5336:
5332:
5322:
5318:
5313:
5309:
5304:
5301:
5298:
5294:
5284:
5276:
5272:
5268:
5263:
5260:
5257:
5253:
5249:
5244:
5240:
5224:
5222:
5204:
5200:
5194:
5189:
5186:
5183:
5179:
5173:
5170:
5165:
5160:
5151:
5128:
5122:
5118:
5113:
5108:
5104:
5081:
5077:
5067:
5050:
5044:
5024:
5001:
4998:
4992:
4972:
4947:
4942:
4938:
4915:
4912:
4909:
4899:
4894:
4864:
4860:
4856:
4848:
4844:
4840:
4835:
4832:
4829:
4825:
4815:
4807:
4803:
4799:
4794:
4791:
4788:
4784:
4780:
4776:
4770:
4766:
4762:
4757:
4754:
4751:
4747:
4734:
4730:
4725:
4719:
4715:
4709:
4705:
4701:
4696:
4693:
4690:
4686:
4682:
4677:
4673:
4651:
4645:
4641:
4637:
4632:
4629:
4626:
4622:
4609:
4605:
4600:
4594:
4590:
4584:
4576:
4571:
4568:
4565:
4561:
4557:
4552:
4548:
4537:
4527:
4511:
4508:
4500:
4497:
4494:
4489:
4478:
4473:
4446:
4443:
4435:
4432:
4429:
4426:
4420:
4415:
4388:
4383:
4375:
4367:
4364:
4359:
4354:
4348:
4344:
4340:
4337:
4325:
4321:
4316:
4309:
4304:
4301:
4298:
4294:
4271:
4241:
4237:
4230:
4208:
4183:
4158:
4154:
4150:
4144:
4124:
4115:
4113:
4095:
4092:
4087:
4079:
4074:
4049:
4028:
4022:
4018:
4014:
4009:
4006:
4003:
3999:
3986:
3982:
3977:
3971:
3967:
3961:
3953:
3948:
3945:
3942:
3938:
3934:
3929:
3925:
3900:
3896:
3890:
3887:
3884:
3867:
3863:
3859:
3856:
3849:
3846:
3843:
3826:
3822:
3816:
3812:
3806:
3803:
3800:
3789:
3784:
3781:
3778:
3770:
3765:
3737:
3734:
3731:
3721:
3718:
3715:
3710:
3681:
3671:
3668:
3665:
3660:
3656:
3640:
3624:
3594:
3590:
3583:
3558:
3554:
3550:
3544:
3541:
3533:
3529:
3525:
3522:
3517:
3513:
3509:
3503:
3478:
3474:
3467:
3438:
3435:
3432:
3428:
3422:
3419:
3416:
3412:
3389:
3359:
3355:
3351:
3348:
3343:
3339:
3333:
3329:
3322:
3302:
3299:
3296:
3293:
3290:
3287:
3284:
3264:
3239:
3235:
3231:
3226:
3222:
3218:
3212:
3187:
3183:
3176:
3151:
3147:
3140:
3120:
3117:
3114:
3089:
3085:
3081:
3075:
3046:
3042:
3036:
3032:
3026:
3021:
3018:
3015:
3011:
3007:
3002:
2988:
2975:
2970:
2966:
2960:
2956:
2950:
2945:
2942:
2939:
2935:
2929:
2926:
2921:
2913:
2910:
2899:
2893:
2890:
2882:
2871:
2864:
2859:
2855:
2831:
2828:
2823:
2819:
2812:
2806:
2798:
2794:
2773:
2762:
2758:
2753:
2728:
2706:
2696:
2693:
2673:
2670:
2667:
2647:
2638:
2625:
2617:
2612:
2608:
2585:
2575:
2571:
2567:
2564:
2552:
2548:
2539:
2534:
2531:
2528:
2524:
2520:
2512:
2508:
2504:
2496:
2492:
2488:
2485:
2476:
2471:
2466:
2463:
2460:
2456:
2452:
2446:
2438:
2434:
2413:
2391:
2381:
2377:
2373:
2365:
2361:
2357:
2354:
2345:
2340:
2330:
2326:
2322:
2314:
2310:
2303:
2297:
2289:
2285:
2281:
2273:
2269:
2262:
2256:
2236:
2214:
2204:
2201:
2179:
2169:
2164:
2160:
2137:
2133:
2129:
2126:
2123:
2115:
2111:
2107:
2104:
2098:
2090:
2086:
2079:
2059:
2045:
2040:
2030:
2028:
2024:
2020:
2016:
2000:
1980:
1977:
1974:
1965:
1946:
1942:
1938:
1933:
1929:
1922:
1919:
1916:
1908:
1904:
1900:
1895:
1891:
1865:
1861:
1838:
1835:
1832:
1828:
1819:
1801:
1797:
1771:
1768:
1765:
1761:
1757:
1752:
1749:
1746:
1742:
1730:
1726:
1725:least squares
1722:
1718:
1696:
1673:
1664:
1661:
1658:
1652:
1649:
1642:
1639:
1633:
1627:
1621:
1618:
1615:
1606:
1603:
1597:
1591:
1585:
1574:
1568:
1562:
1539:
1536:
1533:
1527:
1502:
1498:
1494:
1489:
1485:
1467:
1421:
1418:
1398:
1375:
1369:
1346:
1343:
1337:
1331:
1325:
1297:
1294:
1291:
1288:
1285:
1277:
1276:loss function
1256:
1252:
1248:
1243:
1239:
1232:
1229:
1226:
1218:
1214:
1210:
1205:
1201:
1189:
1170:
1167:
1164:
1158:
1138:
1135:
1132:
1109:
1106:
1103:
1097:
1090:
1074:
1054:
1034:
1028:
1025:
1022:
1014:
1004:
1002:
998:
994:
990:
986:
981:
977:
973:
961:
956:
954:
949:
947:
942:
941:
939:
938:
931:
928:
924:
921:
920:
919:
916:
914:
911:
910:
904:
903:
896:
893:
891:
888:
886:
883:
881:
878:
876:
873:
871:
868:
866:
863:
862:
856:
855:
848:
845:
843:
840:
838:
835:
833:
830:
828:
825:
823:
820:
818:
815:
813:
810:
809:
803:
802:
795:
792:
790:
787:
785:
782:
780:
777:
776:
770:
769:
762:
759:
757:
754:
752:
751:Crowdsourcing
749:
747:
744:
743:
737:
736:
727:
724:
723:
722:
719:
717:
714:
712:
709:
707:
704:
703:
700:
695:
694:
686:
683:
681:
680:Memtransistor
678:
676:
673:
671:
668:
664:
661:
660:
659:
656:
654:
651:
647:
644:
642:
639:
637:
634:
632:
629:
628:
627:
624:
622:
619:
617:
614:
612:
609:
605:
602:
601:
600:
597:
593:
590:
588:
585:
583:
580:
578:
575:
574:
573:
570:
568:
565:
563:
562:Deep learning
560:
558:
555:
554:
551:
546:
545:
538:
535:
533:
530:
528:
526:
522:
520:
517:
516:
513:
508:
507:
498:
497:Hidden Markov
495:
493:
490:
488:
485:
484:
483:
480:
479:
476:
471:
470:
463:
460:
458:
455:
453:
450:
448:
445:
443:
440:
438:
435:
433:
430:
428:
425:
423:
420:
419:
416:
411:
410:
403:
400:
398:
395:
393:
389:
387:
384:
382:
379:
377:
375:
371:
369:
366:
364:
361:
359:
356:
355:
352:
347:
346:
339:
336:
334:
331:
329:
326:
324:
321:
319:
316:
314:
311:
309:
306:
304:
302:
298:
294:
293:Random forest
291:
289:
286:
284:
281:
280:
279:
276:
274:
271:
269:
266:
265:
258:
257:
252:
251:
243:
237:
236:
229:
226:
224:
221:
219:
216:
214:
211:
209:
206:
204:
201:
199:
196:
194:
191:
189:
186:
184:
181:
179:
178:Data cleaning
176:
174:
171:
169:
166:
164:
161:
159:
156:
154:
151:
149:
146:
144:
141:
140:
134:
133:
126:
123:
121:
118:
116:
113:
111:
108:
106:
103:
101:
98:
96:
93:
91:
90:Meta-learning
88:
86:
83:
81:
78:
76:
73:
71:
68:
66:
63:
62:
56:
55:
52:
47:
44:
43:
39:
38:
30:
19:
10440:
10428:
10424:
10403:
10398:Bottou, LĂ©on
10392:
10357:
10353:
10343:
10330:
10320:
10283:
10275:
10213:
10212:
10186:
10185:
10148:
10147:
10122:Clustering:
10099:scikit-learn
9737:
9404:
9223:
9207:
9123:
8984:
8747:, update as
8648:from nature.
8540:
8499:
8305:
8292:as desired.
8057:
7747:. Also, let
7556:as follows:
7517:
7261:
7253:
7241:
7032:
6810:
6803:
6265:
5425:
5230:
5068:
4539:
4116:
3646:
2989:
2639:
2051:
2042:
1966:
1473:
1318:, such that
1278:is given as
1190:of examples
1188:training set
1010:
1007:Introduction
1003:approaches.
975:
969:
837:PAC learning
524:
373:
368:Hierarchical
300:
254:
248:
94:
10326:Hazan, Elad
8369:subgradient
4540:When this
3068:takes time
2015:out-of-core
989:out-of-core
721:Multi-agent
658:Transformer
557:Autoencoder
313:Naive Bayes
51:data mining
10473:Categories
10367:1802.07569
10246:References
10240:Perceptron
10106:Perceptron
10089:reductions
9024:hinge loss
8621:, receive
8300:See also:
7247:are used:
5420:See also:
706:Q-learning
604:Restricted
402:Mean shift
351:Clustering
328:Perceptron
256:regression
158:Clustering
153:Regression
10384:0893-6080
10360:: 54–71.
10025:∗
9990:∗
9964:−
9879:⋅
9873:⋅
9825:⟩
9806:⟨
9780:∑
9692:∗
9662:∗
9648:−
9577:⋅
9571:⋅
9510:…
9375:⟩
9354:−
9343:⟨
9334:∇
9325:γ
9321:−
9313:−
9259:…
9182:
9089:⋅
9070:−
8951:θ
8932:θ
8910:θ
8906:η
8894:Π
8822:⊂
8787:η
8784:−
8672:∂
8669:∈
8508:η
8378:∂
8355:⟩
8336:⟨
8231:θ
8195:θ
8191:η
8179:Π
8142:∑
8138:η
8135:−
8123:Π
8017:∇
8014:η
8011:−
7952:η
7949:−
7906:∑
7902:η
7899:−
7860:η
7814:η
7735:⟩
7716:⟨
7635:−
7617:∑
7613:
7605:∈
7535:→
7489:
7440:−
7361:−
7343:∑
7339:
7331:∈
7153:−
7150:⟩
7131:⟨
7048:∈
6966:→
6767:∈
6703:∈
6509:−
6490:−
6472:∑
6468:−
6442:γ
6355:−
6336:−
6318:∑
6249:⟩
6230:⟨
6212:−
6193:−
6175:∑
6168:⟩
6154:−
6143:⟨
6072:⟩
6046:⟨
6019:⟩
5993:⟨
5975:−
5956:−
5938:∑
5934:−
5908:γ
5858:−
5804:−
5676:∈
5330:⟩
5302:−
5291:⟨
5282:∇
5273:γ
5269:−
5261:−
5180:∑
5155:¯
5114:≈
5105:γ
5078:γ
4948:∈
4939:γ
4913:×
4900:∈
4891:Γ
4854:⟩
4833:−
4822:⟨
4813:∇
4804:γ
4800:−
4792:−
4763:−
4755:−
4706:γ
4702:−
4694:−
4638:−
4630:−
4581:Γ
4577:−
4569:−
4509:−
4498:λ
4486:Σ
4470:Γ
4444:−
4433:λ
4412:Γ
4368:λ
4341:−
4295:∑
4268:Σ
4205:Γ
4093:−
4084:Σ
4071:Γ
4015:−
4007:−
3958:Γ
3954:−
3946:−
3888:−
3881:Γ
3847:−
3840:Γ
3804:−
3797:Γ
3790:−
3782:−
3775:Γ
3762:Γ
3735:×
3722:∈
3707:Γ
3672:∈
3621:Σ
3612:to store
3386:Σ
3297:…
3118:×
3012:∑
2999:Σ
2936:∑
2927:−
2918:Σ
2891:−
2860:∗
2835:⟩
2824:∗
2816:⟨
2799:∗
2750:Σ
2697:∈
2671:×
2618:∈
2568:−
2525:∑
2502:⟩
2483:⟨
2457:∑
2374:−
2371:⟩
2352:⟨
2323:−
2205:∈
2170:∈
2130:⋅
2121:⟩
2102:⟨
1978:≥
1920:…
1700:^
1619:∫
1422:∈
1301:→
1295:×
1230:…
1136:×
1032:→
865:ECML PKDD
847:VC theory
794:ROC curve
726:Self-play
646:DeepDream
487:Bayes net
278:Ensembles
59:Paradigms
10443:, 2003,
10328:(2015).
10144:See also
10007:, where
9679:, where
7828:‖
7822:‖
7454:‖
7433:‖
6664:, where
4379:‖
4373:‖
1715:through
1441:, where
288:Boosting
137:Problems
9130:AdaGrad
8651:Choose
2660:be the
870:NeurIPS
687:(ECRAM)
641:AlexNet
283:Bagging
10447:
10435:
10412:
10382:
10296:
9844:
7035:regret
5562:where
2600:where
2152:where
1671:
663:Vision
519:RANSAC
397:OPTICS
392:DBSCAN
376:-means
183:AutoML
10362:arXiv
10335:(PDF)
10292:–12.
9020:SVM's
8866:i.e.
8470:near
885:IJCAI
711:SARSA
670:Mamba
636:LeNet
631:U-Net
457:t-SNE
381:Fuzzy
358:BIRCH
10445:ISBN
10433:ISBN
10410:ISBN
10380:ISSN
10294:ISBN
8541:For
6811:For
5873:and
3697:and
2640:Let
1727:and
895:JMLR
880:ICLR
875:ICML
761:RLHF
577:LSTM
363:CURE
49:and
10372:doi
10358:113
9216:.
9179:log
9055:max
8811:If
8712:If
8416:of
8058:If
7486:log
5503:SGD
4930:by
4882:or
4114:).
4112:RLS
1125:on
970:In
621:SOM
611:GAN
587:ESN
582:GRU
527:-NN
462:SDL
452:PGD
447:PCA
442:NMF
437:LDA
432:ICA
427:CCA
303:-NN
10475::
10378:.
10370:.
10356:.
10352:.
10308:^
10253:^
10133:,
10112:,
10108:,
10075:.
8091:,
7239:.
6796:.
5066:.
4526:.
3639:.
2029:.
974:,
890:ML
10451:.
10418:.
10386:.
10374::
10364::
10302:.
10290:8
10137:.
10126:.
10116:.
10063:]
10060:w
10057:[
10052:n
10048:I
10020:n
10016:w
9995:]
9985:n
9981:w
9977:[
9972:n
9968:I
9961:]
9956:t
9952:w
9948:[
9943:n
9939:I
9918:]
9915:w
9912:[
9907:n
9903:I
9882:)
9876:,
9870:(
9867:V
9847:.
9841:)
9836:i
9832:y
9828:,
9820:i
9816:x
9812:,
9809:w
9803:(
9800:V
9795:n
9790:1
9787:=
9784:i
9774:n
9771:1
9766:=
9763:]
9760:w
9757:[
9752:n
9748:I
9723:]
9720:w
9717:[
9714:I
9688:w
9667:]
9658:w
9654:[
9651:I
9645:]
9640:t
9636:w
9632:[
9629:I
9609:]
9606:w
9603:[
9600:I
9580:)
9574:,
9568:(
9565:V
9545:)
9542:y
9539:,
9536:x
9533:(
9530:p
9507:,
9504:)
9499:2
9495:y
9491:,
9486:2
9482:x
9478:(
9475:,
9472:)
9467:1
9463:y
9459:,
9454:1
9450:x
9446:(
9426:]
9423:w
9420:[
9417:I
9391:)
9386:t
9382:y
9378:,
9370:t
9366:x
9362:,
9357:1
9351:t
9347:w
9340:(
9337:V
9329:t
9316:1
9310:t
9306:w
9302:=
9297:t
9293:w
9270:n
9266:f
9262:,
9256:,
9251:2
9247:f
9243:,
9238:1
9234:f
9188:)
9185:T
9176:(
9173:O
9153:)
9148:T
9143:(
9140:O
9105:}
9102:)
9097:t
9093:x
9086:w
9083:(
9078:t
9074:y
9067:1
9064:,
9061:0
9058:{
9052:=
9049:)
9046:w
9043:(
9038:t
9034:v
9006:)
9001:T
8996:(
8993:O
8968:t
8964:z
8960:+
8955:t
8947:=
8942:1
8939:+
8936:t
8928:,
8925:)
8920:1
8917:+
8914:t
8903:(
8898:S
8890:=
8885:1
8882:+
8879:t
8875:w
8854:S
8832:d
8827:R
8819:S
8795:t
8791:z
8779:t
8775:w
8771:=
8766:1
8763:+
8760:t
8756:w
8733:d
8728:R
8723:=
8720:S
8698:)
8693:t
8689:w
8685:(
8680:t
8676:v
8664:t
8660:z
8634:t
8630:f
8607:t
8603:w
8579:T
8576:,
8573:.
8570:.
8567:.
8564:,
8561:2
8558:,
8555:1
8552:=
8549:t
8527:0
8524:=
8519:1
8515:w
8511:,
8483:t
8479:w
8456:t
8452:v
8429:t
8425:v
8404:)
8399:t
8395:w
8391:(
8386:t
8382:v
8350:t
8346:z
8342:,
8339:w
8333:=
8330:)
8327:w
8324:(
8319:t
8315:v
8290:0
8276:)
8271:T
8266:(
8263:O
8241:1
8238:+
8235:t
8210:)
8205:1
8202:+
8199:t
8188:(
8183:S
8175:=
8172:)
8167:i
8163:z
8157:t
8152:1
8149:=
8146:i
8132:(
8127:S
8119:=
8114:1
8111:+
8108:t
8104:w
8093:S
8077:d
8072:R
8060:S
8043:)
8038:t
8034:w
8030:(
8025:t
8021:v
8006:t
8002:w
7998:=
7993:1
7990:+
7987:t
7983:w
7960:t
7956:z
7944:t
7940:w
7936:=
7931:i
7927:z
7921:t
7916:1
7913:=
7910:i
7896:=
7891:1
7888:+
7885:t
7881:w
7838:2
7833:2
7825:w
7811:2
7807:1
7802:=
7799:)
7796:w
7793:(
7790:R
7768:d
7763:R
7758:=
7755:S
7730:t
7726:z
7722:,
7719:w
7713:=
7710:)
7707:w
7704:(
7699:t
7695:v
7674:)
7671:w
7668:(
7665:R
7662:+
7659:)
7656:w
7653:(
7648:i
7644:v
7638:1
7632:t
7627:1
7624:=
7621:i
7608:S
7602:w
7596:n
7593:i
7590:m
7586:g
7583:r
7580:a
7574:=
7569:t
7565:w
7554:t
7539:R
7532:S
7529::
7526:R
7498:)
7495:T
7492:(
7464:2
7459:2
7448:t
7444:x
7437:w
7428:=
7425:)
7422:w
7419:(
7414:t
7410:v
7385:)
7382:w
7379:(
7374:i
7370:v
7364:1
7358:t
7353:1
7350:=
7347:i
7334:S
7328:w
7322:n
7319:i
7316:m
7312:g
7309:r
7306:a
7300:=
7295:t
7291:w
7270:t
7225:t
7221:v
7198:t
7194:y
7171:2
7167:)
7161:t
7157:y
7145:t
7141:x
7137:,
7134:w
7128:(
7125:=
7122:)
7119:w
7116:(
7111:t
7107:v
7084:d
7079:R
7074:=
7071:S
7051:S
7045:u
7017:)
7012:t
7008:w
7004:(
6999:t
6995:v
6983:.
6970:R
6963:S
6960::
6955:t
6951:v
6928:S
6906:t
6902:w
6877:t
6873:x
6849:T
6846:,
6843:.
6840:.
6837:.
6834:,
6831:2
6828:,
6825:1
6822:=
6819:t
6777:i
6772:R
6762:i
6758:c
6736:K
6713:d
6708:R
6698:i
6694:w
6672:k
6652:)
6649:k
6646:d
6641:2
6637:n
6633:(
6630:O
6610:n
6588:i
6584:c
6561:)
6556:)
6551:i
6547:x
6543:,
6538:j
6534:x
6530:(
6527:K
6522:j
6518:)
6512:1
6506:i
6502:c
6498:(
6493:1
6487:i
6482:1
6479:=
6476:j
6463:i
6459:y
6453:(
6446:i
6438:=
6433:i
6429:)
6423:i
6419:c
6415:(
6395:)
6392:x
6389:,
6384:j
6380:x
6376:(
6373:K
6368:j
6364:)
6358:1
6352:i
6348:c
6344:(
6339:1
6333:i
6328:1
6325:=
6322:j
6314:=
6311:)
6308:x
6305:(
6300:i
6296:f
6275:K
6252:.
6246:x
6243:,
6238:j
6234:x
6225:j
6221:)
6215:1
6209:i
6205:c
6201:(
6196:1
6190:i
6185:1
6182:=
6179:j
6171:=
6165:x
6162:,
6157:1
6151:i
6147:w
6140:=
6137:)
6134:x
6131:(
6126:i
6122:f
6099:d
6094:R
6067:i
6063:x
6059:,
6054:j
6050:x
6024:)
6014:i
6010:x
6006:,
6001:j
5997:x
5988:j
5984:)
5978:1
5972:i
5968:c
5964:(
5959:1
5953:i
5948:1
5945:=
5942:j
5929:i
5925:y
5919:(
5912:i
5904:=
5899:i
5895:)
5889:i
5885:c
5881:(
5861:1
5855:i
5852:,
5849:.
5846:.
5843:.
5840:,
5837:2
5834:,
5831:1
5828:=
5825:j
5822:,
5817:j
5813:)
5807:1
5801:i
5797:c
5793:(
5790:=
5785:j
5781:)
5775:i
5771:c
5767:(
5748:0
5745:=
5740:0
5736:c
5713:i
5709:c
5686:i
5681:R
5673:)
5668:i
5664:)
5658:i
5654:c
5650:(
5647:,
5644:.
5641:.
5638:.
5635:,
5630:2
5626:)
5620:i
5616:c
5612:(
5609:,
5604:1
5600:)
5594:i
5590:c
5586:(
5583:(
5580:=
5575:i
5571:c
5548:i
5544:c
5537:T
5531:i
5527:X
5523:=
5518:i
5514:w
5489:i
5467:i
5463:w
5440:i
5436:X
5400:i
5378:i
5374:t
5353:)
5346:i
5342:t
5337:y
5333:,
5323:i
5319:t
5314:x
5310:,
5305:1
5299:i
5295:w
5288:(
5285:V
5277:i
5264:1
5258:i
5254:w
5250:=
5245:i
5241:w
5205:i
5201:w
5195:n
5190:1
5187:=
5184:i
5174:n
5171:1
5166:=
5161:n
5152:w
5129:,
5123:i
5119:1
5109:i
5082:i
5054:)
5051:d
5048:(
5045:O
5025:i
5005:)
5002:d
4999:n
4996:(
4993:O
4973:n
4952:R
4943:i
4916:d
4910:d
4905:R
4895:i
4870:)
4865:i
4861:y
4857:,
4849:i
4845:x
4841:,
4836:1
4830:i
4826:w
4819:(
4816:V
4808:i
4795:1
4789:i
4785:w
4781:=
4777:)
4771:i
4767:y
4758:1
4752:i
4748:w
4741:T
4735:i
4731:x
4726:(
4720:i
4716:x
4710:i
4697:1
4691:i
4687:w
4683:=
4678:i
4674:w
4652:)
4646:i
4642:y
4633:1
4627:i
4623:w
4616:T
4610:i
4606:x
4601:(
4595:i
4591:x
4585:i
4572:1
4566:i
4562:w
4558:=
4553:i
4549:w
4512:1
4505:)
4501:I
4495:+
4490:i
4482:(
4479:=
4474:i
4447:1
4440:)
4436:I
4430:+
4427:I
4424:(
4421:=
4416:0
4389:2
4384:2
4376:w
4365:+
4360:2
4355:)
4349:j
4345:y
4338:w
4332:T
4326:j
4322:x
4317:(
4310:n
4305:1
4302:=
4299:j
4272:i
4247:)
4242:2
4238:d
4234:(
4231:O
4209:i
4184:i
4164:)
4159:2
4155:d
4151:n
4148:(
4145:O
4125:n
4096:1
4088:i
4080:=
4075:i
4050:i
4029:)
4023:i
4019:y
4010:1
4004:i
4000:w
3993:T
3987:i
3983:x
3978:(
3972:i
3968:x
3962:i
3949:1
3943:i
3939:w
3935:=
3930:i
3926:w
3901:i
3897:x
3891:1
3885:i
3874:T
3868:i
3864:x
3860:+
3857:1
3850:1
3844:i
3833:T
3827:i
3823:x
3817:i
3813:x
3807:1
3801:i
3785:1
3779:i
3771:=
3766:i
3738:d
3732:d
3727:R
3719:I
3716:=
3711:0
3682:d
3677:R
3669:0
3666:=
3661:0
3657:w
3625:i
3600:)
3595:2
3591:d
3587:(
3584:O
3564:)
3559:3
3555:d
3551:n
3548:(
3545:O
3542:=
3539:)
3534:3
3530:d
3526:n
3523:+
3518:2
3514:d
3510:n
3507:(
3504:O
3484:)
3479:2
3475:d
3471:(
3468:O
3445:T
3439:1
3436:+
3433:i
3429:x
3423:1
3420:+
3417:i
3413:x
3390:i
3365:)
3360:3
3356:d
3352:n
3349:+
3344:2
3340:d
3334:2
3330:n
3326:(
3323:O
3303:n
3300:,
3294:,
3291:1
3288:=
3285:i
3265:n
3245:)
3240:3
3236:d
3232:+
3227:2
3223:d
3219:i
3216:(
3213:O
3193:)
3188:2
3184:d
3180:(
3177:O
3157:)
3152:3
3148:d
3144:(
3141:O
3121:d
3115:d
3095:)
3090:2
3086:d
3082:i
3079:(
3076:O
3053:T
3047:j
3043:x
3037:j
3033:x
3027:i
3022:1
3019:=
3016:j
3008:=
3003:i
2976:.
2971:j
2967:y
2961:j
2957:x
2951:i
2946:1
2943:=
2940:j
2930:1
2922:i
2914:=
2911:y
2905:T
2900:X
2894:1
2887:)
2883:X
2877:T
2872:X
2868:(
2865:=
2856:w
2832:x
2829:,
2820:w
2813:=
2810:)
2807:x
2804:(
2795:f
2774:X
2768:T
2763:X
2759:=
2754:i
2729:i
2707:i
2702:R
2694:y
2674:d
2668:i
2648:X
2626:.
2622:R
2613:j
2609:y
2586:2
2582:)
2576:j
2572:y
2565:w
2559:T
2553:j
2549:x
2545:(
2540:n
2535:1
2532:=
2529:j
2521:=
2518:)
2513:j
2509:y
2505:,
2497:j
2493:x
2489:,
2486:w
2480:(
2477:V
2472:n
2467:1
2464:=
2461:j
2453:=
2450:]
2447:w
2444:[
2439:n
2435:I
2414:w
2392:2
2388:)
2382:j
2378:y
2366:j
2362:x
2358:,
2355:w
2349:(
2346:=
2341:2
2337:)
2331:j
2327:y
2320:)
2315:j
2311:x
2307:(
2304:f
2301:(
2298:=
2295:)
2290:j
2286:y
2282:,
2279:)
2274:j
2270:x
2266:(
2263:f
2260:(
2257:V
2237:w
2215:d
2210:R
2202:w
2180:d
2175:R
2165:j
2161:x
2138:j
2134:x
2127:w
2124:=
2116:j
2112:x
2108:,
2105:w
2099:=
2096:)
2091:j
2087:x
2083:(
2080:f
2060:f
2001:b
1981:1
1975:b
1952:)
1947:t
1943:y
1939:,
1934:t
1930:x
1926:(
1923:,
1917:,
1914:)
1909:1
1905:y
1901:,
1896:1
1892:x
1888:(
1866:t
1862:f
1839:1
1836:+
1833:t
1829:f
1802:t
1798:f
1777:)
1772:1
1769:+
1766:t
1762:y
1758:,
1753:1
1750:+
1747:t
1743:x
1739:(
1697:f
1674:.
1668:)
1665:y
1662:,
1659:x
1656:(
1653:p
1650:d
1646:)
1643:y
1640:,
1637:)
1634:x
1631:(
1628:f
1625:(
1622:V
1616:=
1613:]
1610:)
1607:y
1604:,
1601:)
1598:x
1595:(
1592:f
1589:(
1586:V
1583:[
1579:E
1575:=
1572:]
1569:f
1566:[
1563:I
1543:)
1540:y
1537:,
1534:x
1531:(
1528:p
1508:)
1503:i
1499:y
1495:,
1490:i
1486:x
1482:(
1451:H
1427:H
1419:f
1399:y
1379:)
1376:x
1373:(
1370:f
1350:)
1347:y
1344:,
1341:)
1338:x
1335:(
1332:f
1329:(
1326:V
1305:R
1298:Y
1292:Y
1289::
1286:V
1262:)
1257:n
1253:y
1249:,
1244:n
1240:x
1236:(
1233:,
1227:,
1224:)
1219:1
1215:y
1211:,
1206:1
1202:x
1198:(
1174:)
1171:y
1168:,
1165:x
1162:(
1159:p
1139:Y
1133:X
1113:)
1110:y
1107:,
1104:x
1101:(
1098:p
1075:Y
1055:X
1035:Y
1029:X
1026::
1023:f
959:e
952:t
945:v
525:k
374:k
301:k
259:)
247:(
31:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.