Knowledge

Iterative method

Source 📝

1983: 2390: 143:. However, iterative methods are often useful even for linear problems involving many variables (sometimes on the order of millions), where direct methods would be prohibitively expensive (and in some cases impossible) even with the best available computing power. 1743: 1871:
Krylov methods can be considered as accelerations of stationary iterative methods), where they become transformations of the original operator to a presumably better conditioned one. The construction of preconditioners is a large research area.
572: 690: 423: 1046: 1611: 1237: 1511: 1423: 296:
of the derivative is strictly bounded by one in a neighborhood of the fixed point. If this condition holds at the fixed point, then a sufficiently small neighborhood (basin of attraction) must exist.
337:), form a "correction equation" for which this process is repeated. While these methods are simple to derive, implement, and analyze, convergence is only guaranteed for a limited class of matrices. 777: 620: 459: 133: 1621: 1120: 2041:
Amritkar, Amit; de Sturler, Eric; ƚwirydowicz, Katarzyna; Tafti, Danesh; Ahuja, Kapil (2015). "Recycling Krylov subspaces for CFD applications and a new hybrid recycling solver".
1851:
can be very large, and the iterative process reaches sufficient accuracy already far earlier. The analysis of these methods is hard, depending on a complicated function of the
874: 488: 87:
if the corresponding sequence converges for given initial approximations. A mathematically rigorous convergence analysis of an iterative method is usually performed; however,
832: 1548: 936: 1454: 1817: 1790: 1364: 1344: 1324: 1300: 1280: 1260: 1148: 959: 901: 800: 717: 2287: 70: 500: 272:. Alternately, superscripts in parentheses are often used in numerical methods, so as not to interfere with subscripts with other meanings. (For example, 1768:). The approximations to the solution are then formed by minimizing the residual over the subspace formed. The prototypical method in this class is the 17: 2158: 1967:, but its nature and applicability were misunderstood at the time. Only in the 1970s was it realized that conjugacy based methods work very well for 1948:
to a student of his. He proposed solving a 4-by-4 system of equations by repeatedly solving the component in which the residual was the largest .
628: 355: 971: 2282: 1558: 1863:
The approximating operator that appears in stationary iterative methods can also be incorporated in Krylov subspace methods such as
1156: 1464: 1376: 2776: 2296: 2151: 729: 1615: 42:
that uses an initial value to generate a sequence of improving approximate solutions for a class of problems, in which the
2857: 2319: 1824: 2371: 2232: 1955:
starting in the 1950s. The conjugate gradient method was also invented in the 1950s, with independent developments by
2339: 1847:
is the system size. However, in the presence of rounding errors this statement does not hold; moreover, in practice
2450: 2144: 584: 2389: 431: 105: 2727: 1738:{\displaystyle M:={\frac {1}{\omega (2-\omega )}}(D+\omega L)D^{-1}(D+\omega U)\quad (\omega \not \in \{0,2\})} 1913: 2835: 2455: 1968: 1749: 1370: 334: 1057: 2771: 2739: 2086: 2883: 2820: 2445: 1828: 289: 2878: 2766: 2722: 2615: 2344: 2324: 1552: 305: 2534: 840: 464: 2505: 2167: 1769: 168: 31: 2690: 2136: 1458: 102:, direct methods would deliver an exact solution (for example, solving a linear system of equations 2552: 2011: 1796: 1944:
to high precision. An early iterative method for solving a linear system appeared in a letter of
1933: 2734: 2633: 2349: 1919: 1852: 1820: 1515: 2825: 2810: 2700: 2578: 2227: 2204: 2171: 2021: 1996: 1761: 330: 808: 2714: 2680: 2583: 2525: 2406: 2212: 2192: 2122: 1890: 1521: 2761: 2588: 2500: 2060: 2001: 1945: 909: 136: 1433: 27:
Algorithm in which each approximation of the solution is derived from prior approximations
8: 2830: 2695: 2648: 2638: 2490: 2478: 2291: 2274: 2179: 183: 66: 2064: 2565: 2520: 2510: 2301: 2217: 2050: 2016: 1988: 1903: 1898: 1802: 1775: 1349: 1329: 1309: 1285: 1265: 1245: 1133: 944: 886: 785: 782:
An important theorem states that for a given iterative method and its iteration matrix
702: 333:
approximating the original one; and based on a measurement of the error in the result (
140: 62: 567:{\displaystyle \mathbf {e} ^{k}:=\mathbf {x} ^{k}-\mathbf {x} ^{*}\,,\quad k\geq 0\,.} 2573: 2251: 1982: 1956: 1884: 1303: 962: 83: 1951:
The theory of stationary iterative methods was solidly established with the work of
2653: 2643: 2547: 2424: 2329: 2264: 2068: 1793: 880: 98:
attempt to solve the problem by a finite sequence of operations. In the absence of
54: 2669: 2006: 1960: 1868: 803: 313: 293: 50: 2657: 2542: 2429: 2363: 2334: 2127: 1964: 99: 2072: 2872: 2815: 2799: 1839:
Since these methods form a basis, it is evident that the method converges in
1427: 1130:
Basic examples of stationary iterative methods use a splitting of the matrix
58: 1764:
of the sequence of successive matrix powers times the initial residual (the
46:-th approximation (called an "iterate") is derived from the previous ones. 2753: 2259: 685:{\displaystyle \mathbf {e} ^{k+1}=C\mathbf {e} ^{k}\quad \forall \,k\geq 0} 418:{\displaystyle \mathbf {x} ^{k+1}:=\Psi (\mathbf {x} ^{k})\,,\quad k\geq 0} 1041:{\displaystyle M\mathbf {x} ^{k+1}=N\mathbf {x} ^{k}+b\,,\quad k\geq 0\,.} 2840: 2222: 1952: 2166: 1823:(MINRES). In the case of non-symmetric matrices, methods such as the 88: 74: 39: 2242: 2055: 1880:
Mathematical methods relating to successive approximation include:
2040: 2562: 1606:{\displaystyle M:={\frac {1}{\omega }}D+L\quad (\omega \neq 0)} 1232:{\displaystyle A=D+L+U\,,\quad D:={\text{diag}}((a_{ii})_{i})} 1864: 1506:{\displaystyle M:={\frac {1}{\omega }}D\quad (\omega \neq 0)} 1418:{\displaystyle M:={\frac {1}{\omega }}I\quad (\omega \neq 0)} 329:
Stationary iterative methods solve a linear system with an
1834: 1936:
used iterative methods to calculate the sine of 1° and
1051:
From this follows that the iteration matrix is given by
1875: 772:{\displaystyle \lim _{k\rightarrow \infty }C^{k}=0\,.} 1916:, on existence of solutions of differential equations 1805: 1778: 1624: 1561: 1524: 1467: 1436: 1379: 1352: 1332: 1312: 1288: 1268: 1248: 1159: 1136: 1060: 974: 947: 912: 889: 843: 811: 788: 732: 705: 631: 587: 503: 467: 434: 358: 292:, a sufficient condition for convergence is that the 108: 2679: 1978: 1748:
Linear stationary iterative methods are also called
699:. An iterative method with a given iteration matrix 308:, the two main classes of iterative methods are the 139:). Iterative methods are often the only choice for 1922:, for numerical solution of differential equations 1811: 1784: 1737: 1605: 1542: 1505: 1448: 1417: 1358: 1338: 1318: 1294: 1274: 1254: 1231: 1142: 1114: 1040: 953: 930: 895: 868: 826: 794: 771: 711: 684: 614: 566: 482: 453: 417: 127: 2870: 734: 319: 2152: 615:{\displaystyle C\in \mathbb {R} ^{n\times n}} 2713: 2123:Templates for the Solution of Linear Systems 2106:Fixed-point theorems for compact convex sets 1729: 1717: 2191: 2130:Iterative Methods for Sparse Linear Systems 965:. The iterative methods are now defined as 53:criteria for a given iterative method like 2159: 2145: 1799:. For symmetric (and possibly indefinite) 1772:(CG) which assumes that the system matrix 1760:Krylov subspace methods work by forming a 1755: 454:{\displaystyle A\mathbf {x} =\mathbf {b} } 146: 128:{\displaystyle A\mathbf {x} =\mathbf {b} } 91:-based iterative methods are also common. 2405: 2054: 1181: 1108: 1034: 1020: 862: 765: 672: 596: 560: 546: 401: 2393:Optimization computes maxima and minima. 151:If an equation can be put into the form 2477: 1346:is the strict upper triangular part of 14: 2871: 1835:Convergence of Krylov subspace methods 1115:{\displaystyle C=I-M^{-1}A=M^{-1}N\,.} 2797: 2613: 2589:Principal pivoting algorithm of Lemke 2476: 2404: 2190: 2140: 1895:Means of finding zeros of functions: 1887:, for finding square roots of numbers 1616:Symmetric successive over-relaxation 879:The basic iterative methods work by 802:it is convergent if and only if its 2103: 1876:Methods of successive approximation 1825:generalized minimal residual method 24: 18:Methods of successive approximation 2798: 2388: 2233:Successive parabolic interpolation 1858: 744: 669: 380: 217: â‰„ 1, and the sequence { 175:, then one may begin with a point 79:method of successive approximation 25: 2895: 2614: 2553:Projective algorithm of Karmarkar 2116: 1553:Successive over-relaxation method 299: 250:th approximation or iteration of 2548:Ellipsoid algorithm of Khachiyan 2451:Sequential quadratic programming 2288:Broyden–Fletcher–Goldfarb–Shanno 2104:day, Mahlon (November 2, 1960). 2043:Journal of Computational Physics 1981: 1971:, especially the elliptic type. 1004: 980: 869:{\displaystyle \rho (C)<1\,.} 834:is smaller than unity, that is, 658: 634: 536: 521: 506: 483:{\displaystyle \mathbf {x} ^{*}} 470: 447: 439: 388: 361: 121: 113: 81:. An iterative method is called 1910:Differential-equation matters: 1707: 1587: 1487: 1399: 1185: 1024: 668: 550: 405: 324: 49:A specific implementation with 2506:Reduced gradient (Frank–Wolfe) 2097: 2079: 2034: 1969:partial differential equations 1942:The Treatise of Chord and Sine 1732: 1708: 1704: 1689: 1673: 1658: 1652: 1640: 1600: 1588: 1500: 1488: 1412: 1400: 1226: 1217: 1200: 1197: 853: 847: 821: 815: 741: 695:and this matrix is called the 577:An iterative method is called 428:and for a given linear system 398: 383: 233:will converge to the solution 13: 1: 2836:Spiral optimization algorithm 2456:Successive linear programming 2027: 1262:is only the diagonal part of 340: 2574:Simplex algorithm of Dantzig 2446:Augmented Lagrangian methods 320:Stationary iterative methods 310:stationary iterative methods 77:of an iterative method or a 7: 1974: 1829:biconjugate gradient method 1125: 290:continuously differentiable 10: 2900: 1928: 1831:(BiCG) have been derived. 306:system of linear equations 2853: 2806: 2793: 2777:Push–relabel maximum flow 2752: 2668: 2626: 2622: 2609: 2579:Revised simplex algorithm 2561: 2533: 2519: 2489: 2485: 2472: 2438: 2417: 2413: 2400: 2386: 2362: 2310: 2273: 2250: 2241: 2203: 2199: 2186: 2073:10.1016/j.jcp.2015.09.040 1770:conjugate gradient method 581:if there exists a matrix 32:computational mathematics 2302:Symmetric rank-one (SR1) 2283:Berndt–Hall–Hall–Hausman 2087:"Babylonian mathematics" 2012:Non-linear least squares 827:{\displaystyle \rho (C)} 2826:Parallel metaheuristics 2634:Approximation algorithm 2345:Powell's dog leg method 2297:Davidon–Fletcher–Powell 2193:Unconstrained nonlinear 2132:, 1st edition, PWS 1996 1914:Picard–Lindelöf theorem 1821:minimal residual method 1756:Krylov subspace methods 723:if the following holds 312:, and the more general 147:Attractive fixed points 2811:Evolutionary algorithm 2394: 2091:Babylonian mathematics 2022:Root-finding algorithm 1997:Closed-form expression 1813: 1786: 1739: 1607: 1544: 1543:{\displaystyle M:=D+L} 1507: 1450: 1419: 1360: 1340: 1320: 1296: 1276: 1256: 1233: 1144: 1116: 1042: 955: 932: 897: 870: 828: 796: 773: 713: 686: 616: 568: 484: 455: 419: 129: 40:mathematical procedure 2584:Criss-cross algorithm 2407:Constrained nonlinear 2392: 2213:Golden-section search 1891:Fixed-point iteration 1814: 1787: 1740: 1608: 1545: 1508: 1451: 1420: 1361: 1341: 1321: 1297: 1277: 1257: 1234: 1145: 1117: 1043: 956: 933: 931:{\displaystyle A=M-N} 898: 871: 829: 797: 774: 714: 687: 617: 569: 485: 456: 420: 130: 2501:Cutting-plane method 2002:Iterative refinement 1803: 1776: 1622: 1559: 1522: 1465: 1459:Damped Jacobi method 1449:{\displaystyle M:=D} 1434: 1377: 1350: 1330: 1310: 1302:is the strict lower 1286: 1266: 1246: 1157: 1134: 1058: 972: 945: 941:and here the matrix 910: 887: 841: 809: 786: 730: 703: 629: 585: 501: 465: 461:with exact solution 432: 356: 284:).) If the function 137:Gaussian elimination 106: 67:quasi-Newton methods 2831:Simulated annealing 2649:Integer programming 2639:Dynamic programming 2479:Convex optimization 2340:Levenberg–Marquardt 2093:. December 1, 2000. 2065:2015JCoPh.303..222A 1920:Runge–Kutta methods 1819:one works with the 1516:Gauss–Seidel method 184:basin of attraction 141:nonlinear equations 2884:Numerical analysis 2511:Subgradient method 2395: 2320:Conjugate gradient 2228:Nelder–Mead method 2017:Numerical analysis 1989:Mathematics portal 1843:iterations, where 1809: 1782: 1750:relaxation methods 1735: 1603: 1540: 1503: 1446: 1415: 1356: 1336: 1316: 1292: 1272: 1252: 1229: 1140: 1112: 1038: 951: 928: 893: 866: 824: 792: 769: 748: 709: 682: 612: 564: 480: 451: 415: 125: 2879:Iterative methods 2866: 2865: 2849: 2848: 2789: 2788: 2785: 2784: 2748: 2747: 2709: 2708: 2605: 2604: 2601: 2600: 2597: 2596: 2468: 2467: 2464: 2463: 2384: 2383: 2380: 2379: 2358: 2357: 1957:Cornelius Lanczos 1885:Babylonian method 1855:of the operator. 1812:{\displaystyle A} 1797:positive-definite 1785:{\displaystyle A} 1656: 1576: 1482: 1394: 1371:Richardson method 1359:{\displaystyle A} 1339:{\displaystyle U} 1319:{\displaystyle A} 1295:{\displaystyle L} 1275:{\displaystyle A} 1255:{\displaystyle D} 1195: 1143:{\displaystyle A} 961:should be easily 954:{\displaystyle M} 896:{\displaystyle A} 795:{\displaystyle C} 733: 712:{\displaystyle C} 304:In the case of a 268:+ 1 iteration of 167:is an attractive 163:, and a solution 16:(Redirected from 2891: 2795: 2794: 2711: 2710: 2677: 2676: 2654:Branch and bound 2644:Greedy algorithm 2624: 2623: 2611: 2610: 2531: 2530: 2487: 2486: 2474: 2473: 2415: 2414: 2402: 2401: 2350:Truncated Newton 2265:Wolfe conditions 2248: 2247: 2201: 2200: 2188: 2187: 2161: 2154: 2147: 2138: 2137: 2110: 2109: 2101: 2095: 2094: 2083: 2077: 2076: 2058: 2038: 1991: 1986: 1985: 1939: 1934:JamshÄ«d al-KāshÄ« 1867:(alternatively, 1827:(GMRES) and the 1818: 1816: 1815: 1810: 1791: 1789: 1788: 1783: 1744: 1742: 1741: 1736: 1688: 1687: 1657: 1655: 1632: 1612: 1610: 1609: 1604: 1577: 1569: 1549: 1547: 1546: 1541: 1512: 1510: 1509: 1504: 1483: 1475: 1455: 1453: 1452: 1447: 1424: 1422: 1421: 1416: 1395: 1387: 1365: 1363: 1362: 1357: 1345: 1343: 1342: 1337: 1326:. Respectively, 1325: 1323: 1322: 1317: 1301: 1299: 1298: 1293: 1281: 1279: 1278: 1273: 1261: 1259: 1258: 1253: 1238: 1236: 1235: 1230: 1225: 1224: 1215: 1214: 1196: 1193: 1149: 1147: 1146: 1141: 1121: 1119: 1118: 1113: 1104: 1103: 1085: 1084: 1047: 1045: 1044: 1039: 1013: 1012: 1007: 995: 994: 983: 960: 958: 957: 952: 937: 935: 934: 929: 902: 900: 899: 894: 875: 873: 872: 867: 833: 831: 830: 825: 801: 799: 798: 793: 778: 776: 775: 770: 758: 757: 747: 718: 716: 715: 710: 697:iteration matrix 691: 689: 688: 683: 667: 666: 661: 649: 648: 637: 621: 619: 618: 613: 611: 610: 599: 573: 571: 570: 565: 545: 544: 539: 530: 529: 524: 515: 514: 509: 489: 487: 486: 481: 479: 478: 473: 460: 458: 457: 452: 450: 442: 424: 422: 421: 416: 397: 396: 391: 376: 375: 364: 347:iterative method 171:of the function 134: 132: 131: 126: 124: 116: 55:gradient descent 36:iterative method 21: 2899: 2898: 2894: 2893: 2892: 2890: 2889: 2888: 2869: 2868: 2867: 2862: 2845: 2802: 2781: 2744: 2705: 2682: 2671: 2664: 2618: 2593: 2557: 2524: 2515: 2492: 2481: 2460: 2434: 2430:Penalty methods 2425:Barrier methods 2409: 2396: 2376: 2372:Newton's method 2354: 2306: 2269: 2237: 2218:Powell's method 2195: 2182: 2165: 2119: 2114: 2113: 2108:. Mahlon M day. 2102: 2098: 2085: 2084: 2080: 2039: 2035: 2030: 2007:Kaczmarz method 1987: 1980: 1977: 1961:Magnus Hestenes 1937: 1931: 1904:Newton's method 1899:Halley's method 1878: 1861: 1859:Preconditioners 1837: 1804: 1801: 1800: 1777: 1774: 1773: 1766:Krylov sequence 1758: 1680: 1676: 1636: 1631: 1623: 1620: 1619: 1568: 1560: 1557: 1556: 1523: 1520: 1519: 1474: 1466: 1463: 1462: 1435: 1432: 1431: 1386: 1378: 1375: 1374: 1351: 1348: 1347: 1331: 1328: 1327: 1311: 1308: 1307: 1304:triangular part 1287: 1284: 1283: 1267: 1264: 1263: 1247: 1244: 1243: 1220: 1216: 1207: 1203: 1192: 1158: 1155: 1154: 1135: 1132: 1131: 1128: 1096: 1092: 1077: 1073: 1059: 1056: 1055: 1008: 1003: 1002: 984: 979: 978: 973: 970: 969: 946: 943: 942: 911: 908: 907: 888: 885: 884: 842: 839: 838: 810: 807: 806: 804:spectral radius 787: 784: 783: 753: 749: 737: 731: 728: 727: 704: 701: 700: 662: 657: 656: 638: 633: 632: 630: 627: 626: 600: 595: 594: 586: 583: 582: 540: 535: 534: 525: 520: 519: 510: 505: 504: 502: 499: 498: 474: 469: 468: 466: 463: 462: 446: 438: 433: 430: 429: 392: 387: 386: 365: 360: 359: 357: 354: 353: 343: 327: 322: 314:Krylov subspace 302: 294:spectral radius 264:is the next or 263: 245: 232: 225: 212: 199: 181: 149: 120: 112: 107: 104: 103: 100:rounding errors 63:Newton's method 28: 23: 22: 15: 12: 11: 5: 2897: 2887: 2886: 2881: 2864: 2863: 2861: 2860: 2854: 2851: 2850: 2847: 2846: 2844: 2843: 2838: 2833: 2828: 2823: 2818: 2813: 2807: 2804: 2803: 2800:Metaheuristics 2791: 2790: 2787: 2786: 2783: 2782: 2780: 2779: 2774: 2772:Ford–Fulkerson 2769: 2764: 2758: 2756: 2750: 2749: 2746: 2745: 2743: 2742: 2740:Floyd–Warshall 2737: 2732: 2731: 2730: 2719: 2717: 2707: 2706: 2704: 2703: 2698: 2693: 2687: 2685: 2674: 2666: 2665: 2663: 2662: 2661: 2660: 2646: 2641: 2636: 2630: 2628: 2620: 2619: 2607: 2606: 2603: 2602: 2599: 2598: 2595: 2594: 2592: 2591: 2586: 2581: 2576: 2570: 2568: 2559: 2558: 2556: 2555: 2550: 2545: 2543:Affine scaling 2539: 2537: 2535:Interior point 2528: 2517: 2516: 2514: 2513: 2508: 2503: 2497: 2495: 2483: 2482: 2470: 2469: 2466: 2465: 2462: 2461: 2459: 2458: 2453: 2448: 2442: 2440: 2439:Differentiable 2436: 2435: 2433: 2432: 2427: 2421: 2419: 2411: 2410: 2398: 2397: 2387: 2385: 2382: 2381: 2378: 2377: 2375: 2374: 2368: 2366: 2360: 2359: 2356: 2355: 2353: 2352: 2347: 2342: 2337: 2332: 2327: 2322: 2316: 2314: 2308: 2307: 2305: 2304: 2299: 2294: 2285: 2279: 2277: 2271: 2270: 2268: 2267: 2262: 2256: 2254: 2245: 2239: 2238: 2236: 2235: 2230: 2225: 2220: 2215: 2209: 2207: 2197: 2196: 2184: 2183: 2164: 2163: 2156: 2149: 2141: 2135: 2134: 2125: 2118: 2117:External links 2115: 2112: 2111: 2096: 2078: 2032: 2031: 2029: 2026: 2025: 2024: 2019: 2014: 2009: 2004: 1999: 1993: 1992: 1976: 1973: 1965:Eduard Stiefel 1930: 1927: 1926: 1925: 1924: 1923: 1917: 1908: 1907: 1906: 1901: 1893: 1888: 1877: 1874: 1869:preconditioned 1860: 1857: 1836: 1833: 1808: 1781: 1757: 1754: 1746: 1745: 1734: 1731: 1728: 1725: 1722: 1719: 1716: 1713: 1710: 1706: 1703: 1700: 1697: 1694: 1691: 1686: 1683: 1679: 1675: 1672: 1669: 1666: 1663: 1660: 1654: 1651: 1648: 1645: 1642: 1639: 1635: 1630: 1627: 1613: 1602: 1599: 1596: 1593: 1590: 1586: 1583: 1580: 1575: 1572: 1567: 1564: 1550: 1539: 1536: 1533: 1530: 1527: 1513: 1502: 1499: 1496: 1493: 1490: 1486: 1481: 1478: 1473: 1470: 1456: 1445: 1442: 1439: 1425: 1414: 1411: 1408: 1405: 1402: 1398: 1393: 1390: 1385: 1382: 1355: 1335: 1315: 1291: 1271: 1251: 1240: 1239: 1228: 1223: 1219: 1213: 1210: 1206: 1202: 1199: 1191: 1188: 1184: 1180: 1177: 1174: 1171: 1168: 1165: 1162: 1139: 1127: 1124: 1123: 1122: 1111: 1107: 1102: 1099: 1095: 1091: 1088: 1083: 1080: 1076: 1072: 1069: 1066: 1063: 1049: 1048: 1037: 1033: 1030: 1027: 1023: 1019: 1016: 1011: 1006: 1001: 998: 993: 990: 987: 982: 977: 950: 939: 938: 927: 924: 921: 918: 915: 892: 877: 876: 865: 861: 858: 855: 852: 849: 846: 823: 820: 817: 814: 791: 780: 779: 768: 764: 761: 756: 752: 746: 743: 740: 736: 708: 693: 692: 681: 678: 675: 671: 665: 660: 655: 652: 647: 644: 641: 636: 609: 606: 603: 598: 593: 590: 575: 574: 563: 559: 556: 553: 549: 543: 538: 533: 528: 523: 518: 513: 508: 477: 472: 449: 445: 441: 437: 426: 425: 414: 411: 408: 404: 400: 395: 390: 385: 382: 379: 374: 371: 368: 363: 349:is defined by 342: 339: 326: 323: 321: 318: 301: 300:Linear systems 298: 258: 241: 231: â‰„ 1 227: 221: 208: 194: 179: 148: 145: 123: 119: 115: 111: 96:direct methods 26: 9: 6: 4: 3: 2: 2896: 2885: 2882: 2880: 2877: 2876: 2874: 2859: 2856: 2855: 2852: 2842: 2839: 2837: 2834: 2832: 2829: 2827: 2824: 2822: 2819: 2817: 2816:Hill climbing 2814: 2812: 2809: 2808: 2805: 2801: 2796: 2792: 2778: 2775: 2773: 2770: 2768: 2765: 2763: 2760: 2759: 2757: 2755: 2754:Network flows 2751: 2741: 2738: 2736: 2733: 2729: 2726: 2725: 2724: 2721: 2720: 2718: 2716: 2715:Shortest path 2712: 2702: 2699: 2697: 2694: 2692: 2689: 2688: 2686: 2684: 2683:spanning tree 2678: 2675: 2673: 2667: 2659: 2655: 2652: 2651: 2650: 2647: 2645: 2642: 2640: 2637: 2635: 2632: 2631: 2629: 2625: 2621: 2617: 2616:Combinatorial 2612: 2608: 2590: 2587: 2585: 2582: 2580: 2577: 2575: 2572: 2571: 2569: 2567: 2564: 2560: 2554: 2551: 2549: 2546: 2544: 2541: 2540: 2538: 2536: 2532: 2529: 2527: 2522: 2518: 2512: 2509: 2507: 2504: 2502: 2499: 2498: 2496: 2494: 2488: 2484: 2480: 2475: 2471: 2457: 2454: 2452: 2449: 2447: 2444: 2443: 2441: 2437: 2431: 2428: 2426: 2423: 2422: 2420: 2416: 2412: 2408: 2403: 2399: 2391: 2373: 2370: 2369: 2367: 2365: 2361: 2351: 2348: 2346: 2343: 2341: 2338: 2336: 2333: 2331: 2328: 2326: 2323: 2321: 2318: 2317: 2315: 2313: 2312:Other methods 2309: 2303: 2300: 2298: 2295: 2293: 2289: 2286: 2284: 2281: 2280: 2278: 2276: 2272: 2266: 2263: 2261: 2258: 2257: 2255: 2253: 2249: 2246: 2244: 2240: 2234: 2231: 2229: 2226: 2224: 2221: 2219: 2216: 2214: 2211: 2210: 2208: 2206: 2202: 2198: 2194: 2189: 2185: 2181: 2177: 2173: 2169: 2162: 2157: 2155: 2150: 2148: 2143: 2142: 2139: 2133: 2131: 2126: 2124: 2121: 2120: 2107: 2100: 2092: 2088: 2082: 2074: 2070: 2066: 2062: 2057: 2052: 2048: 2044: 2037: 2033: 2023: 2020: 2018: 2015: 2013: 2010: 2008: 2005: 2003: 2000: 1998: 1995: 1994: 1990: 1984: 1979: 1972: 1970: 1966: 1962: 1958: 1954: 1949: 1947: 1943: 1935: 1921: 1918: 1915: 1912: 1911: 1909: 1905: 1902: 1900: 1897: 1896: 1894: 1892: 1889: 1886: 1883: 1882: 1881: 1873: 1870: 1866: 1856: 1854: 1850: 1846: 1842: 1832: 1830: 1826: 1822: 1806: 1798: 1795: 1779: 1771: 1767: 1763: 1753: 1751: 1726: 1723: 1720: 1714: 1711: 1701: 1698: 1695: 1692: 1684: 1681: 1677: 1670: 1667: 1664: 1661: 1649: 1646: 1643: 1637: 1633: 1628: 1625: 1617: 1614: 1597: 1594: 1591: 1584: 1581: 1578: 1573: 1570: 1565: 1562: 1554: 1551: 1537: 1534: 1531: 1528: 1525: 1517: 1514: 1497: 1494: 1491: 1484: 1479: 1476: 1471: 1468: 1460: 1457: 1443: 1440: 1437: 1429: 1428:Jacobi method 1426: 1409: 1406: 1403: 1396: 1391: 1388: 1383: 1380: 1372: 1369: 1368: 1367: 1353: 1333: 1313: 1305: 1289: 1269: 1249: 1221: 1211: 1208: 1204: 1189: 1186: 1182: 1178: 1175: 1172: 1169: 1166: 1163: 1160: 1153: 1152: 1151: 1137: 1109: 1105: 1100: 1097: 1093: 1089: 1086: 1081: 1078: 1074: 1070: 1067: 1064: 1061: 1054: 1053: 1052: 1035: 1031: 1028: 1025: 1021: 1017: 1014: 1009: 999: 996: 991: 988: 985: 975: 968: 967: 966: 964: 948: 925: 922: 919: 916: 913: 906: 905: 904: 890: 882: 863: 859: 856: 850: 844: 837: 836: 835: 818: 812: 805: 789: 766: 762: 759: 754: 750: 738: 726: 725: 724: 722: 706: 698: 679: 676: 673: 663: 653: 650: 645: 642: 639: 625: 624: 623: 607: 604: 601: 591: 588: 580: 561: 557: 554: 551: 547: 541: 531: 526: 516: 511: 497: 496: 495: 493: 475: 443: 435: 412: 409: 406: 402: 393: 377: 372: 369: 366: 352: 351: 350: 348: 338: 336: 332: 317: 315: 311: 307: 297: 295: 291: 287: 283: 279: 275: 271: 267: 261: 257: 253: 249: 244: 240: 236: 230: 224: 220: 216: 211: 207: 203: 197: 193: 189: 185: 178: 174: 170: 166: 162: 158: 154: 144: 142: 138: 117: 109: 101: 97: 94:In contrast, 92: 90: 86: 85: 80: 76: 72: 68: 64: 60: 59:hill climbing 56: 52: 47: 45: 41: 37: 33: 19: 2821:Local search 2767:Edmonds–Karp 2723:Bellman–Ford 2493:minimization 2325:Gauss–Newton 2311: 2275:Quasi–Newton 2260:Trust region 2175: 2168:Optimization 2129: 2105: 2099: 2090: 2081: 2046: 2042: 2036: 1950: 1941: 1932: 1879: 1862: 1848: 1844: 1840: 1838: 1765: 1759: 1747: 1241: 1129: 1050: 940: 878: 781: 720: 696: 694: 578: 576: 491: 427: 346: 344: 335:the residual 328: 325:Introduction 309: 303: 285: 281: 277: 273: 269: 265: 259: 255: 251: 247: 242: 238: 234: 228: 222: 218: 214: 209: 205: 201: 195: 191: 187: 176: 172: 164: 160: 156: 152: 150: 95: 93: 82: 78: 48: 43: 35: 29: 2841:Tabu search 2252:Convergence 2223:Line search 883:the matrix 169:fixed point 51:termination 2873:Categories 2672:algorithms 2180:heuristics 2172:Algorithms 2056:1501.03358 2028:References 1953:D.M. Young 963:invertible 721:convergent 719:is called 622:such that 341:Definition 190:, and let 84:convergent 2627:Paradigms 2526:quadratic 2243:Gradients 2205:Functions 2128:Y. Saad: 1794:symmetric 1712:ω 1699:ω 1682:− 1668:ω 1650:ω 1647:− 1638:ω 1595:≠ 1592:ω 1574:ω 1495:≠ 1492:ω 1480:ω 1407:≠ 1404:ω 1392:ω 1098:− 1079:− 1071:− 1029:≥ 923:− 881:splitting 845:ρ 813:ρ 745:∞ 742:→ 677:≥ 670:∀ 605:× 592:∈ 555:≥ 542:∗ 532:− 476:∗ 410:≥ 381:Ψ 316:methods. 89:heuristic 75:algorithm 2858:Software 2735:Dijkstra 2566:exchange 2364:Hessians 2330:Gradient 1975:See also 1853:spectrum 1715:∉ 1618:(SSOR): 1150:such as 1126:Examples 331:operator 73:, is an 2701:Kruskal 2691:BorĆŻvka 2681:Minimum 2418:General 2176:methods 2061:Bibcode 2049:: 222. 1929:History 1555:(SOR): 246:is the 237:. Here 182:in the 2563:Basis- 2521:Linear 2491:Convex 2335:Mirror 2292:L-BFGS 2178:, and 1282:, and 1242:where 579:linear 213:) for 2762:Dinic 2670:Graph 2051:arXiv 1946:Gauss 1865:GMRES 1762:basis 903:into 492:error 69:like 65:, or 38:is a 34:, an 2728:SPFA 2696:Prim 2290:and 1963:and 1194:diag 857:< 490:the 254:and 159:) = 71:BFGS 2658:cut 2523:and 2069:doi 2047:303 1940:in 1792:is 1306:of 735:lim 494:by 345:An 288:is 186:of 135:by 30:In 2875:: 2174:, 2170:: 2089:. 2067:. 2059:. 2045:. 1959:, 1752:. 1629::= 1566::= 1529::= 1518:: 1472::= 1461:: 1441::= 1430:: 1384::= 1373:: 1366:. 1190::= 517::= 378::= 276:= 262:+1 200:= 198:+1 61:, 57:, 2656:/ 2160:e 2153:t 2146:v 2075:. 2071:: 2063:: 2053:: 1938:π 1849:N 1845:N 1841:N 1807:A 1780:A 1733:) 1730:} 1727:2 1724:, 1721:0 1718:{ 1709:( 1705:) 1702:U 1696:+ 1693:D 1690:( 1685:1 1678:D 1674:) 1671:L 1665:+ 1662:D 1659:( 1653:) 1644:2 1641:( 1634:1 1626:M 1601:) 1598:0 1589:( 1585:L 1582:+ 1579:D 1571:1 1563:M 1538:L 1535:+ 1532:D 1526:M 1501:) 1498:0 1489:( 1485:D 1477:1 1469:M 1444:D 1438:M 1413:) 1410:0 1401:( 1397:I 1389:1 1381:M 1354:A 1334:U 1314:A 1290:L 1270:A 1250:D 1227:) 1222:i 1218:) 1212:i 1209:i 1205:a 1201:( 1198:( 1187:D 1183:, 1179:U 1176:+ 1173:L 1170:+ 1167:D 1164:= 1161:A 1138:A 1110:. 1106:N 1101:1 1094:M 1090:= 1087:A 1082:1 1075:M 1068:I 1065:= 1062:C 1036:. 1032:0 1026:k 1022:, 1018:b 1015:+ 1010:k 1005:x 1000:N 997:= 992:1 989:+ 986:k 981:x 976:M 949:M 926:N 920:M 917:= 914:A 891:A 864:. 860:1 854:) 851:C 848:( 822:) 819:C 816:( 790:C 767:. 763:0 760:= 755:k 751:C 739:k 707:C 680:0 674:k 664:k 659:e 654:C 651:= 646:1 643:+ 640:k 635:e 608:n 602:n 597:R 589:C 562:. 558:0 552:k 548:, 537:x 527:k 522:x 512:k 507:e 471:x 448:b 444:= 440:x 436:A 413:0 407:k 403:, 399:) 394:k 389:x 384:( 373:1 370:+ 367:k 362:x 286:f 282:x 280:( 278:f 274:x 270:x 266:n 260:n 256:x 252:x 248:n 243:n 239:x 235:x 229:n 226:} 223:n 219:x 215:n 210:n 206:x 204:( 202:f 196:n 192:x 188:x 180:1 177:x 173:f 165:x 161:x 157:x 155:( 153:f 122:b 118:= 114:x 110:A 44:i 20:)

Index

Methods of successive approximation
computational mathematics
mathematical procedure
termination
gradient descent
hill climbing
Newton's method
quasi-Newton methods
BFGS
algorithm
convergent
heuristic
rounding errors
Gaussian elimination
nonlinear equations
fixed point
basin of attraction
continuously differentiable
spectral radius
system of linear equations
Krylov subspace
operator
the residual
spectral radius
splitting
invertible
triangular part
Richardson method
Jacobi method
Damped Jacobi method

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑