Knowledge

Singleton (global governance)

Source đź“ť

101:
complete universal conquest of their world and perpetuate the conquest. Now, however, this is possible. Technology undoes both geographic and climatic barriers. "Today no technological obstacle stands in the way of a world-wide empire", as "modern technology makes it possible to extend the control of mind and action to every corner of the globe regardless of geography and season."
45:, a strong set of global norms that include effective provisions for their own enforcement, or even an alien overlord—its defining characteristic being simply that it is some form of agency that can solve all major global coordination problems. It may, but need not, resemble any familiar form of human governance. 77:
and to delay the development of other (unfriendly) artificial intelligences until and unless the safety issues are solved. A singleton could set "very strict limitations on its own exercise of power (e.g. punctiliously confining itself to ensuring that certain treaty-specified international rules—or
100:
stressed that the mechanical development of weapons, transportation, and communication makes "the conquest of the world technically possible, and they make it technically possible to keep the world in that conquered state". Its lack was the reason why great ancient empires, though vast, failed to
24:
is a hypothetical world order in which there is a single decision-making agency at the highest level, capable of exerting effective control over its domain, and permanently preventing both internal and external threats to its supremacy. The term was first defined by
60:
A singleton has both potential risks and potential benefits. Notably, a suitable singleton could solve world coordination problems that would not otherwise be solvable, opening up otherwise unavailable developmental trajectories for civilization. For example,
430: 85:
Yet Bostrom also regards the possibility of a stable, repressive, totalitarian global regime as a serious existential risk. The very stability of a singleton makes the installation of a
676: 69:
researcher, suggests humans may instead decide to create an "AI Nanny" with "mildly superhuman intelligence and surveillance powers", to protect the human race from
220: 534: 683: 358: 527: 598: 520: 394: 171: 260: 492: 334: 294: 740: 715: 662: 120: 66: 228: 725: 543: 705: 37:
According to Nick Bostrom, a singleton is an abstract concept that could be implemented in various ways:
623: 730: 669: 628: 603: 78:
libertarian principles—are respected)". Furthermore, Bostrom suggests that a singleton could hold
42: 326: 735: 720: 503: 82:
pressures in check, preventing agents interested only in reproduction from coming to dominate.
480: 431:
Should Humanity Build a Global AI Nanny to Delay the Singularity Until It’s Better Understood?
284: 250: 194: 655: 633: 8: 613: 79: 710: 319: 488: 390: 330: 290: 256: 125: 93:
writes that "perhaps an eternity of totalitarianism would be worse than extinction".
70: 50: 460: 115: 577: 562: 97: 443: 410: 150: 582: 512: 89:
singleton especially catastrophic, since the consequences can never be undone.
74: 389:(1st ed.). Oxford, United Kingdom New York, NY: Oxford University Press. 699: 618: 461:"Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards" 448:
Death and Anti-Death: Two Hundred Years After Kant, Fifty Years After Turing
608: 567: 557: 450:, ed. Charles Tandy (Ria University Press: Palo Alto, California): 339-371. 90: 62: 54: 26: 252:
Our Final Invention: Artificial Intelligence and the End of the Human Era
110: 130: 17: 57:
and mind control could also facilitate the creation of a singleton.
572: 487:, eds. Bostrom & Cirkovic (Oxford University Press): 504-519. 172:"7 Totally Unexpected Outcomes That Could Follow the Singularity" 286:
Here Be Dragons: Science, Technology and the Future of Humanity
41:
a singleton could be democracy, a tyranny, a single dominant
506:, 4th edition, New York: Alfred A. Knopf, 1967, p 358-365. 504:
Politics Among Nations: The Struggle for Power and Peace
433:", Journal of consciousness studies 19.1-2 (2012): 1-2. 321:
Nanoethics: Big Ethical Issues with Small Technology
318: 697: 542: 684:Superintelligence: Paths, Dangers, Strategies 528: 387:Superintelligence: paths, dangers, strategies 316: 415:Linguistic and Philosophical Investigations 155:Linguistic and Philosophical Investigations 535: 521: 221:"Die Superintelligenz ist gar nicht super" 282: 53:could form a singleton. Technologies for 356: 384: 359:"Fukushima der kĂĽnstlichen Intelligenz" 169: 698: 599:Differential technological development 357:Könneker, Carsten (19 November 2015). 248: 192: 516: 218: 193:Miller, James D. (6 September 2011). 380: 378: 376: 465:Journal of Evolution and Technology 13: 219:Thiel, Thomas (21 December 2014). 14: 752: 403: 373: 170:Dvorsky, George (11 June 2013). 121:Friendly artificial intelligence 497: 473: 453: 444:"The Future of Human Evolution" 436: 423: 350: 310: 276: 249:Barrat, James (October 2013). 242: 229:Frankfurter Allgemeine Zeitung 212: 186: 163: 143: 1: 136: 544:Future of Humanity Institute 7: 289:. Oxford University Press. 104: 32: 10: 759: 624:Self-indication assumption 663:Global Catastrophic Risks 647: 591: 550: 485:Global Catastrophic Risks 325:. A&C Black. p.  317:O'MathĂşna, DĂłnal (2009). 629:Self-sampling assumption 604:Global catastrophic risk 283:Haggstrom, Olle (2016). 195:"The Singleton Solution" 741:Government by algorithm 716:International relations 481:The totalitarian threat 479:Bryan Caplan (2008). " 411:"What is a Singleton?" 385:Bostrom, Nick (2014). 151:"What is a Singleton?" 80:Darwinian evolutionary 49:Bostrom argues that a 47: 634:Simulation hypothesis 459:Nick Bostrom (2002). 442:Nick Bostrom (2004). 409:Nick Bostrom (2006). 149:Nick Bostrom (2006). 39: 726:Supranational unions 706:Singularitarianism 693: 692: 670:Human Enhancement 396:978-0-19-967811-2 199:hplusmagazine.com 126:Superintelligence 71:existential risks 51:superintelligence 748: 731:World government 614:Pascal's mugging 537: 530: 523: 514: 513: 507: 501: 495: 477: 471: 457: 451: 440: 434: 429:Goertzel, Ben. " 427: 421: 407: 401: 400: 382: 371: 370: 368: 366: 354: 348: 347: 345: 343: 324: 314: 308: 307: 305: 303: 280: 274: 273: 271: 269: 246: 240: 239: 237: 235: 216: 210: 209: 207: 205: 190: 184: 183: 181: 179: 167: 161: 147: 116:Existential risk 758: 757: 751: 750: 749: 747: 746: 745: 696: 695: 694: 689: 643: 587: 578:Anders Sandberg 563:K. Eric Drexler 546: 541: 511: 510: 502: 498: 478: 474: 458: 454: 441: 437: 428: 424: 408: 404: 397: 383: 374: 364: 362: 355: 351: 341: 339: 337: 315: 311: 301: 299: 297: 281: 277: 267: 265: 263: 247: 243: 233: 231: 217: 213: 203: 201: 191: 187: 177: 175: 168: 164: 148: 144: 139: 107: 98:Hans Morgenthau 35: 12: 11: 5: 756: 755: 744: 743: 738: 733: 728: 723: 718: 713: 708: 691: 690: 688: 687: 680: 673: 666: 659: 656:Anthropic Bias 651: 649: 645: 644: 642: 641: 636: 631: 626: 621: 616: 611: 606: 601: 595: 593: 589: 588: 586: 585: 583:Rebecca Roache 580: 575: 570: 565: 560: 554: 552: 548: 547: 540: 539: 532: 525: 517: 509: 508: 496: 472: 452: 435: 422: 402: 395: 372: 349: 335: 309: 295: 275: 262:978-0312622374 261: 241: 211: 185: 162: 141: 140: 138: 135: 134: 133: 128: 123: 118: 113: 106: 103: 75:nanotechnology 34: 31: 9: 6: 4: 3: 2: 754: 753: 742: 739: 737: 736:Global issues 734: 732: 729: 727: 724: 722: 721:Risk analysis 719: 717: 714: 712: 709: 707: 704: 703: 701: 686: 685: 681: 679: 678: 677:The Precipice 674: 672: 671: 667: 665: 664: 660: 658: 657: 653: 652: 650: 646: 640: 637: 635: 632: 630: 627: 625: 622: 620: 619:Reversal test 617: 615: 612: 610: 607: 605: 602: 600: 597: 596: 594: 590: 584: 581: 579: 576: 574: 571: 569: 566: 564: 561: 559: 556: 555: 553: 549: 545: 538: 533: 531: 526: 524: 519: 518: 515: 505: 500: 494: 493:9780198570509 490: 486: 482: 476: 469: 466: 462: 456: 449: 445: 439: 432: 426: 419: 416: 412: 406: 398: 392: 388: 381: 379: 377: 360: 353: 338: 336:9781847063953 332: 328: 323: 322: 313: 298: 296:9780198723547 292: 288: 287: 279: 264: 258: 255:. Macmillan. 254: 253: 245: 230: 226: 222: 215: 200: 196: 189: 173: 166: 159: 156: 152: 146: 142: 132: 129: 127: 124: 122: 119: 117: 114: 112: 109: 108: 102: 99: 94: 92: 88: 83: 81: 76: 72: 68: 64: 58: 56: 52: 46: 44: 38: 30: 28: 23: 19: 682: 675: 668: 661: 654: 638: 609:Great Filter 568:Robin Hanson 558:Nick Bostrom 499: 484: 475: 467: 464: 455: 447: 438: 425: 417: 414: 405: 386: 363:. Retrieved 352: 340:. Retrieved 320: 312: 300:. Retrieved 285: 278: 266:. Retrieved 251: 244: 232:. Retrieved 224: 214: 202:. Retrieved 198: 188: 176:. Retrieved 165: 157: 154: 145: 95: 91:Bryan Caplan 86: 84: 63:Ben Goertzel 59: 55:surveillance 48: 40: 36: 27:Nick Bostrom 21: 15: 420:(2): 48-54. 160:(2): 48-54. 111:AI takeover 700:Categories 365:3 February 361:. Spektrum 342:3 February 302:3 February 268:3 February 234:3 February 204:3 February 178:3 February 137:References 131:Superpower 96:Similarly 18:futurology 711:Globalism 639:Singleton 22:singleton 592:Concepts 573:Toby Ord 105:See also 33:Overview 225:Faz.net 551:People 491:  393:  333:  293:  259:  648:Works 174:. io9 73:like 65:, an 489:ISBN 470:(1). 391:ISBN 367:2016 344:2016 331:ISBN 304:2016 291:ISBN 270:2016 257:ISBN 236:2016 206:2016 180:2016 20:, a 483:". 327:185 87:bad 67:AGI 16:In 702:: 463:. 446:. 413:. 375:^ 329:. 227:. 223:. 197:. 153:. 43:AI 29:. 536:e 529:t 522:v 468:9 418:5 399:. 369:. 346:. 306:. 272:. 238:. 208:. 182:. 158:5

Index

futurology
Nick Bostrom
AI
superintelligence
surveillance
Ben Goertzel
AGI
existential risks
nanotechnology
Darwinian evolutionary
Bryan Caplan
Hans Morgenthau
AI takeover
Existential risk
Friendly artificial intelligence
Superintelligence
Superpower
"What is a Singleton?"
"7 Totally Unexpected Outcomes That Could Follow the Singularity"
"The Singleton Solution"
"Die Superintelligenz ist gar nicht super"
Frankfurter Allgemeine Zeitung
Our Final Invention: Artificial Intelligence and the End of the Human Era
ISBN
978-0312622374
Here Be Dragons: Science, Technology and the Future of Humanity
ISBN
9780198723547
Nanoethics: Big Ethical Issues with Small Technology
185

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑