Knowledge

Natural user interface

Source đź“ť

296:"3D Immersive Touch" is defined as the direct manipulation of 3D virtual environment objects using single or multi-touch surface hardware in multi-user 3D virtual environments. Coined first in 2007 to describe and define the 3D natural user interface learning principles associated with Edusim. Immersive Touch natural user interface now appears to be taking on a broader focus and meaning with the broader adaption of surface and touch driven hardware such as the iPhone, iPod touch, iPad, and a growing list of other hardware. Apple also seems to be taking a keen interest in “Immersive Touch” 3D natural user interfaces over the past few years. This work builds atop the broad academic base which has studied 3D manipulation in virtual reality environments. 204:
design application for example, a user would use the metaphor of 'tools' to do this, for example, selecting a prod tool, or selecting two parts of the mass that they then wanted to apply a 'pinch' action to. Han showed that user interaction could be much more intuitive by doing away with the interaction devices that we are used to and replacing them with a screen that was capable of detecting a much wider range of human actions and gestures. Of course, this allows only for a very limited set of interactions which map neatly onto physical manipulation (RBI). Extending the capabilities of the software beyond physical actions requires significantly more design work.
236: 96: 216:
part of the control mechanisms. So for example, when you place a wine glass on the table, the computer recognizes it as such and displays content associated with that wine glass. Placing a wine glass on a table maps well onto actions taken with wine glasses and other tables, and thus maps well onto reality-based interfaces. Thus, it could be seen as an entrée to a NUI experience.
128:, especially when it is used as an input device, in which touching a natural element (water) becomes a way of inputting data. More generally, a class of musical instruments called "physiphones", so-named from the Greek words "physika", "physikos" (nature) and "phone" (sound) have also been proposed as "Nature-based user interfaces". 469:, in the paper of the Closing Keynote Address, entitled "Reconfigured Self as Basis for Humanistic Intelligence", Steve Mann, USENIX-98, New Orleans June 15–19, 1998, Published in: ATEC '98 Proceedings of the annual conference on USENIX Annual Technical Conference USENIX Association Berkeley, CA, USA ©1998 203:
interfaces. In a demonstration at TED in 2006, he showed a variety of means of interacting with on-screen content using both direct manipulations and gestures. For example, to shape an on-screen glutinous mass, Jeff literally 'pinches' and prods and pokes it with his fingers. In a GUI interface for a
135:
community with the goal to expand discussion and development related to NUI technologies. In a 2008 conference presentation "Predicting the Past," August de los Reyes, a Principal User Experience Director of Surface Computing at Microsoft described the NUI as the next evolutionary phase following the
86:
One example of a strategy for designing a NUI not based in RBI is the strict limiting of functionality and customization, so that users have very little to learn in the operation of a device. Provided that the default capabilities match the user's goals, the interface is effortless to use. This is an
215:
takes similar ideas on how users interact with content, but adds in the ability for the device to optically recognize objects placed on top of it. In this way, users can trigger actions on the computer through the same gestures and motions as Jeff Han's touchscreen allowed, but also objects become a
54:
An NUI relies on a user being able to quickly transition from novice to expert. While the interface requires learning, that learning is eased through design which gives the user the feeling that they are instantly and continuously successful. Thus, "natural" refers to a goal in the user experience –
82:
to render real-world objects "clickable", i.e. so that the wearer can click on any everyday object so as to make it function as a hyperlink, thus merging cyberspace and the real world. Because the term "natural" is evocative of the "natural world", RBI are often confused for NUI, when in fact they
146:
Then, when the mouse enabled the GUI, users could more easily learn the mouse movements and actions, and were able to explore the interface much more. The GUI relied on metaphors for interacting with on-screen content or objects. The 'desktop' and 'drag' for example, being metaphors for a visual
87:
overarching design strategy in Apple's iOS. Because this design is coincident with a direct-touch display, non-designers commonly misattribute the effortlessness of interacting with the device to that multi-touch display, and not to the design of the software where it actually resides.
178:
was asked about the iPhone's interface, he responded "Multi-touch technologies have a long history. To put it in perspective, the original work undertaken by my team was done in 1984, the same year that the first Macintosh computer was released, and we were not the first."
46:
that is effectively invisible, and remains invisible as the user continuously learns increasingly complex interactions. The word "natural" is used because most computer interfaces use artificial control devices whose operation has to be learned. Examples include
136:
shift from the CLI to the GUI. Of course, this too is an over-simplification, since NUIs necessarily include visual elements – and thus, graphical user interfaces. A more accurate description of this concept would be to describe it as a transition from
160:
In 2010, Daniel Wigdor and Dennis Wixon provided an operationalization of building natural user interfaces in their book. In it, they carefully distinguish between natural user interfaces, the technologies used to achieve them, and reality-based UI.
119:
technology typically embodies an example of a natural user interface. Mann's use of the word "Natural" refers to both action that comes naturally to human users, as well as the use of nature itself, i.e. physics
348:
is designed for "a revolutionary new way to play: no controller required.". Again, because Kinect allows the sensing of the physical world, it shows potential for RBI designs, and thus potentially also for NUI.
157:
In 2010, Microsoft's Bill Buxton reiterated the importance of the NUI within Microsoft Corporation with a video discussing technologies which could be used in creating a NUI, and its future potential.
143:
In the CLI, users had to learn an artificial means of input, the keyboard, and a series of codified inputs, that had a limited range of responses, where the syntax of those commands was strict.
182:
Multi-Touch is a technology which could enable a natural user interface. However, most UI toolkits used to construct interfaces executed with such technology are traditional GUIs.
51:, such as Alexa and Siri, touch and multitouch interactions on today's mobile phones and tablets, but also touch interfaces invisibly integrated into the textiles of furniture. 154:
in 2010. "Now a new wave of products is poised to bring natural user interfaces, as these methods of controlling electronics devices are called, to an even broader audience."
55:
that the interaction comes naturally, while interacting with the technology, rather than that the interface itself is natural. This is contrasted with the idea of an
617: 825: 645: 62:
Several design strategies have been proposed which have met this goal to varying degrees of success. One strategy is the use of a "
257: 115:(GUI). Mann referred to this work as "natural user interfaces", "Direct User Interfaces", and "metaphor-free computing". Mann's 443: 618:
https://www.amazon.com/Brave-NUI-World-Designing-Interfaces/dp/0123822319/ref=sr_1_1?ie=UTF8&qid=1329478543&sr=8-1
585: 283: 107:
developed a number of user-interface strategies using natural interaction with the real world as an alternative to a
265: 513: 662: 20: 261: 685: 638: 547: 532: 789: 422:
Brauner, Philipp; van Heek, Julia; Ziefle, Martina; Hamdan, Nur Al-huda; Borchers, Jan (2017-10-17).
151: 612: 820: 705: 631: 246: 112: 741: 398: 250: 147:
interface that ultimately was translated back into the strict codified language of the computer.
799: 718: 373: 368: 108: 104: 728: 680: 403: 212: 56: 8: 773: 449: 428:
Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces
393: 322: 759: 751: 690: 439: 378: 329: 124:), and the natural environment. A good example of an NUI in both these senses is the 79: 48: 571: 498: 453: 713: 431: 137: 768: 763: 675: 466: 423: 337: 654: 589: 308: 43: 814: 388: 196: 132: 125: 435: 695: 363: 311: 150:
An example of the misunderstanding of the term NUI was demonstrated at the
736: 200: 175: 620:
The book Brave NUI World from the creators of Microsoft Surface's NUI
383: 341: 315: 185: 27: 235: 333: 319: 623: 670: 613:
http://blogs.msdn.com/surface/archive/2009/02/25/what-is-nui.aspx
326: 487:
Natural Interfaces for Musical Expression, Steve Mann, Nime 2007
358: 345: 304: 225: 116: 59:, referring to one that can be used without previous learning. 95: 421: 478:
Intelligent Image Processing, John Wiley and Sons, 2001
78:) methods. One example of an RUI strategy is to use a 430:. Brighton United Kingdom: ACM. pp. 151–160. 186:Examples of interfaces commonly referred to as NUI 572:"Multi-Touch Systems that I Have Known and Loved" 812: 533:"Body in Motion: CES to Showcase Touch Gizmos" 639: 511: 563: 520:. Sydney Convention Centre: Web Directions. 264:. Unsourced material may be challenged and 646: 632: 530: 505: 284:Learn how and when to remove this message 490: 131:In 2006, Christian Moore established an 94: 207: 813: 545: 83:are merely one means of achieving it. 826:History of human–computer interaction 627: 578: 496: 219: 262:adding citations to reliable sources 229: 653: 539: 512:de los Reyes, August (2008-09-25). 190: 13: 569: 524: 14: 837: 164: 663:Natural-language user interfaces 548:"CES 2010: NUI with Bill Buxton" 234: 195:One example is the work done by 497:Moore, Christian (2006-07-15). 21:natural language user interface 554: 531:Wingfield, Nick (2010-01-05). 481: 472: 460: 415: 299: 169: 1: 686:Conversational user interface 606: 336:for interaction instead of a 467:Reality User Interface (RUI) 99:Evolution of user interfaces 7: 546:Buxton, Bill (2010-01-06). 352: 10: 842: 223: 90: 18: 800:Multimodal user interface 790:Text-based user interface 782: 750: 727: 706:Graphical user interfaces 704: 661: 518:Web Directions South 2008 152:Consumer Electronics Show 586:"Xbox.com Project Natal" 409: 113:graphical user interface 72:reality-based interfaces 19:Not to be confused with 742:Tangible user interface 436:10.1145/3132272.3134128 424:"Interactive FUrniTURE" 399:Tangible user interface 795:Natural user interface 719:Zooming user interface 535:. Wall Street Journal. 501:. NUI Group Community. 374:Organic user interface 369:Kinetic user interface 109:command-line interface 100: 64:reality user interface 32:natural user interface 16:Type of user interface 783:Other user interfaces 729:Touch user interfaces 681:Voice user interfaces 550:. Microsoft Research. 514:"Predicting the Past" 98: 499:"New Community Open" 404:Touch user interface 258:improve this section 213:Microsoft PixelSense 208:Microsoft PixelSense 774:Positional tracking 70:"), also known as " 57:intuitive interface 752:3D user interfaces 394:Spatial navigation 332:that uses spatial 323:video game console 220:3D Immersive Touch 122:Natural Philosophy 101: 808: 807: 691:Virtual assistant 445:978-1-4503-4691-7 379:Virtual assistant 294: 293: 286: 80:wearable computer 40:natural interface 833: 648: 641: 634: 625: 624: 601: 600: 598: 597: 588:. Archived from 582: 576: 575: 567: 561: 558: 552: 551: 543: 537: 536: 528: 522: 521: 509: 503: 502: 494: 488: 485: 479: 476: 470: 464: 458: 457: 419: 289: 282: 278: 275: 269: 238: 230: 191:Perceptive Pixel 49:voice assistants 841: 840: 836: 835: 834: 832: 831: 830: 821:User interfaces 811: 810: 809: 804: 778: 769:Finger tracking 764:virtual reality 746: 723: 700: 676:Dialogue system 657: 655:User interfaces 652: 609: 604: 595: 593: 584: 583: 579: 568: 564: 560:Brave NUI World 559: 555: 544: 540: 529: 525: 510: 506: 495: 491: 486: 482: 477: 473: 465: 461: 446: 420: 416: 412: 355: 340:. According to 338:game controller 302: 290: 279: 273: 270: 255: 239: 228: 222: 210: 193: 188: 172: 167: 93: 24: 17: 12: 11: 5: 839: 829: 828: 823: 806: 805: 803: 802: 797: 792: 786: 784: 780: 779: 777: 776: 771: 766: 756: 754: 748: 747: 745: 744: 739: 733: 731: 725: 724: 722: 721: 716: 710: 708: 702: 701: 699: 698: 693: 688: 683: 678: 673: 667: 665: 659: 658: 651: 650: 643: 636: 628: 622: 621: 615: 608: 605: 603: 602: 577: 574:. Bill Buxton. 570:Buxton, Bill. 562: 553: 538: 523: 504: 489: 480: 471: 459: 444: 413: 411: 408: 407: 406: 401: 396: 391: 386: 381: 376: 371: 366: 361: 354: 351: 309:motion sensing 301: 298: 292: 291: 242: 240: 233: 224:Main article: 221: 218: 209: 206: 192: 189: 187: 184: 171: 168: 166: 165:Early examples 163: 103:In the 1990s, 92: 89: 44:user interface 15: 9: 6: 4: 3: 2: 838: 827: 824: 822: 819: 818: 816: 801: 798: 796: 793: 791: 788: 787: 785: 781: 775: 772: 770: 767: 765: 761: 758: 757: 755: 753: 749: 743: 740: 738: 735: 734: 732: 730: 726: 720: 717: 715: 712: 711: 709: 707: 703: 697: 694: 692: 689: 687: 684: 682: 679: 677: 674: 672: 669: 668: 666: 664: 660: 656: 649: 644: 642: 637: 635: 630: 629: 626: 619: 616: 614: 611: 610: 592:on 2009-07-09 591: 587: 581: 573: 566: 557: 549: 542: 534: 527: 519: 515: 508: 500: 493: 484: 475: 468: 463: 455: 451: 447: 441: 437: 433: 429: 425: 418: 414: 405: 402: 400: 397: 395: 392: 390: 389:Scratch input 387: 385: 382: 380: 377: 375: 372: 370: 367: 365: 362: 360: 357: 356: 350: 347: 343: 339: 335: 331: 328: 324: 321: 317: 313: 310: 306: 297: 288: 285: 277: 267: 263: 259: 253: 252: 248: 243:This section 241: 237: 232: 231: 227: 217: 214: 205: 202: 198: 197:Jefferson Han 183: 180: 177: 162: 158: 155: 153: 148: 144: 141: 139: 134: 133:open research 129: 127: 126:hydraulophone 123: 118: 114: 110: 106: 97: 88: 84: 81: 77: 73: 69: 65: 60: 58: 52: 50: 45: 41: 37: 33: 29: 22: 794: 696:Voice search 594:. Retrieved 590:the original 580: 565: 556: 541: 526: 517: 507: 492: 483: 474: 462: 427: 417: 364:Eye tracking 312:input device 303: 295: 280: 274:January 2017 271: 256:Please help 244: 211: 194: 181: 173: 159: 156: 149: 145: 142: 130: 121: 102: 85: 75: 71: 67: 63: 61: 53: 39: 35: 31: 25: 737:Multi-touch 300:Xbox Kinect 201:multi-touch 176:Bill Buxton 170:Multi-touch 815:Categories 607:References 596:2009-08-02 105:Steve Mann 760:Augmented 384:Post-WIMP 344:'s page, 342:Microsoft 316:Microsoft 245:does not 111:(CLI) or 28:computing 454:10774834 353:See also 334:gestures 320:Xbox 360 318:for the 140:to NUI. 714:Widgets 671:Chatbot 327:Windows 266:removed 251:sources 91:History 452:  442:  359:Edusim 346:Kinect 305:Kinect 226:Edusim 117:EyeTap 450:S2CID 410:Notes 307:is a 174:When 42:is a 38:) or 762:and 440:ISBN 325:and 249:any 247:cite 138:WIMP 66:" (" 30:, a 432:doi 330:PCs 314:by 260:by 199:on 76:RBI 74:" ( 68:RUI 36:NUI 26:In 817:: 516:. 448:. 438:. 426:. 647:e 640:t 633:v 599:. 456:. 434:: 287:) 281:( 276:) 272:( 268:. 254:. 120:( 34:( 23:.

Index

natural language user interface
computing
user interface
voice assistants
intuitive interface
wearable computer

Steve Mann
command-line interface
graphical user interface
EyeTap
hydraulophone
open research
WIMP
Consumer Electronics Show
Bill Buxton
Jefferson Han
multi-touch
Microsoft PixelSense
Edusim

cite
sources
improve this section
adding citations to reliable sources
removed
Learn how and when to remove this message
Kinect
motion sensing
input device

Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.

↑