Natural User Interface (NUI) is Reality Based User Interface which allows the user to interact with the device by tapping, gestures or voice. As can be understood, examples of Natural User Interface (NUI) are really lying around our daily life – such as touch screens respond to finger movements, Google Glass responds to spoken language. These Interfaces can be taken as Post 2010 Major trends of evolution of the Computers. Command Line Interface or CLI was the first way of interacting with the computer, then a bit sophisticated and colorful interface was Text-based User Interface (TUI); we now mainly use Graphical User Interface (GUI). Natural User Interface (NUI) is usually taken as the next step of evolution of user interfaces.
A typical sarcastic modified graphical representation of human evolution is often found on social networks – which describes human started journey as four footed animal, gradually stood up and again became kind of 4 footed animal sitting in front of computer and typing on the keyboard. There are risks related to prolonged usage of computers with the current way of input or interaction with computing devices.
Basics of Natural User Interface (NUI)
Through the development of touch screen, the old pattern of using graphical user interfaces (GUI) has been changed. Where previously input devices such as a keyboard or a mouse were needed for interaction now a finger touch suffices. Apple iPhone to ATMs – all use this direct form of operation.
---
Since touching and manipulating the virtual objects works almost in the same way as the real objects, users find it easy to transfer activities from everyday life in the digital system. Actions in the real everyday environment parallels to the virtual objects and actions. Thus already existing knowledge structures can be applied without much knowledge about the GUI. The evolution from input devices such as a mouse towards multi-touch systems bring the real and the virtual world to come closer. Objects are no longer influenced by commands to the computer, but are taken into finger yips. This approach is called ‘Reality-Based Interaction’ (RBI) called and serves as the basis for the design of multi-touch applications.
Various predetermined interaction, so-called ‘patterns’, such as moving and rotating images or scrolling information allow the users to act directly with the equipment and the software interface.
Natural User Interface (NUI) allows people a much more natural handling of interactions and is an extension of the previously limited artificial technical interfaces.
Technology and Development of Natural User Interface (NUI)
The first attempt to develop such Input Devices began in the 50s. In 1945-1948, Canadian scientist Hugh Le Caine developed the first voltage-controlled synthesizer that features touch-sensitive keys. From mid 1960s to 1971 various touchscreen technologies, among others, were developed by IBM and University of Illinois. PLATO IV, a touchscreen terminal from the year 1972, works with a precursor of today’s popular infrared technology.
Multi-touch technologies have a long history. To put it in perspective, the original work undertaken by my team was done in 1984, the same year that the first Macintosh computer was released, and we were not the first.
– Bill Buxton, on iPhone’s interface
In 1982, Nimish Mehta from University of Toronto, developed the first multi-touch system. It was called ‘Flexible Machine Interface’ which allows the user by pressing on the screen with finger to draw simple graphics.
In 1990, ‘Sensor Cube’ developed in collaboration with NASA and came as successor to the ‘Sensor Frame’ developed at Carnegie Mellon University in 1985, it is an optical system and is able to detect the angle of the finger on the touch screen. Skipping many steps of inventions, in 2007 Apple presented the most recognized example of a multi-touch device – the iPhone. In the same year, Microsoft introduced the interactive multi-touch table MS Surface.
Regardless of the technology to register a touch event, use all systems use three different components – sensors, comparators and actuators as the basis of their hardware. Sensors register changes of the system and determine by their sensitivity and range of the usable interactions of a multi-touch screen. Comparators perform a condition comparison. The state of the system after the interaction is compared with the state of the system prior to the interaction and the comparator determines the effects of the interactions performed. This is passed to the actuators and taken as actual action. Comparators and actuators appear in the form of software.
Tagged With natural user interface , what is natural user interface , natural interface , is natural user interface and multi touch interface the same? , how are natural user interface and command-line are similar , what is NUI part of when it comes from ibm , natural user interface disability , natural user interface gov , natural user interface in android , natural user interface tizen