Saturday, October 15, 2011

The evolution of computer input models

Keyboard era
Typewriters were created in the 19 century, and at that time the QWERTY layout begun to define keyboard layouts, and it still largely does to this day. QWERTY was designed to prevent jams in these early typewritters, so it is an inefficient layout and it is relatively hard to learn.

The keyboard was an important technology even before then invention of computers, and then when computers where invented, they played an important role in defining their input models.

At first computers couldn't even accept input from keyboards themselves, so you had to use them to separetly created punched cards to input into the computer. This was known as the batch input model.

The next development was the CLI input model, in which computers directly accepted keyboard input and responded to it in various ways, first with printers, and later with CRT displays like the one the IBM 5100 had. The CLI computers were generally used for programming in Basic or some other programming language.

The first challenge to the use of the keyboard as an input device came from indirect pointing devices like the mouse, however, the mouse alone is far from a challenge to the dominance of the keyboard. The PARC input model, based upon the introduction of the mouse, was introduced to the world with the Apple Macintosh and it subsequently came dominate the computing world for nearly thirty years. Most PARC based computers used a desktop environment with windows, menus, toolbars, and icons.

Post-keyboard era
The "tablet revolution" was partially launched by the introduction of the Apple iPad in April 2010, is based upon multitouch and voice input. These input methods allow you to avoid using hard keyboard using at least three separate means:

1) The use of a virtual keyboard or optical character recognition to input text through the touch screen.

2) The use of speech recognition to input text through voice sensors.

These two methods allow you to eliminate the use of hard keyboards in most cases. One of effect of this is that Laptops are now obsolete, since they carry hard keyboards which exerted extra weight, and which aren't ergonomic either. I have given away all of my laptops so that I can use tablets instead.

Now I require that all the keyboards that I use are ergonomic keyboards with a dvorak layout, which will significantly reduce the strain on my fingers, in order to prevent me from acquiring repetitive strain injury, which is a symptom that effects most heavy computer users from the keyboard age. Now that we have tablets, which are no longer dependent upon hard keyboards, I have directions in which I think we should head:

1) Transition from the old WIMP GUI to a multitouch ZUI and effectively do away with all sorts of popups, including menus and dialog windows, which will have the added effect of moving us towards eliminating the distinction web applications and desktop applications.

2) Make commands rather then applications central to the user interface using NLP algorithms and command packages. Commands can be accessed using speech recognition. One important command should be undo, so that the system can forgive mistakes.

These two ideas were first proposed by design expert Jef Raskins, however, he didn't yet realise that they are best implemented on tablet systems with multitouch input to the ZUI and speech input to the commands. I look forward to seeing these ideas implemented in the future.

No comments:

Post a Comment