Hypertext as a practical method for balancing the Hierarchy and Histio-logy of Knowledge
Andreas Goppold, Postf. 2060, D-89010 Ulm, Germany
Tel. ++49 +731 921 6931, Fax: (Goppold:) +731 501-999
email: [email protected]
URL: http://www.uni-ulm.de/uni/intgruppen/memosys/
Abstract
We take our theoretical concepts from the companion paper: "Balanced
F - (Phi-) Trees". The practical requirements for professional knowledge work can be served by the available technology of hypertext. If implemented correctly, hypertext allows us to optimally balance the complementary principles of hierarchy and Histio-logy (to be distinguished from the medical term Histology). The presently available implementations like HTML (-editors / browsers) still fall somewhat short of the requirements for such applications which is due to the heavy commercial drive behind the industry, that is obviously more bent on serving naive customers aimlessly browsing through commercial offerings and inducing them to buy things. Professional knowledge work necessitates a kind of hypertext-navigation that was not on the mind of the marketing department of Netscape when they coined the product name "Navigator". The paper will specifiy some of the temporal requirements for efficient hypertext-navigation and will give some implementation examples. Time is the most essential (and most consistently forgotten) factor of hypertext-navigation. The currently popular WIMP (Windows, Icon, Mouse, Pointing) GUI interfaces are aimed at the naive user / browser customer base who are induced to buy new computers with every new release of Windows-XYZ, because the systems overload the performance of the old hardware. Since KO department budgets are often not as richly fund-endowed as glitzy AI research laboratories, a solution that runs well on vintage machines, is desired. Professional hypertext-navigation means that an entirely different user interface model needs to be constructed for the non-browsing, high-power, expert knowledge workers, whose most precious resource is their lifetime.Practical Background
The value of computer systems will not be determined by how well they can be used in the applications they were designed for, but how easily they can be fit to cases that were never thought of.
(Alan Kay, Scientific American/Spektrum, Okt. 1984)
The present paper is presented on a practical background of work that the author undertook between 1984 and 1994, the design and development of the Leibniz Programming system, abbreviated LPL. This work is described in Goppold (1994) and the further developments of the LPL concept, the Symbolator, in Goppold (xxxx). http://www.uni-ulm.de/uni/intgruppen/memosys/symbol05.htm#Heading40
For the present paper, the salient points are summed up thusly: Between 1984 and 1985, LPL was created as a hypertext-based programming environment, with a programming language, LPL, that had integrated the hypertext principle into its very design. While the first widely useable hypertext implementation for PC type computers was available sometime later with Hypercard on the Apple Macintosh, LPL remains (to the knowledge of the author) as the only attempt ever undertaken to integrate the hypertext principle into the programming system itself. Out of this effort arose a self-contained system with about 10,000 routines, in 100,000 lines of code, in about 6 Megabytes of source, which would fill about 2000 pages of printed listing. The listing of the function names only fills a book of 200 pages.
Since the project was mostly carried through as a one-man effort, it aspires to the record as (one of) the largest self-contained, stand-alone software systems ever created in this way. Such a projekt, obviously, has no commercial chance in the software world dominated by market forces like Microsoft, and the project was mothballed sometime around 1994. But in retrospect, it became apparent that this system represented a typical application of the above quotation by Alan Kay: While it was originally developed as a programming system, LPL turned out to be a protoype solution for a technical ars memoriae. (Yates 1966). To memorize, not only the names, but also the interface parameters and the computational behavior of 10,000 routines, is a memory act comparable, or even beyond, the mnemonic requirements of learning the Chinese character set. Function libraries of this size can otherwise only be produced and maintained by such huge organizations like Microsoft, and are vastly beyond the capacity of a single person. Thus, the effort to create and maintain such an immense algorithmic repository over 10 years, required extensive exploitation of the "necessity as mother of invention" principle.
Time factors and the computer industry
At ISKO 1998, Kim Veltman (1998) gave a vision of a technologically supported facility for managing the surging data floods, and hopefully to stay abreast of them. He presented a fairly optimistic view, and the proposed solution will for sure be very cost intensive, and thus will tend to be limited (at least initially) to the "fortunate 500" of this world. The present global problem situation is aggravated by non-linear side effects of the complexity explosion, as evidenced by the year-2000 problem. Due to such side effects, there is a good chance that main civilatory foundations could be crumbling faster than our technological ability to compensate can keep up. Humanity is in a race for time to solve its fundamental problems.
We will now focus the attention on time factors and the computer industry. Further material in: Goppold (1996b), (1997a), (1999e), (1999f). The current generation of user interfaces (GUI) is predominantly oriented towards the visual and spatial section of the human cognitive spectrum. Time factors are rarely dealt with explicitly in present human-computer interaction research. A notable exception is Tognazzini (1993), where he explicitly links the factor of time to "magic", and he details the working methods of stage magicians as "manipulations of time" (p. 359). It is instructive to note an apparent theoretical neglect of time factors by computer science (Halang 1992). If we observe industrial systems development in the last 15 years, we get the impression that negative time factors are a prime marketing strategy of the PC industry: all new systems seem to be purposefully designed to be so slow to be practically useless when they are not run on the latest and most powerful hardware on the market. A closer examination of industry system design policy yields the enormous attraction of visual GUI design and comparative neglect of time factors. One typical effect of the present GUI mouse access is, that it slows the user down by about a factor of ten, compared to the very rapid keyboard input of hotkeys (or command line sequences). Of course to be that fast, the user has to have memorized all the command key sequences, and must be a touch-typist. This was typically the secret of "wizard" unix programmers who knew all the shortcuts of their command line interfaces down to "every nook of granny". With the complexity of today's menu interfaces, it is impossible to learn them all (especially when every vendor uses different hotkeys for equivalent functions, or, as Microsoft does, creates a new assignment with every new release). Thus the financial attraction of the mass market of "non-computer nerd" users that was opened with the Macintosh has turned into a retarding standard that was rigidly adhered to, even where the design limitations of this 1984 machine were since long broken by present hardware power (Common 1993). We can view this as an outdated Kuhnian (1962) paradigm, whose stability is not determined by what is technically possible, or rationally advisable, but by social standards of the "least common denominator" and "no experiments, please" flavor. The huge mass market causes a tremendous inertia, and no industrial player wants to play the guinea pig trying out any new ideas and approaches, especially not the largest one, who seems to have opted for the technically most inferior solution. Thus, there has been little progress beyond the basic design decisions of the Macintosh. (Businessweek (www), (Landauer (1995), Norman (1998)).
Time optimization factors were of prime importance in the earlier mini computer generation. Again in Kuhnian sense, there was a complete reversal of paradigms between mini computers, early microcomputers, and the present generation of Macintosh-Style PCs. In the older paradigm of minis, design constraints were imposed by the coupling of a fast hard disk with relatively small computer power (PDP type), which forced the systems designers to painstakingly optimize systems performance around that combination, resulting in such unrepeated feats of temporal efficiency as APL, and MUMPS. These were undoubtedly the most powerful programming languages ever invented by man, and just in terms of pure efficiency, present systems are a big step backwards. But these systems were also cryptic and unforgiving, hard to train, and hard to maintain, and so there were good reasons for the paradigm switch. And, of course, it generates more business volume, when millions of users are catered, than a few thousand.
The most serious problem that is created by the present GUI dominated software, is the loss of program scripting facilities. All pre-GUI Unix systems were designed to be scripted through i/o revectoring. This appears to be more difficult with the GUI event loop, or with the current OO programming style. The loss of scripting facilities systematically disadvantages the power user community.
WIMP RSI
A most insidious problem of present GUI's can be called the WIMP RSI. It is the nervous stress factor involved in the point-and-click orgies of current WIMP mazes, caused by the featuritis that is the current rage of the SW industry which forces them to increase the depth and complexity of menus with every new release, and always relocating menu positions to completely different places in the tree. One will never find provisions for users of the old software versions to get back the old menu layout schemes, to which they were used and trained. This is a stress factor imposed on the expert users of a totally different type of RSI, than that for heavy clerical keyboard users.
A touch typist will memorize the position of the keys, and can hit them blindly, with no visual feedback. This is not only faster than WIMPing but causes much less memory and attention load. This possibility is destroyed with the WIMP GUI. There is a constantly recurring need to take the eyes and the attention off the parts of the screen where the work information resides (we may call this the work focus area), and then search for the control areas, fiddling and fingering around on the table until one has finally found the mouse, and then engaging in a pinball-wizard like game trying to fine-position the cursor on those minute control areas euphemistically labeled scroll bars, activation buttons, and what-not. This necessity to constantly shift visual attention and visual focus is the most problematic aspect of the WIMP UIT. This permanent attention interrupt turns WIMP buttons into fiendish hurdles in a brutal nerve-consuming race. WIMPing around interrupts the flow of concentration about as much as if one were forced to take an ice-cold shower everytime one wants to use the scroll bar. But it is more dangerous than that: There is real injury caused, to those nervous centers that are needed for the vital concentration to the work, and it negatively affects the nervous energy for creative thinking and reasoning, the most important potential of the expert information worker. It is therefore extremely hard to account for in clinical tests. Nervous damage shows very indirectly through such symptoms as cumulative exhaustion (computerese: burn out), and by its psycho-social after-effects.
Time factors in database design: The Balanced B-Tree method
The balanced-B-tree database method was developed on the early mini computer systems that posed the design constraints of very small fast RAM size (32-128 K Bytes on PDP-type machines) with a 100 times slower Winchester Hard disk, and MUMPS is the result of probably the most efficient B-tree system that was ever designed. The crucial design consideration was that within the extremely stringent RAM constraints, the B-tree index had to be maintained to minimize search access to the disk, which would slow down system performance by a factor of 100.
Bottlenecks and limitations of present technology
Internet Bandwith bottlenecks
The balanced F (Phi-) Tree Principle of hypertext needs to take a different, but as stringent set of time constraints into account. We notice that the vast amounts of data accessible on the WWW are squeezed through several bottlenecks before they reach the user. The WWW bandwith available at a typical German university hovers around the level of a third-world country like Nigeria: about 300 bytes/sec. While this is bound to improve in time, it will always be about a factor 100 slower than at a typical US ivy-league university. (Community colleges in the US are probably worse off than in Germany). Therefore, the implementations that are workable for the top-end US clientele, are just not feasible in Germany, and for that matter, in 99% of the rest of the world, like Russia, South America, and Africa.
User Interface technology bottlenecks
Even if data transmission bandwith improves significantly, there are more bottlenecks: while the power of computer technology has on the whole improved by factors of thousand in the last 30 years, one factor has remained about the same in that period: the CRT display area. Because of aspect ratio problems, a typical 17" monitor can display maybe two 80*25 char windows concurrently, and it is still not even able to display one full DIN A4 page as it appears on the printer, also because of the aspect ratio. And with user input, the situation has even become worse in the last 30 years, as was pointed out above, since the WIMP mouse access is about a factor 10 slower than touch typist keyboard input. Since software cannot be operated through the keyboard any more, even the most expert touch typist power user is slowed down to the snail's pace of the mouse-clicking idiot.
Then, there is the bottleneck of the basic human reading speed of about 50 char/sec, which will be the hardest to overcome, unless we find entirely new symbolization methods that make a radical departure from the previous 5000-year epoch dominated by the three R's : readin', 'ritin', 'rithmetic.
The disparity between computer processing power and user interaction power is rising rapidly, to an extent that these bottlenecks will pare down the performance of the whole system more and more.
WWW-Browser bottlenecks
Further material in: Goppold (1996a), (1996b), (1997a), (1998)
http://www.uni-ulm.de/uni/intgruppen/memosys/diskur04.htm
Typical browsers like the Netscape and Micrsoft products impose the typical GUI bottlenecks as listed above, and their parametrizing and scripting options are as rudimentary or non-existant as in the rest of the industry standard software. For professional users, there are even more egregious "it's not a bug, it's a feature" design faults:
Compared to a text editing system like Microsoft WinWord, there is no keyboard cursor control facility, and therefore no visual display of the exact area to where a hypertext jump is being made. Also, it is not possible to position a cursor by keyboard to the next hyper jump area. This can only be done by positioning the the mouse there and click it.
While the scroll-up-down / page-up-down facility of the keyboard is very fast and easy to use, the mouse activate scroll bars are extremely slow and cumbersome to use. This makes reading large WWW texts almost impossible, since one has to permanently switch between the mouse and keyboard. This again has led to the completely nonsensical design rule that WWW pages should not contain more text than fits on one window, which completely cancels out the WWW for a serious knowledge work. With WWW-Frames this situation becomes even worse.
There is no folding / outline facility as there is in Microsoft WinWord, even though the HTML headline format would allow this easily. Because of this, one needs special WWW pages which contain only the headlines to allow pre-selection of material. This adds clutter, inflexibity in design, unnecessary data transfers, and loss of overview. The result is the proverbial "lost-in-hyperspace" syndrome, or as Robert Cailliau of CERN expresses it: "World Wide Spaghetti bowl", and Ted Nelson: "the balcanisation" of the WWW.
Typical browsers make it extremely inconvenient to save and organize selected WWW information in private local repositories, which is obviously not in the interest of a supplyer-driven information industry that would like to cater to a completely passive and naive consumer audience. Such features will there not likely to be found in any of the "all for free" offers of the big players.
The result is that the WWW and current browser technology obliterates almost all the potential that hypertext would offer for professional knowledge work. Of course there are more professional hypertext systems available, like Maurer (WWW), but their cost and infrastructure requirement factors impose other constraints.
Neuronal Resonance: the lost secret of the craft traditions
We come back to the magic in connection with Tognazzini's article. His article is an indication of a time efficiency that cannot be re-gained by a predominantly visual / spatial oriented framework, once one has given up control of the time factor. But for the "Augmentation of Human Intellect" (Engelbart (www)), the time factor seems to be crucial. Unfortunately, the "magic effect" is also hard to verify and rationalize by academic standards. This may be a reason why Engelbart has spent a lifetime churning out ideas of which the largest portion still remains to be recognized, let alone be put into wide usage. In the literature, there is a body of work around "Flow" (Csikszentmihalyi (1990), Karn (1997: 64)). Flow is a somewhat loose term for a hard-to-define intellect-augmentation effect that can occur, when expert work is able to proceed in uninterrupted sequences of cumulative efficiency. In this, the time factor is critical, since there is a connection to the human attention span and capacity of the short term memory (Pöppel 1978-1995). (The best known of these phenomena is the 'flicker fusion effect' utilized in movie projection). Maximum time lag of about 100 msec in user-machine interaction cycles seems imperative. Of course, these augmentation effects are attained mainly when a high level of user training and expertise is already present. The research result of ten years experience with the LPL system comes to a conclusion that corroborates with Csikszentmihalyi. In view of present neurological knowledge, this effect can be called neuronal resonance. Goppold (1999d). The action of the human neuronal system is strongly influenced by temporal structures, the neuronal resonances, and any interaction with a technical device has to take these resonances into account. In pre-industrial times, when all machinery was human-powered, the optimization of these interdependencies was the secret knowledge of all the craft traditions of humanity, and only with the rise of the machine age, has this fallen into oblivion. All craft tools, but especially the weapons, were masterpieces of setting up neuronal resonances. (Bernard (1985), Breidbach (1993-1997), Brock (1994), Bücher (1924), Goppold (1999d), (1999e)). This was a "kind of knowledge" that the craft traditions maintained as tacit or latent knowledge, i.e. something that was transmitted non-verbally, through many long, and arduous years of apprenticeship, without any formal instructions as to what it was that they learned. Those people who did not get the "knack" were simply weeded out of the system. Since crafts were rather low on the scale of social prestige occupations, and the academic learning was of rather bookish, and sedentary (chair potato) type, this "kind of knowledge" was rarerely acknowledged in the academic system. Thus, when the academic engineering sciences began to absorb the craft knowledge in the age of bookprinting, these factors vanished from view, and are therefore nowadays almost lost in the academic pantheon of knowledge.
The Hierarchy and Histio-logy of Hypertext
The potential of hypertext can only be utilized when advances are made in the neuronal resonance potential of the new technology to offset its disadvantages. Many of the problems listed above can be corrected with better user interface design and better provisions for power users. The basic limitations of the display technology are harder to overcome, since display technology has more stringent industrial production constraints that cannot be overcome with the same type of technology as the silicon miniaturization that has gone on continuously in the last 30 years. An example for these difficulties are the LCD screen production bottlenecks. Therefore, better ways have to be found to amplify on the temporal domain. It is, for example, possible to visually scan pages at a rate of about 2000 char/sec with the same techniques as speed reading, if the information design is such that the eye can select out important markers. But as everyone will understand, the present WIMP / browser bottlenecks don't allow such information system designs.
Hypertext of the WWW flavor is mainly a technology of histio-logy, (or association, which is a better known name). The aspect of hierarchy is less well served by this technology, and there is a much better solution in the Microsoft WinWord outline editing facility. But again, this is only a matter of incorporating this aspect into a structured editing facility which can deliver HTML-WWW structured material or another, more suitable format, when a new standard is found.
Time and the Ecology of Pragmatic Knowledge
Temporal aspects are vital for any kind of pragmatic application of knowledge. Let us call Pragmatic Knowledge (abbrev. PK) (Handlungswissen) that kind of knowledge which is necessary in any given situation to fulfil a task. Dahlberg (1993: 214): "Information is Knowledge in Action". Since real-life tasks are always constrained by time limitations, PK is time bound. Any PK fact not found in time (for a problem to be solved), might as well not exist in the universe of knowledge. PK has these aspects:
1) the kind and conditions of the task to be accomplished: task knowledge.
2) the means by which it is to be accomplished: instrument knowledge.
3) the possible, expectable and unexpectable consequences of action.
Action is only possible in the present moment. It consists of manipulation / transformation of material objects and / or mental constructs under application of PK and under consideration of possible consequences. Freedom of action is dependent on the relative availability of PK. Acquisition of PK itself engenders a cost factor, which must be balanced against the cost of failure of action due to insufficient PK.
Desiderata
Information technology must make full use of hitherto unused human facilities for overcoming the fundamental 50-char/sec "sonic barrier" of human data processing. Present technologies are still backward-oriented to the 5000-year history of alphanumeric processing, and need to incorporate radically new knowledge processing designs and representations. The knowledge systems of humanity themselves must be radically re-designed to make use of such hitherto unimagined facilities. An example for such new facilities is given by Lennon (1994, 1995).
The LPL hypertext converter
In its present implementation, the LPL system provides a conversion facility from WinWord type structured texts to HTML which has a great freedom for automatically generating hypertext links. This offers a flexible and relatively low-cost alternative instead of full fledged HTML database systems like Hyperwave, while at the same time offering a more suitable data maintenance scheme than HTML, which is, at best, a backwards standard that impedes further progress.
Bibliography
is situated in:
Goppold, A.: Balanced Phi-Trees: The Hierarchy and Histio-logy of Noo-logy, ISKO '99, Hamburg 23.-25.9.1999, (1999b)
http://www.uni-ulm.de/uni/intgruppen/memosys/isko1.htm