The Magical Place of Literary Memory™: Xanadu

What I thought would be called Xanadu is called the World Wide Web and works differently, but has the same penetration (Nelson, 1999, interview).

It was a vision in a dream. A computer filing system which would store and deliver the great body of human literature, in all its historical versions and with all its messy interconnections, acknowledging authorship, ownership, quotation and linkage. Like the web, but much better: no links would ever be broken, no documents would ever be lost, copyright and ownership would be scrupulously preserved. This vision is actually older than the web, and aspects of it are older than personal computing: it belongs to hypertext pioneer Theodore Holm Nelson, who dubbed the project Xanadu in 1967.

The name comes from the famous poem by Samuel Taylor Coleridge, Kubla Khan. In his tale of the poem’s origin, Coleridge claimed to have woken from a laudanum-laced reverie with ‘two or three hundred’ lines of poetry in his head. He had noted down but a few lines when he was interrupted by a visitor, and when he returned to his work later he found that the memories had blurred irretrievably. His mythical landscape, this vision of Xanadu, had passed away ‘like the images on the surface of a stream into which a stone had been cast’ (Coleridge, cited in Nelson 1987, hereafter DM, p. 142). Like Nelson, Coleridge feared the despotism of the senses and the confusion of senseless memory; he feared memory loss. In an introduction to the poem cited by Nelson (DM), Coleridge writes of his muddled visions:

Then all the charm
Is broken-all that phantom world so fair,
Vanishes and a thousand circlets spread,
And each mis-shape the other
(Coleridge, cited in DM, p. 142).

The first story Nelson told Wired reporter Gary Wolf about Xanadu was also based on a vision of disturbed water. To Nelson, the ‘swirling currents under his grandfather’s boat represent the chaotic transformation of all relationships and the irrevocable decay associated with the flow of time’ (Wolf 1995, p. 12). Xanadu was meant to organise this chaos, to channel this temporal flow, at the same time preserving all the ‘true interconnections’ which held it together. But unlike Coleridge, Nelson believed (and still does believe) that the ‘intertwingled’ nature of human thought is its greatest asset.

Like the early hypertext pioneers Vannevar Bush and Douglas Engelbart [1]  , Nelson has a theory about the inheritance and transmission of human knowledge. The knowledge that we pass on to each other as human beings is itself a vast, intertwingled network (DM, p. 156), an accumulation of different disciplines and fields of knowledge, ‘a corpus of all fields’ (Nelson 1965, p. 145). Most importantly, this ‘corpus of all fields’ is constantly shifting and changing; like biological life, it is evolving. It is a ‘bundle of relationships subject to all kinds of twists, inversions, involutions and rearrangement: these changes are frequent but unpredictable’ (Nelson 1965, p. 145). If we wish to gain control of this giant corpus, if we wish to preserve human knowledge, then we need to understand how it works and preserve its structure. We need to maintain the interconnections, the original paths or trails through ideas. This is important because:

thoughts and minds themselves, of course, do not last …”Knowledge,” then – and indeed most of our civilization and what remains of those previous – is a vasty [sic] cross-tangle of ideas and evidential materials, not a pyramid of truth. So that preserving its structure, and improving its accessibility, is important to us all (DM, p. 157).

Xanadu was originally proposed as a vast digital network to house this great corpus of ideas and its interconnections, facilitated by a special linking system. The linking system would be based on ‘the fluidity of thought – not just its crystallised and static form, which, like water’s, is hard and cold and goes nowhere’ (Nelson 1992, 1/13). He wanted this system to stretch around the planet, embracing all our stray ideas, all our stray works of literature and scholarship, all the information that would otherwise be lost to us. Xanadu is a case study in Derrida’s will to totality. It would be a mini-universe of words which remember where they have been and where they might yet be: the ‘Pleasure Dome of the creative writer’ (DM, p. 141).

The story of Xanadu is the greatest image of potentiality in the evolution of hypertext [2] . Nelson invented a new vocabulary to describe his vision, much of which has become integrated into contemporary hypermedia theory and practice – for instance, the words ‘hypertext’ and ‘hypermedia’. As he put it in our interview, ‘I think I’ve put more words in the dictionary than Lewis Carroll. Every significant change in an idea means a new term’ (1999). Nelson came up with many significant changes, and consequently many new terms (some of which will be explained presently). He also recruited or inspired some of the most visionary programmers and developers in the history of computing, many of whom went on to develop the first hypertext products. His writings and presentations concerning the ‘digital repository scheme for worldwide electronic publishing’ (Nelson 1992, 3/2) have been plundered by theorists and practitioners the world over. It was Nelson’s vision which inspired me to write a Ph.D. on hypertext and the aporia of memory. Media opinion, however, is divided over Nelson: ‘[b]oon or boondoggle, nobody is quite sure’, as The Economist puts it (cited in Nelson 1992, preface).

I will be exploring the evolution of the Xanadu design and the ideas behind it in more depth presently. For now, I wish to emphasise this mythical dimension to Xanadu. As a concept, it has been under development for over forty years, and it is only in the last five years that Nelson has released a beta version resembling the vision. Like Bush’s Memex, Xanadu has become the stuff of legend within the hypertext community, largely due to the inspired writings of its creator and the lack of a real-world prototype. But unlike Bush’s Memex, there have been numerous attempts to create the design exactly as Nelson described it-none of which have realised this colossal vision. Like a spectre of the future, all we have of Xanadu is its erotic simulacrum, its ideals, its ideas-and some tantalising shells of code. To Nelson’s dismay, and unfairly, it has consequently been hailed as ‘the longest-running vaporware project in the history of computing’ (Wolf 1995, p. 1).

To complicate matters, the code behind Xanadu has been in a state of perpetual rewrite for decades. This technical evanescence, combined with its mythical status, make it a very difficult subject to write about. I have chosen to divide this article into two main parts: the first is the evolution of the idea, which is quite straightforward; the second is an explanation of the Xanadu system itself, its technical design, the criticism of this design, and the attempts to build it. A simple explanation for non-experts of the Xanadu system is long overdue, as anyone who has tried to navigate the white papers will attest. This ‘technical’ section was written after interviewing Nelson in Japan over a two-day period, and the interesting part of this section is how far it will depart from the first; it is not that Xanadu has failed as vision (it has captured the imagination of a whole generation of developers, for a start), but that the vision has failed to realise itself qua technical artefact.

Yet Xanadu refuses to die (its logo is, appropriately enough, The Eternal Flaming X™). Paisley and Butler (cited in Smith 1991, p. 262) have noted that ‘[s]cientists and technologists are guided by “images of potentiality”-the untested theories, unanswered questions and unbuilt devices that they view as their agenda for five years, ten years, and longer’. Often accused of handwaving and lucid dreaming, Nelson’s Xanadu has nonetheless become inherited vision.


Nelson at at Keio University, Japan 1999 (image Belinda Barnet).

Ideas and their interconnections: the evolution of the idea

People ask me why I carry a stapler. The answer is to attach pieces of paper to each other. Such an archaic concept. Photographers carry cameras, gunfighters carry guns: I CONNECT THINGS! … the problem with paper is that every sentence wants to break out and slither in some other direction, but the confines of the page forbid us to. So that if the page could sprout wings, or sprout tunnels off to the side, the parenthesis, instead of having to stop after some point, could go on forever (Nelson 1999, interview).

Nelson wears a strap across his shoulder with pens, scissors, sticky notes and sticky tape attached to it. He has been wearing it since the mid-1960s. The belt is filled with tools to connect things with, ammunition against a world of paper. Like Bush, Nelson is painfully aware that ideas are easily lost in conventional indexing systems, that they are disconnected from each other, and that ‘serious writing or research’ demands connecting ideas together (Nelson 1999, interview). Frustrated by the lack of a global, real-world system that might do this for him, and ‘outraged’ by the confines of paper (Nelson 1998, pp. 1–2), he feels the need to do this manually.

Nelson (1965) first published the term ‘hypertext’ in his paper, ‘A File Structure for the Complex, the Changing and the Indeterminate’, where he describes a type of computer-supported writing system that would allow for branching, linking and responding text. This system would connect things automatically: no need for scissors and glue. It would be composed of either ‘written or pictorial material’ and its defining feature is that the information would be ‘interconnected in such a complex way that it could not be conveniently presented or represented on paper’ (Nelson 1965, p. 96). Also in the mid-sixties, Nelson coined the terms ‘hypermedia’ and ‘hyperfilm’ – terms which employed the same ideas behind hypertext and were meant to encompass image as well as text.

Contrary to modern use of the word ‘hypertext’ to refer to networked text (e.g. Delaney & Landow 1994, p. 7; Stefik 1997, p. 21):

Nelson always meant hypermedia when he said hypertext, it’s one of the things that people get wrong about Nelson. They think that they’ve invented hypermedia and he only invented hypertext. He meant ‘text’ in the sense of corpus, not text in the sense of characters. I know this for a fact because we’ve talked about it many times (van Dam 1999, interview).

As might be evident by now, Nelson tends towards the universal rather than the particular. Distinguishing between image, text and sound is pointless in a digital environment. This ‘new form of computer-supported writing’, Nelson wrote in 1965, will organise and represent ‘all our complex informational arrangements’ (Nelson 1965, p. 144). It will contain whatever we want it to contain, stem the loss of great ideas, and preserve all the interconnections. But most importantly, it will preserve all the different stages in the evolution of an idea. These stages are usually lost or discarded in codex filing systems.

The physical universe is not all that decays. So do abstractions and categories. Human ideas, science, scholarship and language are constantly collapsing and unfolding… I believe that such a system as the ELF [the early hypertext system he proposed in this paper] ties in better than anything previously used with the actual processes by which thought is progressively organised (Nelson 1965, p. 97).

Like Vannevar Bush before him, Nelson seeks a weapon against loss-and in particular, information loss. We are what we can remember-and we remember best when information is appropriately organised. As Bush wrote in the early 1930s, so Nelson believes that traditional methods of archiving, storage and retrieval are inadequate to deal with complex data. All methods of paper impose connective restrictions which mask the true structure of ideas. The benefit of a global hypertext system would be ‘psychological, not technical’ (Nelson 1965, p. 145), and its creation is of the utmost importance. Like Bush and Engelbart, Nelson feels a great sense of urgency about this. We are ‘trapped’ in the wrong paradigm-codex culture-and this is not the way the mind works (Nelson 1992, p. 321). We need a new paradigm, and we need it soon. ‘We will have to stop reading from paper. We can no longer find what we need in an increasingly fragmented world of information’ (Nelson 1967, p. 23).

Nelson despaired of ever finding an indexing and writing system which could organise the associations his mind produced, until he discovered the computer.

I was continually trying different systems for organizing ideas. File cards… were clearly hopeless. I tried index tabbing, needlesort cards, making multiple carbons and cutting them up. None of these solved the basic problem: an idea needed to be in several places at once… but then, in graduate school, I took a computer course (Nelson 1992, 1/24).

In a 1998 paper, Nelson also remembers seeing ‘a picture in Datamation of a map-this was 1960-of a map on a computer screen. Holy smoke! This was going to replace the printed word’ (Nelson 1998, p. 2). Like Engelbart, he remembers the idea of screen-based computing as the manifest destiny of writing. But unlike Engelbart, Nelson was not an engineer; he was ‘just a computer fan, computer fanatic if you will’ (Nelson 1998, p. 303) who had been following this technology closely for many years.

In 1960, Nelson announced his term project. A writing system for the IBM 7090, the only computer at Harvard at the time, stored in a big, air-conditioned room at the Smithsonian Laboratory. In the 1960s computers were viewed as number crunchers, ‘possessed only by huge organizations to be used for corporate tasks or intricate scientific calculations’ (Nelson 1965, p. 135). The idea that expensive processing time might be wasted on pictures and writing, of all things, was deemed crazy by the engineering community. As Professor Andries van Dam found six years later when he tried to create the first hypertext system on commercial equipment at Brown University, processing time on university mainframes was carefully meted out to physicists and engineers to ‘solve serious problems’, and the attitude was ‘if you want to write papers you can just damn well use a typewriter’ (van Dam 1999, interview).

Nelson ignored this. He proposed a machine-language program to store documents in the computer, change them on-screen with various editorial operations, and print them out. In the 1990s, he is quick to point out, we call this word processing (Nelson 1999, interview). Included in this design were facilities to compare previous or alternative versions of the same document, and extensive historical backtrack. Computers should be able to support the history of our ideas, and comparison between them. A writer should be able to cast her eye back over previous versions of the same document, discern the evolution of her ideas and recover sections if necessary. This is why computers were better than typewriters, better than paper-they could accommodate historical backtrack; they had an unlimited memory. The memory of paper is limited.

Interestingly, a deluge of hypermedia systems have begun to appear in University computer labs over the past eight years which claim to provide historical backtrack. They also claim that the problem with the computer world is that it doesn’t accommodate the dimension of time, because it is caught in the ‘paradigm drag’ of paper:

One more operation that isn’t possible in the world of paper but might be useful if you could get it is ‘time travel’-restoring some particular context from the past… [o]ur candidate for replacing the desktop is Lifestreams. [In Lifestreams], here is your computer environment circa 2010: every document you’ve ever created or received stretches before you in a time-ordered stream, reaching from right now back to the day you were born. You can sit down and watch new documents arrive: they’re plunked at the head of the stream (Gelernter, cited in Steinberg 1997).

Oddly, Gelernter contrasts this architecture to Xanadu, which he claims is just about ‘documents organized in relation to other documents by means of links’ (Gelernter, cited in Steinberg 1997). Contra Gelernter, Xanadu has always been about cycling through time, treating documents as evolving versions. Lifestreams (Yale University) or Linda (Scientific Computing Associates) are just two contemporary systems which embody this technique (both of these are commercial applications now). As I will explain over the next few pages, many of Nelson’s early ideas have recently been built-often without recourse or reference to Nelson’s work in the area. The dream of a perfect, universal archive and publishing system does not belong to Nelson. He has, however, managed to articulate it in a particularly infectious fashion, tailored to the digital era.

The second part of Nelson’s design took shape in the early 1960s, when there was ‘a lot of talk around Cambridge… about Computer-Assisted Instruction, for which there was a lot of money’ (Nelson 1992, 1/26). He designed what he called ‘the thousand theories program’, an explorable CAI program which would let you study different theories and subjects by taking different trajectories through a network of information. The basic idea was of many separate, modularised paragraphs, each with many branching choices: writing as a tree diagram, not a single line or sequence. Ideas, Nelson maintained, are inherently diagrammatic, not syllogistic. We think in terms of objects, and connections between them (Nelson 1999, interview). Which brings to mind Quintillion’s teachings on our ‘natural affinity’ for remembering places, objects and the connections between them (Yates 1997, p. 37). Nelson believed that computers should be able to support human thought in this way.

This led to a third design, which was drafted while Nelson was teaching sociology at Vassar College in 1965. This design combined the two original parts-historical backtrack and ‘non-sequential’ reading and writing. He thought about the architecture of the system, and decided to have sequences of information which could be linked together sideways. As with his first design, this would all occur on a computer screen, visually, in real-time. He called this system ‘Zippered Lists’. Zippered lists permitted linking between documents: like the teeth in a zipper, items in one sequence could become part of another. Versions of a document could be intercompared, an item could be an important heading in one sequence and a trivial point in another, and all items could be written or retrieved in a nonsequential fashion. Links could be made between large sections, small sections or single paragraphs. Links could be made between different versions of the same thing. Most importantly, however, chronological stages and sections in a document could be intercompared: writers could recall and trace the evolution of an idea.

Again, the important thing about this design was that it exploited the storage and display facilities of a computer: this kind of writing simply ‘could not be done on paper’ (Nelson 1965, p. 98). You did not need to retype the entire thing every time you changed your mind. Paper was so clumsy by comparison, so limited in its storage capacity (Nelson 1965, p. 98; Nelson seems to forget here that the design existed almost entirely on paper). This system was called ELF, or the Evolutionary File Structure.

Nelson’s next job was at a publishing house. It was here that he chose to rename the evolved design Xanadu, for its connotations in literary circles.

As the mysterious palace in Coleridge’s poem “Kubla Khan” – a great poem which he claimed to have mostly forgotten before he could write it down – Xanadu seemed the perfect name for a magical place of literary memory (Nelson 1992, 1/30).

Nelson responds angrily to suggestions that Xanadu is an impossible dream, but in selecting this name, he at least admits the irony (Rosenzweig). Kubla Khan is the Romantic era’s most famous unfinished poem; had Coleridge not been disturbed by an anonymous visitor, it could have been an epic masterpiece. Orson Welles (one of Nelson’s heroes) used the word Xanadu for Citizen Kane‘s extravagant, uncompleted mansion (Rosenzweig). Like the Xanadu system, the tantalising potential, or perhaps the loss, of this literary work permeates its memory.

In 1967, Nelson was told that Professor Andries van Dam was going to build a Hypertext Editing System (HES) at Brown University. The objective was to ‘explore this hypertext concept’ (van Dam 1988, 889). It was to run on 128k of memory on an IBM/360, the first general purpose computer built by IBM. Nelson went up there at his own expense to consult in its development, but found the experience quite frustrating (Nelson 1999, interview). He argued for the elaboration of HES’s hypertext features, but believes that the team were writing the system to a paper-based world. As he puts it, HES emphasised ‘paper printout and formatting’ (Nelson 1992, 1/31) to the neglect of on-line reading and writing. Nelson wanted pure hypertext, which could not be printed. Paper, in fact, was the enemy. To Nelson’s dismay, the HES team built what was effectively a word processor with linking facilities, modelled on the page as a central metaphor. As Nelson sees it:

this very, very clearly led to today’s web… they left out transclusion, left out multiple windows… this trivialisation became people’s notion of hypertext (Nelson 1999, interview).

That HES engendered the web, or even inspired its design, is debatable. Tim Berners-Lee claims he had seen Dynatext, one of van Dam’s later electronic writing technologies after HES, but claims not to have transferred this technical design to HTML (Berners-Lee 1999, p. 27). Nelson, however, believes that HES demonstrated a technical paradigm to the world at a higher level: what hypertext looks like and what it can do. ‘People [saw] this, and they [thought] oh that’s what hypertext is’ (Nelson 1999, interview). Like Engelbart, Nelson believes the technical system moves in paradigms, and that the current era is bound to paper as a central metaphor. We need to be forced from our collective tricycles. ‘I deal with new paradigms’ (Nelson 1999, interview).

Regardless of the departure from Nelson’s vision, the HES project was a great success, and was effectively the first on-screen, visual text facility that beginners could use (Engelbart’s NLS was still under development at this stage). But the project led to a falling-out between van Dam and Nelson, who describes Brown University’s next project, the File Retrieval and Editing System (FRESS), as an attempt to ‘write Nelson out altogether’ (Nelson 1999, interview). Nelson is painfully aware of the departure contemporary hypertext systems have taken from his vision, and can at times be bitter about this. He sees it as ‘a paradigm issue, a political issue, a religious issue’ (Nelson 1999, interview).

HES did not fulfil Nelson’s vision. Not to be deterred, however, he looked around for work in the computer field. He desperately wanted to be a ‘computer person’, but came up against the dominant paradigm at the time: computers are calculating machines. Nelson is a self-described liberal arts type rather than an engineering type-a dichotomy he deplores as it kept him ‘away from computing for so long’ (Nelson 1987, p. 25. With characteristic wit, he calls the two sides the ‘technoids’ and the ‘fluffies’ respectively). As Engelbart discovered in the late 1960s, if you tried to talk about the structure of ideas, the engineering community dismissed it as ‘babble about the human side of things’.

By the early 1970s, having offended anyone in the academy who could help him (Rheingold 1985, p. 302), Nelson found a few like-minded friends and programmers and attempted to write the Xanadu software himself. One of the most important collaborators was Roger Gregory, then a science fiction fan working in a used-computer store in Ann Arbor, Michigan. Gregory had technical skills Nelson lacked: training in computer programming, and an ability to make machines work. Gregory began to write shells of Xanadu code with Nelson in the mid-1970s, as the design matured. And he is still working on it-he coordinates Udanax, Xanandu’s open source project (www.udanax.com).

The idea which had started it all (Nelson’s first design back in 1960) was a system to accommodate historical backtracking: to retrieve and re-use any prior version of the same document. The next step was to expand this capability to handle entire document versions and to show the user on-screen which parts of these versions were the same and which were different. Nelson called this ‘intercomparison’. But by itself, this is just an inflated word-processing program. The idea got hyperdimensional when Nelson began to incorporate zippered lists and links into the new design.

Nelson coined the term ‘hypertext’ in 1962, and first published it in 1965 (CyberArcheology Project and also Bardini, 39). ‘Hyper’ means ‘exceeding… over, beyond, above’ (Oxford Concise Dictionary). The link, as Nelson saw it, would instantly take the user from one place to another, and that place would contain the excess, the overflow, the past or future of the previous idea. Unlike writing on paper, the link would consequently allow for a natural sequence of ideas, like thought itself. Links as Nelson saw them were deeply tied to sequence: ideas are only meaningful in relation to where they have been and where they might yet be. Navigation is pointless unless we remember where we have been. Like the Aristotelian emphasis on ‘original’ sequence, Nelson believed we should preserve the natural structure of ideas.

Ted did not just say ‘branch, link, make arbitrary associations’. He tried very early on to impose some discipline on linking (van Dam 1988, p. 889).

The link is central to the concept of hypertext, and according to Nelson, it is this structure which makes writing in hypertext different to writing on paper. Engelbart gives equal credit to Nelson for discovering the link: they were both working on similar ideas at the same time, but Engelbart claims he had the facilities and funding to build a machine that explored those ideas (Engelbart 1999, interview). As an engineer, Engelbart was more concerned with constructing the tool system than thinking about the form such a system might take, more aware of how ideas will change qua technical artefact. This is why, Nelson admits, Engelbart had ‘more basis for conversation with the computer mainstream’ than he did (Nelson 1995, p. 33). The computing community has always wanted deliverables; theorising is something humanists do. At the Fall Joint Computer Conference in 1968, Engelbart actually showed the world what text on computers would look like, and how links could facilitate such a system. Due to this work, Engelbart is ‘finally getting the credit he deserves’ (van Dam 1999, interview), but discussion of Xanadu still positions it left of centre.

Nelson, in turn, has always been inspired by Engelbart’s work. He dedicated his book on Xanadu, Literary Machines, to this ‘visionary of The Augmentation of Human Intellect… and (what this book is largely about) THE TEXT LINK’ (Nelson 1992). In our interview, he singled Engelbart out as his main inspiration in the early years. As he sees it, they shared an original vision: to construct a digital network which utilised the computer not as a calculating machine, but as a machine to boost human thought. Or as Nelson puts it, a ‘thinkertoy’. Nelson transferred some features from NLS to the Xanadu design in the late 1960s-in particular, the concept of using a mouse to activate links.

Now, I visited Doug in the Spring of 1967 because we talked about my coming out to work for him. Two things on that trip. Well several, one was -I loved the guy of course, everybody does, and I saw the mouse. Up til then I thought it would be light pens, well obviously the mouse thing worked better (Nelson 1998, p. 2).

The mouse was not the only thing that obviously worked [3] . Some of the proposed features of Xanadu were embodied NLS by 1968. The ability of the user to link, revise and window documents in real-time across the screen was a strong similarity, as was the idea that a system might be able to preserve and track changes. As Engelbart wrote of the ‘hypothetical writing machine’ in 1963 which was to become NLS;

[T]rial drafts can rapidly be composed from rearranged excerpts of old drafts, together with new words or passages which you insert by typing. Your first draft may represent a free outpouring of thoughts in any order, with the inspection of foregoing thoughts continuously stimulating new considerations and ideas to be entered. If the tangle of thoughts represented by the draft becomes too complex, you can compile a reordered draft quickly (Engelbart 1963, pp. 6–7).

Screen-based computing and historical backtrack; this paragraph reads like an excerpt from Nelson’s term project proposal in 1960. The concept of linking in NLS was also similar to Nelson’s vision, but this is an idea which Nelson claims was:

… by no means new. To go back only as far as 1945, Vannevar Bush, in his famous article “As We May Think” describes a system of this type … Bush stressed his file’s ability to store related materials in associative trails, lists or chains of documents stored together (Nelson 1965, p. 135).

By 1970, in both NLS and in the proposed Xanadu system, the user would point a mouse at links in the document and bring the appended reference to the screen. Information was stored and retrieved based on a networked structure structure (for Engelbart, this structure was a public, semantic web; for Nelson it would be more ‘unrestricted’ and personal). Nelson built this concept around the academic reference and incorporated the ‘mouse’ from NLS. As I explained earlier, Engelbart claims the idea of associative links and trails came from Memex. In both Xanadu and NLS, a return button would bring the user back to the point in the original text where the link symbol had appeared (in NLS, links could also be controlled by a keyboard command).

There is, however, a major difference between Xanadu and NLS: NLS was designed to boost work groups and make them smarter; it literally evolved around the technical activities of a group of engineers. This, perhaps, is why it emphasised keyboard commands, workflow and journaling. Xanadu was intended, like Bush’s Memex, to be a very personalised machine: more precisely, to empower the individual. Xanadu will ‘free the individual from group obtuseness and impediment’ so they can ‘follow their interests or current line of thought in a way heretorfore considered impossible’ (Nelson 1992, 0/3). To Nelson, links were not just part of an augmentation toolbox. They were the essence of a revolution-an ideological revolution. Literature need no longer be linear. We don’t have to read the same books, in the same order. We don’t have to learn at someone else’s pace, and in someone else’s direction. Hypertext furnishes the individual with choices; ‘YOU GET THE PART YOU WANT WHEN YOU ASK FOR IT’ (Nelson 1992, p. 2/16).

This aspect of Nelson’s vision cannot be ignored. Xanadu was (and still is) personal in the most libertarian, 1960s Californian sense. Links, Nelson maintains, furnish the individual with choices, with the right to choose. Nelson engaged in a rhetoric of liberation about hypertext well before George Landow and Jay David Bolter had discovered the text link; he pioneered technological utopianism in the digital era. Unfortunately, these excited presentations, filled with individualistic, egalitarian ‘hokum’ (van Dam 1999, interview), or more kindly, ‘passionate rhetoric’ (Snyder 1996, p. 27) have not helped the engineering world to take his design seriously.

This writing system, like the computer itself, is ‘FOR PERSONAL FREEDOM, AND AGAINST RESTRICTION AND COERCION’ (Nelson 1987, p. 2). As Nelson sees it, everybody should be able to create what they want and put it on the system, from bad ‘zines and pamphlets to Great Novels, and everybody should be able to quote or cite another document. When someone explores this information, their trails, their personal experiences of it should be preserved in all their uniqueness. Hypertext is more than just a technological shift, claims Nelson: it is an ideological overhaul of the way we read, write and learn literature. The way we think. COMPUTER POWER TO THE PEOPLE! Nelson has a habit of writing in capitals when he talks about computers and personal freedom.

A system with links and historical backtrack needs only an economic basis to become a publishing system. Nelson believes that hypertext should not only be an ‘archive for the world’ but a ‘universal instantaneous publishing system’ (Nelson 1992, 2/4). Every document must have an author who owns it and is the only one who may change it. Materials should be re-usable by anyone, with credit and payment going to the original author. In other words, the concept of authorship, ownership and quotation as we know it can and should be scrupulously preserved – and soon. We are witnessing the ‘slow death’ of literacy in contemporary multimedia; the institutions which protect it and ensure its integrity are dying (Nelson 1999, interview). Xanadu is ‘an intensive pursuit of “lost” or “forgotten” values’ in this regard (Moulthrop 1991, p. 695), an epic of recovery. And although it has failed to ‘recover’ copyright or ownership from the clutches of contemporary multimedia, Nelson’s early writings certainly forecast a problem here. In ‘the digital world … [c]opyright law is totally out of date. It will probably have to break down completely before it is corrected’ (Negroponte 1995, p. 58).

As a forecaster in the field of computing, Nelson has been remarkably successful -at forecasting (Rheingold 1985, p. 299). In his 1965 paper, at a time when computers were viewed as esoteric calculating machines attended by men in white cloaks, Nelson claimed that computers would eventually ‘do the dirty work of personal file and text handling’ (Nelson 1965, p. 85). In 1967, inspired by NLS, he predicted a networked structure of information that would ‘be read from an illuminated screen; the cathode-ray display; it will respond or branch upon actions by the user. It will be a succession of displays that come and go according to his actions’ (Nelson 1967, p. 195). Needless to say, interactive multimedia has happened. But it is not quite the ‘world-spanning’, permanent archive Nelson intended (Wardrip-Fruin 2003, p. 441).

One aspect of the design which the web has vindicated, however, is Nelson’s vision for the commercial provision of network access. In 1980, Nelson pitched the idea of ‘Xanadu Information Franchises’ in his book, Dream Machines. These would be local computer stations where data shoppers could access material from a global information system, distributed over the telephone. They would be everywhere: the McDonalds stands of cyberspace. He even included a sketch of the interior, complete with a snack bar and jingles. It was (perhaps…) a silly idea, but it certainly predicted the domestic penetration of ISPs and data portals in what would eventually become the web. Sceptical readers, Stuart Moulthrop writes, might see in this vision of Xanadu yet another domain of the postmodern theme park. ‘Gentle readers, welcome to Literacyland!’ (Moulthrop 1991, p. 695). Xanadu has always been part technical design, part utopian business plan.

Nelson’s business ventures, however, have yet to meet with success. As one might assume from the previous example of Xanchises™, this could be ‘because Ted packages ideas in so much… P.T. Barnum salesmanship that people distrust it’ (van Dam 1999, interview). It could also be because Xanadu has always been more urban legend than demonstrable system. There is a difference between vision and technical artefact (or successful business).

Although Nelson seems to figure in every major history of computing and multimedia (e.g. Segaller 1998; Montfort & Wardrip-Fruin 2003; Packer & Jordan 2001; Ceruzzi 1998; Rheingold 1992; Abbate, Shurkin & Bardini 2000) as the man who dreamed up hypertext, he is not a known quantity outside of the digerati. The people he appears beside in such histories-Doug Engelbart, Bob Metcalfe, Tim Berners-Lee-have attained worldwide recognition (and in the case of Metcalfe, wealth) and have directly influenced the course of computing. Nelson’s influence is more indirect; all we have of Xanadu is its erotic simulacra, its ideals, its great potential.

You know, [Nelson] gets his name in the newspapers, but there isn’t a company that has him as a serious consultant. He’s not going to influence IBM, he’s not going to influence Compaq or Intel or any of the people who really could make a difference. [Although] I think his ideas are really being absorbed by the world over time (van Dam 1999, interview).

Visions take longer to influence the engineering world than prototypes. Nelson’s ideas have been absorbed over time, but ‘not due to his direct advocacy’ (van Dam 1999, interview). In the next section, I will be looking more closely at his major project, Xanadu, and at the design of this system – a task more difficult than it seems. Until quite recently, the design itself was shrouded in secrecy (Wardrip-Fruin 2003, p. 441), a fact which certainly did not help its promotion to an engineering culture which emphasises code, software, deliverables, ‘things that are concrete’ (van Dam 1999, interview). Nelson is also reluctant to list the technical aspects of Xanadu, as it detracts from explaining the paradigm shift that Xanadu represents (‘You see, as soon as you start listing things, you’re out of understanding the paradigm and into understanding features’ Nelson, 1999). It is always difficult to extricate technical design from Nelson’s philosophy: the two are mutually constitutive.

The Xanadu System

My principal long-term concern is the exact representation of human thought, especially that thought put into words and writing… to maintain a continuing evolutionary base of material and to track the changes in it (Nelson 1997).

Nelson is proposing an entirely new ‘computer religion’. This religion attempts to model an information system on the structure of thought and the creative process behind writing, ‘if we can figure out what that is’ (Nelson 1992, 2/5). One thing that we do know, however, is that the nature of thought is change. Consequently, maintains Nelson, a system which is true to thought should be able to retrieve and track changes regardless of the current state of the document. It would accommodate change. Historical backtrack and intercomparison imply control over time itself, the ability to transcend change. These two concepts were integral to Nelson’s first writing project back in the 1960s. With the addition of the third principle-linking-in the late 1960s, the Xanadu design began to take shape. I will now elaborate on the technical design of Xanadu as a proposed universal publishing system and archive.

In our interview, Nelson hit upon the sentence he had ‘been looking for years’ to explain the design in a nutshell.

Xanadu is a system for registered and owned content with thin document shells, re-usable by reference, connectable and intercomparable sideways over a vast address space (Nelson 1999, interview).

I will unpack this in three parts. Firstly, the principle that documents should be ‘re-usable by reference’- ideas evolve bit by bit, and these bits should not be stored redundantly. For Nelson, the computer offers the opportunity to track these pieces in memory, to track their use or citation. Secondly, the idea that these bits might be identified not by where they are, but by what they are ‘over a vast address space’ navigable by links. These concepts ground Nelson’s more recent idea of transclusion, which is at the heart of Xanadu’s most innovative commercial feature: transcopyright, a ‘system for registered and owned content’. I will then discuss the relationship between these ideas and the web, and perhaps more deeply, why the web has ‘ignored’ them.

In most computer applications, it is necessary to make and keep several copies of the same document. This is because we wish to assure the safety of recent work, or because we wish to keep track of former states of the same work. It is also because we wish to have the same work in different places-hence mirror sites on the web. Nelson believes that these purposes have been mistakenly fused into the single practice of making multiple copies. As he puts it, if you are repeatedly making small changes and then storing the unchanged material, there is redundancy (Nelson 1992, 2/13).

He proposes a storage system which takes care of backtrack automatically, filing just the changes, chronologically. The part you want is retrieved and integrated on the fly, and the ‘document’ becomes a series of alternative versions. This means the user can scroll through time as well as space in her work, tracing the evolution of ideas and comparing versions. In essence, the Xanadu system holds modular pieces of a document, not whole documents in multiple places. These parts are re-usable in the same document, but most importantly, they are re-usable by reference outside of the document (a concept Nelson calls ‘transclusion’). Xanadu is about controlling and tracking the location of these pieces, in the same document or in other documents. Liberatory rhetoric aside, this fact cannot be ignored; the system architecture of Xanadu is about the ubiquitous control and tracking of information.

Importantly, document versions are navigable via a parallel-not an embedded-linking structure. The Xanadu system must allow the user to create any type of link between anything they want to link. There are many different breeds of link, for instance, one-to-many or calculated links, which are largely unsupported on systems like the web. Links in Xanadu must be bivisible and bifollowable (capable of being seen and followed from the destination document as well as the originating document). By contrast, hyperlinks on the web are univisible and unidirectional. This is due to the fact if you wish to create a bidirectional link with embedded markup, you must insert anchor tags in the destination document as well as the referring document-and if that destination document belongs to someone else, this is impracticable. Also due to the embedded nature of markup, linking to a someone else’s URL dumps you at the top of the page-not at the particular quote or image you were referring to (unless the destination document has included ‘anchor’ tags for you). A linking structure should be separated from the document structure, maintains Nelson.

On the web, users also face the problem of ‘updating’ links when the destination document is moved or changed. This is because web documents are currently identified by where they are, not what they are. If the URL of a document changes, all referring links become dead ends because the URL points to a single position on a particular computer (the server), not the information itself. In the proposed Xanadu system, links of any type would attach themselves not to a positional, geographical address but to specific characters-Nelson calls this a ‘span’: modular pieces of a document. Hence, because objects are not identified as a location but by what they are across a vast address space, links would update automatically as their position changed. No dead ends. No outdated links or 404’s. Every document would be uniquely identified character for character, wherever it is. This would be the perfect archive: its trails would never fade.

This is not to say that only one copy of a document may exist on the Xanadu server, but that there would be numerous ‘instances’ (Nelson 1995) throughout the network and on the user’s machine. These disparate bits or instances would be collected into a single virtual object-and the way to do this, Nelson maintains, is to identify bits not by where they are, but by what they are.

Which brings us to a common criticism of Xanadu. Nelson’s project is often described as infeasible by academics and journalists alike (for example, Wolf, 1995 and Bolter 1991, p. 103, for social and political reasons). However, the most widely-read piece on Nelson, Gary Wolf’s feature article in Wired, condemned the project outright as technically infeasible. ‘[T]he colleagues were trying to build a universal library on machines that could barely manage to edit and search a book’s worth of text… they always had too much data to move in and out of memory’, he says, referring to the 128k machines that Nelson’s programmers were working with in the early 1970s. This comment, and the tone of the article, inspired Nelson to write a bitter essay in response. Wolf’s criticism was wrong.

The idea behind Xanadu is to point at the data and recreate it as a virtual object, rather than moving the entire thing in and out of memory-or for that matter, storing the ‘corpus of all fields’ on your hard drive. Why store thousands of copies of a document all over the world, when the part you want can be securely identified and retrieved with a link? (Nelson 1995, p. 2). Wolf’s criticism missed the essence of Xanadu: everything is finely controlled and tracked in pieces, and these pieces can exist anywhere, in any number of places. It is the design for a distributed archive.

Integral to this idea of pointing at bits of a document rather than storing multiple copies of it in memory is the concept of transclusion. Transclusion is a term introduced by Nelson to describe the process of including something by reference rather than by copying. Whenever an author wishes to quote, he or she uses transclusion to ‘virtually include’ the passage in his or her own document. As Nelson is fond of saying, all this means is making and maintaining connections between things that are the same (Nelson 1995), or ‘deep connectivity’ as the Udanax community term it. Remote instances remain part of the same virtual object, wherever they are. This concept underpins Nelson’s most famous commercial feature: transcopyright.

The on-line copyright problem may be resolvable by a simple, sweeping permission method. This proposed system, which anyone may use, allows broad re-use of materials in exchange for automatic tracking of ownership. Payment goes to the original publisher and credit to the original author (Nelson 1995).

In other words, transcopyright reframes the question, ‘how do we prevent infringement of copyright?’ as ‘how can we allow re-use?’. As with all copyright, permission is needed to republish. Copyright holders choosing to publish on a Xanadu system supporting bivisible and bifollowable links automatically grant permission for others to transclude their material, provided it is purchased by the recipient. Necessarily, a mechanism must be put in place to permit the system to charge for instances, a micropayment system which provides a bridge to the original from each instance. Critically, this bridge should never break; links should not be outdated. At the same time, the bridge must leave no trace of who bought the pieces, as this would make reading political. As such, Xanadu would require a micropayment system parallel to the docuverse. This is Nelson’s economic infrastructure.

The web has actually evolved to incorporate numerous micropayment systems-but they neither work according to Nelson’s system of ‘transclusion’ nor acknowledge Xanadu in any way. W3C even has a markup system for Web Micropayment on a pay-per-view basis (see W3C Working Draft 15 March 1999). This is based on HTML, however: it is not ‘parallel’ to the docuverse. Links can still be outdated, bridges broken, names and numbers lost.

Transcopyright and bifollowable or bivisible linking standards have not been incorporated into the web either. As Nelson puts it:

…the web is a universal, world-wide, anarchic publishing system. It completely vindicated my 35 years of saying that a universal, worldwide, anarchic publishing system would be of enormous human benefit. There is no question of whether it is of benefit. It just does it all wrong, that’s all. (Nelson 1999, interview).

Nelson’s concept of hypertext influenced Tim Berners-Lee, who appropriated Nelson’s term to describe his language (HTML, or Hypertext Markup Language, which describes how graphics and text should be displayed when they are delivered across the web). But as Nelson (1999, interview) puts it, ‘HTML is like one-tenth of what I could do. It is a parody. I like and respect Tim Berners-Lee [but] he fulfilled his objective. He didn’t fulfil mine. ‘ Although I won’t have the space here to go into the evolution of HTML [4] , it should be noted that Berners-Lee shared Nelson’s ‘deep connectionist’ philosophy, and his desire to organise information associatively.

A computer typically keeps things in rigid hierarchies … whereas the human mind has the ability to link random bits of data. When I smell coffee, strong and stale, I find myself in a small room over a corner coffeehouse in Oxford. My brain makes a link, and instantly transports me there (Berners-Lee 1999, p. 3).

From milk to white, from white to air, from air to damp: this is association as Aristotle formulated it two thousand years ago (De memoria et reminscentia, pp. 8–16, cited in Yates 1997). Berners-Lee also believed that

… a piece of information is really defined only by what it is connected to, and how it’s related. There really is little else to meaning. The structure is everything … all that we know, all that we are, comes from the way our neurons are connected … my desire [is] to represent the connective aspect of information (Berners-Lee 1999, p. 13).

However, while Berners-Lee was met with scepticism and passivity, Nelson-with his energetic and eccentric presentations and business plans-received entirely disparaging responses (Segaller 1998, p. 288).

But Nelson slogs on, ideals held high above the mud. And he is not alone: the Xanadu vision lives on in the minds (and evolving code shells) of hundreds of computer professionals around the world. The extended Xanadu community is even wider, and has spawned several organisations devoted to the pursuit of a ‘solution’ to the ‘problem’ of contemporary information storage and publishing based on Nelson’s vision. In 1999, Udanax (run by Roger Gregory) released pieces of the elusive source code for Xanadu, Udanax Green™ and Udanax Gold™ (udanax.com). Xanadu Australia, another offshoot, is run by Andrew Pam, Nelson’s ‘most promising disciple’ (Mitchell). The vision refuses to die.

The Xanadu Australia formal problem definition is: We need a way for people to store information not as individual “files” but as aconnected literature. It must be possible to create, access and manipulate this literature of richly formatted and connected information cheaply, reliably and securely from anywhere in the world. Documents must remain accessible indefinitely, safe from any kind of loss, damage, modification, censorship or removal except by the owner (Xanadu Australia Home page text, November 2003).

Xanadu is an ‘epic of recovery’ (Moulthrop 1991, p. 695) for the digital era, and it has entered into the imagination of a generation of developers. Unlike Bush’s Memex, people keep trying to build the thing as it was first designed. This fact alone is evidence of its impact: technical white papers are not known for their shelf-life, but Xanadu’s have thrived for over 40 years.

Acknowledgements

Noah Wardrip-Fruin from Brown University provided useful comments, suggestions and criticisms of this article. I would also like to thank Andries van Dam and Ted Nelson for their time.

Bibliography

Barnet, Belinda 1998. ‘Reconfiguring Hypermedia as a Machine: Capitalism, periodic tables and a mad optometrist’ Frame: The Journal of New Media Art, issue 2. http://trace.ntu.ac.uk/frame2/articles/barnet.htm
Bardini, Thierry 2000, Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing, Stanford University Press.
Berners-Lee, Tim 1999, Weaving the Web: The Original Design and Ultimate Destiny of the World Wide Web by its Inventor, HarperCollins, New York.
Bush, Vannevar. The Inscrutible ‘Thirties’ (1933) in Nyce & Kahn (eds.) From Memex to Hypertext: Vannevar Bush and the Mind’s Machine, London & San Diego: Academic Press, 1991.
Bush, Vannevar. ‘As We May Think’ in The Atlantic Monthly, Vol. 176, No.1, 641-649, 1945.
Bush, Vannevar. Pieces of the Action. New York, William Morrow, 1970.
Delaney, Paul & Landow, George P. (eds) 1994, Hypermedia and Literary Studies, MIT Press, Cambridge MA.
Engelbart, Douglas 1963, ‘A Conceptual Framework for the Augmentation of Man’s Intellect’, in P.W. Howerton & D.C. Weeks (eds.), Vistas in Information Handling, Spartan, Washington DC, 1–29.
Engelbart, Douglas. “The Augmented Knowledge Workshop” (AKW) in A History of Personal Workstations (Adele Goldberg ed.) ACM Press, Addison-Wesley Publishing Company; New York, New York, 1988. pp. 185-249.
Engelbart, Douglas 1999. Interview with the author.
Moulthrop, Stuart 1991, ‘You Say You Want a Revolution? Hypertext and the Laws of Media’, reprinted in Wardrip-Fruin & Montfort (eds) 2003, 692–704.
Negroponte, Nicholas 1995, Being Digital, Hodder & Stoughton, Rydalmere NSW.
Nelson, Ted 1965, ‘A File Structure for the Complex, the Changing and the Indeterminate’, in Proceedings of the ACM 20th National Conference, ACM Press, New York.
Nelson, Ted 1980, ‘Replacing the Printed Word: a Complete Literary System’, in S.H. Lavington (ed.), Information Processing 80, North-Holland Publishing Company, IFIP.
Nelson, Ted 1987, Computer Lib/Dream Machines, Microsoft Press, Redmond WA.
Nelson, Ted 1991, ‘As We Will Think’, in Nyce & Kahn (eds) 1991, 245–260.
Nelson, Ted 1992, Literary Machines, Mindful Press, Sausalito CA.
Nelson, Ted 1995. Transcopyright: Pre-permission for Virtual Publishing http://xanadu.com.au/ted/transcopyright/transcopy.html
Nelson, Ted 1997, ‘Embedded Markup Considered Harmful’, XML.com: XML from the Inside Out, http://www.xml.com/pub/a/w3j/s3.nelson.html
Nelson, Ted 1998. ‘The Unfinished Revolution and Xanadu’ paper presented at Engelbart’s Unfinished Revolution, Stanford University December 9 http://www.cs.brown.edu/memex/ACM_HypertextTestbed/papers/64.html
Nelson, Ted 1999, Interview with the author.
Nyce, James and Kahn, Paul (eds.) From Memex to Hypertext: Vannevar Bush and the Mind’s Machine. London: Academic Press, 1991.Segaller, Stephen 1998, Nerds: A Brief History of the Internet, TV Books, New York.
Smith, Linda C. 1991, ‘Memex as an Image of Potentiality Revisited’, in Nyce & Kahn (eds) 1991, 261–286.
Snyder, Ilana 1996, Hypertext: The Electronic Labyrinth, Melbourne University Press.
Steinberg, Steve S. 1997, ‘Lifestreams’, Wired 5:02, February,
http://www.wired.com/wired/archive/5.02/fflifestreams_pr.html
van Dam, Andries 1999, Interview with the author.
van Dam, Andries 1988. ‘Hypertext ’87 Keynote Address’ Communications of the ACM, Vol. 31 No. 7, July 887–895.
Wardrip-Fruin, Noah & Montfort, Nick (eds) 2003, The New Media Reader, The MIT Press, Cambridge MA.
Wolf, Gary 1995, ‘The Curse of Xanadu’, Wired, Vol. 3 No. 6, June,http://www.wired.com/wired/archive/3.06/xanadu.html

Endnotes

[1] Vannevar Bush is an American engineer most famous for his work on the design of ‘Memex’, an analogue machine composed of gears and levers that used microfiche technology to store articles in associative ‘trails’. Although the machine was never built (it existed entirely on paper), it is often seen as a precursor to modern hypertext systems. The design was first published in 1945 in The Atlantic Monthly, but Bush’s autobiography, Pieces of the Action, and also his essay “Memex Revisited” tell us that he started work on the design in the early thirties (1967, 197; 1970, 130).
Douglas Engelbart, who was inspired by Bush’s work on the Memex, created the world’s first working hypertext system in the late 60’s at the Stanford Research Institute – the oN-Line System. This was also the first project to implement video conferencing and the Windows-Interactive Menus-Pointing Device (WIMP) interface. As part of this project he invented the computer mouse, another innovation he is famous for.
[2] It may also be argued that Bush’s Memex was the greatest image of potentiality in information science. Linda C. Smith undertook a comprehensive citation context analysis of literary, technical and scientific articles produced after the 1945 publication of Bush’s article in The Atlantic Monthly, ‘As We May Think’, a work which urges scientists to turn to the task of making more accessible the growing store of knowledge and proposes a prototypical machine for organising and managing this: Memex. She found that the great body of American authors writing from a historical perspective over the last 45 years which treat of Bush maintain that it was the starting point of modern information science (Smith 1991, p. 265).
[3] Engelbart invented the mouse after a series of experiments at SRI. These focussed on the way people select and connect objects together across a computer screen. A section of the lab was given over to “screen selection device” experiments, and different technologies were tested for speed and flexibility (1988, 195). Light pens, which were based on the Memex stylus, knee or head-mounted selection devices, joysticks and the mouse were all trialled. The mouse won, and Nelson picked the idea up from the NLS prototype.
[4] It should also be noted that the evolution of HTML itself was neither planned nor controlled by Berners-Lee. HTML was created to organise the text documents of a single corporation-CERN. It is now the lingua franca of a global hypertext system, the web, and its uses have proliferated beyond ‘what any of us truly envisioned you could have’ (van Dam 1999, interview with the author). It has adapted and evolved, and it has both incorporated and engendered new functions and new material technologies in the process. Private companies like Netscape had as much to do with this success as Berners-Lee, and it might be argued that the private sector has more influence over the way hypertext is currently implemented on the web than the W3C. For more information on the history of HTML and the web, see Berners-Lee’s own account (1999).
Created on: Thursday, 14 July 2005 | Last Updated: Wednesday, 27 July 2005

About the Author

Belinda Barnet

About the Author


Belinda Barnet

Belinda Barnet is Lecturer in Media and Communications at Swinburne University of Technology. Her work has appeared in journals like Continuum, Convergence, Fibreculture Journal, The American Book Review, Media/Culture (M/C) and CTHEORY.View all posts by Belinda Barnet →