Performances Of Writing In The Age Of Digital Transliteration

Abstract (in English): 



This paper addresses and attempts a reconfiguration of the theory and practice of writing (and/or language art) in networked and programmable media (hereafter, in this abstract, abbreviated as "npm").

A number of problems provoke the paper's arguments, problems and concerns which arise from the current practice of language art in npm :

- the problematic (non-)engagement of language artists (poets) with text-making in npm

- the (non-)engagement of visual, cinematic, audio-visual artists (especially those who are currently working in npm) with practicing language artists

- the identification of those characteristic of textuality which are proper to npm (this involves a critique of hypertext, when hypertext is seen as a definitive or determinative genre in npm ) and the relationship of these characteristics to demonstrable language art practice

- the (mis-)assimilation by established literary culture of rhetorical technologies which are emergent in npm.

Thus, while the paper's arguments are theoretical or expository, there is also an underlying agenda, which might be expressed in a more polemical fashion:

- programmers must be readers:

Programmers (including programmers per se) and artists working in programmable media should pay more attention to language art, especially the (historical) textual practice of poets and literary experimentalists.

- writers must be programmers

Writers should reconfigure their practice in order to engage, explicitly, the reconfiguration of delivery media.


My use of certain terms may need a little preliminary explanation:


Networked and Programmable Media (npm): This term is used in order to avoid problematic or overly specific alternatives, 'new media,' 'cyberspace,' 'the Net,' 'the Web,' etc., and also because it points to characteristics of the media in question which distinguish them, viz.:

'Networked' highlights the many-to-many access, and downstream-upstream transactionality of media clustered around the Net, which has instantiated text-making which all but denies closure, and enables multi-user, collaborative text-making for social and artistic purposes.

'Programmable' highlights the intrinsic programmability of this complex of (delivery and compostion) media, enabling, for example, time-based performance of literary work.

'Cybertext': 'cybertext' is used as a more inclusive 'category of (electronic) textual media' (Aarseth, 1997), encompassing, amongst other types,

'Hypertext': which I reserve for writing in link-node structures.

'Programmaton': When forced to make reference to hardware on which the media under discussion are implemented, I prefer this neologism to 'computer'. It is intended to resonate with 'automaton', and implicate senses such as the 'programmed/programmable thing/thinker/thinking-thing/actor'. The reason for wanting to displace 'computer' is that the original meaning of this word was a *human* arithmetic slave who performed difficult computations (because, pre-Babbage, there was no other way to do them). 'Computer' ties the programmaton too tightly to its original uses as a more flexible calculating machine substitute.


The text will be written up as a conventional, structured research paper with notes and bibliography. However, it will be delivered live, with the projected backdrop of a hypertextual version, including 'live demos' of quoted time-based literary objects.



The key arguments of the paper emerge from discussion around the questions: what do we mean by 'digital' as in 'digital media' or simply 'the digital'? What do we mean by 'digitisation'?


Clearly, there is some shift in the characteristics of cultural production, and in the culture itself, when the media of privileged cultural and artistic forms are digitised. In recent times we have become aware of digitisation chiefly in so far as it has reconfigured audio-visual production, where, logistically, technology has only quite recently allowed the previously analogue to be translated to digital. Apparently, a revolution is underway.

But if we consider the relationship of the digital to linguistic and literary media, our view of digitisation and its significance, is radically recast.

Whereas in sound and vision (disregarding, for our particular technological moment, the potential digitisation of other sensoria - smell, taste and touch), what we currently see is the *simultaneous* digitisation of compositional and delivery media, in language the history of digitisation is very different.

Arguably, the compositional medium of language is already 'digitised' as alphabetic writing, and has been for the entire history of writing. Once linguistic inscription had been encoded in a small character set (1700-1500 BC in all but the Chinese culture-sphere), an important field of cultural production was already digitised. Literature was transcribed in a medium that is structured in a managable number of discrete objects. It is, therefore, divisible into well-understood units down to an elementary level, and so easily editable. Moreover, it is editable in such a way that, after redrafting, the fractures and joins of the editing process may be made invisible. These are all features of so-called 'digital' production - now being applied, in particular, to audio-visual material - which are perceived as novel, the 'new' of new media. In fact, they are quite good examples of features found elsewhere in the culture, migrating from literature to the disciplines of image, sound, kinetics, etc., for reasons which are chiefly technical and logistic. What we see today is simply the digitisation of the *delivery* media of language art, along with the 'more novel' digitisation of audio-visual.

This is not to say that alphabetic transcription is essentially the same as binary representation. However, alphabetic writing already had digital characteristics and, more importantly for the development from 'computing' - calculating projectile and missile trajectories - to net art, the alphabet allowed relatively easy encoding of language-as-text, because of the relatively small number of transcription elements or letters. When computing machines came to be appreciated as more generalised Turing machines, or 'programmatons,' our traditions of writing were surprisingly well-adapted.

The prior 'digitisation' of alphabetic writing in this sense has had massive critical consequences, for the development of npm in particular. Even when only very low bandwidth communications were possible, as in the early days of the internet, it is arguable that the low-bandwidth/high-significance compositional medium of written text -- which was handily 'byte-sized,' i.e. composed from a relatively small number of transcription elements which could be easily encoded in eight bits -- allowed machine-modulated communication to proliferate usefully and finally to set its rhizomes in the mulch of Media and popular culture. Even now, apart from any juggernaut-like progression of audio-visual digitisation, this 'already digital' compositional medium of writing enables what is perhaps the most radical (both senses) construct in our current culture, the collaborative multi-user texts of MOOspace, which may be conceived as artefactual representations of almost any aspect of social interaction which may be mediated by text (and this turns out to be very broad range). However, the further investigation of collaborative, co-creative textuality is not the focus of the present paper, which will instead return (shortly) to important aspects of literary culture which emerge from its progressive digitisation, or more specifically from the digitisation of the delivery media of language art.




There is a somewhat tangential, but necessary argument here which is concerned with the relationship of poststructuralism and digitisation and which entails a critique of hypertext (in so far as hypertext represents a privileged 'ur'-genre of electronic textuality).

The majority of the subversive tropes and figures of electronic text - intertextuality, non-linearity, the 'writerly' text, the nomadic reader and problematized author - are, arguably, functions of the digital characteristics of (especially) alphabetic writing, regardless of delivery medium. Text was always a medium perfectly adapted for the inherently (post)modernist experiments of collage, intercutting and creative plagiarism, ideal for the development of sampling (in the musical sense), framing and linking. Since the emergence of the codex or book format, literature has also been randomly accessible - including through non-linear links in the form of indexes, cross-references, tables of contents and so on. Hypertext on the net (and in other more recondite forms and formats) is seen to represent an 'advanced' version of literacy, 'late literacy' (Bolter) or 'post-literacy' or 'electracy' (Ulmer). Hypertext practice is proposed as a privileged instantiation of post-structuralist critical theory or even, in a sense, as its objective correlative. However, the critical theory in question was developed as a critique of the literary and philosophical tradition, *prior* to the implementation of networked and link-node text. The tropes and figures of hypertext were latent in literacy and not established by its 'advances'. Structuralist and post-structuralist critiques were precisely designed to show how, for example, the supposed voice of the author was a cultural construct, by no means implied by any essential characteristics of writing. Rather, writing itself -- long before its translation to node-link webs -- subverted these constructions, exposed them as intertextual, non-linear, fragmentary, of problematic origin and reception.

However, the Derridean critique through writing of literary culture and Western metaphysics was more profound and generalised, opening out, as Ulmer has demonstrated, to mode of cultural practice which remains highly appropriate in npm, a practice of inscription on any surface, however complex, "... thus we say 'writing' for all that gives rise to an inscription in general, whether it is literal or not and even if what it distributes in space is alien to the order of the voice: cinematography, choreography, of course, but also pictorial, musical, sculptural 'writing.' ... It is also in this sense that the biologist speaks of writing and *pro-gram* in relation to the most elementary processes of information with the living cell. And, finally, whether it has essential limits or not, the entire field covered by the cybernetic *program* will be the field of writing." (*Of Grammatology*, trans. by G. Ch. Spivak, Baltimore and London: John Hopkins, 1976, p. 9)

Moreover, and more pertinently for the present arguments, Derrida's understanding of writing recognizes the significant cultural determinations of the elements of any particular writing system -- its letters -- down to the ulitmate elementary reduction of (two-letter) binary representation, without seeking to reinstate a transcendental signified at any point. If literature is to be digitised, it has to face up to and work at the edge of this abyss. Derrida has something of a prior claim to have recognized that we stand or fall at the edge of this abyss, and his view of writing can, for this reason, engage with digitisation as it extends into the delivery media of literacy.




But still, we have not really asked yet what we mean by digitisation. The hypothesis presented here is that digitisation represents two things: 1) translatability, or perhaps a guarantee of translatability (of 'content', and from one medium to another) and 2) programmability, the ability to manipulate (digital) representations easily and seamlessly, and in ways which may also be systematic or algorithmic. With the further implication that programming may be at the service of translation in this and related senses, and that our understanding of translatability is inflected by programming.




(nb. in the paper these two sections will be the most substantial; in this abstract they are severely abbreviated, since the abstract introduces and provides context for the claims the paper will set out.)

Translatability is invoked here as one possible name for what digitisation promises: the ability to 'translate' material and objects in either or both of the compositional or delivery media of a cultural practice into corresponding digital representations.

Furthermore, because a variety of such cultural practices are successively submitting themselves to digitisation, its underlying binary representations come to be seen and used as an ultimate 'pivot code,' capable of provoking, for example, experimental essays in 'translation' across media, art forms, cultures.

Thus while no one confuses digitisation with translation 'via' a transcendental signified, it enables procedures of translation to be attempted with some hope of culturally significant results.

Translatability is also invoked here because it is a key concept in poststructuralist and, for that matter, earlier important critiques of language:

"In effect, the theme of a transcendental signified took shape within the horizon of an absolutely pure, transparent, and unequivocal translatability. In the limits to which it is possible, or at least *appears* possible, translation practices the difference between signifier and signified. But if this difference is never pure, no more so is translation, and for the notion of translation we would have to substitute a notion of *transformation*: a regulated transformation of one language by another, of one text by another." (Jacques Derrida. Positions. Translated and annotated by Alan Bass. London: The Athlone Press, 1981 (original, Les Editions de Minuit, 1972), p. 20, from the interview with Julia Kristeva, 'Semiology and Grammatology'.)

"Translation attains its full meaning in the realization that every evolved language ... can be considered as a translation of all the others. By the relation ... of languages as between media of varying densities, the translatability of languages is established.

Translation is removal from one language into another through a continuum of transformations. Translation passes through continua of transformation, not abstract areas of identity and similarity." (Walter Benjamin, 'On Language as Such and on the Language of Man,' in One-Way Street, London: Verso, 1979, p. 117, my emphases;

I'm reading the latter part of this sentence as negating abstracted meanings (?transcendental signifieds?) derived from identities or similarities and not identity or similarity per se.)


Both these quotes are striking for 1) their problematized (and problematizing) identification of something which allows at least the appearance of translation, and 2) the recognition that translation, in this context, becomes a matter of process, of regulated (?programmed) and reiterated transformation.

This paper's argument continues but suggesting that 'translation' is, in fact, a flawed concept, irredemably compromised by the spectre of the transcendental signified (or explicitly, in Benjamin's case, the logos). Taking a hint from the transformational processes which both thinkers find emergent when the problem of translation is addressed, and substituting digitisation for the underpinning pivot code, 'transliteration' is put forward as a working concept. This term recognizes the crucial role of the fundamental elements of any system of inscription -- letters -- and suggests that, apart from transcription, writing, the production of literary objects should be seen as a programmed process.

Whereas translatability has traditionally underpinned of writing, permitting writing in the sense of the 'translation' of experiences or (transcendent) meanings into language; transliteration is writing at the edge of the abyss of 'meaningless' 1s and 0s -- the ultimately reduced character set -- and its associated metaphors reconfigure writing practice accordingly.




Specifically, they reconfigure writing as by definition provisional, always subject to reversioning. They indicate that writing may be better understood as programming just as programming is writing, writing recognized as prior and provisional, the detailed announcement of a performance which may soon take place (on the screen, in the mind), an indication of what to read and how.

And because of actual existing digitisation and its npm, the delivery medium of writing is now programmable. It has full Turing programmability. There is no radical break here. While there might seem to be a disjuncture between, say, the printed page and a literary chatterBot or poetic text-generator - in fact there is a continuum. The programmaton and its associated technologies have allowed writers to increase their intervention in the programming of a text *progressively*. When a writer takes over responsibility for the layout and design of the work, what is this but programming? The once-'new,' now-familiar technology of desk-top publishing allows writers to encode programmatic indications of suggested 'ways to read.' A text-generator, designed by the writer, simply takes the programming of such suggested ways to read a few stages further.

As an alternative tradition for these developments, there is also a (largely 20th-century) literary history of relevant programmatic engagements with writing, in many ways running parallel with or counter to the canonical literary history of modern and contemporary writing. This is a history of literary experimentation in which the rhetorical potentialities have been more consonant with those of writing in npm than, for example the existing and potential rhetorics of hypertext. The paper briefly outlines and provides a number of examples of this history from the seminal moment of Mallarmé's 'A Throw of the Dice will never Abolish Chance', and also relates it to the parallel history of typographic design and book art.

Finally, the paper addresses and attempts to provide examples of exisiting text-making in npm which represent some of its potentialities as the delivery media of language art are digitised and so able to exploit the already 'digital' and transliteral effects instrinsic to 'language as such' (as Benjamin puts it). Language art gains access to modes of publication which are time-based: textual movies, texts as movies (already now with us and soon to be tanked into Quicktime and other standards which underlie multi-media development). Language art already encompasses, for example, kinetic text, holographic text (Eduardo Kac), 3D textual worlds (Jeffrey Shaw, Ladislao Pablo Györi) and other literary objects which are experienced as time-based. This art will demand the development and application of new rhetorical tropes and figures to text which has previously been dominated - up to and including the implementation of link-node hypertext - by spatial structuring, by topographic rhetoric (though enclosed within the easily-granted linearities of print and narrative). Cinema will provide the privileged source of metaphors for these figures. The reader may imagine a significant development in the kinds of *textual* transition and montage effects that we see in experimental typographic design, advertising and cinematic titling. These figures should quickly replace the hollow, passionless 'link' allowing time-based text art to emerge with a rich, cinematic rhetoric that is derived from the art of letters rather than exclusively or predominantly from visual art, music, or, as now, by default, from the arbitrary exigencies of the 'human-computer interface'.

Programming will reconfigure the process of Writing and *incorporate* 'programming' in its technical sense, including the algorithms of text generators, textual movies, all the 'performance-design' publication and production aspects of text-making. Writers are always already programmers (coders of inherently provisional scripts, subject to development, implementation and execution) and they must now be prepared to extend and deepen their practice in ways which embrace the continual -- responsible, creative -- reconfiguration of the delivery medium itself.





(These will, in the final versions of the paper, be worked into the arguaments above.)

1) translatability and digitisation:

- demonstrsation of an algorithmic literary object which performs a transliteral 'morph' from an English text -- the translation of a Chinese poem -- to a corresponding romanized version of the text and finally to a (?the) Chinese text.

This demostrates both one particular literary transition effect and also points to a radical cultural disjuncture between by the alphabetic digitisation of language, its subsequent binary encoding and the adaptation of this digitisation to Chinese. The alphabet is a cultural determined digitisation, but do we say the same for binary representation? How should the Chinese writing system have been digitised? Clearly there is no alternative to binary representation, but there will be a great deal of work to be done -- programmatological work, including literary work -- in order to make sense or make meaning of this transliteral disjuncture. Such work will be simultaneously writing and programming.

2) the extralexical:

A related example, Xu Bing's 'Tianshu' or 'A Book from the Sky' -- a book of unreadable words which nonetheless look as if they should be legible Chinese characters. Again, how should this be digitised to reflect its nonlexical and culturally specific textuality? What programs do we set up.

3) more straightforward examples of contemporary or recent experimental writing demonstrating procedural approaches to transcription, translation, transliteration.

- Emmet Williams; Jackson Mac Low; John Cage.

- homeophonic translation (Kelly, Zukofsky, Caddell)

4) programmability: time-based texts and transition effects

- Kinetic text (Kac, Gyori, Burgaud)

- Generative: Balpe, Bootz

- Alternative montage; simultanaeties: Jim Rosenberg

- textMorphs

(Source: DAC 1998 Author's abstract)

The permanent URL of this page: 
Record posted by: 
Jill Walker Rettberg